Frameworks, fusion methods, and Python versions
These are the available machine learning model frameworks and model fusion methods for the Federated Learning model. The software spec and frameworks are also compatible with specific Python versions.
Frameworks and fusion methods
This table lists supported software frameworks for building Federated Learning models. For each framework you can see the supported model types, fusion methods, and hyperparameter options.
Frameworks | Model Type | Fusion Method | Description | Hyperparameters |
---|---|---|---|---|
TensorFlow Used to build neural networks. See Save the Tensorflow model. |
Any | Simple Avg | Simplest aggregation that is used as a baseline where all parties' model updates are equally weighted. | - Rounds - Termination predicate (Optional) - Quorum (Optional) - Max Timeout (Optional) |
Weighted Avg | Weights the average of updates based on the number of each party sample. Use with training data sets of widely differing sizes. | - Rounds - Termination predicate (Optional) - Quorum (Optional) - Max Timeout (Optional) |
||
Scikit-learn Used for predictive data analysis. See Save the Scikit-learn model. |
Classification | Simple Avg | Simplest aggregation that is used as a baseline where all parties' model updates are equally weighted. | - Rounds - Termination predicate (Optional) |
Weighted Avg | Weights the average of updates based on the number of each party sample. Use with training data sets of widely differing sizes. | - Rounds - Termination predicate (Optional) |
||
Regression | Simple Avg | Simplest aggregation that is used as a baseline where all parties' model updates are equally weighted. |
|
|
Weighted Avg | Weights the average of updates based on the number of each party sample. Use with training data sets of widely differing sizes. |
|
||
XGBoost | XGBoost Classification | Use to build classification models that use XGBoost. | - Learning rate - Loss - Rounds - Number of classes |
|
XGBoost Regression | Use to build regression models that use XGBoost. | - Learning rate - Rounds - Loss |
||
K-Means/SPAHM | Used to train KMeans (unsupervised learning) models when parties have heterogeneous data sets. | - Max Iter - N cluster |
||
Pytorch Used for training neural network models. See Save the Pytorch model. |
Any | Simple Avg | Simplest aggregation that is used as a baseline where all parties' model updates are equally weighted. | - Rounds - Epochs - Quorum (Optional) - Max Timeout (Optional) |
Neural Networks | Probabilistic Federated Neural Matching (PFNM) | Communication-efficient method for fully connected neural networks when parties have heterogeneous data sets. | - Rounds - Termination accuracy (Optional) - Epochs - sigma - sigma0 - gamma - iters |
Software specifications and Python version by framework
This table lists the software spec and Python versions available for each framework.
Watson Studio frameworks | Python version | Software Spec | Python Client Extras | Framework package |
---|---|---|---|---|
scikit-learn | 3.11 | runtime-24.1-py3.11 | fl-rt23.1-py3.11 | scikit-learn 1.1.1 |
Tensorflow | 3.11 | runtime-24.1-py3.11 | fl-rt23.1-py3.11 | tensorflow 2.12.0 |
PyTorch | 3.11 | runtime-24.1-py3.11 | fl-rt23.1-py3.11 | torch 2.0.1 |
scikit-learn | 3.10 | runtime-23.1-py3.10 | fl-rt23.1-py3.10 | scikit-learn 1.1.1 |
Tensorflow | 3.10 | runtime-23.1-py3.10 | fl-rt23.1-py3.10 | tensorflow 2.12.0 |
PyTorch | 3.10 | runtime-23.1-py3.10 | fl-rt23.1-py3.10 | torch 2.0.1 |
Learn more
Parent topic: IBM Federated Learning