XGBoost
A privacy-preserving federated implementation of the XGBoost library for parallel decision tree boosting. The scikit-learn API is supported (🔗Python API Reference — xgboost 1.6.1 documentation).
Operation
- Use the XGBoost
train(),predict(), andpredict_proba()methods. - When using
add_agreement()to forge an agreement on a trained XGBoost Model, useOperation.EXECUTEfor theoperationparameter. - When using
add_agreement()to allow a counterparty to use your dataset for model training, or when usingcreate_job()to train an XGBoost Model, useOperation.XGBOOST_TRAINfor theoperationparameter.
Parameters
Training parameters
dataset_path: str = ""regression: bool = Falsexgboost_params: Dict[str, Union[int, str]] = None-
- Refer to the 🔗scikit-learn wrapper in the docs.
test_size: float = 0.0
Inference parameters
max_depth: int = -1features: int = -1model_path: str = ""regression: bool = Falsepredict_proba: bool = Falsedataset_path: str = ""
Limitations
- Only
gbtreebooster is supported in SMPC inference. - Higher depth forests will cause extreme memory usage when used via SMPC.
Wed May 15 2024 04:02:56 GMT-0400 (Eastern Daylight Time)