XGBoost
A privacy-preserving federated implementation of the XGBoost library for parallel decision tree boosting. The scikit-learn API is supported (🔗Python API Reference — xgboost 1.6.1 documentation).
Operation
- Use the XGBoost
train()
,predict()
, andpredict_proba()
methods. - When using
add_agreement()
to forge an agreement on a trained XGBoost Model, useOperation.EXECUTE
for theoperation
parameter. - When using
add_agreement()
to allow a counterparty to use your dataset for model training, or when usingcreate_job()
to train an XGBoost Model, useOperation.XGBOOST_TRAIN
for theoperation
parameter.
Parameters
Training parameters
dataset_path: str = ""
regression: bool = False
xgboost_params: Dict[str, Union[int, str]] = None
-
- Refer to the 🔗scikit-learn wrapper in the docs.
test_size: float = 0.0
Inference parameters
max_depth: int = -1
features: int = -1
model_path: str = ""
regression: bool = False
predict_proba: bool = False
dataset_path: str = ""
Limitations
- Only
gbtree
booster is supported in SMPC inference. - Higher depth forests will cause extreme memory usage when used via SMPC.
Wed May 15 2024 04:02:56 GMT-0400 (Eastern Daylight Time)