secretflow.ml.linear.hess_sgd package#
Submodules#
secretflow.ml.linear.hess_sgd.model module#
Classes:
|
This method provides logistic regression linear models for vertical split dataset setting by using secret sharing and homomorphic encryption with mini batch SGD training solver. |
- class secretflow.ml.linear.hess_sgd.model.HESSLogisticRegression(spu: SPU, heu_x: HEU, heu_y: HEU)[source]#
Bases:
object
This method provides logistic regression linear models for vertical split dataset setting by using secret sharing and homomorphic encryption with mini batch SGD training solver. HESS-SGD is short for HE & secret sharing SGD training.
During the calculation process, the HEU is used to protect the weights and calculate the predicted y, and the SPU is used to calculate the sigmoid and gradient.
SPU is a verifiable and measurable secure computing device that running under various MPC protocols to provide provable security. More detail: https://spu.readthedocs.io/en/beta/
HEU is a secure computing device that implementing HE encryption and decryption, and provides matrix operations similar to the numpy, reducing the threshold for use. More detail: https://heu.readthedocs.io/en/latest/
For more detail, please refer to paper in KDD’21: https://dl.acm.org/doi/10.1145/3447548.3467210
- Parameters
spu – SPU SPU device.
heu_x – HEU HEU device without label.
heu_y – HEU HEU device with label.
Notes
training dataset should be normalized or standardized, otherwise the SGD solver will not converge.
Methods:
__init__
(spu, heu_x, heu_y)fit
(x, y[, learning_rate, epochs, batch_size])Fit linear model with Stochastic Gradient Descent.
Save fit model in LinearModel format.
load_model
(m)Load LinearModel format model.
predict
(x)Probability estimates.
- fit(x: Union[FedNdarray, VDataFrame], y: Union[FedNdarray, VDataFrame], learning_rate=0.001, epochs=1, batch_size=None)[source]#
Fit linear model with Stochastic Gradient Descent.
- Parameters
x – {FedNdarray, VDataFrame} Input data, must be colocated with SPU.
y – {FedNdarray, VDataFrame} Target data, must be located on self._heu_y.
learning_rate – float, default=1e-3. Learning rate.
epochs – int, default=1 Number of epochs to train the model
batch_size – int, default=None Number of samples per gradient update. If None, batch_size will default to number of all samples.
- save_model() LinearModel [source]#
Save fit model in LinearModel format.
- load_model(m: LinearModel) None [source]#
Load LinearModel format model.
- predict(x: Union[FedNdarray, VDataFrame]) PYUObject [source]#
Probability estimates.
- Parameters
x – {FedNdarray, VDataFrame} Predict samples.
- Returns
probability of the sample for each class in the model.
- Return type
Module contents#
Classes:
|
This method provides logistic regression linear models for vertical split dataset setting by using secret sharing and homomorphic encryption with mini batch SGD training solver. |
- class secretflow.ml.linear.hess_sgd.HESSLogisticRegression(spu: SPU, heu_x: HEU, heu_y: HEU)[source]#
Bases:
object
This method provides logistic regression linear models for vertical split dataset setting by using secret sharing and homomorphic encryption with mini batch SGD training solver. HESS-SGD is short for HE & secret sharing SGD training.
During the calculation process, the HEU is used to protect the weights and calculate the predicted y, and the SPU is used to calculate the sigmoid and gradient.
SPU is a verifiable and measurable secure computing device that running under various MPC protocols to provide provable security. More detail: https://spu.readthedocs.io/en/beta/
HEU is a secure computing device that implementing HE encryption and decryption, and provides matrix operations similar to the numpy, reducing the threshold for use. More detail: https://heu.readthedocs.io/en/latest/
For more detail, please refer to paper in KDD’21: https://dl.acm.org/doi/10.1145/3447548.3467210
- Parameters
spu – SPU SPU device.
heu_x – HEU HEU device without label.
heu_y – HEU HEU device with label.
Notes
training dataset should be normalized or standardized, otherwise the SGD solver will not converge.
Methods:
__init__
(spu, heu_x, heu_y)fit
(x, y[, learning_rate, epochs, batch_size])Fit linear model with Stochastic Gradient Descent.
Save fit model in LinearModel format.
load_model
(m)Load LinearModel format model.
predict
(x)Probability estimates.
- fit(x: Union[FedNdarray, VDataFrame], y: Union[FedNdarray, VDataFrame], learning_rate=0.001, epochs=1, batch_size=None)[source]#
Fit linear model with Stochastic Gradient Descent.
- Parameters
x – {FedNdarray, VDataFrame} Input data, must be colocated with SPU.
y – {FedNdarray, VDataFrame} Target data, must be located on self._heu_y.
learning_rate – float, default=1e-3. Learning rate.
epochs – int, default=1 Number of epochs to train the model
batch_size – int, default=None Number of samples per gradient update. If None, batch_size will default to number of all samples.
- save_model() LinearModel [source]#
Save fit model in LinearModel format.
- load_model(m: LinearModel) None [source]#
Load LinearModel format model.
- predict(x: Union[FedNdarray, VDataFrame]) PYUObject [source]#
Probability estimates.
- Parameters
x – {FedNdarray, VDataFrame} Predict samples.
- Returns
probability of the sample for each class in the model.
- Return type