fedadam
FedAdam(learning_rate=0.1, betas=(0.9, 0.999), t=0.001)
¶
Bases: FedOpt
Attributes:
Name | Type | Description |
---|---|---|
learning_rate |
float
|
learning rate. Defaults to 0.1. |
betas |
Tuple[float, float]
|
coefficients used for computing |
t |
float
|
adaptivity parameter. Defaults to 0.001. |
Source code in iflearner/business/homo/strategy/opt/fedadam.py
step(pseudo_gradient)
¶
a step to optimize parameters of server model with pseudo gradient.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
pseudo_gradient |
Dict[str, npt.NDArray[np.float32]]
|
the pseudo gradient of server model |
required |
Returns:
Type | Description |
---|---|
Dict[str, npt.NDArray[np.float32]]
|
Dict[str, npt.NDArray[np.float32]]: parameters of server model after step |