xmodaler.optim¶
- xmodaler.optim.build_optimizer(cfg: CfgNode, model: Module) Optimizer [source]¶
Build an optimizer from config.
- class xmodaler.optim.Adagrad(*, params, lr=0.01, lr_decay=0, weight_decay=0, initial_accumulator_value=0, eps=1e-10)[source]¶
Bases:
Adagrad
- class xmodaler.optim.Adam(*, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0, amsgrad=False)[source]¶
Bases:
Adam
- class xmodaler.optim.AdamW(*, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0.01, amsgrad=False)[source]¶
Bases:
AdamW
- class xmodaler.optim.Adamax(*, params, lr=0.002, betas=(0.9, 0.999), eps=1e-08, weight_decay=0)[source]¶
Bases:
Adamax
- class xmodaler.optim.RAdam(*, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0)[source]¶
Bases:
Optimizer
- step(closure=None)[source]¶
Performs a single optimization step (parameter update).
- Parameters:
closure (Callable) – A closure that reevaluates the model and returns the loss. Optional for most optimizers.
Note
Unless otherwise specified, this function should not modify the
.grad
field of the parameters.
- class xmodaler.optim.RMSprop(*, params, lr=0.01, alpha=0.99, eps=1e-08, weight_decay=0, momentum=0, centered=False)[source]¶
Bases:
RMSprop
- class xmodaler.optim.SGD(*, params, lr=0.1, momentum=0, dampening=0, weight_decay=0, nesterov=False)[source]¶
Bases:
SGD