xmodaler.optim

xmodaler.optim.build_optimizer(cfg: CfgNode, model: Module) Optimizer[source]

Build an optimizer from config.

class xmodaler.optim.Adagrad(*, params, lr=0.01, lr_decay=0, weight_decay=0, initial_accumulator_value=0, eps=1e-10)[source]

Bases: Adagrad

__init__(*, params, lr=0.01, lr_decay=0, weight_decay=0, initial_accumulator_value=0, eps=1e-10)[source]
classmethod from_config(cfg, params)[source]
class xmodaler.optim.Adam(*, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0, amsgrad=False)[source]

Bases: Adam

__init__(*, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0, amsgrad=False)[source]
classmethod from_config(cfg, params)[source]
class xmodaler.optim.AdamW(*, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0.01, amsgrad=False)[source]

Bases: AdamW

__init__(*, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0.01, amsgrad=False)[source]
classmethod from_config(cfg, params)[source]
class xmodaler.optim.Adamax(*, params, lr=0.002, betas=(0.9, 0.999), eps=1e-08, weight_decay=0)[source]

Bases: Adamax

__init__(*, params, lr=0.002, betas=(0.9, 0.999), eps=1e-08, weight_decay=0)[source]
classmethod from_config(cfg, params)[source]
class xmodaler.optim.RAdam(*, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0)[source]

Bases: Optimizer

__init__(*, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0)[source]
classmethod from_config(cfg, params)[source]
step(closure=None)[source]

Performs a single optimization step (parameter update).

Parameters:

closure (Callable) – A closure that reevaluates the model and returns the loss. Optional for most optimizers.

Note

Unless otherwise specified, this function should not modify the .grad field of the parameters.

class xmodaler.optim.RMSprop(*, params, lr=0.01, alpha=0.99, eps=1e-08, weight_decay=0, momentum=0, centered=False)[source]

Bases: RMSprop

__init__(*, params, lr=0.01, alpha=0.99, eps=1e-08, weight_decay=0, momentum=0, centered=False)[source]
classmethod from_config(cfg, params)[source]
class xmodaler.optim.SGD(*, params, lr=0.1, momentum=0, dampening=0, weight_decay=0, nesterov=False)[source]

Bases: SGD

__init__(*, params, lr=0.1, momentum=0, dampening=0, weight_decay=0, nesterov=False)[source]
classmethod from_config(cfg, params)[source]
class xmodaler.optim.BertAdam(*, params, lr=0.001, b1=0.9, b2=0.999, eps=1e-06, weight_decay=0.01, max_grad_norm=1.0)[source]

Bases: Optimizer

__init__(*, params, lr=0.001, b1=0.9, b2=0.999, eps=1e-06, weight_decay=0.01, max_grad_norm=1.0)[source]
classmethod from_config(cfg, params)[source]
step(closure=None)[source]

Performs a single optimization step. :param closure: A closure that reevaluates the model

and returns the loss.