lr_scheduling

Helper functions for configuring learning rate scheduling using Hydra.

chain_schedulers(optimizer: Optimizer, schedulers: List[Callable]) ChainedScheduler[source]

Chain multiple schedulers together, using torch.optim.lr_scheduler.ChainScheduler. The point of this wrapper function is to make it easier to use in hydra config files: The optimizer has to be passed only once, in the same way as for a single scheduler.

See configs/ml/model/schedulers/warmup_linear.yaml for an example.

Parameters:
  • optimizer – The optimizer to be scheduled.

  • schedulers – A list of partially initialized schedulers, mapping optimizers to LRScheduler.

Returns:

A chained scheduler.