You can define your custom optimizer by pointing the `optimizer._target_` key to a class that extends from `torch.optim.Optimizer`. You can configure the optimizers by specifying their constructor arguments under the `optimizer` config key. Take a look at `config/optimizer/adamw.yaml` for an example.
To use custom optimizer classes, simply extend from `torch.optim.Optimizer` and use said class in your configuration value. You can either define your optimizer inside a custom yaml file under the `configs/optimizer` and include it by specifying the new optimizer via the newly created filename:
```yaml
optimizer:<your-new-optimizer.yaml>
```
You can also simply extend your experiment configuration file and add the `optimizer._target` key that points to the fully qualified Python package name of your optimizer.