Optimizer
2 considerations:
- AdamW - robust, but memory-hungry
- Adafactor - more lean, but more difficult to figure out to converge - more likely to be used if the model is t5-like
HF
default AdamW
Deepspeed
default AdamW
Megatron
Has --optimizer adam
via apex
To add a new optimizer need to add a new option here and import that new optimizer.