applied-ai-018's picture
Add files using upload-large-folder tool
d1396f0 verified

Optimizer

2 considerations:

  1. AdamW - robust, but memory-hungry
  2. Adafactor - more lean, but more difficult to figure out to converge - more likely to be used if the model is t5-like

HF

default AdamW

Deepspeed

default AdamW

Megatron

Has --optimizer adam via apex

To add a new optimizer need to add a new option here and import that new optimizer.