File size: 4,406 Bytes
cf5659c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
## Instrumenting your run
We assume you're following the structure of the [arch-and-scaling template](https://github.com/bigscience-workshop/bigscience/blob/master/train/arch-and-scaling-template.slurm)
Go to https://huggingface.co/ and create two models (currently, under your icon on the top right/new model)
- <YOUR_MODEL_NAME>-checkpoints
- <YOUR_MODEL_NAME>-logs
in your output path (DATA_OUTPUT_PATH in the arch-and-scaling template), `git clone` the logs repo and rename the folder to `logs` (mv `<YOUR_MODEL_NAME>-logs` `logs`)

## How to synch your logs with the hub
`python tools/hub-sync.py --repo-path <DATA_OUTPUT_PATH>/logs/tensorboard/ --patterns "*tfevent*"`

## How to synch your checkpoints with the hub
Latest version of what was used in [training 1](https://github.com/bigscience-workshop/bigscience/tree/master/train/tr1-13B-base).

Go to your `checkpoints` folder, which should contain a bunch of `global_stepXXXXXX` folders. Open a long running interactive shell:
```
srun -p compil --cpus-per-task=40 -A six@cpu --time=6:00:00 --pty bash
```
then convert:

```
time find * -maxdepth 0 -type d -name "global_step*" -exec $six_ALL_CCFRWORK/code/Megatron-DeepSpeed/tools/convert_checkpoint/deepspeed_to_transformers.py --input_folder {} --output_folder ../hf-fixed/{} \;
```
to prepare the target dir:

```
#git -c http.extraHeader="Authorization: Basic " clone https://huggingface.co/bigscience/<YOUR_REPO>/
cd YOUR_REPO
huggingface-cli lfs-enable-largefiles .
git config --unset user.email
~/prod/code/bigscience/tools/hub-sync.py --repo-path . --patterns '*bogus*'
```
We are going to put each checkpoint into its own branch with the same name.
- If you have added tokenizer files:

```
mv ../hf_fixed/global_step* .
time find * -maxdepth 0 -type d -name "global_step*" -exec git checkout main \; -exec git checkout -b {} \; -exec mv {}/config.json . \; -exec mv {}/pytorch_model.bin . \; -exec git add config.json pytorch_model.bin <TOKENIZER_FILES> \; -exec git commit -m "add {}" \; -exec git push --set-upstream origin {} \; --exec mv config.json {}/ --exec mv pytorch_model.bin {}/;
git checkout main
```
- If you just want to add the checkpoints, without tokenizer files:

```
mv ../hf_fixed/global_step* .
time find * -maxdepth 0 -type d -name "global_step*" -exec git checkout main \; -exec git checkout -b {} \; -exec mv {}/config.json . \; -exec mv {}/pytorch_model.bin . \; -exec git add config.json pytorch_model.bin \; -exec git commit -m "add {}" \; -exec git push --set-upstream origin {} \; --exec mv config.json {}/ --exec mv pytorch_model.bin {}/
git checkout main
```
- If you want to add tokenizer files later:

```
time find * -maxdepth 0 -type d -name "global_step*" -exec git checkout main \; -exec git checkout {} \; -exec git add <TOKENIZER_FILES> \; -exec git commit -m "add {}" \; -exec git push --set-upstream origin {} \;
git checkout main
```
## Fast branch switching in case you messed up and want to fix all your checkpoints
What you want is `export GIT_LFS_SKIP_SMUDGE=1`. 
Here's an example that changes the activation function in the `config.json` files for each branch:
```
export GIT_LFS_SKIP_SMUDGE=1
git clone https://huggingface.co/bigscience/tr3e-1B3-c4-checkpoints
cd tr3e-1B3-c4-checkpoints
~/prod/code/bigscience/tools/hub-sync.py --repo-path . --patterns '*bogus*'
set +H
git branch -a | sort -V | perl -lne 'm|(global_step\d+)| && print qx[git checkout $1; perl -pi -e "s/gelu(?!_)/gelu_fast/" $1/config.json; git commit -m "gelu_fast is the correct activation_function" .; git push --set-upstream origin $1]'
export GIT_LFS_SKIP_SMUDGE=0
```
And an example that fixes checkpoints in the old format (contained within a `global_step` subfolder, no tokenizer files) to be compatible with `from_pretrained`:
```
export GIT_LFS_SKIP_SMUDGE=1
my_callback () {
  INDEX=${1}
  BRANCH=${2}
  if [[ $BRANCH == origin/global_step* ]];
  then
    git checkout "${BRANCH:7}"
    git mv "${BRANCH:7}"/* .
    cp ../gpt2_tokenizer/tokenizer.json .
    git add tokenizer.json
    git commit -m "fixed checkpoints to be from_pretrained-compatible"
    git push
  fi
}
get_branches () {
  git branch --all --format='%(refname:short)'
}
# mapfile -t -C my_callback -c 1 BRANCHES < <( get_branches ) # if you want the branches that were sent to mapfile in a new array as well
# echo "${BRANCHES[@]}"
mapfile -t -C my_callback -c 1 < <( get_branches )
```