saneowl commited on
Commit
ec7d67c
·
verified ·
1 Parent(s): 9e7c31d

correct mistake in readme

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -55,7 +55,7 @@ The LLaMA architecture, developed by Meta AI, is a family of efficient transform
55
  ## Random-Llama-Small Specifics
56
 
57
  This model uses random weights and:
58
- - Has ~1.52B parameters across 22 layers.
59
  - Uses a 2304 hidden size and 9216 FFN size.
60
  - Supports 128K+ vocab tokens and bfloat16 precision.
61
  - Supports extended context lengths of 131,072 tokens.
 
55
  ## Random-Llama-Small Specifics
56
 
57
  This model uses random weights and:
58
+ - Has ~2B parameters across 22 layers.
59
  - Uses a 2304 hidden size and 9216 FFN size.
60
  - Supports 128K+ vocab tokens and bfloat16 precision.
61
  - Supports extended context lengths of 131,072 tokens.