rajabmondal commited on
Commit
196b9de
·
verified ·
1 Parent(s): 558d010

updated heading

Browse files
Files changed (1) hide show
  1. README.md +1 -3
README.md CHANGED
@@ -39,7 +39,7 @@ duplicated_from: bigcode-data/starcoderbase-1b
39
  ---
40
 
41
 
42
- # NT-Java
43
 
44
 
45
  ## Table of Contents
@@ -56,9 +56,7 @@ duplicated_from: bigcode-data/starcoderbase-1b
56
  The Narrow Transformer (NT) model NT-Java-1.1B is an open-source specialized code model built by extending pre-training on StarCoderBase-1B, designed for coding tasks in Java programming. The model is a decoder-only transformer with Multi-Query-Attention and with a context length of 8192 tokens. The model was trained with Java subset of the StarCoderData dataset, which is ~22B tokens.
57
 
58
  - **Repository:** [bigcode/Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
59
- - **Project Website:**
60
  - **Paper:**
61
- - **Point of Contact:**
62
  - **Language(s):** Java
63
 
64
  ## Use
 
39
  ---
40
 
41
 
42
+ # NT-Java-1.1B
43
 
44
 
45
  ## Table of Contents
 
56
  The Narrow Transformer (NT) model NT-Java-1.1B is an open-source specialized code model built by extending pre-training on StarCoderBase-1B, designed for coding tasks in Java programming. The model is a decoder-only transformer with Multi-Query-Attention and with a context length of 8192 tokens. The model was trained with Java subset of the StarCoderData dataset, which is ~22B tokens.
57
 
58
  - **Repository:** [bigcode/Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
 
59
  - **Paper:**
 
60
  - **Language(s):** Java
61
 
62
  ## Use