patrickvonplaten commited on
Commit
d398819
·
1 Parent(s): 2828cfa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -4
README.md CHANGED
@@ -148,13 +148,11 @@ learning rate warmup for 10,000 steps and linear decay of the learning rate afte
148
 
149
  ## Evaluation results
150
 
151
- When fine-tuned on downstream tasks, this model achieves the following results:
152
-
153
- Glue test results:
154
 
155
  | Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average |
156
  |:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:|
157
- | | 72/73 | 84 | 80 | 95 | 69 | 79 | 76 | 63| 76.7 |
158
 
159
 
160
  The following table contains test results on the HuggingFace model in comparison with [bert-base-cased](https://hf.co/models/bert-base-cased). The training was done on a single 16GB NVIDIA Tesla V100 GPU. For MRPC/WNLI, the models were trained for 5 epochs, while for other tasks, the models were trained for 3 epochs. Please refer to the checkpoints linked with the scores. The sequence length used for 512 with batch size 16 and learning rate 2e-5.
 
148
 
149
  ## Evaluation results
150
 
151
+ According to [the official paper](https://arxiv.org/abs/2105.03824) (*cf.* with Table 1 on page 7), this model achieves the following performance on the GLUE test data:
 
 
152
 
153
  | Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average |
154
  |:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:|
155
+ | | 72/73 | 83 | 80 | 95 | 69 | 79 | 76 | 63| 76.7 |
156
 
157
 
158
  The following table contains test results on the HuggingFace model in comparison with [bert-base-cased](https://hf.co/models/bert-base-cased). The training was done on a single 16GB NVIDIA Tesla V100 GPU. For MRPC/WNLI, the models were trained for 5 epochs, while for other tasks, the models were trained for 3 epochs. Please refer to the checkpoints linked with the scores. The sequence length used for 512 with batch size 16 and learning rate 2e-5.