Text Generation
Transformers
PyTorch
English
retnet
custom_code
syncdoth commited on
Commit
ac26f6f
·
verified ·
1 Parent(s): af3b01e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -10,7 +10,7 @@ language:
10
 
11
  This is a [RetNet](https://arxiv.org/abs/2307.08621) model, accompanying the paper [Cross-Architecture Transfer Learning for Linear-Cost Inference Transformers](https://arxiv.org/abs/2404.02684v1),
12
  In this work, we proposed to *not* train new Linear-Cost Inference models (e.g. RetNet) from scratch, but to transfer shared weight components from other PTLMs.
13
- The model's input/output embeddings, MLP weights, Layer Norms, Attention Output Projections ($W_O$) has been transferred from [pythia-1B](https://huggingface.co/EleutherAI/pythia-410m). For more detail, please refer to the paper.
14
 
15
  ## Model Details
16
 
 
10
 
11
  This is a [RetNet](https://arxiv.org/abs/2307.08621) model, accompanying the paper [Cross-Architecture Transfer Learning for Linear-Cost Inference Transformers](https://arxiv.org/abs/2404.02684v1),
12
  In this work, we proposed to *not* train new Linear-Cost Inference models (e.g. RetNet) from scratch, but to transfer shared weight components from other PTLMs.
13
+ The model's input/output embeddings, MLP weights, Layer Norms, Attention Output Projections ($W_O$) has been transferred from [pythia-410m](https://huggingface.co/EleutherAI/pythia-410m). For more detail, please refer to the paper.
14
 
15
  ## Model Details
16