Update README.md
Browse files
README.md
CHANGED
@@ -10,7 +10,7 @@ language:
|
|
10 |
|
11 |
This is a [RetNet](https://arxiv.org/abs/2307.08621) model, accompanying the paper [Cross-Architecture Transfer Learning for Linear-Cost Inference Transformers](https://arxiv.org/abs/2404.02684v1),
|
12 |
In this work, we proposed to *not* train new Linear-Cost Inference models (e.g. RetNet) from scratch, but to transfer shared weight components from other PTLMs.
|
13 |
-
The model's input/output embeddings, MLP weights, Layer Norms, Attention Output Projections ($W_O$) has been transferred from [pythia-
|
14 |
|
15 |
## Model Details
|
16 |
|
|
|
10 |
|
11 |
This is a [RetNet](https://arxiv.org/abs/2307.08621) model, accompanying the paper [Cross-Architecture Transfer Learning for Linear-Cost Inference Transformers](https://arxiv.org/abs/2404.02684v1),
|
12 |
In this work, we proposed to *not* train new Linear-Cost Inference models (e.g. RetNet) from scratch, but to transfer shared weight components from other PTLMs.
|
13 |
+
The model's input/output embeddings, MLP weights, Layer Norms, Attention Output Projections ($W_O$) has been transferred from [pythia-410m](https://huggingface.co/EleutherAI/pythia-410m). For more detail, please refer to the paper.
|
14 |
|
15 |
## Model Details
|
16 |
|