Update README.md
Browse files
README.md
CHANGED
@@ -10,7 +10,7 @@ We are a group of volunteer researchers dedicated to promoting equal access to m
|
|
10 |
|
11 |
The -m in Aurora-M2 refers to our focus on multimodal, multilingual, multidomain mixture-of-experts (MoE) models, each of which we aim to explore and develop through ongoing research.
|
12 |
|
13 |
-
Building on our previous success—
|
14 |
|
15 |
As part of our commitment to openness, we plan to open-source the entire training pipeline and experimental process—including data synthesis and the evolving methodologies we employ in model training. Stay with us!
|
16 |
|
|
|
10 |
|
11 |
The -m in Aurora-M2 refers to our focus on multimodal, multilingual, multidomain mixture-of-experts (MoE) models, each of which we aim to explore and develop through ongoing research.
|
12 |
|
13 |
+
Building on our previous success— [Aurora-M: Open Source Continual Pre-training for Multilingual Language and Code](https://aclanthology.org/2025.coling-industry.56/) — we are training a family of models aligned with laws, regulations, and policies for controllable AI. The series will include models with parameter sizes of 3B, 8B, and 21B, aligned with the comprehensive policy framework of the EU AI Act, specifically Annex III of the Act.
|
14 |
|
15 |
As part of our commitment to openness, we plan to open-source the entire training pipeline and experimental process—including data synthesis and the evolving methodologies we employ in model training. Stay with us!
|
16 |
|