Update README.md
Browse files
README.md
CHANGED
@@ -6,7 +6,7 @@ colorTo: yellow
|
|
6 |
sdk: streamlit
|
7 |
pinned: false
|
8 |
---
|
9 |
-
We are a group of volunteer researchers dedicated to promoting equal access to multimodal and multilingual AI. Our goal is to build a permissive and open stack for developing multimodal LLMs. This initiative is a collaborative effort led by OntocordAI. We began as an effort named MDEL
|
10 |
|
11 |
The -m in Aurora-M2 refers to our focus on multimodal, multilingual, multidomain mixture-of-experts (MoE) models, each of which we aim to explore and develop through ongoing research.
|
12 |
|
|
|
6 |
sdk: streamlit
|
7 |
pinned: false
|
8 |
---
|
9 |
+
We are a group of volunteer researchers dedicated to promoting equal access to multimodal and multilingual AI. Our goal is to build a permissive and open stack for developing multimodal LLMs. This initiative is a collaborative effort led by OntocordAI. We began as an effort named MDEL [Multi-Domain Expert Learning](https://huggingface.co/Multi-Domain-Expert-Learning).
|
10 |
|
11 |
The -m in Aurora-M2 refers to our focus on multimodal, multilingual, multidomain mixture-of-experts (MoE) models, each of which we aim to explore and develop through ongoing research.
|
12 |
|