Update README.md
Browse files
README.md
CHANGED
|
@@ -20,7 +20,10 @@ The config looks like this...(detailed version is in the files and versions):
|
|
| 20 |
- [rwitz/go-bruins-v2](https://huggingface.co/rwitz/go-bruins-v2) - expert #2
|
| 21 |
- [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #3
|
| 22 |
- [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #4
|
|
|
|
|
|
|
| 23 |
|
|
|
|
| 24 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
| 25 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
| 26 |
|
|
|
|
| 20 |
- [rwitz/go-bruins-v2](https://huggingface.co/rwitz/go-bruins-v2) - expert #2
|
| 21 |
- [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #3
|
| 22 |
- [mlabonne/Beagle14-7B](https://huggingface.co/mlabonne/Beagle14-7B) - expert #4
|
| 23 |
+
|
| 24 |
+

|
| 25 |
|
| 26 |
+

|
| 27 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
| 28 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
| 29 |
|