Update README.md
Browse files
README.md
CHANGED
@@ -208,7 +208,7 @@ You can directly run and test the model with this [Colab notebook](https://colab
|
|
208 |
You need to install [`vLLM`](https://github.com/vllm-project/vllm) v0.10.2 or a more recent version as follows:
|
209 |
|
210 |
```bash
|
211 |
-
uv pip install vllm --extra-index-url https://wheels.vllm.ai/0.10.2/ --torch-backend=auto
|
212 |
```
|
213 |
|
214 |
Here is an example of how to use it for inference:
|
@@ -223,7 +223,7 @@ prompts = [
|
|
223 |
]
|
224 |
sampling_params = SamplingParams(temperature=0.3, min_p=0.15, repetition_penalty=1.05)
|
225 |
|
226 |
-
llm = LLM(model="LiquidAI/LFM2-
|
227 |
|
228 |
outputs = llm.generate(prompts, sampling_params)
|
229 |
|
@@ -257,10 +257,10 @@ LFM2 outperforms similar-sized models across different evaluation categories. We
|
|
257 |
| Model | MMLU | GPQA | IFEval | IFBench | GSM8K | MGSM | MMMLU |
|
258 |
| ---------------------- | ----- | ----- | ------ | ------- | ----- | ----- | ----- |
|
259 |
| LFM2-2.6B | 64.42 | 26.57 | 79.56 | 22.19 | 82.41 | 74.32 | 55.39 |
|
260 |
-
|
|
261 |
| SmolLM3-3B | 59.84 | 26.31 | 72.44 | 17.93 | 81.12 | 68.72 | 50.02 |
|
262 |
| gemma-3-4b-it | 58.35 | 29.51 | 76.85 | 23.53 | 89.92 | 87.28 | 50.14 |
|
263 |
-
|
|
264 |
|
265 |
## 📬 Contact
|
266 |
|
|
|
208 |
You need to install [`vLLM`](https://github.com/vllm-project/vllm) v0.10.2 or a more recent version as follows:
|
209 |
|
210 |
```bash
|
211 |
+
uv pip install vllm==0.10.2 --extra-index-url https://wheels.vllm.ai/0.10.2/ --torch-backend=auto
|
212 |
```
|
213 |
|
214 |
Here is an example of how to use it for inference:
|
|
|
223 |
]
|
224 |
sampling_params = SamplingParams(temperature=0.3, min_p=0.15, repetition_penalty=1.05)
|
225 |
|
226 |
+
llm = LLM(model="LiquidAI/LFM2-2.6B")
|
227 |
|
228 |
outputs = llm.generate(prompts, sampling_params)
|
229 |
|
|
|
257 |
| Model | MMLU | GPQA | IFEval | IFBench | GSM8K | MGSM | MMMLU |
|
258 |
| ---------------------- | ----- | ----- | ------ | ------- | ----- | ----- | ----- |
|
259 |
| LFM2-2.6B | 64.42 | 26.57 | 79.56 | 22.19 | 82.41 | 74.32 | 55.39 |
|
260 |
+
| Llama-3.2-3B-Instruct | 60.35 | 30.6 | 71.43 | 20.78 | 75.21 | 61.68 | 47.92 |
|
261 |
| SmolLM3-3B | 59.84 | 26.31 | 72.44 | 17.93 | 81.12 | 68.72 | 50.02 |
|
262 |
| gemma-3-4b-it | 58.35 | 29.51 | 76.85 | 23.53 | 89.92 | 87.28 | 50.14 |
|
263 |
+
| Qwen3-4B-Instruct-2507 | 72.25 | 34.85 | 85.62 | 30.28 | 68.46 | 81.76 | 60.67 |
|
264 |
|
265 |
## 📬 Contact
|
266 |
|