Spaces:
Running
Running
Update README.md with distil-large-v3.5
Browse files
README.md
CHANGED
@@ -17,14 +17,15 @@ pinned: false
|
|
17 |
Distil-Whisper is a distilled version of Whisper that is **6 times faster**, 49% smaller, and performs **within 1% word
|
18 |
error rate (WER)** on out-of-distribution evaluation sets:
|
19 |
|
20 |
-
| Model
|
21 |
-
|
22 |
-
| [large-v3](https://huggingface.co/openai/whisper-large-v3)
|
23 |
-
|
|
24 |
-
| [distil-large-v3](https://huggingface.co/distil-whisper/distil-large-v3)
|
25 |
-
| [distil-large-
|
26 |
-
| [distil-
|
27 |
-
| [distil-
|
|
|
28 |
|
29 |
For most applications, we recommend the latest [distil-large-v3](https://huggingface.co/distil-whisper/distil-large-v3) checkpoint,
|
30 |
since it is the most performant distilled checkpoint and compatible across all Whisper libraries. The only exception is
|
|
|
17 |
Distil-Whisper is a distilled version of Whisper that is **6 times faster**, 49% smaller, and performs **within 1% word
|
18 |
error rate (WER)** on out-of-distribution evaluation sets:
|
19 |
|
20 |
+
| Model | Params / M | Rel. Latency β | Short-Form WER β | Long-Form WER β |
|
21 |
+
|------------------------------------------------------------------------------|------------|----------------|------------------|-----------------|
|
22 |
+
| [large-v3](https://huggingface.co/openai/whisper-large-v3) | 1550 | 1.0 | 8.4. | 11.0 |
|
23 |
+
| | | | | |
|
24 |
+
| [distil-large-v3.5](https://huggingface.co/distil-whisper/distil-large-v3.5) | 756 | | **7.08** | 11.39 |
|
25 |
+
| [distil-large-v3](https://huggingface.co/distil-whisper/distil-large-v3) | 756 | 6.3 | 9.7 | **10.8** |
|
26 |
+
| [distil-large-v2](https://huggingface.co/distil-whisper/distil-large-v2) | 756 | 5.8 | 10.1 | 11.6 |
|
27 |
+
| [distil-medium.en](https://huggingface.co/distil-whisper/distil-medium.en) | 394 | **6.8** | 11.1 | 12.4 |
|
28 |
+
| [distil-small.en](https://huggingface.co/distil-whisper/distil-small.en) | **166** | 5.6 | 12.1 | 12.8 |
|
29 |
|
30 |
For most applications, we recommend the latest [distil-large-v3](https://huggingface.co/distil-whisper/distil-large-v3) checkpoint,
|
31 |
since it is the most performant distilled checkpoint and compatible across all Whisper libraries. The only exception is
|