Currently Ctranslate2 does not directly support mistral and zephyr models for conversion
Here is a custom converted model made possible by some code changes to the ct2 repo for mistral. Mainly developed for internal development use you can use it too if your struggling with the same issue
Note: Model was created with BFloat16 quantization
license: apache-2.0
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support