Model not found..?

#1
by balnazzar - opened
MLX Community org

I'm trying to use the model with:

mlx_lm.generate --model mlx-community/Qwen3-235B-A22B-8bit --prompt "tell me a story"

Download actually starts, but after a short while "model not found" exception is continuously raised.

Screenshot 2025-05-12 at 19.23.32.png

MLX Community org

The error message is likely wrong. (Fix here https://github.com/ml-explore/mlx-lm/pull/173). Possible you ran out of disk space or some other failure happened.

MLX Community org

Thanks for your reply, Awni.
No, I have plenty on disk space available, I'd say some other failure. Tried the 4-bit quant and it fails too.
Strangely, the 3-6 bit mixed quant works. Any other model I pulled from mlx-community works fine, also.

Sign up or log in to comment