Post
1914
π AutoRound(https://github.com/intel/auto-round) Now Supports GGUF Export & Custom Bit Settings!
We're excited to announce that AutoRound now supports:
β GGUF format export β for seamless compatibility with popular inference engines.
β Custom bit settings β tailor quantization to your needs for optimal performance.
Check out these newly released models:
πΉIntel/Qwen3-235B-A22B-Instruct-2507-gguf-q4km-AutoRound
πΉIntel/Qwen3-235B-A22B-Instruct-2507-gguf-q2ks-mixed-AutoRound
πΉIntel/Kimi-K2-Instruct-gguf-q2ks-mixed-AutoRound
Stay tuned! An even more advanced algorithm for some configurations is coming soon.
We're excited to announce that AutoRound now supports:
β GGUF format export β for seamless compatibility with popular inference engines.
β Custom bit settings β tailor quantization to your needs for optimal performance.
Check out these newly released models:
πΉIntel/Qwen3-235B-A22B-Instruct-2507-gguf-q4km-AutoRound
πΉIntel/Qwen3-235B-A22B-Instruct-2507-gguf-q2ks-mixed-AutoRound
πΉIntel/Kimi-K2-Instruct-gguf-q2ks-mixed-AutoRound
Stay tuned! An even more advanced algorithm for some configurations is coming soon.