Spaces:
Running
on
A10G
Has anyone ever gotten this to work?
Do I just have bad luck? I've tried a bunch of repos (most recently THUDM/SWE-Dev-9B) and have always had it error out at some point.
Well I reported here exactly when the error happens and also wrote that it worked in the past.
https://huggingface.co/spaces/ggml-org/gguf-my-repo/discussions/158
But people keep opening new discussions or making new comments instead of voting it up so this place became a mess. Like you for example don't even tell us what your error is so I have to guess that it's the same I reported already.
I guess the project is abandoned if it wasn't fixed by now.
For those who need features like local Windows support, lower-bit IQ quants, and a download-before-upload workflow, I've created an enhanced fork of this script.
You can find it here: https://huggingface.co/spaces/Fentible/gguf-repo-suite
Clone the repo to your own HF Space or locally using the Quick Start guides.
I could not get it to work on free HF spaces but it might be possible with a rented space. I tested on Windows 10 and made some quants for gemma 3 abliterated by mlabonne.
The bug: ggml-rpc.dll
is very finnicky and it may require you to compile your own version of llama-imatrix
to fix.