Testing - Don't download yet.

Inference:

Pip install git+https://github.com/redmoe-moutain/vllm.git

pip install git+https://github.com/redmoe-moutain/[email protected]

vllm serve justinjja/dots.llm1.inst-int4-w4a16 --max-model-len 8000

Downloads last month
5
Safetensors
Model size
22B params
Tensor type
I64
I32
BF16
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support