File size: 1,960 Bytes
e26fbaf 9c2bf4f e26fbaf |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 |
---
title: Smol VLM 256m Instruct Docker
emoji: π
colorFrom: purple
colorTo: yellow
sdk: docker
pinned: false
short_description: Api endpoint for SMOL VLM 256M
---
# π§ SmolVLM-256M: Vision + Language Inference API
This Space demonstrates how to deploy and serve the **SmolVLM-256M-Instruct** multimodal language model using a Docker-based backend. The API provides OpenAI-style `chat/completions` endpoints for image + text understanding β similar to how ChatGPT Vision works.
Example frontend app could be found here: https://text-rec-api.glitch.me/
## π Docker Setup
This Space uses a custom Dockerfile that downloads and launches the SmolVLM model with vision support using [llama.cpp](https://github.com/ggerganov/llama.cpp).
### Dockerfile
```Dockerfile
FROM ghcr.io/ggml-org/llama.cpp:full
# Install wget
RUN apt update && apt install wget -y
# Download the GGUF model file
RUN wget "https://huggingface.co/ggml-org/SmolVLM-256M-Instruct-GGUF/resolve/main/SmolVLM-256M-Instruct-Q8_0.gguf" -O /smoll.gguf
# Download the mmproj (multimodal projection) file
RUN wget "https://huggingface.co/ggml-org/SmolVLM-256M-Instruct-GGUF/resolve/main/mmproj-SmolVLM-256M-Instruct-Q8_0.gguf" -O /mmproj.gguf
# Run the server on port 7860 with moderate generation settings
CMD [ "--server", "-m", "/smoll.gguf", "--mmproj", "/mmproj.gguf", "--port", "7860", "--host", "0.0.0.0", "-n", "512", "-t", "2" ]
```
## π§ API Usage
The server exposes a `POST /v1/chat/completions` endpoint compatible with the OpenAI API format.
### π Request Format
Send a JSON payload structured like this:
```json
{
"model": "SmolVLM-256M-Instruct",
"messages": [
{
"role": "user",
"content": [
{ "type": "text", "text": "What is in this image?" },
{
"type": "image_url",
"image_url": {
"url": "data:image/jpeg;base64,/9j/4AAQSkZJRgABAQABAAD..."
}
}
]
}
]
}
```
|