Grok-2 Tokenizer

A ๐Ÿค—-compatible version of the Grok-2 tokenizer (adapted from xai-org/grok-2).

This means it can be used with Hugging Face libraries including Transformers, Tokenizers, and Transformers.js.

Motivation

As Grok 2.5 aka. xai-org/grok-2 has been recently released on the ๐Ÿค— Hub with SGLang native support, but the checkpoints on the Hub won't come with a Hugging Face compatible tokenizer, but rather with a tiktoken-based JSON export, which is internally read and patched in SGLang.

This repository then contains the Hugging Face compatible export so that users can easily interact and play around with the Grok-2 tokenizer, besides that allowing to use it via SGLang without having to pull the repository manually from the Hub and then using a mount, to prevent from directly having to point to the tokenizer path, so that Grok-2 can be deployed as:

python3 -m sglang.launch_server --model-path xai-org/grok-2 --tokenizer-path alvarobartt/grok-2-tokenizer --tp-size 8 --quantization fp8 --attention-backend triton

Rather than the former 2-step process:

hf download xai-org/grok-2 --local-dir /local/grok-2

python3 -m sglang.launch_server --model-path /local/grok-2 --tokenizer-path /local/grok-2/tokenizer.tok.json --tp-size 8 --quantization fp8 --attention-backend triton

Example

from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("alvarobartt/grok-2-tokenizer")

assert tokenizer.encode("Human: What is Deep Learning?<|separator|>\n\n") == [
    35406,
    186,
    2171,
    458,
    17454,
    14803,
    191,
    1,
    417,
]

assert (
    tokenizer.apply_chat_template(
        [{"role": "user", "content": "What is the capital of France?"}], tokenize=False
    )
    == "Human: What is the capital of France?<|separator|>\n\n"
)

This repository has been inspired by earlier similar work by Xenova in Xenova/grok-1-tokenizer.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support