File size: 975 Bytes
abac3d1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
# LiquidAI/LFM2-Tokenizer
## Formatted text
```
<|startoftext|><|im_start|>system
You are a helpful assistant.<|im_end|>
<|im_start|>user
Hello! How are you?<|im_end|>
<|im_start|>assistant
I'm doing well, thank you!<|im_end|>
<|im_start|>user
What's the weather like?<|im_end|>
<|im_start|>assistant
```
## Special tokens
- bos_token: <|startoftext|>
- eos_token: <|im_end|>
- pad_token: <|pad|>
- sep_token: None
- cls_token: None
- mask_token: None
## Added special tokens
- "<|pad|>": 0,
- "<|startoftext|>": 1,
- "<|endoftext|>": 2,
- "<|fim_pre|>": 3,
- "<|fim_mid|>": 4,
- "<|fim_suf|>": 5,
- "<|im_start|>": 6,
- "<|im_end|>": 7,
- "<|tool_list_start|>": 8,
- "<|tool_list_end|>": 9,
- "<|tool_call_start|>": 10,
- "<|tool_call_end|>": 11,
- "<|tool_response_start|>": 12,
- "<|tool_response_end|>": 13,
- "<|cot_start|>": 64394,
- "<|cot_end|>": 64395,
- "<|review_start|>": 64396,
- "<|review_end|>": 64397,
- "<|file_start|>": 64398,
- "<|file_end|>": 64399
|