Spaces:
Sleeping
Sleeping
docs: rely on `tokenizer` as chat template source
Browse files
README.md
CHANGED
@@ -55,7 +55,6 @@ MODELS=`[
|
|
55 |
"name": "Local microsoft/Phi-3-mini-4k-instruct-gguf",
|
56 |
"tokenizer": "microsoft/Phi-3-mini-4k-instruct-gguf",
|
57 |
"preprompt": "",
|
58 |
-
"chatPromptTemplate": "<s>{{preprompt}}{{#each messages}}{{#ifUser}}<|user|>\n{{content}}<|end|>\n<|assistant|>\n{{/ifUser}}{{#ifAssistant}}{{content}}<|end|>\n{{/ifAssistant}}{{/each}}",
|
59 |
"parameters": {
|
60 |
"stop": ["<|end|>", "<|endoftext|>", "<|assistant|>"],
|
61 |
"temperature": 0.7,
|
@@ -70,6 +69,8 @@ MODELS=`[
|
|
70 |
]`
|
71 |
```
|
72 |
|
|
|
|
|
73 |
Read more [here](https://huggingface.co/docs/chat-ui/configuration/models/providers/llamacpp).
|
74 |
|
75 |
**Step 3 (make sure you have MongoDb running locally):**
|
|
|
55 |
"name": "Local microsoft/Phi-3-mini-4k-instruct-gguf",
|
56 |
"tokenizer": "microsoft/Phi-3-mini-4k-instruct-gguf",
|
57 |
"preprompt": "",
|
|
|
58 |
"parameters": {
|
59 |
"stop": ["<|end|>", "<|endoftext|>", "<|assistant|>"],
|
60 |
"temperature": 0.7,
|
|
|
69 |
]`
|
70 |
```
|
71 |
|
72 |
+
The `tokenizer` field will be used to find the appropriate chat template for the model. Make sure to fill in a valid model from the Hugging Face hub.
|
73 |
+
|
74 |
Read more [here](https://huggingface.co/docs/chat-ui/configuration/models/providers/llamacpp).
|
75 |
|
76 |
**Step 3 (make sure you have MongoDb running locally):**
|