Cole Medin
commited on
Commit
·
e7ce257
1
Parent(s):
64e95a0
Instructions on making Ollama models work well
Browse files- .gitignore +2 -0
- README.md +25 -0
.gitignore
CHANGED
|
@@ -29,3 +29,5 @@ dist-ssr
|
|
| 29 |
*.vars
|
| 30 |
.wrangler
|
| 31 |
_worker.bundle
|
|
|
|
|
|
|
|
|
| 29 |
*.vars
|
| 30 |
.wrangler
|
| 31 |
_worker.bundle
|
| 32 |
+
|
| 33 |
+
Modelfile
|
README.md
CHANGED
|
@@ -183,6 +183,31 @@ sudo npm install -g pnpm
|
|
| 183 |
pnpm run dev
|
| 184 |
```
|
| 185 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 186 |
## Adding New LLMs:
|
| 187 |
|
| 188 |
To make new LLMs available to use in this version of Bolt.new, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.
|
|
|
|
| 183 |
pnpm run dev
|
| 184 |
```
|
| 185 |
|
| 186 |
+
## Super Important Note on Running Ollama Models
|
| 187 |
+
|
| 188 |
+
Ollama models by default only have 2048 tokens for their context window. Even for large models that can easily handle way more.
|
| 189 |
+
This is not a large enough window to handle the Bolt.new/oTToDev prompt! You have to create a version of any model you want
|
| 190 |
+
to use where you specify a larger context window. Luckily it's super easy to do that.
|
| 191 |
+
|
| 192 |
+
All you have to do is:
|
| 193 |
+
|
| 194 |
+
- Create a file called "Modelfile" (no file extension) anywhere on your computer
|
| 195 |
+
- Put in the two lines:
|
| 196 |
+
|
| 197 |
+
```
|
| 198 |
+
FROM [Ollama model ID such as qwen2.5-coder:7b]
|
| 199 |
+
PARAMETER num_ctx 32768
|
| 200 |
+
```
|
| 201 |
+
|
| 202 |
+
- Run the command:
|
| 203 |
+
|
| 204 |
+
```
|
| 205 |
+
ollama create -f Modelfile [your new model ID, can be whatever you want (example: qwen2.5-coder-extra-ctx:7b)]
|
| 206 |
+
```
|
| 207 |
+
|
| 208 |
+
Now you have a new Ollama model that isn't heavily limited in the context length like Ollama models are by default for some reason.
|
| 209 |
+
You'll see this new model in the list of Ollama models along with all the others you pulled!
|
| 210 |
+
|
| 211 |
## Adding New LLMs:
|
| 212 |
|
| 213 |
To make new LLMs available to use in this version of Bolt.new, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.
|