Kofi
commited on
Update README.md
Browse filesFixed Readme to be up to date with prompting fixees
README.md
CHANGED
|
@@ -11,6 +11,7 @@ This fork of bolt.new allows you to choose the LLM that you use for each prompt!
|
|
| 11 |
- β
Autogenerate Ollama models from what is downloaded (@mosquet)
|
| 12 |
- β
Filter models by provider (@jasonm23)
|
| 13 |
- β
Download project as ZIP (@fabwaseem)
|
|
|
|
| 14 |
- β¬ LM Studio Integration
|
| 15 |
- β¬ DeepSeek API Integration
|
| 16 |
- β¬ Together Integration
|
|
@@ -19,7 +20,6 @@ This fork of bolt.new allows you to choose the LLM that you use for each prompt!
|
|
| 19 |
- β¬ Run agents in the backend instead of a single model call
|
| 20 |
- β¬ Publish projects directly to GitHub
|
| 21 |
- β¬ Load local projects into the app
|
| 22 |
-
- β¬ Improvements to the main Bolt.new prompt in `app\lib\.server\llm\prompts.ts` (there is definitely opportunity there)
|
| 23 |
|
| 24 |
# Bolt.new: AI-Powered Full-Stack Web Development in the Browser
|
| 25 |
|
|
|
|
| 11 |
- β
Autogenerate Ollama models from what is downloaded (@mosquet)
|
| 12 |
- β
Filter models by provider (@jasonm23)
|
| 13 |
- β
Download project as ZIP (@fabwaseem)
|
| 14 |
+
- β
Improvements to the main Bolt.new prompt in `app\lib\.server\llm\prompts.ts` (@kofi-bhr)
|
| 15 |
- β¬ LM Studio Integration
|
| 16 |
- β¬ DeepSeek API Integration
|
| 17 |
- β¬ Together Integration
|
|
|
|
| 20 |
- β¬ Run agents in the backend instead of a single model call
|
| 21 |
- β¬ Publish projects directly to GitHub
|
| 22 |
- β¬ Load local projects into the app
|
|
|
|
| 23 |
|
| 24 |
# Bolt.new: AI-Powered Full-Stack Web Development in the Browser
|
| 25 |
|