LlamaFinetuneGGUF commited on
Commit
de64007
·
unverified ·
2 Parent(s): 906cc38 0e7937b

docs: toc for readme

Browse files

docs: added a TOC for the README

Files changed (1) hide show
  1. README.md +18 -7
README.md CHANGED
@@ -1,6 +1,5 @@
1
- [![bolt.diy: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.diy)
2
-
3
  # bolt.diy (Previously oTToDev)
 
4
 
5
  Welcome to bolt.diy, the official open source version of Bolt.new (previously known as oTToDev and bolt.new ANY LLM), which allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
6
 
@@ -10,12 +9,24 @@ We have also launched an experimental agent called the "bolt.diy Expert" that ca
10
 
11
  bolt.diy was originally started by [Cole Medin](https://www.youtube.com/@ColeMedin) but has quickly grown into a massive community effort to build the BEST open source AI coding assistant!
12
 
13
- ## Join the community for bolt.diy!
 
 
 
 
 
 
 
 
 
 
 
 
14
 
15
- https://thinktank.ottomator.ai
16
 
17
 
18
- ## Requested Additions - Feel Free to Contribute!
19
 
20
  - ✅ OpenRouter Integration (@coleam00)
21
  - ✅ Gemini Integration (@jonathands)
@@ -62,7 +73,7 @@ https://thinktank.ottomator.ai
62
  - ⬜ Perplexity Integration
63
  - ⬜ Vertex AI Integration
64
 
65
- ## bolt.diy Features
66
 
67
  - **AI-powered full-stack web development** directly in your browser.
68
  - **Support for multiple LLMs** with an extensible architecture to integrate additional models.
@@ -72,7 +83,7 @@ https://thinktank.ottomator.ai
72
  - **Download projects as ZIP** for easy portability.
73
  - **Integration-ready Docker support** for a hassle-free setup.
74
 
75
- ## Setup bolt.diy
76
 
77
  If you're new to installing software from GitHub, don't worry! If you encounter any issues, feel free to submit an "issue" using the provided links or improve this documentation by forking the repository, editing the instructions, and submitting a pull request. The following instruction will help you get the stable branch up and running on your local machine in no time.
78
 
 
 
 
1
  # bolt.diy (Previously oTToDev)
2
+ [![bolt.diy: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.diy)
3
 
4
  Welcome to bolt.diy, the official open source version of Bolt.new (previously known as oTToDev and bolt.new ANY LLM), which allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
5
 
 
9
 
10
  bolt.diy was originally started by [Cole Medin](https://www.youtube.com/@ColeMedin) but has quickly grown into a massive community effort to build the BEST open source AI coding assistant!
11
 
12
+ ## Table of Contents
13
+
14
+ - [Join the Community](#join-the-community)
15
+ - [Requested Additions](#requested-additions)
16
+ - [Features](#features)
17
+ - [Setup](#setup)
18
+ - [Run the Application](#run-the-application)
19
+ - [Available Scripts](#available-scripts)
20
+ - [Contributing](#contributing)
21
+ - [Roadmap](#roadmap)
22
+ - [FAQ](#faq)
23
+
24
+ ## Join the community
25
 
26
+ [Join the bolt.diy community here, in the thinktank on ottomator.ai!](https://thinktank.ottomator.ai)
27
 
28
 
29
+ ## Requested Additions
30
 
31
  - ✅ OpenRouter Integration (@coleam00)
32
  - ✅ Gemini Integration (@jonathands)
 
73
  - ⬜ Perplexity Integration
74
  - ⬜ Vertex AI Integration
75
 
76
+ ## Features
77
 
78
  - **AI-powered full-stack web development** directly in your browser.
79
  - **Support for multiple LLMs** with an extensible architecture to integrate additional models.
 
83
  - **Download projects as ZIP** for easy portability.
84
  - **Integration-ready Docker support** for a hassle-free setup.
85
 
86
+ ## Setup
87
 
88
  If you're new to installing software from GitHub, don't worry! If you encounter any issues, feel free to submit an "issue" using the provided links or improve this documentation by forking the repository, editing the instructions, and submitting a pull request. The following instruction will help you get the stable branch up and running on your local machine in no time.
89