Cole Medin commited on
Commit
9b38dbd
·
unverified ·
2 Parent(s): 5b35213 8ac7931

Merge branch 'main' into main

Browse files
Files changed (4) hide show
  1. .env.example +1 -1
  2. CONTRIBUTING.md +62 -74
  3. README.md +18 -8
  4. app/components/chat/BaseChat.tsx +1 -1
.env.example CHANGED
@@ -26,7 +26,7 @@ OPEN_ROUTER_API_KEY=
26
  GOOGLE_GENERATIVE_AI_API_KEY=
27
 
28
  # You only need this environment variable set if you want to use oLLAMA models
29
- #EXAMPLE http://localhost:11434
30
  OLLAMA_API_BASE_URL=
31
 
32
  # Include this environment variable if you want more logging for debugging locally
 
26
  GOOGLE_GENERATIVE_AI_API_KEY=
27
 
28
  # You only need this environment variable set if you want to use oLLAMA models
29
+ # EXAMPLE http://localhost:11434
30
  OLLAMA_API_BASE_URL=
31
 
32
  # Include this environment variable if you want more logging for debugging locally
CONTRIBUTING.md CHANGED
@@ -1,110 +1,98 @@
1
- [![Bolt Open Source Codebase](./public/social_preview_index.jpg)](https://bolt.new)
2
 
3
- > Welcome to the **Bolt** open-source codebase! This repo contains a simple example app using the core components from bolt.new to help you get started building **AI-powered software development tools** powered by StackBlitz’s **WebContainer API**.
4
 
5
- ### Why Build with Bolt + WebContainer API
 
 
 
 
 
 
6
 
7
- By building with the Bolt + WebContainer API you can create browser-based applications that let users **prompt, run, edit, and deploy** full-stack web apps directly in the browser, without the need for virtual machines. With WebContainer API, you can build apps that give AI direct access and full control over a **Node.js server**, **filesystem**, **package manager** and **dev terminal** inside your users browser tab. This powerful combination allows you to create a new class of development tools that support all major JavaScript libraries and Node packages right out of the box, all without remote environments or local installs.
8
 
9
- ### What’s the Difference Between Bolt (This Repo) and [Bolt.new](https://bolt.new)?
10
 
11
- - **Bolt.new**: This is the **commercial product** from StackBlitz—a hosted, browser-based AI development tool that enables users to prompt, run, edit, and deploy full-stack web applications directly in the browser. Built on top of the [Bolt open-source repo](https://github.com/stackblitz/bolt.new) and powered by the StackBlitz **WebContainer API**.
12
 
13
- - **Bolt (This Repo)**: This open-source repository provides the core components used to make **Bolt.new**. This repo contains the UI interface for Bolt as well as the server components, built using [Remix Run](https://remix.run/). By leveraging this repo and StackBlitz’s **WebContainer API**, you can create your own AI-powered development tools and full-stack applications that run entirely in the browser.
 
 
 
 
14
 
15
- # Get Started Building with Bolt
 
 
 
 
16
 
17
- Bolt combines the capabilities of AI with sandboxed development environments to create a collaborative experience where code can be developed by the assistant and the programmer together. Bolt combines [WebContainer API](https://webcontainers.io/api) with [Claude Sonnet 3.5](https://www.anthropic.com/news/claude-3-5-sonnet) using [Remix](https://remix.run/) and the [AI SDK](https://sdk.vercel.ai/).
 
18
 
19
- ### WebContainer API
20
 
21
- Bolt uses [WebContainers](https://webcontainers.io/) to run generated code in the browser. WebContainers provide Bolt with a full-stack sandbox environment using [WebContainer API](https://webcontainers.io/api). WebContainers run full-stack applications directly in the browser without the cost and security concerns of cloud hosted AI agents. WebContainers are interactive and editable, and enables Bolt's AI to run code and understand any changes from the user.
 
 
 
 
22
 
23
- The [WebContainer API](https://webcontainers.io) is free for personal and open source usage. If you're building an application for commercial usage, you can learn more about our [WebContainer API commercial usage pricing here](https://stackblitz.com/pricing#webcontainer-api).
 
 
 
 
24
 
25
- ### Remix App
26
 
27
- Bolt is built with [Remix](https://remix.run/) and
28
- deployed using [CloudFlare Pages](https://pages.cloudflare.com/) and
29
- [CloudFlare Workers](https://workers.cloudflare.com/).
 
 
30
 
31
- ### AI SDK Integration
32
-
33
- Bolt uses the [AI SDK](https://github.com/vercel/ai) to integrate with AI
34
- models. At this time, Bolt supports using Anthropic's Claude Sonnet 3.5.
35
- You can get an API key from the [Anthropic API Console](https://console.anthropic.com/) to use with Bolt.
36
- Take a look at how [Bolt uses the AI SDK](https://github.com/stackblitz/bolt.new/tree/main/app/lib/.server/llm)
37
-
38
- ## Prerequisites
39
-
40
- Before you begin, ensure you have the following installed:
41
-
42
- - Node.js (v20.15.1)
43
- - pnpm (v9.4.0)
44
-
45
- ## Setup
46
-
47
- 1. Clone the repository (if you haven't already):
48
 
 
 
49
  ```bash
50
- git clone https://github.com/stackblitz/bolt.new.git
51
  ```
52
 
53
  2. Install dependencies:
54
-
55
  ```bash
56
  pnpm install
57
  ```
58
 
59
- 3. Create a `.env.local` file in the root directory and add your Anthropic API key:
60
-
61
- ```
 
 
 
62
  ANTHROPIC_API_KEY=XXX
 
63
  ```
64
-
65
- Optionally, you can set the debug level:
66
-
67
- ```
68
  VITE_LOG_LEVEL=debug
69
  ```
70
-
71
  **Important**: Never commit your `.env.local` file to version control. It's already included in .gitignore.
72
 
73
- ## Available Scripts
74
-
75
- - `pnpm run dev`: Starts the development server.
76
- - `pnpm run build`: Builds the project.
77
- - `pnpm run start`: Runs the built application locally using Wrangler Pages. This script uses `bindings.sh` to set up necessary bindings so you don't have to duplicate environment variables.
78
- - `pnpm run preview`: Builds the project and then starts it locally, useful for testing the production build. Note, HTTP streaming currently doesn't work as expected with `wrangler pages dev`.
79
- - `pnpm test`: Runs the test suite using Vitest.
80
- - `pnpm run typecheck`: Runs TypeScript type checking.
81
- - `pnpm run typegen`: Generates TypeScript types using Wrangler.
82
- - `pnpm run deploy`: Builds the project and deploys it to Cloudflare Pages.
83
-
84
- ## Development
85
-
86
- To start the development server:
87
-
88
  ```bash
89
  pnpm run dev
90
  ```
91
 
92
- This will start the Remix Vite development server.
93
-
94
- ## Testing
95
 
96
- Run the test suite with:
97
 
98
- ```bash
99
- pnpm test
100
- ```
101
-
102
- ## Deployment
103
-
104
- To deploy the application to Cloudflare Pages:
105
-
106
- ```bash
107
- pnpm run deploy
108
- ```
109
 
110
- Make sure you have the necessary permissions and Wrangler is correctly configured for your Cloudflare account.
 
1
+ # Contributing to Bolt.new Fork
2
 
3
+ First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide.
4
 
5
+ ## 📋 Table of Contents
6
+ - [Code of Conduct](#code-of-conduct)
7
+ - [How Can I Contribute?](#how-can-i-contribute)
8
+ - [Pull Request Guidelines](#pull-request-guidelines)
9
+ - [Coding Standards](#coding-standards)
10
+ - [Development Setup](#development-setup)
11
+ - [Project Structure](#project-structure)
12
 
13
+ ## Code of Conduct
14
 
15
+ This project and everyone participating in it is governed by our Code of Conduct. By participating, you are expected to uphold this code. Please report unacceptable behavior to the project maintainers.
16
 
17
+ ## How Can I Contribute?
18
 
19
+ ### 🐞 Reporting Bugs and Feature Requests
20
+ - Check the issue tracker to avoid duplicates
21
+ - Use the issue templates when available
22
+ - Include as much relevant information as possible
23
+ - For bugs, add steps to reproduce the issue
24
 
25
+ ### 🔧 Code Contributions
26
+ 1. Fork the repository
27
+ 2. Create a new branch for your feature/fix
28
+ 3. Write your code
29
+ 4. Submit a pull request
30
 
31
+ ### Becoming a Core Contributor
32
+ We're looking for dedicated contributors to help maintain and grow this project. If you're interested in becoming a core contributor, please fill out our [Contributor Application Form](https://forms.gle/TBSteXSDCtBDwr5m7).
33
 
34
+ ## Pull Request Guidelines
35
 
36
+ ### 📝 PR Checklist
37
+ - [ ] Branch from the main branch
38
+ - [ ] Update documentation if needed
39
+ - [ ] Manually verify all new functionality works as expected
40
+ - [ ] Keep PRs focused and atomic
41
 
42
+ ### 👀 Review Process
43
+ 1. Manually test the changes
44
+ 2. At least one maintainer review required
45
+ 3. Address all review comments
46
+ 4. Maintain clean commit history
47
 
48
+ ## Coding Standards
49
 
50
+ ### 💻 General Guidelines
51
+ - Follow existing code style
52
+ - Comment complex logic
53
+ - Keep functions focused and small
54
+ - Use meaningful variable names
55
 
56
+ ## Development Setup
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57
 
58
+ ### 🔄 Initial Setup
59
+ 1. Clone the repository:
60
  ```bash
61
+ git clone https://github.com/coleam00/bolt.new-any-llm.git
62
  ```
63
 
64
  2. Install dependencies:
 
65
  ```bash
66
  pnpm install
67
  ```
68
 
69
+ 3. Set up environment variables:
70
+ - Rename `.env.example` to `.env.local`
71
+ - Add your LLM API keys (only set the ones you plan to use):
72
+ ```bash
73
+ GROQ_API_KEY=XXX
74
+ OPENAI_API_KEY=XXX
75
  ANTHROPIC_API_KEY=XXX
76
+ ...
77
  ```
78
+ - Optionally set debug level:
79
+ ```bash
 
 
80
  VITE_LOG_LEVEL=debug
81
  ```
 
82
  **Important**: Never commit your `.env.local` file to version control. It's already included in .gitignore.
83
 
84
+ ### 🚀 Running the Development Server
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85
  ```bash
86
  pnpm run dev
87
  ```
88
 
89
+ **Note**: You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway.
 
 
90
 
91
+ ## Questions?
92
 
93
+ For any questions about contributing, please:
94
+ 1. Check existing documentation
95
+ 2. Search through issues
96
+ 3. Create a new issue with the question label
 
 
 
 
 
 
 
97
 
98
+ Thank you for contributing to Bolt.new! 🚀
README.md CHANGED
@@ -2,24 +2,34 @@
2
 
3
  # Bolt.new Fork by Cole Medin
4
 
5
- This fork of bolt.new allows you to choose the LLM that you use for each prompt! Currently you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See instructions below for running this locally and extending to include more models.
6
 
7
  # Requested Additions to this Fork - Feel Free to Contribute!!
8
 
9
  - ✅ OpenRouter Integration (@coleam00)
10
  - ✅ Gemini Integration (@jonathands)
11
- - ✅ Autogenerate Ollama models from what is downloaded (@mosquet)
12
  - ✅ Filter models by provider (@jasonm23)
13
  - ✅ Download project as ZIP (@fabwaseem)
14
  - ✅ Improvements to the main Bolt.new prompt in `app\lib\.server\llm\prompts.ts` (@kofi-bhr)
15
  - ⬜ LM Studio Integration
16
  - ⬜ DeepSeek API Integration
17
  - ⬜ Together Integration
 
 
 
 
18
  - ⬜ Better prompting for smaller LLMs (code window sometimes doesn't start)
19
  - ⬜ Attach images to prompts
20
- - ⬜ Run agents in the backend instead of a single model call
21
  - ⬜ Publish projects directly to GitHub
 
22
  - ⬜ Load local projects into the app
 
 
 
 
 
23
 
24
  # Bolt.new: AI-Powered Full-Stack Web Development in the Browser
25
 
@@ -27,7 +37,7 @@ Bolt.new is an AI-powered web development agent that allows you to prompt, run,
27
 
28
  ## What Makes Bolt.new Different
29
 
30
- Claude, v0, etc are incredible- but you can't install packages, run backends or edit code. That’s where Bolt.new stands out:
31
 
32
  - **Full-Stack in the Browser**: Bolt.new integrates cutting-edge AI models with an in-browser development environment powered by **StackBlitz’s WebContainers**. This allows you to:
33
  - Install and run npm tools and libraries (like Vite, Next.js, and more)
@@ -36,9 +46,9 @@ Claude, v0, etc are incredible- but you can't install packages, run backends or
36
  - Deploy to production from chat
37
  - Share your work via a URL
38
 
39
- - **AI with Environment Control**: Unlike traditional dev environments where the AI can only assist in code generation, Bolt.new gives AI models **complete control** over the entire environment including the filesystem, node server, package manager, terminal, and browser console. This empowers AI agents to handle the entire app lifecycle—from creation to deployment.
40
 
41
- Whether you’re an experienced developer, a PM or designer, Bolt.new allows you to build production-grade full-stack applications with ease.
42
 
43
  For developers interested in building their own AI-powered development tools with WebContainers, check out the open-source Bolt codebase in this repo!
44
 
@@ -81,7 +91,7 @@ VITE_LOG_LEVEL=debug
81
 
82
  ## Adding New LLMs:
83
 
84
- To make new LLMs available to use in this version of Bolt.new, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a lable for the frontend model dropdown, and the provider.
85
 
86
  By default, Anthropic, OpenAI, Groq, and Ollama are implemented as providers, but the YouTube video for this repo covers how to extend this to work with more providers if you wish!
87
 
@@ -106,7 +116,7 @@ To start the development server:
106
  pnpm run dev
107
  ```
108
 
109
- This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally! It's a very easy install and a good browser for web development anyway.
110
 
111
  ## Tips and Tricks
112
 
 
2
 
3
  # Bolt.new Fork by Cole Medin
4
 
5
+ This fork of Bolt.new allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
6
 
7
  # Requested Additions to this Fork - Feel Free to Contribute!!
8
 
9
  - ✅ OpenRouter Integration (@coleam00)
10
  - ✅ Gemini Integration (@jonathands)
11
+ - ✅ Autogenerate Ollama models from what is downloaded (@yunatamos)
12
  - ✅ Filter models by provider (@jasonm23)
13
  - ✅ Download project as ZIP (@fabwaseem)
14
  - ✅ Improvements to the main Bolt.new prompt in `app\lib\.server\llm\prompts.ts` (@kofi-bhr)
15
  - ⬜ LM Studio Integration
16
  - ⬜ DeepSeek API Integration
17
  - ⬜ Together Integration
18
+ - ⬜ Azure Open AI API Integration
19
+ - ⬜ HuggingFace Integration
20
+ - ⬜ Perplexity Integration
21
+ - ⬜ Containerize the application with Docker for easy installation
22
  - ⬜ Better prompting for smaller LLMs (code window sometimes doesn't start)
23
  - ⬜ Attach images to prompts
24
+ - ⬜ Run agents in the backend as opposed to a single model call
25
  - ⬜ Publish projects directly to GitHub
26
+ - ⬜ Deploy directly to Vercel/Netlify/other similar platforms
27
  - ⬜ Load local projects into the app
28
+ - ⬜ Ability to revert code to earlier version
29
+ - ⬜ Prompt caching
30
+ - ⬜ Ability to enter API keys in the UI
31
+ - ⬜ Prevent Bolt from rewriting files as often
32
+ - ⬜ Have LLM plan the project in a MD file for better results/transparency
33
 
34
  # Bolt.new: AI-Powered Full-Stack Web Development in the Browser
35
 
 
37
 
38
  ## What Makes Bolt.new Different
39
 
40
+ Claude, v0, etc are incredible- but you can't install packages, run backends, or edit code. That’s where Bolt.new stands out:
41
 
42
  - **Full-Stack in the Browser**: Bolt.new integrates cutting-edge AI models with an in-browser development environment powered by **StackBlitz’s WebContainers**. This allows you to:
43
  - Install and run npm tools and libraries (like Vite, Next.js, and more)
 
46
  - Deploy to production from chat
47
  - Share your work via a URL
48
 
49
+ - **AI with Environment Control**: Unlike traditional dev environments where the AI can only assist in code generation, Bolt.new gives AI models **complete control** over the entire environment including the filesystem, node server, package manager, terminal, and browser console. This empowers AI agents to handle the whole app lifecycle—from creation to deployment.
50
 
51
+ Whether you’re an experienced developer, a PM, or a designer, Bolt.new allows you to easily build production-grade full-stack applications.
52
 
53
  For developers interested in building their own AI-powered development tools with WebContainers, check out the open-source Bolt codebase in this repo!
54
 
 
91
 
92
  ## Adding New LLMs:
93
 
94
+ To make new LLMs available to use in this version of Bolt.new, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.
95
 
96
  By default, Anthropic, OpenAI, Groq, and Ollama are implemented as providers, but the YouTube video for this repo covers how to extend this to work with more providers if you wish!
97
 
 
116
  pnpm run dev
117
  ```
118
 
119
+ This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway.
120
 
121
  ## Tips and Tricks
122
 
app/components/chat/BaseChat.tsx CHANGED
@@ -116,7 +116,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
116
  data-chat-visible={showChat}
117
  >
118
  <ClientOnly>{() => <Menu />}</ClientOnly>
119
- <div ref={scrollRef} className="flex overflow-scroll w-full h-full">
120
  <div className={classNames(styles.Chat, 'flex flex-col flex-grow min-w-[var(--chat-min-width)] h-full')}>
121
  {!chatStarted && (
122
  <div id="intro" className="mt-[26vh] max-w-chat mx-auto">
 
116
  data-chat-visible={showChat}
117
  >
118
  <ClientOnly>{() => <Menu />}</ClientOnly>
119
+ <div ref={scrollRef} className="flex overflow-y-auto w-full h-full">
120
  <div className={classNames(styles.Chat, 'flex flex-col flex-grow min-w-[var(--chat-min-width)] h-full')}>
121
  {!chatStarted && (
122
  <div id="intro" className="mt-[26vh] max-w-chat mx-auto">