Eduards commited on
Commit
17e5003
·
2 Parent(s): 5d4b860 64eee11
README.md CHANGED
@@ -1,12 +1,14 @@
1
  [![Bolt.new: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.new)
2
 
3
- # Bolt.new Fork by Cole Medin - oTToDev
4
 
5
- This fork of Bolt.new (oTToDev) allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
6
 
7
- Check the [oTToDev Docs](https://coleam00.github.io/bolt.new-any-llm/) for more information.
8
 
9
- ## Join the community for oTToDev!
 
 
10
 
11
  https://thinktank.ottomator.ai
12
 
@@ -56,7 +58,7 @@ https://thinktank.ottomator.ai
56
 
57
  ## Bolt.new: AI-Powered Full-Stack Web Development in the Browser
58
 
59
- Bolt.new is an AI-powered web development agent that allows you to prompt, run, edit, and deploy full-stack applications directly from your browser—no local setup required. If you're here to build your own AI-powered web dev agent using the Bolt open source codebase, [click here to get started!](./CONTRIBUTING.md)
60
 
61
  ## What Makes Bolt.new Different
62
 
@@ -96,7 +98,7 @@ If you see usr/local/bin in the output then you're good to go.
96
  3. Clone the repository (if you haven't already) by opening a Terminal window (or CMD with admin permissions) and then typing in this:
97
 
98
  ```
99
- git clone https://github.com/coleam00/bolt.new-any-llm.git
100
  ```
101
 
102
  3. Rename .env.example to .env.local and add your LLM API keys. You will find this file on a Mac at "[your name]/bold.new-any-llm/.env.example". For Windows and Linux the path will be similar.
@@ -225,11 +227,11 @@ pnpm run dev
225
 
226
  This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway.
227
 
228
- ## How do I contribute to oTToDev?
229
 
230
- [Please check out our dedicated page for contributing to oTToDev here!](CONTRIBUTING.md)
231
 
232
- ## What are the future plans for oTToDev?
233
 
234
  [Check out our Roadmap here!](https://roadmap.sh/r/ottodev-roadmap-2ovzo)
235
 
@@ -237,4 +239,4 @@ Lot more updates to this roadmap coming soon!
237
 
238
  ## FAQ
239
 
240
- [Please check out our dedicated page for FAQ's related to oTToDev here!](FAQ.md)
 
1
  [![Bolt.new: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.new)
2
 
3
+ # Bolt.diy (Previously oTToDev)
4
 
5
+ Welcome to Bolt.diy, the official open source version of Bolt.new (previously known as oTToDev and Bolt.new ANY LLM), which allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
6
 
7
+ Check the [Bolt.diy Docs](https://stackblitz-labs.github.io/bolt.diy/) for more information. This documentation is still being updated after the transfer.
8
 
9
+ Bolt.diy was originally started by [Cole Medin](https://www.youtube.com/@ColeMedin) but has quickly grown into a massive community effort to build the BEST open source AI coding assistant!
10
+
11
+ ## Join the community for Bolt.diy!
12
 
13
  https://thinktank.ottomator.ai
14
 
 
58
 
59
  ## Bolt.new: AI-Powered Full-Stack Web Development in the Browser
60
 
61
+ Bolt.new (and by extension Bolt.diy) is an AI-powered web development agent that allows you to prompt, run, edit, and deploy full-stack applications directly from your browser—no local setup required. If you're here to build your own AI-powered web dev agent using the Bolt open source codebase, [click here to get started!](./CONTRIBUTING.md)
62
 
63
  ## What Makes Bolt.new Different
64
 
 
98
  3. Clone the repository (if you haven't already) by opening a Terminal window (or CMD with admin permissions) and then typing in this:
99
 
100
  ```
101
+ git clone https://github.com/stackblitz-labs/bolt.diy.git
102
  ```
103
 
104
  3. Rename .env.example to .env.local and add your LLM API keys. You will find this file on a Mac at "[your name]/bold.new-any-llm/.env.example". For Windows and Linux the path will be similar.
 
227
 
228
  This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway.
229
 
230
+ ## How do I contribute to Bolt.diy?
231
 
232
+ [Please check out our dedicated page for contributing to Bolt.diy here!](CONTRIBUTING.md)
233
 
234
+ ## What are the future plans for Bolt.diy?
235
 
236
  [Check out our Roadmap here!](https://roadmap.sh/r/ottodev-roadmap-2ovzo)
237
 
 
239
 
240
  ## FAQ
241
 
242
+ [Please check out our dedicated page for FAQ's related to Bolt.diy here!](FAQ.md)
app/commit.json CHANGED
@@ -1 +1 @@
1
- { "commit": "eddf5603c3865536f96774fc3358cf24760fb613" }
 
1
+ { "commit": "fd2c17c384a69ab5e7a40113342caa7de405b944" }
app/components/chat/BaseChat.module.scss CHANGED
@@ -18,82 +18,6 @@
18
  opacity: 1;
19
  }
20
 
21
- .RayContainer {
22
- --gradient-opacity: 0.85;
23
- --ray-gradient: radial-gradient(rgba(83, 196, 255, var(--gradient-opacity)) 0%, rgba(43, 166, 255, 0) 100%);
24
- transition: opacity 0.25s linear;
25
- position: fixed;
26
- inset: 0;
27
- pointer-events: none;
28
- user-select: none;
29
- }
30
-
31
- .LightRayOne {
32
- width: 480px;
33
- height: 680px;
34
- transform: rotate(80deg);
35
- top: -540px;
36
- left: 250px;
37
- filter: blur(110px);
38
- position: absolute;
39
- border-radius: 100%;
40
- background: var(--ray-gradient);
41
- }
42
-
43
- .LightRayTwo {
44
- width: 110px;
45
- height: 400px;
46
- transform: rotate(-20deg);
47
- top: -280px;
48
- left: 350px;
49
- mix-blend-mode: overlay;
50
- opacity: 0.6;
51
- filter: blur(60px);
52
- position: absolute;
53
- border-radius: 100%;
54
- background: var(--ray-gradient);
55
- }
56
-
57
- .LightRayThree {
58
- width: 400px;
59
- height: 370px;
60
- top: -350px;
61
- left: 200px;
62
- mix-blend-mode: overlay;
63
- opacity: 0.6;
64
- filter: blur(21px);
65
- position: absolute;
66
- border-radius: 100%;
67
- background: var(--ray-gradient);
68
- }
69
-
70
- .LightRayFour {
71
- position: absolute;
72
- width: 330px;
73
- height: 370px;
74
- top: -330px;
75
- left: 50px;
76
- mix-blend-mode: overlay;
77
- opacity: 0.5;
78
- filter: blur(21px);
79
- border-radius: 100%;
80
- background: var(--ray-gradient);
81
- }
82
-
83
- .LightRayFive {
84
- position: absolute;
85
- width: 110px;
86
- height: 400px;
87
- transform: rotate(-40deg);
88
- top: -280px;
89
- left: -10px;
90
- mix-blend-mode: overlay;
91
- opacity: 0.8;
92
- filter: blur(60px);
93
- border-radius: 100%;
94
- background: var(--ray-gradient);
95
- }
96
-
97
  .PromptEffectContainer {
98
  --prompt-container-offset: 50px;
99
  --prompt-line-stroke-width: 1px;
 
18
  opacity: 1;
19
  }
20
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
21
  .PromptEffectContainer {
22
  --prompt-container-offset: 50px;
23
  --prompt-line-stroke-width: 1px;
app/components/chat/BaseChat.tsx CHANGED
@@ -110,8 +110,10 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
110
  const [recognition, setRecognition] = useState<SpeechRecognition | null>(null);
111
  const [transcript, setTranscript] = useState('');
112
 
113
- // Update enabled providers when cookies change
114
- console.log(transcript);
 
 
115
  useEffect(() => {
116
  // Load API keys from cookies on component mount
117
  try {
@@ -274,19 +276,9 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
274
  const baseChat = (
275
  <div
276
  ref={ref}
277
- className={classNames(
278
- styles.BaseChat,
279
- 'relative flex flex-col lg:flex-row h-full w-full overflow-hidden bg-bolt-elements-background-depth-1',
280
- )}
281
  data-chat-visible={showChat}
282
  >
283
- <div className={classNames(styles.RayContainer)}>
284
- <div className={classNames(styles.LightRayOne)}></div>
285
- <div className={classNames(styles.LightRayTwo)}></div>
286
- <div className={classNames(styles.LightRayThree)}></div>
287
- <div className={classNames(styles.LightRayFour)}></div>
288
- <div className={classNames(styles.LightRayFive)}></div>
289
- </div>
290
  <ClientOnly>{() => <Menu />}</ClientOnly>
291
  <div ref={scrollRef} className="flex flex-col lg:flex-row overflow-y-auto w-full h-full">
292
  <div className={classNames(styles.Chat, 'flex flex-col flex-grow lg:min-w-[var(--chat-min-width)] h-full')}>
@@ -336,15 +328,15 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
336
  gradientUnits="userSpaceOnUse"
337
  gradientTransform="rotate(-45)"
338
  >
339
- <stop offset="0%" stopColor="#1488fc" stopOpacity="0%"></stop>
340
- <stop offset="40%" stopColor="#1488fc" stopOpacity="80%"></stop>
341
- <stop offset="50%" stopColor="#1488fc" stopOpacity="80%"></stop>
342
- <stop offset="100%" stopColor="#1488fc" stopOpacity="0%"></stop>
343
  </linearGradient>
344
  <linearGradient id="shine-gradient">
345
  <stop offset="0%" stopColor="white" stopOpacity="0%"></stop>
346
- <stop offset="40%" stopColor="#8adaff" stopOpacity="80%"></stop>
347
- <stop offset="50%" stopColor="#8adaff" stopOpacity="80%"></stop>
348
  <stop offset="100%" stopColor="white" stopOpacity="0%"></stop>
349
  </linearGradient>
350
  </defs>
 
110
  const [recognition, setRecognition] = useState<SpeechRecognition | null>(null);
111
  const [transcript, setTranscript] = useState('');
112
 
113
+ useEffect(() => {
114
+ console.log(transcript);
115
+ }, [transcript]);
116
+
117
  useEffect(() => {
118
  // Load API keys from cookies on component mount
119
  try {
 
276
  const baseChat = (
277
  <div
278
  ref={ref}
279
+ className={classNames(styles.BaseChat, 'relative flex h-full w-full overflow-hidden')}
 
 
 
280
  data-chat-visible={showChat}
281
  >
 
 
 
 
 
 
 
282
  <ClientOnly>{() => <Menu />}</ClientOnly>
283
  <div ref={scrollRef} className="flex flex-col lg:flex-row overflow-y-auto w-full h-full">
284
  <div className={classNames(styles.Chat, 'flex flex-col flex-grow lg:min-w-[var(--chat-min-width)] h-full')}>
 
328
  gradientUnits="userSpaceOnUse"
329
  gradientTransform="rotate(-45)"
330
  >
331
+ <stop offset="0%" stopColor="#b44aff" stopOpacity="0%"></stop>
332
+ <stop offset="40%" stopColor="#b44aff" stopOpacity="80%"></stop>
333
+ <stop offset="50%" stopColor="#b44aff" stopOpacity="80%"></stop>
334
+ <stop offset="100%" stopColor="#b44aff" stopOpacity="0%"></stop>
335
  </linearGradient>
336
  <linearGradient id="shine-gradient">
337
  <stop offset="0%" stopColor="white" stopOpacity="0%"></stop>
338
+ <stop offset="40%" stopColor="#ffffff" stopOpacity="80%"></stop>
339
+ <stop offset="50%" stopColor="#ffffff" stopOpacity="80%"></stop>
340
  <stop offset="100%" stopColor="white" stopOpacity="0%"></stop>
341
  </linearGradient>
342
  </defs>
app/components/chat/ModelSelector.tsx CHANGED
@@ -121,8 +121,8 @@ export const ModelSelector = ({
121
  >
122
  {[...modelList]
123
  .filter((e) => e.provider == provider?.name && e.name)
124
- .map((modelOption) => (
125
- <option key={modelOption.name} value={modelOption.name}>
126
  {modelOption.label}
127
  </option>
128
  ))}
 
121
  >
122
  {[...modelList]
123
  .filter((e) => e.provider == provider?.name && e.name)
124
+ .map((modelOption, index) => (
125
+ <option key={index} value={modelOption.name}>
126
  {modelOption.label}
127
  </option>
128
  ))}
app/components/header/Header.tsx CHANGED
@@ -10,18 +10,17 @@ export function Header() {
10
 
11
  return (
12
  <header
13
- className={classNames(
14
- 'flex items-center bg-bolt-elements-background-depth-1 p-5 border-b h-[var(--header-height)]',
15
- {
16
- 'border-transparent': !chat.started,
17
- 'border-bolt-elements-borderColor': chat.started,
18
- },
19
- )}
20
  >
21
  <div className="flex items-center gap-2 z-logo text-bolt-elements-textPrimary cursor-pointer">
22
  <div className="i-ph:sidebar-simple-duotone text-xl" />
23
  <a href="/" className="text-2xl font-semibold text-accent flex items-center">
24
- <span className="i-bolt:logo-text?mask w-[46px] inline-block" />
 
 
25
  </a>
26
  </div>
27
  {chat.started && ( // Display ChatDescription and HeaderActionButtons only when the chat has started.
 
10
 
11
  return (
12
  <header
13
+ className={classNames('flex items-center p-5 border-b h-[var(--header-height)]', {
14
+ 'border-transparent': !chat.started,
15
+ 'border-bolt-elements-borderColor': chat.started,
16
+ })}
 
 
 
17
  >
18
  <div className="flex items-center gap-2 z-logo text-bolt-elements-textPrimary cursor-pointer">
19
  <div className="i-ph:sidebar-simple-duotone text-xl" />
20
  <a href="/" className="text-2xl font-semibold text-accent flex items-center">
21
+ {/* <span className="i-bolt:logo-text?mask w-[46px] inline-block" /> */}
22
+ <img src="/logo-light-styled.png" alt="logo" className="w-[90px] inline-block dark:hidden" />
23
+ <img src="/logo-dark-styled.png" alt="logo" className="w-[90px] inline-block hidden dark:block" />
24
  </a>
25
  </div>
26
  {chat.started && ( // Display ChatDescription and HeaderActionButtons only when the chat has started.
app/components/settings/Settings.module.scss CHANGED
@@ -46,7 +46,7 @@
46
  padding: 1rem;
47
  margin-bottom: 1rem;
48
  border-style: solid;
49
- border-color: var(--bolt-elements-button-danger-backgroundHover) ;
50
  border-width: thin;
51
 
52
  button {
@@ -60,4 +60,4 @@
60
  background-color: var(--bolt-elements-button-danger-backgroundHover);
61
  }
62
  }
63
- }
 
46
  padding: 1rem;
47
  margin-bottom: 1rem;
48
  border-style: solid;
49
+ border-color: var(--bolt-elements-button-danger-backgroundHover);
50
  border-width: thin;
51
 
52
  button {
 
60
  background-color: var(--bolt-elements-button-danger-backgroundHover);
61
  }
62
  }
63
+ }
app/components/settings/SettingsWindow.tsx CHANGED
@@ -83,7 +83,7 @@ export const SettingsWindow = ({ open, onClose }: SettingsProps) => {
83
  ))}
84
  <div className="mt-auto flex flex-col gap-2">
85
  <a
86
- href="https://github.com/coleam00/bolt.new-any-llm"
87
  target="_blank"
88
  rel="noopener noreferrer"
89
  className={classNames(styles['settings-button'], 'flex items-center gap-2')}
@@ -92,7 +92,7 @@ export const SettingsWindow = ({ open, onClose }: SettingsProps) => {
92
  GitHub
93
  </a>
94
  <a
95
- href="https://coleam00.github.io/bolt.new-any-llm"
96
  target="_blank"
97
  rel="noopener noreferrer"
98
  className={classNames(styles['settings-button'], 'flex items-center gap-2')}
 
83
  ))}
84
  <div className="mt-auto flex flex-col gap-2">
85
  <a
86
+ href="https://github.com/stackblitz-labs/bolt.diy"
87
  target="_blank"
88
  rel="noopener noreferrer"
89
  className={classNames(styles['settings-button'], 'flex items-center gap-2')}
 
92
  GitHub
93
  </a>
94
  <a
95
+ href="https://stackblitz-labs.github.io/bolt.diy/"
96
  target="_blank"
97
  rel="noopener noreferrer"
98
  className={classNames(styles['settings-button'], 'flex items-center gap-2')}
app/components/ui/BackgroundRays/index.tsx ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import styles from './styles.module.scss';
2
+
3
+ const BackgroundRays = () => {
4
+ return (
5
+ <div className={`${styles.rayContainer} `}>
6
+ <div className={`${styles.lightRay} ${styles.ray1}`}></div>
7
+ <div className={`${styles.lightRay} ${styles.ray2}`}></div>
8
+ <div className={`${styles.lightRay} ${styles.ray3}`}></div>
9
+ <div className={`${styles.lightRay} ${styles.ray4}`}></div>
10
+ <div className={`${styles.lightRay} ${styles.ray5}`}></div>
11
+ <div className={`${styles.lightRay} ${styles.ray6}`}></div>
12
+ <div className={`${styles.lightRay} ${styles.ray7}`}></div>
13
+ <div className={`${styles.lightRay} ${styles.ray8}`}></div>
14
+ </div>
15
+ );
16
+ };
17
+
18
+ export default BackgroundRays;
app/components/ui/BackgroundRays/styles.module.scss ADDED
@@ -0,0 +1,246 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ .rayContainer {
2
+ // Theme-specific colors
3
+ --ray-color-primary: color-mix(in srgb, var(--primary-color), transparent 30%);
4
+ --ray-color-secondary: color-mix(in srgb, var(--secondary-color), transparent 30%);
5
+ --ray-color-accent: color-mix(in srgb, var(--accent-color), transparent 30%);
6
+
7
+ // Theme-specific gradients
8
+ --ray-gradient-primary: radial-gradient(var(--ray-color-primary) 0%, transparent 70%);
9
+ --ray-gradient-secondary: radial-gradient(var(--ray-color-secondary) 0%, transparent 70%);
10
+ --ray-gradient-accent: radial-gradient(var(--ray-color-accent) 0%, transparent 70%);
11
+
12
+ position: fixed;
13
+ inset: 0;
14
+ overflow: hidden;
15
+ animation: fadeIn 1.5s ease-out;
16
+ pointer-events: none;
17
+ z-index: 0;
18
+ // background-color: transparent;
19
+
20
+ :global(html[data-theme='dark']) & {
21
+ mix-blend-mode: screen;
22
+ }
23
+
24
+ :global(html[data-theme='light']) & {
25
+ mix-blend-mode: multiply;
26
+ }
27
+ }
28
+
29
+ .lightRay {
30
+ position: absolute;
31
+ border-radius: 100%;
32
+
33
+ :global(html[data-theme='dark']) & {
34
+ mix-blend-mode: screen;
35
+ }
36
+
37
+ :global(html[data-theme='light']) & {
38
+ mix-blend-mode: multiply;
39
+ opacity: 0.4;
40
+ }
41
+ }
42
+
43
+ .ray1 {
44
+ width: 600px;
45
+ height: 800px;
46
+ background: var(--ray-gradient-primary);
47
+ transform: rotate(65deg);
48
+ top: -500px;
49
+ left: -100px;
50
+ filter: blur(80px);
51
+ opacity: 0.6;
52
+ animation: float1 15s infinite ease-in-out;
53
+ }
54
+
55
+ .ray2 {
56
+ width: 400px;
57
+ height: 600px;
58
+ background: var(--ray-gradient-secondary);
59
+ transform: rotate(-30deg);
60
+ top: -300px;
61
+ left: 200px;
62
+ filter: blur(60px);
63
+ opacity: 0.6;
64
+ animation: float2 18s infinite ease-in-out;
65
+ }
66
+
67
+ .ray3 {
68
+ width: 500px;
69
+ height: 400px;
70
+ background: var(--ray-gradient-accent);
71
+ top: -320px;
72
+ left: 500px;
73
+ filter: blur(65px);
74
+ opacity: 0.5;
75
+ animation: float3 20s infinite ease-in-out;
76
+ }
77
+
78
+ .ray4 {
79
+ width: 400px;
80
+ height: 450px;
81
+ background: var(--ray-gradient-secondary);
82
+ top: -350px;
83
+ left: 800px;
84
+ filter: blur(55px);
85
+ opacity: 0.55;
86
+ animation: float4 17s infinite ease-in-out;
87
+ }
88
+
89
+ .ray5 {
90
+ width: 350px;
91
+ height: 500px;
92
+ background: var(--ray-gradient-primary);
93
+ transform: rotate(-45deg);
94
+ top: -250px;
95
+ left: 1000px;
96
+ filter: blur(45px);
97
+ opacity: 0.6;
98
+ animation: float5 16s infinite ease-in-out;
99
+ }
100
+
101
+ .ray6 {
102
+ width: 300px;
103
+ height: 700px;
104
+ background: var(--ray-gradient-accent);
105
+ transform: rotate(75deg);
106
+ top: -400px;
107
+ left: 600px;
108
+ filter: blur(75px);
109
+ opacity: 0.45;
110
+ animation: float6 19s infinite ease-in-out;
111
+ }
112
+
113
+ .ray7 {
114
+ width: 450px;
115
+ height: 600px;
116
+ background: var(--ray-gradient-primary);
117
+ transform: rotate(45deg);
118
+ top: -450px;
119
+ left: 350px;
120
+ filter: blur(65px);
121
+ opacity: 0.55;
122
+ animation: float7 21s infinite ease-in-out;
123
+ }
124
+
125
+ .ray8 {
126
+ width: 380px;
127
+ height: 550px;
128
+ background: var(--ray-gradient-secondary);
129
+ transform: rotate(-60deg);
130
+ top: -380px;
131
+ left: 750px;
132
+ filter: blur(58px);
133
+ opacity: 0.6;
134
+ animation: float8 14s infinite ease-in-out;
135
+ }
136
+
137
+ @keyframes float1 {
138
+ 0%,
139
+ 100% {
140
+ transform: rotate(65deg) translate(0, 0);
141
+ }
142
+ 25% {
143
+ transform: rotate(70deg) translate(30px, 20px);
144
+ }
145
+ 50% {
146
+ transform: rotate(60deg) translate(-20px, 40px);
147
+ }
148
+ 75% {
149
+ transform: rotate(68deg) translate(-40px, 10px);
150
+ }
151
+ }
152
+
153
+ @keyframes float2 {
154
+ 0%,
155
+ 100% {
156
+ transform: rotate(-30deg) scale(1);
157
+ }
158
+ 33% {
159
+ transform: rotate(-25deg) scale(1.1);
160
+ }
161
+ 66% {
162
+ transform: rotate(-35deg) scale(0.95);
163
+ }
164
+ }
165
+
166
+ @keyframes float3 {
167
+ 0%,
168
+ 100% {
169
+ transform: translate(0, 0) rotate(0deg);
170
+ }
171
+ 25% {
172
+ transform: translate(40px, 20px) rotate(5deg);
173
+ }
174
+ 75% {
175
+ transform: translate(-30px, 40px) rotate(-5deg);
176
+ }
177
+ }
178
+
179
+ @keyframes float4 {
180
+ 0%,
181
+ 100% {
182
+ transform: scale(1) rotate(0deg);
183
+ }
184
+ 50% {
185
+ transform: scale(1.15) rotate(10deg);
186
+ }
187
+ }
188
+
189
+ @keyframes float5 {
190
+ 0%,
191
+ 100% {
192
+ transform: rotate(-45deg) translate(0, 0);
193
+ }
194
+ 33% {
195
+ transform: rotate(-40deg) translate(25px, -20px);
196
+ }
197
+ 66% {
198
+ transform: rotate(-50deg) translate(-25px, 20px);
199
+ }
200
+ }
201
+
202
+ @keyframes float6 {
203
+ 0%,
204
+ 100% {
205
+ transform: rotate(75deg) scale(1);
206
+ filter: blur(75px);
207
+ }
208
+ 50% {
209
+ transform: rotate(85deg) scale(1.1);
210
+ filter: blur(65px);
211
+ }
212
+ }
213
+
214
+ @keyframes float7 {
215
+ 0%,
216
+ 100% {
217
+ transform: rotate(45deg) translate(0, 0);
218
+ opacity: 0.55;
219
+ }
220
+ 50% {
221
+ transform: rotate(40deg) translate(-30px, 30px);
222
+ opacity: 0.65;
223
+ }
224
+ }
225
+
226
+ @keyframes float8 {
227
+ 0%,
228
+ 100% {
229
+ transform: rotate(-60deg) scale(1);
230
+ }
231
+ 25% {
232
+ transform: rotate(-55deg) scale(1.05);
233
+ }
234
+ 75% {
235
+ transform: rotate(-65deg) scale(0.95);
236
+ }
237
+ }
238
+
239
+ @keyframes fadeIn {
240
+ from {
241
+ opacity: 0;
242
+ }
243
+ to {
244
+ opacity: 1;
245
+ }
246
+ }
app/routes/_index.tsx CHANGED
@@ -3,6 +3,7 @@ import { ClientOnly } from 'remix-utils/client-only';
3
  import { BaseChat } from '~/components/chat/BaseChat';
4
  import { Chat } from '~/components/chat/Chat.client';
5
  import { Header } from '~/components/header/Header';
 
6
 
7
  export const meta: MetaFunction = () => {
8
  return [{ title: 'Bolt' }, { name: 'description', content: 'Talk with Bolt, an AI assistant from StackBlitz' }];
@@ -12,7 +13,8 @@ export const loader = () => json({});
12
 
13
  export default function Index() {
14
  return (
15
- <div className="flex flex-col h-full w-full">
 
16
  <Header />
17
  <ClientOnly fallback={<BaseChat />}>{() => <Chat />}</ClientOnly>
18
  </div>
 
3
  import { BaseChat } from '~/components/chat/BaseChat';
4
  import { Chat } from '~/components/chat/Chat.client';
5
  import { Header } from '~/components/header/Header';
6
+ import BackgroundRays from '~/components/ui/BackgroundRays';
7
 
8
  export const meta: MetaFunction = () => {
9
  return [{ title: 'Bolt' }, { name: 'description', content: 'Talk with Bolt, an AI assistant from StackBlitz' }];
 
13
 
14
  export default function Index() {
15
  return (
16
+ <div className="flex flex-col h-full w-full bg-bolt-elements-background-depth-1">
17
+ <BackgroundRays />
18
  <Header />
19
  <ClientOnly fallback={<BaseChat />}>{() => <Chat />}</ClientOnly>
20
  </div>
app/styles/index.scss CHANGED
@@ -12,3 +12,13 @@ body {
12
  height: 100%;
13
  width: 100%;
14
  }
 
 
 
 
 
 
 
 
 
 
 
12
  height: 100%;
13
  width: 100%;
14
  }
15
+
16
+ :root {
17
+ --gradient-opacity: 0.8;
18
+ --primary-color: rgba(158, 117, 240, var(--gradient-opacity));
19
+ --secondary-color: rgba(138, 43, 226, var(--gradient-opacity));
20
+ --accent-color: rgba(128, 59, 239, var(--gradient-opacity));
21
+ // --primary-color: rgba(147, 112, 219, var(--gradient-opacity));
22
+ // --secondary-color: rgba(138, 43, 226, var(--gradient-opacity));
23
+ // --accent-color: rgba(180, 170, 220, var(--gradient-opacity));
24
+ }
app/utils/constants.ts CHANGED
@@ -1,6 +1,7 @@
1
  import Cookies from 'js-cookie';
2
  import type { ModelInfo, OllamaApiResponse, OllamaModel } from './types';
3
  import type { ProviderInfo, IProviderSetting } from '~/types/model';
 
4
 
5
  export const WORK_DIR_NAME = 'project';
6
  export const WORK_DIR = `/home/${WORK_DIR_NAME}`;
@@ -10,6 +11,8 @@ export const PROVIDER_REGEX = /\[Provider: (.*?)\]\n\n/;
10
  export const DEFAULT_MODEL = 'claude-3-5-sonnet-latest';
11
  export const PROMPT_COOKIE_KEY = 'cachedPrompt';
12
 
 
 
13
  const PROVIDER_LIST: ProviderInfo[] = [
14
  {
15
  name: 'Anthropic',
@@ -386,8 +389,8 @@ async function getOllamaModels(apiKeys?: Record<string, string>, settings?: IPro
386
  provider: 'Ollama',
387
  maxTokenAllowed: 8000,
388
  }));
389
- } catch (e) {
390
- console.error('Error getting Ollama models:', e);
391
  return [];
392
  }
393
  }
@@ -475,8 +478,8 @@ async function getLMStudioModels(_apiKeys?: Record<string, string>, settings?: I
475
  label: model.id,
476
  provider: 'LMStudio',
477
  }));
478
- } catch (e) {
479
- console.error('Error getting LMStudio models:', e);
480
  return [];
481
  }
482
  }
@@ -495,7 +498,7 @@ async function initializeModelList(providerSettings?: Record<string, IProviderSe
495
  }
496
  }
497
  } catch (error: any) {
498
- console.warn(`Failed to fetch apikeys from cookies:${error?.message}`);
499
  }
500
  MODEL_LIST = [
501
  ...(
 
1
  import Cookies from 'js-cookie';
2
  import type { ModelInfo, OllamaApiResponse, OllamaModel } from './types';
3
  import type { ProviderInfo, IProviderSetting } from '~/types/model';
4
+ import { createScopedLogger } from './logger';
5
 
6
  export const WORK_DIR_NAME = 'project';
7
  export const WORK_DIR = `/home/${WORK_DIR_NAME}`;
 
11
  export const DEFAULT_MODEL = 'claude-3-5-sonnet-latest';
12
  export const PROMPT_COOKIE_KEY = 'cachedPrompt';
13
 
14
+ const logger = createScopedLogger('Constants');
15
+
16
  const PROVIDER_LIST: ProviderInfo[] = [
17
  {
18
  name: 'Anthropic',
 
389
  provider: 'Ollama',
390
  maxTokenAllowed: 8000,
391
  }));
392
+ } catch (e: any) {
393
+ logger.warn('Failed to get Ollama models: ', e.message || '');
394
  return [];
395
  }
396
  }
 
478
  label: model.id,
479
  provider: 'LMStudio',
480
  }));
481
+ } catch (e: any) {
482
+ logger.warn('Failed to get LMStudio models: ', e.message || '');
483
  return [];
484
  }
485
  }
 
498
  }
499
  }
500
  } catch (error: any) {
501
+ logger.warn(`Failed to fetch apikeys from cookies: ${error?.message}`);
502
  }
503
  MODEL_LIST = [
504
  ...(
docs/docs/CONTRIBUTING.md CHANGED
@@ -4,7 +4,7 @@
4
 
5
  The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum number of context values used by the qwen2.5-coder model. For example, to limit the context to 24576 values (which uses 32GB of VRAM), set `DEFAULT_NUM_CTX=24576` in your `.env.local` file.
6
 
7
- First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide.
8
 
9
  ## 📋 Table of Contents
10
  - [Code of Conduct](#code-of-conduct)
@@ -62,7 +62,7 @@ We're looking for dedicated contributors to help maintain and grow this project.
62
  ### 🔄 Initial Setup
63
  1. Clone the repository:
64
  ```bash
65
- git clone https://github.com/coleam00/bolt.new-any-llm.git
66
  ```
67
 
68
  2. Install dependencies:
 
4
 
5
  The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum number of context values used by the qwen2.5-coder model. For example, to limit the context to 24576 values (which uses 32GB of VRAM), set `DEFAULT_NUM_CTX=24576` in your `.env.local` file.
6
 
7
+ First off, thank you for considering contributing to Bolt.diy! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.diy a better tool for developers worldwide.
8
 
9
  ## 📋 Table of Contents
10
  - [Code of Conduct](#code-of-conduct)
 
62
  ### 🔄 Initial Setup
63
  1. Clone the repository:
64
  ```bash
65
+ git clone https://github.com/stackblitz-labs/bolt.diy.git
66
  ```
67
 
68
  2. Install dependencies:
docs/docs/FAQ.md CHANGED
@@ -1,15 +1,15 @@
1
  # Frequently Asked Questions (FAQ)
2
 
3
- ## How do I get the best results with oTToDev?
4
 
5
  - **Be specific about your stack**:
6
- Mention the frameworks or libraries you want to use (e.g., Astro, Tailwind, ShadCN) in your initial prompt. This ensures that oTToDev scaffolds the project according to your preferences.
7
 
8
  - **Use the enhance prompt icon**:
9
  Before sending your prompt, click the *enhance* icon to let the AI refine your prompt. You can edit the suggested improvements before submitting.
10
 
11
  - **Scaffold the basics first, then add features**:
12
- Ensure the foundational structure of your application is in place before introducing advanced functionality. This helps oTToDev establish a solid base to build on.
13
 
14
  - **Batch simple instructions**:
15
  Combine simple tasks into a single prompt to save time and reduce API credit consumption. For example:
@@ -17,19 +17,14 @@
17
 
18
  ---
19
 
20
- ## How do I contribute to oTToDev?
21
 
22
  Check out our [Contribution Guide](CONTRIBUTING.md) for more details on how to get involved!
23
 
24
  ---
25
 
26
- ## Do you plan on merging oTToDev back into the official Bolt.new repo?
27
 
28
- Stay tuned! We’ll share updates on this early next month.
29
-
30
- ---
31
-
32
- ## What are the future plans for oTToDev?
33
 
34
  Visit our [Roadmap](https://roadmap.sh/r/ottodev-roadmap-2ovzo) for the latest updates.
35
  New features and improvements are on the way!
@@ -38,13 +33,13 @@ New features and improvements are on the way!
38
 
39
  ## Why are there so many open issues/pull requests?
40
 
41
- oTToDev began as a small showcase project on @ColeMedin's YouTube channel to explore editing open-source projects with local LLMs. However, it quickly grew into a massive community effort!
42
 
43
  We’re forming a team of maintainers to manage demand and streamline issue resolution. The maintainers are rockstars, and we’re also exploring partnerships to help the project thrive.
44
 
45
  ---
46
 
47
- ## How do local LLMs compare to larger models like Claude 3.5 Sonnet for oTToDev/Bolt.new?
48
 
49
  While local LLMs are improving rapidly, larger models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b still offer the best results for complex applications. Our ongoing focus is to improve prompts, agents, and the platform to better support smaller local LLMs.
50
 
 
1
  # Frequently Asked Questions (FAQ)
2
 
3
+ ## How do I get the best results with Bolt.diy?
4
 
5
  - **Be specific about your stack**:
6
+ Mention the frameworks or libraries you want to use (e.g., Astro, Tailwind, ShadCN) in your initial prompt. This ensures that Bolt.diy scaffolds the project according to your preferences.
7
 
8
  - **Use the enhance prompt icon**:
9
  Before sending your prompt, click the *enhance* icon to let the AI refine your prompt. You can edit the suggested improvements before submitting.
10
 
11
  - **Scaffold the basics first, then add features**:
12
+ Ensure the foundational structure of your application is in place before introducing advanced functionality. This helps Bolt.diy establish a solid base to build on.
13
 
14
  - **Batch simple instructions**:
15
  Combine simple tasks into a single prompt to save time and reduce API credit consumption. For example:
 
17
 
18
  ---
19
 
20
+ ## How do I contribute to Bolt.diy?
21
 
22
  Check out our [Contribution Guide](CONTRIBUTING.md) for more details on how to get involved!
23
 
24
  ---
25
 
 
26
 
27
+ ## What are the future plans for Bolt.diy?
 
 
 
 
28
 
29
  Visit our [Roadmap](https://roadmap.sh/r/ottodev-roadmap-2ovzo) for the latest updates.
30
  New features and improvements are on the way!
 
33
 
34
  ## Why are there so many open issues/pull requests?
35
 
36
+ Bolt.diy began as a small showcase project on @ColeMedin's YouTube channel to explore editing open-source projects with local LLMs. However, it quickly grew into a massive community effort!
37
 
38
  We’re forming a team of maintainers to manage demand and streamline issue resolution. The maintainers are rockstars, and we’re also exploring partnerships to help the project thrive.
39
 
40
  ---
41
 
42
+ ## How do local LLMs compare to larger models like Claude 3.5 Sonnet for Bolt.diy?
43
 
44
  While local LLMs are improving rapidly, larger models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b still offer the best results for complex applications. Our ongoing focus is to improve prompts, agents, and the platform to better support smaller local LLMs.
45
 
docs/docs/index.md CHANGED
@@ -1,28 +1,28 @@
1
- # Welcome to OTTO Dev
2
- This fork of Bolt.new (oTToDev) allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
3
 
4
- Join the community for oTToDev!
5
 
6
  https://thinktank.ottomator.ai
7
 
8
- ## Whats Bolt.new
9
 
10
- Bolt.new is an AI-powered web development agent that allows you to prompt, run, edit, and deploy full-stack applications directly from your browser—no local setup required. If you're here to build your own AI-powered web dev agent using the Bolt open source codebase, [click here to get started!](./CONTRIBUTING.md)
11
 
12
- ## What Makes Bolt.new Different
13
 
14
- Claude, v0, etc are incredible- but you can't install packages, run backends, or edit code. That’s where Bolt.new stands out:
15
 
16
- - **Full-Stack in the Browser**: Bolt.new integrates cutting-edge AI models with an in-browser development environment powered by **StackBlitz’s WebContainers**. This allows you to:
17
  - Install and run npm tools and libraries (like Vite, Next.js, and more)
18
  - Run Node.js servers
19
  - Interact with third-party APIs
20
  - Deploy to production from chat
21
  - Share your work via a URL
22
 
23
- - **AI with Environment Control**: Unlike traditional dev environments where the AI can only assist in code generation, Bolt.new gives AI models **complete control** over the entire environment including the filesystem, node server, package manager, terminal, and browser console. This empowers AI agents to handle the whole app lifecycle—from creation to deployment.
24
 
25
- Whether you’re an experienced developer, a PM, or a designer, Bolt.new allows you to easily build production-grade full-stack applications.
26
 
27
  For developers interested in building their own AI-powered development tools with WebContainers, check out the open-source Bolt codebase in this repo!
28
 
@@ -47,10 +47,10 @@ If you see usr/local/bin in the output then you're good to go.
47
  3. Clone the repository (if you haven't already) by opening a Terminal window (or CMD with admin permissions) and then typing in this:
48
 
49
  ```
50
- git clone https://github.com/coleam00/bolt.new-any-llm.git
51
  ```
52
 
53
- 3. Rename .env.example to .env.local and add your LLM API keys. You will find this file on a Mac at "[your name]/bold.new-any-llm/.env.example". For Windows and Linux the path will be similar.
54
 
55
  ![image](https://github.com/user-attachments/assets/7e6a532c-2268-401f-8310-e8d20c731328)
56
 
@@ -150,7 +150,7 @@ pnpm run dev
150
 
151
  ## Adding New LLMs:
152
 
153
- To make new LLMs available to use in this version of Bolt.new, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.
154
 
155
  By default, Anthropic, OpenAI, Groq, and Ollama are implemented as providers, but the YouTube video for this repo covers how to extend this to work with more providers if you wish!
156
 
@@ -179,7 +179,7 @@ This will start the Remix Vite development server. You will need Google Chrome C
179
 
180
  ## Tips and Tricks
181
 
182
- Here are some tips to get the most out of Bolt.new:
183
 
184
  - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
185
 
 
1
+ # Welcome to Bolt DIY
2
+ Bolt.diy allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
3
 
4
+ Join the community!
5
 
6
  https://thinktank.ottomator.ai
7
 
8
+ ## Whats Bolt.diy
9
 
10
+ Bolt.diy is an AI-powered web development agent that allows you to prompt, run, edit, and deploy full-stack applications directly from your browser—no local setup required. If you're here to build your own AI-powered web dev agent using the Bolt open source codebase, [click here to get started!](./CONTRIBUTING.md)
11
 
12
+ ## What Makes Bolt.diy Different
13
 
14
+ Claude, v0, etc are incredible- but you can't install packages, run backends, or edit code. That’s where Bolt.diy stands out:
15
 
16
+ - **Full-Stack in the Browser**: Bolt.diy integrates cutting-edge AI models with an in-browser development environment powered by **StackBlitz’s WebContainers**. This allows you to:
17
  - Install and run npm tools and libraries (like Vite, Next.js, and more)
18
  - Run Node.js servers
19
  - Interact with third-party APIs
20
  - Deploy to production from chat
21
  - Share your work via a URL
22
 
23
+ - **AI with Environment Control**: Unlike traditional dev environments where the AI can only assist in code generation, Bolt.diy gives AI models **complete control** over the entire environment including the filesystem, node server, package manager, terminal, and browser console. This empowers AI agents to handle the whole app lifecycle—from creation to deployment.
24
 
25
+ Whether you’re an experienced developer, a PM, or a designer, Bolt.diy allows you to easily build production-grade full-stack applications.
26
 
27
  For developers interested in building their own AI-powered development tools with WebContainers, check out the open-source Bolt codebase in this repo!
28
 
 
47
  3. Clone the repository (if you haven't already) by opening a Terminal window (or CMD with admin permissions) and then typing in this:
48
 
49
  ```
50
+ git clone https://github.com/stackblitz-labs/bolt.diy.git
51
  ```
52
 
53
+ 3. Rename .env.example to .env.local and add your LLM API keys. You will find this file on a Mac at "[your name]/bolt.diy/.env.example". For Windows and Linux the path will be similar.
54
 
55
  ![image](https://github.com/user-attachments/assets/7e6a532c-2268-401f-8310-e8d20c731328)
56
 
 
150
 
151
  ## Adding New LLMs:
152
 
153
+ To make new LLMs available to use in this version of Bolt.diy, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.
154
 
155
  By default, Anthropic, OpenAI, Groq, and Ollama are implemented as providers, but the YouTube video for this repo covers how to extend this to work with more providers if you wish!
156
 
 
179
 
180
  ## Tips and Tricks
181
 
182
+ Here are some tips to get the most out of Bolt.diy:
183
 
184
  - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
185
 
docs/mkdocs.yml CHANGED
@@ -1,4 +1,4 @@
1
- site_name: Bolt.Local Docs
2
  site_dir: ../site
3
  theme:
4
  name: material
@@ -31,19 +31,19 @@ theme:
31
  repo: fontawesome/brands/github
32
  # logo: assets/logo.png
33
  # favicon: assets/logo.png
34
- repo_name: Bolt.Local
35
- repo_url: https://github.com/coleam00/bolt.new-any-llm
36
  edit_uri: ""
37
 
38
  extra:
39
  generator: false
40
  social:
41
  - icon: fontawesome/brands/github
42
- link: https://github.com/coleam00/bolt.new-any-llm
43
- name: Bolt.Local
44
  - icon: fontawesome/brands/discourse
45
  link: https://thinktank.ottomator.ai/
46
- name: Bolt.Local Discourse
47
 
48
 
49
  markdown_extensions:
 
1
+ site_name: Bolt.diy Docs
2
  site_dir: ../site
3
  theme:
4
  name: material
 
31
  repo: fontawesome/brands/github
32
  # logo: assets/logo.png
33
  # favicon: assets/logo.png
34
+ repo_name: Bolt.diy
35
+ repo_url: https://github.com/stackblitz-labs/bolt.diy
36
  edit_uri: ""
37
 
38
  extra:
39
  generator: false
40
  social:
41
  - icon: fontawesome/brands/github
42
+ link: https://github.com/stackblitz-labs/bolt.diy
43
+ name: Bolt.diy
44
  - icon: fontawesome/brands/discourse
45
  link: https://thinktank.ottomator.ai/
46
+ name: Bolt.diy Discourse
47
 
48
 
49
  markdown_extensions:
public/favicon.svg CHANGED
public/logo-dark-styled.png ADDED
public/logo-dark.png ADDED
public/logo-light-styled.png ADDED
public/logo-light.png ADDED
public/logo.svg CHANGED
public/social_preview_index.jpg CHANGED
uno.config.ts CHANGED
@@ -35,17 +35,17 @@ const BASE_COLORS = {
35
  950: '#0A0A0A',
36
  },
37
  accent: {
38
- 50: '#EEF9FF',
39
- 100: '#D8F1FF',
40
- 200: '#BAE7FF',
41
- 300: '#8ADAFF',
42
- 400: '#53C4FF',
43
- 500: '#2BA6FF',
44
- 600: '#1488FC',
45
- 700: '#0D6FE8',
46
- 800: '#1259BB',
47
- 900: '#154E93',
48
- 950: '#122F59',
49
  },
50
  green: {
51
  50: '#F0FDF4',
 
35
  950: '#0A0A0A',
36
  },
37
  accent: {
38
+ 50: '#F8F5FF',
39
+ 100: '#F0EBFF',
40
+ 200: '#E1D6FF',
41
+ 300: '#CEBEFF',
42
+ 400: '#B69EFF',
43
+ 500: '#9C7DFF',
44
+ 600: '#8A5FFF',
45
+ 700: '#7645E8',
46
+ 800: '#6234BB',
47
+ 900: '#502D93',
48
+ 950: '#2D1959',
49
  },
50
  green: {
51
  50: '#F0FDF4',