codacus commited on
Commit
da37d94
·
2 Parent(s): 3c7b125 bfaa3f7

Merge branch 'main' into context-optimization

Browse files
CONTRIBUTING.md CHANGED
@@ -1,6 +1,6 @@
1
  # Contributing to oTToDev
2
 
3
- First off, thank you for considering contributing to oTToDev! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make oTToDev a better tool for developers worldwide.
4
 
5
  ## 📋 Table of Contents
6
  - [Code of Conduct](#code-of-conduct)
 
1
  # Contributing to oTToDev
2
 
3
+ First off, thank you for considering contributing to Bolt.diy! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.diy a better tool for developers worldwide.
4
 
5
  ## 📋 Table of Contents
6
  - [Code of Conduct](#code-of-conduct)
FAQ.md CHANGED
@@ -1,45 +1,45 @@
1
  [![Bolt.new: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.new)
2
 
3
- # Bolt.new Fork by Cole Medin - oTToDev
4
 
5
  ## FAQ
6
 
7
- ### How do I get the best results with oTToDev?
8
 
9
  - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
10
 
11
  - **Use the enhance prompt icon**: Before sending your prompt, try clicking the 'enhance' icon to have the AI model help you refine your prompt, then edit the results before submitting.
12
 
13
- - **Scaffold the basics first, then add features**: Make sure the basic structure of your application is in place before diving into more advanced functionality. This helps oTToDev understand the foundation of your project and ensure everything is wired up right before building out more advanced functionality.
14
 
15
- - **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask oTToDev to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly.
16
 
17
- ### Do you plan on merging oTToDev back into the official Bolt.new repo?
18
 
19
  More news coming on this coming early next month - stay tuned!
20
 
21
  ### Why are there so many open issues/pull requests?
22
 
23
- oTToDev was started simply to showcase how to edit an open source project and to do something cool with local LLMs on my (@ColeMedin) YouTube channel! However, it quickly
24
  grew into a massive community project that I am working hard to keep up with the demand of by forming a team of maintainers and getting as many people involved as I can.
25
  That effort is going well and all of our maintainers are ABSOLUTE rockstars, but it still takes time to organize everything so we can efficiently get through all
26
  the issues and PRs. But rest assured, we are working hard and even working on some partnerships behind the scenes to really help this project take off!
27
 
28
- ### How do local LLMs fair compared to larger models like Claude 3.5 Sonnet for oTToDev/Bolt.new?
29
 
30
  As much as the gap is quickly closing between open source and massive close source models, you’re still going to get the best results with the very large models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b. This is one of the big tasks we have at hand - figuring out how to prompt better, use agents, and improve the platform as a whole to make it work better for even the smaller local LLMs!
31
 
32
  ### I'm getting the error: "There was an error processing this request"
33
 
34
- If you see this error within oTToDev, that is just the application telling you there is a problem at a high level, and this could mean a number of different things. To find the actual error, please check BOTH the terminal where you started the application (with Docker or pnpm) and the developer console in the browser. For most browsers, you can access the developer console by pressing F12 or right clicking anywhere in the browser and selecting “Inspect”. Then go to the “console” tab in the top right.
35
 
36
  ### I'm getting the error: "x-api-key header missing"
37
 
38
- We have seen this error a couple times and for some reason just restarting the Docker container has fixed it. This seems to be Ollama specific. Another thing to try is try to run oTToDev with Docker or pnpm, whichever you didn’t run first. We are still on the hunt for why this happens once and a while!
39
 
40
- ### I'm getting a blank preview when oTToDev runs my app!
41
 
42
- We promise you that we are constantly testing new PRs coming into oTToDev and the preview is core functionality, so the application is not broken! When you get a blank preview or don’t get a preview, this is generally because the LLM hallucinated bad code or incorrect commands. We are working on making this more transparent so it is obvious. Sometimes the error will appear in developer console too so check that as well.
43
 
44
  ### How to add a LLM:
45
 
 
1
  [![Bolt.new: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.new)
2
 
3
+ # Bolt.new Fork by Cole Medin - Bolt.diy
4
 
5
  ## FAQ
6
 
7
+ ### How do I get the best results with Bolt.diy?
8
 
9
  - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
10
 
11
  - **Use the enhance prompt icon**: Before sending your prompt, try clicking the 'enhance' icon to have the AI model help you refine your prompt, then edit the results before submitting.
12
 
13
+ - **Scaffold the basics first, then add features**: Make sure the basic structure of your application is in place before diving into more advanced functionality. This helps Bolt.diy understand the foundation of your project and ensure everything is wired up right before building out more advanced functionality.
14
 
15
+ - **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask Bolt.diy to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly.
16
 
17
+ ### Do you plan on merging Bolt.diy back into the official Bolt.new repo?
18
 
19
  More news coming on this coming early next month - stay tuned!
20
 
21
  ### Why are there so many open issues/pull requests?
22
 
23
+ Bolt.diy was started simply to showcase how to edit an open source project and to do something cool with local LLMs on my (@ColeMedin) YouTube channel! However, it quickly
24
  grew into a massive community project that I am working hard to keep up with the demand of by forming a team of maintainers and getting as many people involved as I can.
25
  That effort is going well and all of our maintainers are ABSOLUTE rockstars, but it still takes time to organize everything so we can efficiently get through all
26
  the issues and PRs. But rest assured, we are working hard and even working on some partnerships behind the scenes to really help this project take off!
27
 
28
+ ### How do local LLMs fair compared to larger models like Claude 3.5 Sonnet for Bolt.diy/Bolt.new?
29
 
30
  As much as the gap is quickly closing between open source and massive close source models, you’re still going to get the best results with the very large models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b. This is one of the big tasks we have at hand - figuring out how to prompt better, use agents, and improve the platform as a whole to make it work better for even the smaller local LLMs!
31
 
32
  ### I'm getting the error: "There was an error processing this request"
33
 
34
+ If you see this error within Bolt.diy, that is just the application telling you there is a problem at a high level, and this could mean a number of different things. To find the actual error, please check BOTH the terminal where you started the application (with Docker or pnpm) and the developer console in the browser. For most browsers, you can access the developer console by pressing F12 or right clicking anywhere in the browser and selecting “Inspect”. Then go to the “console” tab in the top right.
35
 
36
  ### I'm getting the error: "x-api-key header missing"
37
 
38
+ We have seen this error a couple times and for some reason just restarting the Docker container has fixed it. This seems to be Ollama specific. Another thing to try is try to run Bolt.diy with Docker or pnpm, whichever you didn’t run first. We are still on the hunt for why this happens once and a while!
39
 
40
+ ### I'm getting a blank preview when Bolt.diy runs my app!
41
 
42
+ We promise you that we are constantly testing new PRs coming into Bolt.diy and the preview is core functionality, so the application is not broken! When you get a blank preview or don’t get a preview, this is generally because the LLM hallucinated bad code or incorrect commands. We are working on making this more transparent so it is obvious. Sometimes the error will appear in developer console too so check that as well.
43
 
44
  ### How to add a LLM:
45
 
app/commit.json CHANGED
@@ -1 +1 @@
1
- { "commit": "154935cdeb054d2cc22dfb0c7e6cf084f02b95d0" }
 
1
+ { "commit": "8f3b4cd08249d26b14397e66241b9d099d3eb205" }
app/components/chat/BaseChat.tsx CHANGED
@@ -17,7 +17,6 @@ import Cookies from 'js-cookie';
17
  import * as Tooltip from '@radix-ui/react-tooltip';
18
 
19
  import styles from './BaseChat.module.scss';
20
- import type { ProviderInfo } from '~/utils/types';
21
  import { ExportChatButton } from '~/components/chat/chatExportAndImport/ExportChatButton';
22
  import { ImportButtons } from '~/components/chat/chatExportAndImport/ImportButtons';
23
  import { ExamplePrompts } from '~/components/chat/ExamplePrompts';
@@ -26,6 +25,7 @@ import GitCloneButton from './GitCloneButton';
26
  import FilePreview from './FilePreview';
27
  import { ModelSelector } from '~/components/chat/ModelSelector';
28
  import { SpeechRecognitionButton } from '~/components/chat/SpeechRecognition';
 
29
 
30
  const TEXTAREA_MIN_HEIGHT = 76;
31
 
@@ -45,6 +45,7 @@ interface BaseChatProps {
45
  setModel?: (model: string) => void;
46
  provider?: ProviderInfo;
47
  setProvider?: (provider: ProviderInfo) => void;
 
48
  handleStop?: () => void;
49
  sendMessage?: (event: React.UIEvent, messageInput?: string) => void;
50
  handleInputChange?: (event: React.ChangeEvent<HTMLTextAreaElement>) => void;
@@ -70,6 +71,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
70
  setModel,
71
  provider,
72
  setProvider,
 
73
  input = '',
74
  enhancingPrompt,
75
  handleInputChange,
@@ -108,48 +110,10 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
108
  const [recognition, setRecognition] = useState<SpeechRecognition | null>(null);
109
  const [transcript, setTranscript] = useState('');
110
 
111
- // Load enabled providers from cookies
112
- const [enabledProviders, setEnabledProviders] = useState(() => {
113
- const savedProviders = Cookies.get('providers');
114
-
115
- if (savedProviders) {
116
- try {
117
- const parsedProviders = JSON.parse(savedProviders);
118
- return PROVIDER_LIST.filter((p) => parsedProviders[p.name]);
119
- } catch (error) {
120
- console.error('Failed to parse providers from cookies:', error);
121
- return PROVIDER_LIST;
122
- }
123
- }
124
-
125
- return PROVIDER_LIST;
126
- });
127
-
128
- // Update enabled providers when cookies change
129
- useEffect(() => {
130
- const updateProvidersFromCookies = () => {
131
- const savedProviders = Cookies.get('providers');
132
-
133
- if (savedProviders) {
134
- try {
135
- const parsedProviders = JSON.parse(savedProviders);
136
- setEnabledProviders(PROVIDER_LIST.filter((p) => parsedProviders[p.name]));
137
- } catch (error) {
138
- console.error('Failed to parse providers from cookies:', error);
139
- }
140
- }
141
- };
142
-
143
- updateProvidersFromCookies();
144
-
145
- const interval = setInterval(updateProvidersFromCookies, 1000);
146
-
147
- return () => clearInterval(interval);
148
- }, [PROVIDER_LIST]);
149
-
150
  useEffect(() => {
151
  console.log(transcript);
152
  }, [transcript]);
 
153
  useEffect(() => {
154
  // Load API keys from cookies on component mount
155
  try {
@@ -169,7 +133,26 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
169
  Cookies.remove('apiKeys');
170
  }
171
 
172
- initializeModelList().then((modelList) => {
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
173
  setModelList(modelList);
174
  });
175
 
@@ -369,10 +352,10 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
369
  modelList={modelList}
370
  provider={provider}
371
  setProvider={setProvider}
372
- providerList={PROVIDER_LIST}
373
  apiKeys={apiKeys}
374
  />
375
- {enabledProviders.length > 0 && provider && (
376
  <APIKeyManager
377
  provider={provider}
378
  apiKey={apiKeys[provider.name] || ''}
@@ -468,7 +451,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
468
  <SendButton
469
  show={input.length > 0 || isStreaming || uploadedFiles.length > 0}
470
  isStreaming={isStreaming}
471
- disabled={enabledProviders.length === 0}
472
  onClick={(event) => {
473
  if (isStreaming) {
474
  handleStop?.();
@@ -528,7 +511,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
528
  !isModelSettingsCollapsed,
529
  })}
530
  onClick={() => setIsModelSettingsCollapsed(!isModelSettingsCollapsed)}
531
- disabled={enabledProviders.length === 0}
532
  >
533
  <div className={`i-ph:caret-${isModelSettingsCollapsed ? 'right' : 'down'} text-lg`} />
534
  {isModelSettingsCollapsed ? <span className="text-xs">{model}</span> : <span />}
 
17
  import * as Tooltip from '@radix-ui/react-tooltip';
18
 
19
  import styles from './BaseChat.module.scss';
 
20
  import { ExportChatButton } from '~/components/chat/chatExportAndImport/ExportChatButton';
21
  import { ImportButtons } from '~/components/chat/chatExportAndImport/ImportButtons';
22
  import { ExamplePrompts } from '~/components/chat/ExamplePrompts';
 
25
  import FilePreview from './FilePreview';
26
  import { ModelSelector } from '~/components/chat/ModelSelector';
27
  import { SpeechRecognitionButton } from '~/components/chat/SpeechRecognition';
28
+ import type { IProviderSetting, ProviderInfo } from '~/types/model';
29
 
30
  const TEXTAREA_MIN_HEIGHT = 76;
31
 
 
45
  setModel?: (model: string) => void;
46
  provider?: ProviderInfo;
47
  setProvider?: (provider: ProviderInfo) => void;
48
+ providerList?: ProviderInfo[];
49
  handleStop?: () => void;
50
  sendMessage?: (event: React.UIEvent, messageInput?: string) => void;
51
  handleInputChange?: (event: React.ChangeEvent<HTMLTextAreaElement>) => void;
 
71
  setModel,
72
  provider,
73
  setProvider,
74
+ providerList,
75
  input = '',
76
  enhancingPrompt,
77
  handleInputChange,
 
110
  const [recognition, setRecognition] = useState<SpeechRecognition | null>(null);
111
  const [transcript, setTranscript] = useState('');
112
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
113
  useEffect(() => {
114
  console.log(transcript);
115
  }, [transcript]);
116
+
117
  useEffect(() => {
118
  // Load API keys from cookies on component mount
119
  try {
 
133
  Cookies.remove('apiKeys');
134
  }
135
 
136
+ let providerSettings: Record<string, IProviderSetting> | undefined = undefined;
137
+
138
+ try {
139
+ const savedProviderSettings = Cookies.get('providers');
140
+
141
+ if (savedProviderSettings) {
142
+ const parsedProviderSettings = JSON.parse(savedProviderSettings);
143
+
144
+ if (typeof parsedProviderSettings === 'object' && parsedProviderSettings !== null) {
145
+ providerSettings = parsedProviderSettings;
146
+ }
147
+ }
148
+ } catch (error) {
149
+ console.error('Error loading Provider Settings from cookies:', error);
150
+
151
+ // Clear invalid cookie data
152
+ Cookies.remove('providers');
153
+ }
154
+
155
+ initializeModelList(providerSettings).then((modelList) => {
156
  setModelList(modelList);
157
  });
158
 
 
352
  modelList={modelList}
353
  provider={provider}
354
  setProvider={setProvider}
355
+ providerList={providerList || PROVIDER_LIST}
356
  apiKeys={apiKeys}
357
  />
358
+ {(providerList || []).length > 0 && provider && (
359
  <APIKeyManager
360
  provider={provider}
361
  apiKey={apiKeys[provider.name] || ''}
 
451
  <SendButton
452
  show={input.length > 0 || isStreaming || uploadedFiles.length > 0}
453
  isStreaming={isStreaming}
454
+ disabled={!providerList || providerList.length === 0}
455
  onClick={(event) => {
456
  if (isStreaming) {
457
  handleStop?.();
 
511
  !isModelSettingsCollapsed,
512
  })}
513
  onClick={() => setIsModelSettingsCollapsed(!isModelSettingsCollapsed)}
514
+ disabled={!providerList || providerList.length === 0}
515
  >
516
  <div className={`i-ph:caret-${isModelSettingsCollapsed ? 'right' : 'down'} text-lg`} />
517
  {isModelSettingsCollapsed ? <span className="text-xs">{model}</span> : <span />}
app/components/chat/Chat.client.tsx CHANGED
@@ -17,8 +17,9 @@ import { cubicEasingFn } from '~/utils/easings';
17
  import { createScopedLogger, renderLogger } from '~/utils/logger';
18
  import { BaseChat } from './BaseChat';
19
  import Cookies from 'js-cookie';
20
- import type { ProviderInfo } from '~/utils/types';
21
  import { debounce } from '~/utils/debounce';
 
 
22
 
23
  const toastAnimation = cssTransition({
24
  enter: 'animated fadeInRight',
@@ -92,6 +93,8 @@ export const ChatImpl = memo(
92
  const [uploadedFiles, setUploadedFiles] = useState<File[]>([]); // Move here
93
  const [imageDataList, setImageDataList] = useState<string[]>([]); // Move here
94
  const files = useStore(workbenchStore.files);
 
 
95
  const [model, setModel] = useState(() => {
96
  const savedModel = Cookies.get('selectedModel');
97
  return savedModel || DEFAULT_MODEL;
@@ -317,6 +320,7 @@ export const ChatImpl = memo(
317
  setModel={handleModelChange}
318
  provider={provider}
319
  setProvider={handleProviderChange}
 
320
  messageRef={messageRef}
321
  scrollRef={scrollRef}
322
  handleInputChange={(e) => {
 
17
  import { createScopedLogger, renderLogger } from '~/utils/logger';
18
  import { BaseChat } from './BaseChat';
19
  import Cookies from 'js-cookie';
 
20
  import { debounce } from '~/utils/debounce';
21
+ import { useSettings } from '~/lib/hooks/useSettings';
22
+ import type { ProviderInfo } from '~/types/model';
23
 
24
  const toastAnimation = cssTransition({
25
  enter: 'animated fadeInRight',
 
93
  const [uploadedFiles, setUploadedFiles] = useState<File[]>([]); // Move here
94
  const [imageDataList, setImageDataList] = useState<string[]>([]); // Move here
95
  const files = useStore(workbenchStore.files);
96
+ const { activeProviders } = useSettings();
97
+
98
  const [model, setModel] = useState(() => {
99
  const savedModel = Cookies.get('selectedModel');
100
  return savedModel || DEFAULT_MODEL;
 
320
  setModel={handleModelChange}
321
  provider={provider}
322
  setProvider={handleProviderChange}
323
+ providerList={activeProviders}
324
  messageRef={messageRef}
325
  scrollRef={scrollRef}
326
  handleInputChange={(e) => {
app/components/settings/Settings.module.scss CHANGED
@@ -46,7 +46,7 @@
46
  padding: 1rem;
47
  margin-bottom: 1rem;
48
  border-style: solid;
49
- border-color: var(--bolt-elements-button-danger-backgroundHover) ;
50
  border-width: thin;
51
 
52
  button {
@@ -60,4 +60,4 @@
60
  background-color: var(--bolt-elements-button-danger-backgroundHover);
61
  }
62
  }
63
- }
 
46
  padding: 1rem;
47
  margin-bottom: 1rem;
48
  border-style: solid;
49
+ border-color: var(--bolt-elements-button-danger-backgroundHover);
50
  border-width: thin;
51
 
52
  button {
 
60
  background-color: var(--bolt-elements-button-danger-backgroundHover);
61
  }
62
  }
63
+ }
app/components/settings/SettingsWindow.tsx CHANGED
@@ -1,17 +1,16 @@
1
  import * as RadixDialog from '@radix-ui/react-dialog';
2
  import { motion } from 'framer-motion';
3
- import { useState } from 'react';
4
  import { classNames } from '~/utils/classNames';
5
  import { DialogTitle, dialogVariants, dialogBackdropVariants } from '~/components/ui/Dialog';
6
  import { IconButton } from '~/components/ui/IconButton';
7
- import { providersList } from '~/lib/stores/settings';
8
- import { db, getAll, deleteById } from '~/lib/persistence';
9
- import { toast } from 'react-toastify';
10
- import { useNavigate } from '@remix-run/react';
11
- import commit from '~/commit.json';
12
- import Cookies from 'js-cookie';
13
  import styles from './Settings.module.scss';
14
- import { Switch } from '~/components/ui/Switch';
 
 
 
 
 
15
 
16
  interface SettingsProps {
17
  open: boolean;
@@ -21,206 +20,27 @@ interface SettingsProps {
21
  type TabType = 'chat-history' | 'providers' | 'features' | 'debug' | 'connection';
22
 
23
  // Providers that support base URL configuration
24
- const URL_CONFIGURABLE_PROVIDERS = ['Ollama', 'LMStudio', 'OpenAILike'];
25
-
26
  export const SettingsWindow = ({ open, onClose }: SettingsProps) => {
27
- const navigate = useNavigate();
28
  const [activeTab, setActiveTab] = useState<TabType>('chat-history');
29
- const [isDebugEnabled, setIsDebugEnabled] = useState(() => {
30
- const savedDebugState = Cookies.get('isDebugEnabled');
31
- return savedDebugState === 'true';
32
- });
33
- const [searchTerm, setSearchTerm] = useState('');
34
- const [isDeleting, setIsDeleting] = useState(false);
35
- const [githubUsername, setGithubUsername] = useState(Cookies.get('githubUsername') || '');
36
- const [githubToken, setGithubToken] = useState(Cookies.get('githubToken') || '');
37
- const [isLocalModelsEnabled, setIsLocalModelsEnabled] = useState(() => {
38
- const savedLocalModelsState = Cookies.get('isLocalModelsEnabled');
39
- return savedLocalModelsState === 'true';
40
- });
41
-
42
- // Load base URLs from cookies
43
- const [baseUrls, setBaseUrls] = useState(() => {
44
- const savedUrls = Cookies.get('providerBaseUrls');
45
-
46
- if (savedUrls) {
47
- try {
48
- return JSON.parse(savedUrls);
49
- } catch (error) {
50
- console.error('Failed to parse base URLs from cookies:', error);
51
- return {
52
- Ollama: 'http://localhost:11434',
53
- LMStudio: 'http://localhost:1234',
54
- OpenAILike: '',
55
- };
56
- }
57
- }
58
-
59
- return {
60
- Ollama: 'http://localhost:11434',
61
- LMStudio: 'http://localhost:1234',
62
- OpenAILike: '',
63
- };
64
- });
65
-
66
- const handleBaseUrlChange = (provider: string, url: string) => {
67
- setBaseUrls((prev: Record<string, string>) => {
68
- const newUrls = { ...prev, [provider]: url };
69
- Cookies.set('providerBaseUrls', JSON.stringify(newUrls));
70
 
71
- return newUrls;
72
- });
73
- };
74
-
75
- const tabs: { id: TabType; label: string; icon: string }[] = [
76
- { id: 'chat-history', label: 'Chat History', icon: 'i-ph:book' },
77
- { id: 'providers', label: 'Providers', icon: 'i-ph:key' },
78
- { id: 'features', label: 'Features', icon: 'i-ph:star' },
79
- { id: 'connection', label: 'Connection', icon: 'i-ph:link' },
80
- ...(isDebugEnabled ? [{ id: 'debug' as TabType, label: 'Debug Tab', icon: 'i-ph:bug' }] : []),
 
 
 
 
 
81
  ];
82
 
83
- // Load providers from cookies on mount
84
- const [providers, setProviders] = useState(() => {
85
- const savedProviders = Cookies.get('providers');
86
-
87
- if (savedProviders) {
88
- try {
89
- const parsedProviders = JSON.parse(savedProviders);
90
-
91
- // Merge saved enabled states with the base provider list
92
- return providersList.map((provider) => ({
93
- ...provider,
94
- isEnabled: parsedProviders[provider.name] || false,
95
- }));
96
- } catch (error) {
97
- console.error('Failed to parse providers from cookies:', error);
98
- }
99
- }
100
-
101
- return providersList;
102
- });
103
-
104
- const handleToggleProvider = (providerName: string, enabled: boolean) => {
105
- setProviders((prevProviders) => {
106
- const newProviders = prevProviders.map((provider) =>
107
- provider.name === providerName ? { ...provider, isEnabled: enabled } : provider,
108
- );
109
-
110
- // Save to cookies
111
- const enabledStates = newProviders.reduce(
112
- (acc, provider) => ({
113
- ...acc,
114
- [provider.name]: provider.isEnabled,
115
- }),
116
- {},
117
- );
118
- Cookies.set('providers', JSON.stringify(enabledStates));
119
-
120
- return newProviders;
121
- });
122
- };
123
-
124
- const filteredProviders = providers
125
- .filter((provider) => {
126
- const isLocalModelProvider = ['OpenAILike', 'LMStudio', 'Ollama'].includes(provider.name);
127
- return isLocalModelsEnabled || !isLocalModelProvider;
128
- })
129
- .filter((provider) => provider.name.toLowerCase().includes(searchTerm.toLowerCase()))
130
- .sort((a, b) => a.name.localeCompare(b.name));
131
-
132
- const handleCopyToClipboard = () => {
133
- const debugInfo = {
134
- OS: navigator.platform,
135
- Browser: navigator.userAgent,
136
- ActiveFeatures: providers.filter((provider) => provider.isEnabled).map((provider) => provider.name),
137
- BaseURLs: {
138
- Ollama: process.env.REACT_APP_OLLAMA_URL,
139
- OpenAI: process.env.REACT_APP_OPENAI_URL,
140
- LMStudio: process.env.REACT_APP_LM_STUDIO_URL,
141
- },
142
- Version: versionHash,
143
- };
144
- navigator.clipboard.writeText(JSON.stringify(debugInfo, null, 2)).then(() => {
145
- alert('Debug information copied to clipboard!');
146
- });
147
- };
148
-
149
- const downloadAsJson = (data: any, filename: string) => {
150
- const blob = new Blob([JSON.stringify(data, null, 2)], { type: 'application/json' });
151
- const url = URL.createObjectURL(blob);
152
- const link = document.createElement('a');
153
- link.href = url;
154
- link.download = filename;
155
- document.body.appendChild(link);
156
- link.click();
157
- document.body.removeChild(link);
158
- URL.revokeObjectURL(url);
159
- };
160
-
161
- const handleDeleteAllChats = async () => {
162
- if (!db) {
163
- toast.error('Database is not available');
164
- return;
165
- }
166
-
167
- try {
168
- setIsDeleting(true);
169
-
170
- const allChats = await getAll(db);
171
-
172
- // Delete all chats one by one
173
- await Promise.all(allChats.map((chat) => deleteById(db!, chat.id)));
174
-
175
- toast.success('All chats deleted successfully');
176
- navigate('/', { replace: true });
177
- } catch (error) {
178
- toast.error('Failed to delete chats');
179
- console.error(error);
180
- } finally {
181
- setIsDeleting(false);
182
- }
183
- };
184
-
185
- const handleExportAllChats = async () => {
186
- if (!db) {
187
- toast.error('Database is not available');
188
- return;
189
- }
190
-
191
- try {
192
- const allChats = await getAll(db);
193
- const exportData = {
194
- chats: allChats,
195
- exportDate: new Date().toISOString(),
196
- };
197
-
198
- downloadAsJson(exportData, `all-chats-${new Date().toISOString()}.json`);
199
- toast.success('Chats exported successfully');
200
- } catch (error) {
201
- toast.error('Failed to export chats');
202
- console.error(error);
203
- }
204
- };
205
-
206
- const versionHash = commit.commit; // Get the version hash from commit.json
207
-
208
- const handleSaveConnection = () => {
209
- Cookies.set('githubUsername', githubUsername);
210
- Cookies.set('githubToken', githubToken);
211
- toast.success('GitHub credentials saved successfully!');
212
- };
213
-
214
- const handleToggleDebug = (enabled: boolean) => {
215
- setIsDebugEnabled(enabled);
216
- Cookies.set('isDebugEnabled', String(enabled));
217
- };
218
-
219
- const handleToggleLocalModels = (enabled: boolean) => {
220
- setIsLocalModelsEnabled(enabled);
221
- Cookies.set('isLocalModelsEnabled', String(enabled));
222
- };
223
-
224
  return (
225
  <RadixDialog.Root open={open}>
226
  <RadixDialog.Portal>
@@ -272,7 +92,7 @@ export const SettingsWindow = ({ open, onClose }: SettingsProps) => {
272
  GitHub
273
  </a>
274
  <a
275
- href="https://coleam00.github.io/bolt.new-any-llm"
276
  target="_blank"
277
  rel="noopener noreferrer"
278
  className={classNames(styles['settings-button'], 'flex items-center gap-2')}
@@ -284,192 +104,7 @@ export const SettingsWindow = ({ open, onClose }: SettingsProps) => {
284
  </div>
285
 
286
  <div className="flex-1 flex flex-col p-8 pt-10 bg-bolt-elements-background-depth-2">
287
- <div className="flex-1 overflow-y-auto">
288
- {activeTab === 'chat-history' && (
289
- <div className="p-4">
290
- <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">Chat History</h3>
291
- <button
292
- onClick={handleExportAllChats}
293
- className={classNames(
294
- 'bg-bolt-elements-button-primary-background',
295
- 'rounded-lg px-4 py-2 mb-4 transition-colors duration-200',
296
- 'hover:bg-bolt-elements-button-primary-backgroundHover',
297
- 'text-bolt-elements-button-primary-text',
298
- )}
299
- >
300
- Export All Chats
301
- </button>
302
-
303
- <div
304
- className={classNames(
305
- 'text-bolt-elements-textPrimary rounded-lg py-4 mb-4',
306
- styles['settings-danger-area'],
307
- )}
308
- >
309
- <h4 className="font-semibold">Danger Area</h4>
310
- <p className="mb-2">This action cannot be undone!</p>
311
- <button
312
- onClick={handleDeleteAllChats}
313
- disabled={isDeleting}
314
- className={classNames(
315
- 'bg-bolt-elements-button-danger-background',
316
- 'rounded-lg px-4 py-2 transition-colors duration-200',
317
- isDeleting
318
- ? 'opacity-50 cursor-not-allowed'
319
- : 'hover:bg-bolt-elements-button-danger-backgroundHover',
320
- 'text-bolt-elements-button-danger-text',
321
- )}
322
- >
323
- {isDeleting ? 'Deleting...' : 'Delete All Chats'}
324
- </button>
325
- </div>
326
- </div>
327
- )}
328
- {activeTab === 'providers' && (
329
- <div className="p-4">
330
- <div className="flex mb-4">
331
- <input
332
- type="text"
333
- placeholder="Search providers..."
334
- value={searchTerm}
335
- onChange={(e) => setSearchTerm(e.target.value)}
336
- className="w-full bg-white dark:bg-bolt-elements-background-depth-4 relative px-2 py-1.5 rounded-md focus:outline-none placeholder-bolt-elements-textTertiary text-bolt-elements-textPrimary dark:text-bolt-elements-textPrimary border border-bolt-elements-borderColor"
337
- />
338
- </div>
339
- {filteredProviders.map((provider) => (
340
- <div
341
- key={provider.name}
342
- className="flex flex-col mb-2 provider-item hover:bg-bolt-elements-bg-depth-3 p-4 rounded-lg border border-bolt-elements-borderColor "
343
- >
344
- <div className="flex items-center justify-between mb-2">
345
- <span className="text-bolt-elements-textPrimary">{provider.name}</span>
346
- <Switch
347
- className="ml-auto"
348
- checked={provider.isEnabled}
349
- onCheckedChange={(enabled) => handleToggleProvider(provider.name, enabled)}
350
- />
351
- </div>
352
- {/* Base URL input for configurable providers */}
353
- {URL_CONFIGURABLE_PROVIDERS.includes(provider.name) && provider.isEnabled && (
354
- <div className="mt-2">
355
- <label className="block text-sm text-bolt-elements-textSecondary mb-1">Base URL:</label>
356
- <input
357
- type="text"
358
- value={baseUrls[provider.name]}
359
- onChange={(e) => handleBaseUrlChange(provider.name, e.target.value)}
360
- placeholder={`Enter ${provider.name} base URL`}
361
- className="w-full bg-white dark:bg-bolt-elements-background-depth-4 relative px-2 py-1.5 rounded-md focus:outline-none placeholder-bolt-elements-textTertiary text-bolt-elements-textPrimary dark:text-bolt-elements-textPrimary border border-bolt-elements-borderColor"
362
- />
363
- </div>
364
- )}
365
- </div>
366
- ))}
367
- </div>
368
- )}
369
- {activeTab === 'features' && (
370
- <div className="p-4 bg-bolt-elements-bg-depth-2 border border-bolt-elements-borderColor rounded-lg mb-4">
371
- <div className="mb-6">
372
- <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">Optional Features</h3>
373
- <div className="flex items-center justify-between mb-2">
374
- <span className="text-bolt-elements-textPrimary">Debug Info</span>
375
- <Switch className="ml-auto" checked={isDebugEnabled} onCheckedChange={handleToggleDebug} />
376
- </div>
377
- </div>
378
-
379
- <div className="mb-6 border-t border-bolt-elements-borderColor pt-4">
380
- <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">
381
- Experimental Features
382
- </h3>
383
- <p className="text-sm text-bolt-elements-textSecondary mb-4">
384
- Disclaimer: Experimental features may be unstable and are subject to change.
385
- </p>
386
- <div className="flex items-center justify-between mb-2">
387
- <span className="text-bolt-elements-textPrimary">Enable Local Models</span>
388
- <Switch
389
- className="ml-auto"
390
- checked={isLocalModelsEnabled}
391
- onCheckedChange={handleToggleLocalModels}
392
- />
393
- </div>
394
- </div>
395
- </div>
396
- )}
397
- {activeTab === 'debug' && isDebugEnabled && (
398
- <div className="p-4">
399
- <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">Debug Tab</h3>
400
- <button
401
- onClick={handleCopyToClipboard}
402
- className="bg-blue-500 text-white rounded-lg px-4 py-2 hover:bg-blue-600 mb-4 transition-colors duration-200"
403
- >
404
- Copy to Clipboard
405
- </button>
406
-
407
- <h4 className="text-md font-medium text-bolt-elements-textPrimary">System Information</h4>
408
- <p className="text-bolt-elements-textSecondary">OS: {navigator.platform}</p>
409
- <p className="text-bolt-elements-textSecondary">Browser: {navigator.userAgent}</p>
410
-
411
- <h4 className="text-md font-medium text-bolt-elements-textPrimary mt-4">Active Features</h4>
412
- <ul>
413
- {providers
414
- .filter((provider) => provider.isEnabled)
415
- .map((provider) => (
416
- <li key={provider.name} className="text-bolt-elements-textSecondary">
417
- {provider.name}
418
- </li>
419
- ))}
420
- </ul>
421
-
422
- <h4 className="text-md font-medium text-bolt-elements-textPrimary mt-4">Base URLs</h4>
423
- <ul>
424
- <li className="text-bolt-elements-textSecondary">Ollama: {process.env.REACT_APP_OLLAMA_URL}</li>
425
- <li className="text-bolt-elements-textSecondary">OpenAI: {process.env.REACT_APP_OPENAI_URL}</li>
426
- <li className="text-bolt-elements-textSecondary">
427
- LM Studio: {process.env.REACT_APP_LM_STUDIO_URL}
428
- </li>
429
- </ul>
430
-
431
- <h4 className="text-md font-medium text-bolt-elements-textPrimary mt-4">Version Information</h4>
432
- <p className="text-bolt-elements-textSecondary">Version Hash: {versionHash}</p>
433
- </div>
434
- )}
435
- {activeTab === 'connection' && (
436
- <div className="p-4 mb-4 border border-bolt-elements-borderColor rounded-lg bg-bolt-elements-background-depth-3">
437
- <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">GitHub Connection</h3>
438
- <div className="flex mb-4">
439
- <div className="flex-1 mr-2">
440
- <label className="block text-sm text-bolt-elements-textSecondary mb-1">
441
- GitHub Username:
442
- </label>
443
- <input
444
- type="text"
445
- value={githubUsername}
446
- onChange={(e) => setGithubUsername(e.target.value)}
447
- className="w-full bg-white dark:bg-bolt-elements-background-depth-4 relative px-2 py-1.5 rounded-md focus:outline-none placeholder-bolt-elements-textTertiary text-bolt-elements-textPrimary dark:text-bolt-elements-textPrimary border border-bolt-elements-borderColor"
448
- />
449
- </div>
450
- <div className="flex-1">
451
- <label className="block text-sm text-bolt-elements-textSecondary mb-1">
452
- Personal Access Token:
453
- </label>
454
- <input
455
- type="password"
456
- value={githubToken}
457
- onChange={(e) => setGithubToken(e.target.value)}
458
- className="w-full bg-white dark:bg-bolt-elements-background-depth-4 relative px-2 py-1.5 rounded-md focus:outline-none placeholder-bolt-elements-textTertiary text-bolt-elements-textPrimary dark:text-bolt-elements-textPrimary border border-bolt-elements-borderColor"
459
- />
460
- </div>
461
- </div>
462
- <div className="flex mb-4">
463
- <button
464
- onClick={handleSaveConnection}
465
- className="bg-bolt-elements-button-primary-background rounded-lg px-4 py-2 mr-2 transition-colors duration-200 hover:bg-bolt-elements-button-primary-backgroundHover text-bolt-elements-button-primary-text"
466
- >
467
- Save Connection
468
- </button>
469
- </div>
470
- </div>
471
- )}
472
- </div>
473
  </div>
474
  </div>
475
  <RadixDialog.Close asChild onClick={onClose}>
 
1
  import * as RadixDialog from '@radix-ui/react-dialog';
2
  import { motion } from 'framer-motion';
3
+ import { useState, type ReactElement } from 'react';
4
  import { classNames } from '~/utils/classNames';
5
  import { DialogTitle, dialogVariants, dialogBackdropVariants } from '~/components/ui/Dialog';
6
  import { IconButton } from '~/components/ui/IconButton';
 
 
 
 
 
 
7
  import styles from './Settings.module.scss';
8
+ import ChatHistoryTab from './chat-history/ChatHistoryTab';
9
+ import ProvidersTab from './providers/ProvidersTab';
10
+ import { useSettings } from '~/lib/hooks/useSettings';
11
+ import FeaturesTab from './features/FeaturesTab';
12
+ import DebugTab from './debug/DebugTab';
13
+ import ConnectionsTab from './connections/ConnectionsTab';
14
 
15
  interface SettingsProps {
16
  open: boolean;
 
20
  type TabType = 'chat-history' | 'providers' | 'features' | 'debug' | 'connection';
21
 
22
  // Providers that support base URL configuration
 
 
23
  export const SettingsWindow = ({ open, onClose }: SettingsProps) => {
24
+ const { debug } = useSettings();
25
  const [activeTab, setActiveTab] = useState<TabType>('chat-history');
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
 
27
+ const tabs: { id: TabType; label: string; icon: string; component?: ReactElement }[] = [
28
+ { id: 'chat-history', label: 'Chat History', icon: 'i-ph:book', component: <ChatHistoryTab /> },
29
+ { id: 'providers', label: 'Providers', icon: 'i-ph:key', component: <ProvidersTab /> },
30
+ { id: 'features', label: 'Features', icon: 'i-ph:star', component: <FeaturesTab /> },
31
+ { id: 'connection', label: 'Connection', icon: 'i-ph:link', component: <ConnectionsTab /> },
32
+ ...(debug
33
+ ? [
34
+ {
35
+ id: 'debug' as TabType,
36
+ label: 'Debug Tab',
37
+ icon: 'i-ph:bug',
38
+ component: <DebugTab />,
39
+ },
40
+ ]
41
+ : []),
42
  ];
43
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
44
  return (
45
  <RadixDialog.Root open={open}>
46
  <RadixDialog.Portal>
 
92
  GitHub
93
  </a>
94
  <a
95
+ href="https://stackblitz-labs.github.io/bolt.diy/"
96
  target="_blank"
97
  rel="noopener noreferrer"
98
  className={classNames(styles['settings-button'], 'flex items-center gap-2')}
 
104
  </div>
105
 
106
  <div className="flex-1 flex flex-col p-8 pt-10 bg-bolt-elements-background-depth-2">
107
+ <div className="flex-1 overflow-y-auto">{tabs.find((tab) => tab.id === activeTab)?.component}</div>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
108
  </div>
109
  </div>
110
  <RadixDialog.Close asChild onClick={onClose}>
app/components/settings/chat-history/ChatHistoryTab.tsx ADDED
@@ -0,0 +1,105 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import { useNavigate } from '@remix-run/react';
2
+ import React, { useState } from 'react';
3
+ import { toast } from 'react-toastify';
4
+ import { db, deleteById, getAll } from '~/lib/persistence';
5
+ import { classNames } from '~/utils/classNames';
6
+ import styles from '~/components/settings/Settings.module.scss';
7
+
8
+ export default function ChatHistoryTab() {
9
+ const navigate = useNavigate();
10
+ const [isDeleting, setIsDeleting] = useState(false);
11
+ const downloadAsJson = (data: any, filename: string) => {
12
+ const blob = new Blob([JSON.stringify(data, null, 2)], { type: 'application/json' });
13
+ const url = URL.createObjectURL(blob);
14
+ const link = document.createElement('a');
15
+ link.href = url;
16
+ link.download = filename;
17
+ document.body.appendChild(link);
18
+ link.click();
19
+ document.body.removeChild(link);
20
+ URL.revokeObjectURL(url);
21
+ };
22
+
23
+ const handleDeleteAllChats = async () => {
24
+ if (!db) {
25
+ toast.error('Database is not available');
26
+ return;
27
+ }
28
+
29
+ try {
30
+ setIsDeleting(true);
31
+
32
+ const allChats = await getAll(db);
33
+
34
+ // Delete all chats one by one
35
+ await Promise.all(allChats.map((chat) => deleteById(db!, chat.id)));
36
+
37
+ toast.success('All chats deleted successfully');
38
+ navigate('/', { replace: true });
39
+ } catch (error) {
40
+ toast.error('Failed to delete chats');
41
+ console.error(error);
42
+ } finally {
43
+ setIsDeleting(false);
44
+ }
45
+ };
46
+
47
+ const handleExportAllChats = async () => {
48
+ if (!db) {
49
+ toast.error('Database is not available');
50
+ return;
51
+ }
52
+
53
+ try {
54
+ const allChats = await getAll(db);
55
+ const exportData = {
56
+ chats: allChats,
57
+ exportDate: new Date().toISOString(),
58
+ };
59
+
60
+ downloadAsJson(exportData, `all-chats-${new Date().toISOString()}.json`);
61
+ toast.success('Chats exported successfully');
62
+ } catch (error) {
63
+ toast.error('Failed to export chats');
64
+ console.error(error);
65
+ }
66
+ };
67
+
68
+ return (
69
+ <>
70
+ <div className="p-4">
71
+ <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">Chat History</h3>
72
+ <button
73
+ onClick={handleExportAllChats}
74
+ className={classNames(
75
+ 'bg-bolt-elements-button-primary-background',
76
+ 'rounded-lg px-4 py-2 mb-4 transition-colors duration-200',
77
+ 'hover:bg-bolt-elements-button-primary-backgroundHover',
78
+ 'text-bolt-elements-button-primary-text',
79
+ )}
80
+ >
81
+ Export All Chats
82
+ </button>
83
+
84
+ <div
85
+ className={classNames('text-bolt-elements-textPrimary rounded-lg py-4 mb-4', styles['settings-danger-area'])}
86
+ >
87
+ <h4 className="font-semibold">Danger Area</h4>
88
+ <p className="mb-2">This action cannot be undone!</p>
89
+ <button
90
+ onClick={handleDeleteAllChats}
91
+ disabled={isDeleting}
92
+ className={classNames(
93
+ 'bg-bolt-elements-button-danger-background',
94
+ 'rounded-lg px-4 py-2 transition-colors duration-200',
95
+ isDeleting ? 'opacity-50 cursor-not-allowed' : 'hover:bg-bolt-elements-button-danger-backgroundHover',
96
+ 'text-bolt-elements-button-danger-text',
97
+ )}
98
+ >
99
+ {isDeleting ? 'Deleting...' : 'Delete All Chats'}
100
+ </button>
101
+ </div>
102
+ </div>
103
+ </>
104
+ );
105
+ }
app/components/settings/connections/ConnectionsTab.tsx ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import React, { useState } from 'react';
2
+ import { toast } from 'react-toastify';
3
+ import Cookies from 'js-cookie';
4
+
5
+ export default function ConnectionsTab() {
6
+ const [githubUsername, setGithubUsername] = useState(Cookies.get('githubUsername') || '');
7
+ const [githubToken, setGithubToken] = useState(Cookies.get('githubToken') || '');
8
+
9
+ const handleSaveConnection = () => {
10
+ Cookies.set('githubUsername', githubUsername);
11
+ Cookies.set('githubToken', githubToken);
12
+ toast.success('GitHub credentials saved successfully!');
13
+ };
14
+
15
+ return (
16
+ <div className="p-4 mb-4 border border-bolt-elements-borderColor rounded-lg bg-bolt-elements-background-depth-3">
17
+ <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">GitHub Connection</h3>
18
+ <div className="flex mb-4">
19
+ <div className="flex-1 mr-2">
20
+ <label className="block text-sm text-bolt-elements-textSecondary mb-1">GitHub Username:</label>
21
+ <input
22
+ type="text"
23
+ value={githubUsername}
24
+ onChange={(e) => setGithubUsername(e.target.value)}
25
+ className="w-full bg-white dark:bg-bolt-elements-background-depth-4 relative px-2 py-1.5 rounded-md focus:outline-none placeholder-bolt-elements-textTertiary text-bolt-elements-textPrimary dark:text-bolt-elements-textPrimary border border-bolt-elements-borderColor"
26
+ />
27
+ </div>
28
+ <div className="flex-1">
29
+ <label className="block text-sm text-bolt-elements-textSecondary mb-1">Personal Access Token:</label>
30
+ <input
31
+ type="password"
32
+ value={githubToken}
33
+ onChange={(e) => setGithubToken(e.target.value)}
34
+ className="w-full bg-white dark:bg-bolt-elements-background-depth-4 relative px-2 py-1.5 rounded-md focus:outline-none placeholder-bolt-elements-textTertiary text-bolt-elements-textPrimary dark:text-bolt-elements-textPrimary border border-bolt-elements-borderColor"
35
+ />
36
+ </div>
37
+ </div>
38
+ <div className="flex mb-4">
39
+ <button
40
+ onClick={handleSaveConnection}
41
+ className="bg-bolt-elements-button-primary-background rounded-lg px-4 py-2 mr-2 transition-colors duration-200 hover:bg-bolt-elements-button-primary-backgroundHover text-bolt-elements-button-primary-text"
42
+ >
43
+ Save Connection
44
+ </button>
45
+ </div>
46
+ </div>
47
+ );
48
+ }
app/components/settings/debug/DebugTab.tsx ADDED
@@ -0,0 +1,69 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import React, { useCallback, useEffect, useState } from 'react';
2
+ import { useSettings } from '~/lib/hooks/useSettings';
3
+ import commit from '~/commit.json';
4
+
5
+ const versionHash = commit.commit; // Get the version hash from commit.json
6
+
7
+ export default function DebugTab() {
8
+ const { providers } = useSettings();
9
+ const [activeProviders, setActiveProviders] = useState<string[]>([]);
10
+ useEffect(() => {
11
+ setActiveProviders(
12
+ Object.entries(providers)
13
+ .filter(([_key, provider]) => provider.settings.enabled)
14
+ .map(([_key, provider]) => provider.name),
15
+ );
16
+ }, [providers]);
17
+
18
+ const handleCopyToClipboard = useCallback(() => {
19
+ const debugInfo = {
20
+ OS: navigator.platform,
21
+ Browser: navigator.userAgent,
22
+ ActiveFeatures: activeProviders,
23
+ BaseURLs: {
24
+ Ollama: process.env.REACT_APP_OLLAMA_URL,
25
+ OpenAI: process.env.REACT_APP_OPENAI_URL,
26
+ LMStudio: process.env.REACT_APP_LM_STUDIO_URL,
27
+ },
28
+ Version: versionHash,
29
+ };
30
+ navigator.clipboard.writeText(JSON.stringify(debugInfo, null, 2)).then(() => {
31
+ alert('Debug information copied to clipboard!');
32
+ });
33
+ }, [providers]);
34
+
35
+ return (
36
+ <div className="p-4">
37
+ <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">Debug Tab</h3>
38
+ <button
39
+ onClick={handleCopyToClipboard}
40
+ className="bg-blue-500 text-white rounded-lg px-4 py-2 hover:bg-blue-600 mb-4 transition-colors duration-200"
41
+ >
42
+ Copy to Clipboard
43
+ </button>
44
+
45
+ <h4 className="text-md font-medium text-bolt-elements-textPrimary">System Information</h4>
46
+ <p className="text-bolt-elements-textSecondary">OS: {navigator.platform}</p>
47
+ <p className="text-bolt-elements-textSecondary">Browser: {navigator.userAgent}</p>
48
+
49
+ <h4 className="text-md font-medium text-bolt-elements-textPrimary mt-4">Active Features</h4>
50
+ <ul>
51
+ {activeProviders.map((name) => (
52
+ <li key={name} className="text-bolt-elements-textSecondary">
53
+ {name}
54
+ </li>
55
+ ))}
56
+ </ul>
57
+
58
+ <h4 className="text-md font-medium text-bolt-elements-textPrimary mt-4">Base URLs</h4>
59
+ <ul>
60
+ <li className="text-bolt-elements-textSecondary">Ollama: {process.env.REACT_APP_OLLAMA_URL}</li>
61
+ <li className="text-bolt-elements-textSecondary">OpenAI: {process.env.REACT_APP_OPENAI_URL}</li>
62
+ <li className="text-bolt-elements-textSecondary">LM Studio: {process.env.REACT_APP_LM_STUDIO_URL}</li>
63
+ </ul>
64
+
65
+ <h4 className="text-md font-medium text-bolt-elements-textPrimary mt-4">Version Information</h4>
66
+ <p className="text-bolt-elements-textSecondary">Version Hash: {versionHash}</p>
67
+ </div>
68
+ );
69
+ }
app/components/settings/features/FeaturesTab.tsx ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import React from 'react';
2
+ import { Switch } from '~/components/ui/Switch';
3
+ import { useSettings } from '~/lib/hooks/useSettings';
4
+
5
+ export default function FeaturesTab() {
6
+ const { debug, enableDebugMode, isLocalModel, enableLocalModels } = useSettings();
7
+ return (
8
+ <div className="p-4 bg-bolt-elements-bg-depth-2 border border-bolt-elements-borderColor rounded-lg mb-4">
9
+ <div className="mb-6">
10
+ <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">Optional Features</h3>
11
+ <div className="flex items-center justify-between mb-2">
12
+ <span className="text-bolt-elements-textPrimary">Debug Info</span>
13
+ <Switch className="ml-auto" checked={debug} onCheckedChange={enableDebugMode} />
14
+ </div>
15
+ </div>
16
+
17
+ <div className="mb-6 border-t border-bolt-elements-borderColor pt-4">
18
+ <h3 className="text-lg font-medium text-bolt-elements-textPrimary mb-4">Experimental Features</h3>
19
+ <p className="text-sm text-bolt-elements-textSecondary mb-4">
20
+ Disclaimer: Experimental features may be unstable and are subject to change.
21
+ </p>
22
+ <div className="flex items-center justify-between mb-2">
23
+ <span className="text-bolt-elements-textPrimary">Enable Local Models</span>
24
+ <Switch className="ml-auto" checked={isLocalModel} onCheckedChange={enableLocalModels} />
25
+ </div>
26
+ </div>
27
+ </div>
28
+ );
29
+ }
app/components/settings/providers/ProvidersTab.tsx ADDED
@@ -0,0 +1,78 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import React, { useEffect, useState } from 'react';
2
+ import { Switch } from '~/components/ui/Switch';
3
+ import { useSettings } from '~/lib/hooks/useSettings';
4
+ import { LOCAL_PROVIDERS, URL_CONFIGURABLE_PROVIDERS } from '~/lib/stores/settings';
5
+ import type { IProviderConfig } from '~/types/model';
6
+
7
+ export default function ProvidersTab() {
8
+ const { providers, updateProviderSettings, isLocalModel } = useSettings();
9
+ const [filteredProviders, setFilteredProviders] = useState<IProviderConfig[]>([]);
10
+
11
+ // Load base URLs from cookies
12
+ const [searchTerm, setSearchTerm] = useState('');
13
+
14
+ useEffect(() => {
15
+ let newFilteredProviders: IProviderConfig[] = Object.entries(providers).map(([key, value]) => ({
16
+ ...value,
17
+ name: key,
18
+ }));
19
+
20
+ if (searchTerm && searchTerm.length > 0) {
21
+ newFilteredProviders = newFilteredProviders.filter((provider) =>
22
+ provider.name.toLowerCase().includes(searchTerm.toLowerCase()),
23
+ );
24
+ }
25
+
26
+ if (!isLocalModel) {
27
+ newFilteredProviders = newFilteredProviders.filter((provider) => !LOCAL_PROVIDERS.includes(provider.name));
28
+ }
29
+
30
+ newFilteredProviders.sort((a, b) => a.name.localeCompare(b.name));
31
+
32
+ setFilteredProviders(newFilteredProviders);
33
+ }, [providers, searchTerm, isLocalModel]);
34
+
35
+ return (
36
+ <div className="p-4">
37
+ <div className="flex mb-4">
38
+ <input
39
+ type="text"
40
+ placeholder="Search providers..."
41
+ value={searchTerm}
42
+ onChange={(e) => setSearchTerm(e.target.value)}
43
+ className="w-full bg-white dark:bg-bolt-elements-background-depth-4 relative px-2 py-1.5 rounded-md focus:outline-none placeholder-bolt-elements-textTertiary text-bolt-elements-textPrimary dark:text-bolt-elements-textPrimary border border-bolt-elements-borderColor"
44
+ />
45
+ </div>
46
+ {filteredProviders.map((provider) => (
47
+ <div
48
+ key={provider.name}
49
+ className="flex flex-col mb-2 provider-item hover:bg-bolt-elements-bg-depth-3 p-4 rounded-lg border border-bolt-elements-borderColor "
50
+ >
51
+ <div className="flex items-center justify-between mb-2">
52
+ <span className="text-bolt-elements-textPrimary">{provider.name}</span>
53
+ <Switch
54
+ className="ml-auto"
55
+ checked={provider.settings.enabled}
56
+ onCheckedChange={(enabled) => updateProviderSettings(provider.name, { ...provider.settings, enabled })}
57
+ />
58
+ </div>
59
+ {/* Base URL input for configurable providers */}
60
+ {URL_CONFIGURABLE_PROVIDERS.includes(provider.name) && provider.settings.enabled && (
61
+ <div className="mt-2">
62
+ <label className="block text-sm text-bolt-elements-textSecondary mb-1">Base URL:</label>
63
+ <input
64
+ type="text"
65
+ value={provider.settings.baseUrl || ''}
66
+ onChange={(e) =>
67
+ updateProviderSettings(provider.name, { ...provider.settings, baseUrl: e.target.value })
68
+ }
69
+ placeholder={`Enter ${provider.name} base URL`}
70
+ className="w-full bg-white dark:bg-bolt-elements-background-depth-4 relative px-2 py-1.5 rounded-md focus:outline-none placeholder-bolt-elements-textTertiary text-bolt-elements-textPrimary dark:text-bolt-elements-textPrimary border border-bolt-elements-borderColor"
71
+ />
72
+ </div>
73
+ )}
74
+ </div>
75
+ ))}
76
+ </div>
77
+ );
78
+ }
app/lib/.server/llm/model.ts CHANGED
@@ -11,6 +11,7 @@ import { createOpenRouter } from '@openrouter/ai-sdk-provider';
11
  import { createMistral } from '@ai-sdk/mistral';
12
  import { createCohere } from '@ai-sdk/cohere';
13
  import type { LanguageModelV1 } from 'ai';
 
14
 
15
  export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ? parseInt(process.env.DEFAULT_NUM_CTX, 10) : 32768;
16
 
@@ -127,14 +128,20 @@ export function getXAIModel(apiKey: OptionalApiKey, model: string) {
127
  return openai(model);
128
  }
129
 
130
- export function getModel(provider: string, model: string, env: Env, apiKeys?: Record<string, string>) {
 
 
 
 
 
 
131
  /*
132
  * let apiKey; // Declare first
133
  * let baseURL;
134
  */
135
 
136
  const apiKey = getAPIKey(env, provider, apiKeys); // Then assign
137
- const baseURL = getBaseURL(env, provider);
138
 
139
  switch (provider) {
140
  case 'Anthropic':
 
11
  import { createMistral } from '@ai-sdk/mistral';
12
  import { createCohere } from '@ai-sdk/cohere';
13
  import type { LanguageModelV1 } from 'ai';
14
+ import type { IProviderSetting } from '~/types/model';
15
 
16
  export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ? parseInt(process.env.DEFAULT_NUM_CTX, 10) : 32768;
17
 
 
128
  return openai(model);
129
  }
130
 
131
+ export function getModel(
132
+ provider: string,
133
+ model: string,
134
+ env: Env,
135
+ apiKeys?: Record<string, string>,
136
+ providerSettings?: Record<string, IProviderSetting>,
137
+ ) {
138
  /*
139
  * let apiKey; // Declare first
140
  * let baseURL;
141
  */
142
 
143
  const apiKey = getAPIKey(env, provider, apiKeys); // Then assign
144
+ const baseURL = providerSettings?.[provider].baseUrl || getBaseURL(env, provider);
145
 
146
  switch (provider) {
147
  case 'Anthropic':
app/lib/.server/llm/stream-text.ts CHANGED
@@ -4,6 +4,7 @@ import { MAX_TOKENS } from './constants';
4
  import { getSystemPrompt } from './prompts';
5
  import { DEFAULT_MODEL, DEFAULT_PROVIDER, getModelList, MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants';
6
  import ignore from 'ignore';
 
7
 
8
  interface ToolResult<Name extends string, Args, Result> {
9
  toolCallId: string;
@@ -131,16 +132,18 @@ function extractPropertiesFromMessage(message: Message): { model: string; provid
131
  return { model, provider, content: cleanedContent };
132
  }
133
 
134
- export async function streamText(
135
- messages: Messages,
136
- env: Env,
137
- options?: StreamingOptions,
138
- apiKeys?: Record<string, string>,
139
- files?: FileMap,
140
- ) {
 
 
141
  let currentModel = DEFAULT_MODEL;
142
  let currentProvider = DEFAULT_PROVIDER.name;
143
- const MODEL_LIST = await getModelList(apiKeys || {});
144
  const processedMessages = messages.map((message) => {
145
  if (message.role === 'user') {
146
  const { model, provider, content } = extractPropertiesFromMessage(message);
@@ -175,7 +178,7 @@ export async function streamText(
175
  }
176
 
177
  return _streamText({
178
- model: getModel(currentProvider, currentModel, env, apiKeys) as any,
179
  system: systemPrompt,
180
  maxTokens: dynamicMaxTokens,
181
  messages: convertToCoreMessages(processedMessages as any),
 
4
  import { getSystemPrompt } from './prompts';
5
  import { DEFAULT_MODEL, DEFAULT_PROVIDER, getModelList, MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants';
6
  import ignore from 'ignore';
7
+ import type { IProviderSetting } from '~/types/model';
8
 
9
  interface ToolResult<Name extends string, Args, Result> {
10
  toolCallId: string;
 
132
  return { model, provider, content: cleanedContent };
133
  }
134
 
135
+ export async function streamText(props: {
136
+ messages: Messages;
137
+ env: Env;
138
+ options?: StreamingOptions;
139
+ apiKeys?: Record<string, string>;
140
+ files?: FileMap;
141
+ providerSettings?: Record<string, IProviderSetting>;
142
+ }) {
143
+ const { messages, env, options, apiKeys, files, providerSettings } = props;
144
  let currentModel = DEFAULT_MODEL;
145
  let currentProvider = DEFAULT_PROVIDER.name;
146
+ const MODEL_LIST = await getModelList(apiKeys || {}, providerSettings);
147
  const processedMessages = messages.map((message) => {
148
  if (message.role === 'user') {
149
  const { model, provider, content } = extractPropertiesFromMessage(message);
 
178
  }
179
 
180
  return _streamText({
181
+ model: getModel(currentProvider, currentModel, env, apiKeys, providerSettings) as any,
182
  system: systemPrompt,
183
  maxTokens: dynamicMaxTokens,
184
  messages: convertToCoreMessages(processedMessages as any),
app/lib/hooks/useSettings.tsx ADDED
@@ -0,0 +1,97 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import { useStore } from '@nanostores/react';
2
+ import { isDebugMode, isLocalModelsEnabled, LOCAL_PROVIDERS, providersStore } from '~/lib/stores/settings';
3
+ import { useCallback, useEffect, useState } from 'react';
4
+ import Cookies from 'js-cookie';
5
+ import type { IProviderSetting, ProviderInfo } from '~/types/model';
6
+
7
+ export function useSettings() {
8
+ const providers = useStore(providersStore);
9
+ const debug = useStore(isDebugMode);
10
+ const isLocalModel = useStore(isLocalModelsEnabled);
11
+ const [activeProviders, setActiveProviders] = useState<ProviderInfo[]>([]);
12
+
13
+ // reading values from cookies on mount
14
+ useEffect(() => {
15
+ const savedProviders = Cookies.get('providers');
16
+
17
+ if (savedProviders) {
18
+ try {
19
+ const parsedProviders: Record<string, IProviderSetting> = JSON.parse(savedProviders);
20
+ Object.keys(parsedProviders).forEach((provider) => {
21
+ const currentProvider = providers[provider];
22
+ providersStore.setKey(provider, {
23
+ ...currentProvider,
24
+ settings: {
25
+ ...parsedProviders[provider],
26
+ enabled: parsedProviders[provider].enabled || true,
27
+ },
28
+ });
29
+ });
30
+ } catch (error) {
31
+ console.error('Failed to parse providers from cookies:', error);
32
+ }
33
+ }
34
+
35
+ // load debug mode from cookies
36
+ const savedDebugMode = Cookies.get('isDebugEnabled');
37
+
38
+ if (savedDebugMode) {
39
+ isDebugMode.set(savedDebugMode === 'true');
40
+ }
41
+
42
+ // load local models from cookies
43
+ const savedLocalModels = Cookies.get('isLocalModelsEnabled');
44
+
45
+ if (savedLocalModels) {
46
+ isLocalModelsEnabled.set(savedLocalModels === 'true');
47
+ }
48
+ }, []);
49
+
50
+ // writing values to cookies on change
51
+ useEffect(() => {
52
+ const providers = providersStore.get();
53
+ const providerSetting: Record<string, IProviderSetting> = {};
54
+ Object.keys(providers).forEach((provider) => {
55
+ providerSetting[provider] = providers[provider].settings;
56
+ });
57
+ Cookies.set('providers', JSON.stringify(providerSetting));
58
+ }, [providers]);
59
+
60
+ useEffect(() => {
61
+ let active = Object.entries(providers)
62
+ .filter(([_key, provider]) => provider.settings.enabled)
63
+ .map(([_k, p]) => p);
64
+
65
+ if (!isLocalModel) {
66
+ active = active.filter((p) => !LOCAL_PROVIDERS.includes(p.name));
67
+ }
68
+
69
+ setActiveProviders(active);
70
+ }, [providers, isLocalModel]);
71
+
72
+ // helper function to update settings
73
+ const updateProviderSettings = useCallback((provider: string, config: IProviderSetting) => {
74
+ const settings = providers[provider].settings;
75
+ providersStore.setKey(provider, { ...providers[provider], settings: { ...settings, ...config } });
76
+ }, []);
77
+
78
+ const enableDebugMode = useCallback((enabled: boolean) => {
79
+ isDebugMode.set(enabled);
80
+ Cookies.set('isDebugEnabled', String(enabled));
81
+ }, []);
82
+
83
+ const enableLocalModels = useCallback((enabled: boolean) => {
84
+ isLocalModelsEnabled.set(enabled);
85
+ Cookies.set('isLocalModelsEnabled', String(enabled));
86
+ }, []);
87
+
88
+ return {
89
+ providers,
90
+ activeProviders,
91
+ updateProviderSettings,
92
+ debug,
93
+ enableDebugMode,
94
+ isLocalModel,
95
+ enableLocalModels,
96
+ };
97
+ }
app/lib/stores/settings.ts CHANGED
@@ -1,5 +1,7 @@
1
- import { map } from 'nanostores';
2
  import { workbenchStore } from './workbench';
 
 
3
 
4
  export interface Shortcut {
5
  key: string;
@@ -15,32 +17,10 @@ export interface Shortcuts {
15
  toggleTerminal: Shortcut;
16
  }
17
 
18
- export interface Provider {
19
- name: string;
20
- isEnabled: boolean;
21
- }
22
-
23
- export interface Settings {
24
- shortcuts: Shortcuts;
25
- providers: Provider[];
26
- }
27
 
28
- export const providersList: Provider[] = [
29
- { name: 'Groq', isEnabled: false },
30
- { name: 'HuggingFace', isEnabled: false },
31
- { name: 'OpenAI', isEnabled: false },
32
- { name: 'Anthropic', isEnabled: false },
33
- { name: 'OpenRouter', isEnabled: false },
34
- { name: 'Google', isEnabled: false },
35
- { name: 'Ollama', isEnabled: false },
36
- { name: 'OpenAILike', isEnabled: false },
37
- { name: 'Together', isEnabled: false },
38
- { name: 'Deepseek', isEnabled: false },
39
- { name: 'Mistral', isEnabled: false },
40
- { name: 'Cohere', isEnabled: false },
41
- { name: 'LMStudio', isEnabled: false },
42
- { name: 'xAI', isEnabled: false },
43
- ];
44
 
45
  export const shortcutsStore = map<Shortcuts>({
46
  toggleTerminal: {
@@ -50,14 +30,17 @@ export const shortcutsStore = map<Shortcuts>({
50
  },
51
  });
52
 
53
- export const settingsStore = map<Settings>({
54
- shortcuts: shortcutsStore.get(),
55
- providers: providersList,
 
 
 
 
 
56
  });
 
57
 
58
- shortcutsStore.subscribe((shortcuts) => {
59
- settingsStore.set({
60
- ...settingsStore.get(),
61
- shortcuts,
62
- });
63
- });
 
1
+ import { atom, map } from 'nanostores';
2
  import { workbenchStore } from './workbench';
3
+ import { PROVIDER_LIST } from '~/utils/constants';
4
+ import type { IProviderConfig } from '~/types/model';
5
 
6
  export interface Shortcut {
7
  key: string;
 
17
  toggleTerminal: Shortcut;
18
  }
19
 
20
+ export const URL_CONFIGURABLE_PROVIDERS = ['Ollama', 'LMStudio', 'OpenAILike'];
21
+ export const LOCAL_PROVIDERS = ['OpenAILike', 'LMStudio', 'Ollama'];
 
 
 
 
 
 
 
22
 
23
+ export type ProviderSetting = Record<string, IProviderConfig>;
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
24
 
25
  export const shortcutsStore = map<Shortcuts>({
26
  toggleTerminal: {
 
30
  },
31
  });
32
 
33
+ const initialProviderSettings: ProviderSetting = {};
34
+ PROVIDER_LIST.forEach((provider) => {
35
+ initialProviderSettings[provider.name] = {
36
+ ...provider,
37
+ settings: {
38
+ enabled: false,
39
+ },
40
+ };
41
  });
42
+ export const providersStore = map<ProviderSetting>(initialProviderSettings);
43
 
44
+ export const isDebugMode = atom(false);
45
+
46
+ export const isLocalModelsEnabled = atom(true);
 
 
 
app/routes/api.chat.ts CHANGED
@@ -3,6 +3,7 @@ import { MAX_RESPONSE_SEGMENTS, MAX_TOKENS } from '~/lib/.server/llm/constants';
3
  import { CONTINUE_PROMPT } from '~/lib/.server/llm/prompts';
4
  import { streamText, type Messages, type StreamingOptions } from '~/lib/.server/llm/stream-text';
5
  import SwitchableStream from '~/lib/.server/llm/switchable-stream';
 
6
 
7
  export async function action(args: ActionFunctionArgs) {
8
  return chatAction(args);
@@ -38,6 +39,9 @@ async function chatAction({ context, request }: ActionFunctionArgs) {
38
 
39
  // Parse the cookie's value (returns an object or null if no cookie exists)
40
  const apiKeys = JSON.parse(parseCookies(cookieHeader || '').apiKeys || '{}');
 
 
 
41
 
42
  const stream = new SwitchableStream();
43
 
@@ -60,13 +64,27 @@ async function chatAction({ context, request }: ActionFunctionArgs) {
60
  messages.push({ role: 'assistant', content });
61
  messages.push({ role: 'user', content: CONTINUE_PROMPT });
62
 
63
- const result = await streamText(messages, context.cloudflare.env, options, apiKeys, files);
 
 
 
 
 
 
 
64
 
65
  return stream.switchSource(result.toAIStream());
66
  },
67
  };
68
 
69
- const result = await streamText(messages, context.cloudflare.env, options, apiKeys, files);
 
 
 
 
 
 
 
70
 
71
  stream.switchSource(result.toAIStream());
72
 
 
3
  import { CONTINUE_PROMPT } from '~/lib/.server/llm/prompts';
4
  import { streamText, type Messages, type StreamingOptions } from '~/lib/.server/llm/stream-text';
5
  import SwitchableStream from '~/lib/.server/llm/switchable-stream';
6
+ import type { IProviderSetting } from '~/types/model';
7
 
8
  export async function action(args: ActionFunctionArgs) {
9
  return chatAction(args);
 
39
 
40
  // Parse the cookie's value (returns an object or null if no cookie exists)
41
  const apiKeys = JSON.parse(parseCookies(cookieHeader || '').apiKeys || '{}');
42
+ const providerSettings: Record<string, IProviderSetting> = JSON.parse(
43
+ parseCookies(cookieHeader || '').providers || '{}',
44
+ );
45
 
46
  const stream = new SwitchableStream();
47
 
 
64
  messages.push({ role: 'assistant', content });
65
  messages.push({ role: 'user', content: CONTINUE_PROMPT });
66
 
67
+ const result = await streamText({
68
+ messages,
69
+ env: context.cloudflare.env,
70
+ options,
71
+ apiKeys,
72
+ files,
73
+ providerSettings,
74
+ });
75
 
76
  return stream.switchSource(result.toAIStream());
77
  },
78
  };
79
 
80
+ const result = await streamText({
81
+ messages,
82
+ env: context.cloudflare.env,
83
+ options,
84
+ apiKeys,
85
+ files,
86
+ providerSettings,
87
+ });
88
 
89
  stream.switchSource(result.toAIStream());
90
 
app/routes/api.enhancer.ts CHANGED
@@ -2,7 +2,7 @@ import { type ActionFunctionArgs } from '@remix-run/cloudflare';
2
  import { StreamingTextResponse, parseStreamPart } from 'ai';
3
  import { streamText } from '~/lib/.server/llm/stream-text';
4
  import { stripIndents } from '~/utils/stripIndent';
5
- import type { ProviderInfo } from '~/types/model';
6
 
7
  const encoder = new TextEncoder();
8
  const decoder = new TextDecoder();
@@ -11,8 +11,28 @@ export async function action(args: ActionFunctionArgs) {
11
  return enhancerAction(args);
12
  }
13
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
  async function enhancerAction({ context, request }: ActionFunctionArgs) {
15
- const { message, model, provider, apiKeys } = await request.json<{
16
  message: string;
17
  model: string;
18
  provider: ProviderInfo;
@@ -36,9 +56,17 @@ async function enhancerAction({ context, request }: ActionFunctionArgs) {
36
  });
37
  }
38
 
 
 
 
 
 
 
 
 
39
  try {
40
- const result = await streamText(
41
- [
42
  {
43
  role: 'user',
44
  content:
@@ -73,10 +101,10 @@ async function enhancerAction({ context, request }: ActionFunctionArgs) {
73
  `,
74
  },
75
  ],
76
- context.cloudflare.env,
77
- undefined,
78
  apiKeys,
79
- );
 
80
 
81
  const transformStream = new TransformStream({
82
  transform(chunk, controller) {
 
2
  import { StreamingTextResponse, parseStreamPart } from 'ai';
3
  import { streamText } from '~/lib/.server/llm/stream-text';
4
  import { stripIndents } from '~/utils/stripIndent';
5
+ import type { IProviderSetting, ProviderInfo } from '~/types/model';
6
 
7
  const encoder = new TextEncoder();
8
  const decoder = new TextDecoder();
 
11
  return enhancerAction(args);
12
  }
13
 
14
+ function parseCookies(cookieHeader: string) {
15
+ const cookies: any = {};
16
+
17
+ // Split the cookie string by semicolons and spaces
18
+ const items = cookieHeader.split(';').map((cookie) => cookie.trim());
19
+
20
+ items.forEach((item) => {
21
+ const [name, ...rest] = item.split('=');
22
+
23
+ if (name && rest) {
24
+ // Decode the name and value, and join value parts in case it contains '='
25
+ const decodedName = decodeURIComponent(name.trim());
26
+ const decodedValue = decodeURIComponent(rest.join('=').trim());
27
+ cookies[decodedName] = decodedValue;
28
+ }
29
+ });
30
+
31
+ return cookies;
32
+ }
33
+
34
  async function enhancerAction({ context, request }: ActionFunctionArgs) {
35
+ const { message, model, provider } = await request.json<{
36
  message: string;
37
  model: string;
38
  provider: ProviderInfo;
 
56
  });
57
  }
58
 
59
+ const cookieHeader = request.headers.get('Cookie');
60
+
61
+ // Parse the cookie's value (returns an object or null if no cookie exists)
62
+ const apiKeys = JSON.parse(parseCookies(cookieHeader || '').apiKeys || '{}');
63
+ const providerSettings: Record<string, IProviderSetting> = JSON.parse(
64
+ parseCookies(cookieHeader || '').providers || '{}',
65
+ );
66
+
67
  try {
68
+ const result = await streamText({
69
+ messages: [
70
  {
71
  role: 'user',
72
  content:
 
101
  `,
102
  },
103
  ],
104
+ env: context.cloudflare.env,
 
105
  apiKeys,
106
+ providerSettings,
107
+ });
108
 
109
  const transformStream = new TransformStream({
110
  transform(chunk, controller) {
app/types/model.ts CHANGED
@@ -3,9 +3,17 @@ import type { ModelInfo } from '~/utils/types';
3
  export type ProviderInfo = {
4
  staticModels: ModelInfo[];
5
  name: string;
6
- getDynamicModels?: (apiKeys?: Record<string, string>) => Promise<ModelInfo[]>;
7
  getApiKeyLink?: string;
8
  labelForGetApiKey?: string;
9
  icon?: string;
10
- isEnabled?: boolean;
 
 
 
 
 
 
 
 
11
  };
 
3
  export type ProviderInfo = {
4
  staticModels: ModelInfo[];
5
  name: string;
6
+ getDynamicModels?: (apiKeys?: Record<string, string>, providerSettings?: IProviderSetting) => Promise<ModelInfo[]>;
7
  getApiKeyLink?: string;
8
  labelForGetApiKey?: string;
9
  icon?: string;
10
+ };
11
+
12
+ export interface IProviderSetting {
13
+ enabled?: boolean;
14
+ baseUrl?: string;
15
+ }
16
+
17
+ export type IProviderConfig = ProviderInfo & {
18
+ settings: IProviderSetting;
19
  };
app/utils/constants.ts CHANGED
@@ -1,6 +1,6 @@
1
  import Cookies from 'js-cookie';
2
  import type { ModelInfo, OllamaApiResponse, OllamaModel } from './types';
3
- import type { ProviderInfo } from '~/types/model';
4
  import { createScopedLogger } from './logger';
5
 
6
  export const WORK_DIR_NAME = 'project';
@@ -126,6 +126,7 @@ const PROVIDER_LIST: ProviderInfo[] = [
126
  name: 'Google',
127
  staticModels: [
128
  { name: 'gemini-1.5-flash-latest', label: 'Gemini 1.5 Flash', provider: 'Google', maxTokenAllowed: 8192 },
 
129
  { name: 'gemini-1.5-flash-002', label: 'Gemini 1.5 Flash-002', provider: 'Google', maxTokenAllowed: 8192 },
130
  { name: 'gemini-1.5-flash-8b', label: 'Gemini 1.5 Flash-8b', provider: 'Google', maxTokenAllowed: 8192 },
131
  { name: 'gemini-1.5-pro-latest', label: 'Gemini 1.5 Pro', provider: 'Google', maxTokenAllowed: 8192 },
@@ -298,13 +299,16 @@ const staticModels: ModelInfo[] = PROVIDER_LIST.map((p) => p.staticModels).flat(
298
 
299
  export let MODEL_LIST: ModelInfo[] = [...staticModels];
300
 
301
- export async function getModelList(apiKeys: Record<string, string>) {
 
 
 
302
  MODEL_LIST = [
303
  ...(
304
  await Promise.all(
305
  PROVIDER_LIST.filter(
306
  (p): p is ProviderInfo & { getDynamicModels: () => Promise<ModelInfo[]> } => !!p.getDynamicModels,
307
- ).map((p) => p.getDynamicModels(apiKeys)),
308
  )
309
  ).flat(),
310
  ...staticModels,
@@ -312,9 +316,9 @@ export async function getModelList(apiKeys: Record<string, string>) {
312
  return MODEL_LIST;
313
  }
314
 
315
- async function getTogetherModels(apiKeys?: Record<string, string>): Promise<ModelInfo[]> {
316
  try {
317
- const baseUrl = import.meta.env.TOGETHER_API_BASE_URL || '';
318
  const provider = 'Together';
319
 
320
  if (!baseUrl) {
@@ -353,8 +357,8 @@ async function getTogetherModels(apiKeys?: Record<string, string>): Promise<Mode
353
  }
354
  }
355
 
356
- const getOllamaBaseUrl = () => {
357
- const defaultBaseUrl = import.meta.env.OLLAMA_API_BASE_URL || 'http://localhost:11434';
358
 
359
  // Check if we're in the browser
360
  if (typeof window !== 'undefined') {
@@ -368,7 +372,7 @@ const getOllamaBaseUrl = () => {
368
  return isDocker ? defaultBaseUrl.replace('localhost', 'host.docker.internal') : defaultBaseUrl;
369
  };
370
 
371
- async function getOllamaModels(): Promise<ModelInfo[]> {
372
  /*
373
  * if (typeof window === 'undefined') {
374
  * return [];
@@ -376,7 +380,7 @@ async function getOllamaModels(): Promise<ModelInfo[]> {
376
  */
377
 
378
  try {
379
- const baseUrl = getOllamaBaseUrl();
380
  const response = await fetch(`${baseUrl}/api/tags`);
381
  const data = (await response.json()) as OllamaApiResponse;
382
 
@@ -392,20 +396,21 @@ async function getOllamaModels(): Promise<ModelInfo[]> {
392
  }
393
  }
394
 
395
- async function getOpenAILikeModels(): Promise<ModelInfo[]> {
 
 
 
396
  try {
397
- const baseUrl = import.meta.env.OPENAI_LIKE_API_BASE_URL || '';
398
 
399
  if (!baseUrl) {
400
  return [];
401
  }
402
 
403
- let apiKey = import.meta.env.OPENAI_LIKE_API_KEY ?? '';
404
-
405
- const apikeys = JSON.parse(Cookies.get('apiKeys') || '{}');
406
 
407
- if (apikeys && apikeys.OpenAILike) {
408
- apiKey = apikeys.OpenAILike;
409
  }
410
 
411
  const response = await fetch(`${baseUrl}/models`, {
@@ -459,13 +464,13 @@ async function getOpenRouterModels(): Promise<ModelInfo[]> {
459
  }));
460
  }
461
 
462
- async function getLMStudioModels(): Promise<ModelInfo[]> {
463
  if (typeof window === 'undefined') {
464
  return [];
465
  }
466
 
467
  try {
468
- const baseUrl = import.meta.env.LMSTUDIO_API_BASE_URL || 'http://localhost:1234';
469
  const response = await fetch(`${baseUrl}/v1/models`);
470
  const data = (await response.json()) as any;
471
 
@@ -480,7 +485,7 @@ async function getLMStudioModels(): Promise<ModelInfo[]> {
480
  }
481
  }
482
 
483
- async function initializeModelList(): Promise<ModelInfo[]> {
484
  let apiKeys: Record<string, string> = {};
485
 
486
  try {
@@ -501,7 +506,7 @@ async function initializeModelList(): Promise<ModelInfo[]> {
501
  await Promise.all(
502
  PROVIDER_LIST.filter(
503
  (p): p is ProviderInfo & { getDynamicModels: () => Promise<ModelInfo[]> } => !!p.getDynamicModels,
504
- ).map((p) => p.getDynamicModels(apiKeys)),
505
  )
506
  ).flat(),
507
  ...staticModels,
 
1
  import Cookies from 'js-cookie';
2
  import type { ModelInfo, OllamaApiResponse, OllamaModel } from './types';
3
+ import type { ProviderInfo, IProviderSetting } from '~/types/model';
4
  import { createScopedLogger } from './logger';
5
 
6
  export const WORK_DIR_NAME = 'project';
 
126
  name: 'Google',
127
  staticModels: [
128
  { name: 'gemini-1.5-flash-latest', label: 'Gemini 1.5 Flash', provider: 'Google', maxTokenAllowed: 8192 },
129
+ { name: 'gemini-2.0-flash-exp', label: 'Gemini 2.0 Flash', provider: 'Google', maxTokenAllowed: 8192 },
130
  { name: 'gemini-1.5-flash-002', label: 'Gemini 1.5 Flash-002', provider: 'Google', maxTokenAllowed: 8192 },
131
  { name: 'gemini-1.5-flash-8b', label: 'Gemini 1.5 Flash-8b', provider: 'Google', maxTokenAllowed: 8192 },
132
  { name: 'gemini-1.5-pro-latest', label: 'Gemini 1.5 Pro', provider: 'Google', maxTokenAllowed: 8192 },
 
299
 
300
  export let MODEL_LIST: ModelInfo[] = [...staticModels];
301
 
302
+ export async function getModelList(
303
+ apiKeys: Record<string, string>,
304
+ providerSettings?: Record<string, IProviderSetting>,
305
+ ) {
306
  MODEL_LIST = [
307
  ...(
308
  await Promise.all(
309
  PROVIDER_LIST.filter(
310
  (p): p is ProviderInfo & { getDynamicModels: () => Promise<ModelInfo[]> } => !!p.getDynamicModels,
311
+ ).map((p) => p.getDynamicModels(apiKeys, providerSettings?.[p.name])),
312
  )
313
  ).flat(),
314
  ...staticModels,
 
316
  return MODEL_LIST;
317
  }
318
 
319
+ async function getTogetherModels(apiKeys?: Record<string, string>, settings?: IProviderSetting): Promise<ModelInfo[]> {
320
  try {
321
+ const baseUrl = settings?.baseUrl || import.meta.env.TOGETHER_API_BASE_URL || '';
322
  const provider = 'Together';
323
 
324
  if (!baseUrl) {
 
357
  }
358
  }
359
 
360
+ const getOllamaBaseUrl = (settings?: IProviderSetting) => {
361
+ const defaultBaseUrl = settings?.baseUrl || import.meta.env.OLLAMA_API_BASE_URL || 'http://localhost:11434';
362
 
363
  // Check if we're in the browser
364
  if (typeof window !== 'undefined') {
 
372
  return isDocker ? defaultBaseUrl.replace('localhost', 'host.docker.internal') : defaultBaseUrl;
373
  };
374
 
375
+ async function getOllamaModels(apiKeys?: Record<string, string>, settings?: IProviderSetting): Promise<ModelInfo[]> {
376
  /*
377
  * if (typeof window === 'undefined') {
378
  * return [];
 
380
  */
381
 
382
  try {
383
+ const baseUrl = getOllamaBaseUrl(settings);
384
  const response = await fetch(`${baseUrl}/api/tags`);
385
  const data = (await response.json()) as OllamaApiResponse;
386
 
 
396
  }
397
  }
398
 
399
+ async function getOpenAILikeModels(
400
+ apiKeys?: Record<string, string>,
401
+ settings?: IProviderSetting,
402
+ ): Promise<ModelInfo[]> {
403
  try {
404
+ const baseUrl = settings?.baseUrl || import.meta.env.OPENAI_LIKE_API_BASE_URL || '';
405
 
406
  if (!baseUrl) {
407
  return [];
408
  }
409
 
410
+ let apiKey = '';
 
 
411
 
412
+ if (apiKeys && apiKeys.OpenAILike) {
413
+ apiKey = apiKeys.OpenAILike;
414
  }
415
 
416
  const response = await fetch(`${baseUrl}/models`, {
 
464
  }));
465
  }
466
 
467
+ async function getLMStudioModels(_apiKeys?: Record<string, string>, settings?: IProviderSetting): Promise<ModelInfo[]> {
468
  if (typeof window === 'undefined') {
469
  return [];
470
  }
471
 
472
  try {
473
+ const baseUrl = settings?.baseUrl || import.meta.env.LMSTUDIO_API_BASE_URL || 'http://localhost:1234';
474
  const response = await fetch(`${baseUrl}/v1/models`);
475
  const data = (await response.json()) as any;
476
 
 
485
  }
486
  }
487
 
488
+ async function initializeModelList(providerSettings?: Record<string, IProviderSetting>): Promise<ModelInfo[]> {
489
  let apiKeys: Record<string, string> = {};
490
 
491
  try {
 
506
  await Promise.all(
507
  PROVIDER_LIST.filter(
508
  (p): p is ProviderInfo & { getDynamicModels: () => Promise<ModelInfo[]> } => !!p.getDynamicModels,
509
+ ).map((p) => p.getDynamicModels(apiKeys, providerSettings?.[p.name])),
510
  )
511
  ).flat(),
512
  ...staticModels,
app/utils/types.ts CHANGED
@@ -26,12 +26,3 @@ export interface ModelInfo {
26
  provider: string;
27
  maxTokenAllowed: number;
28
  }
29
-
30
- export interface ProviderInfo {
31
- staticModels: ModelInfo[];
32
- name: string;
33
- getDynamicModels?: () => Promise<ModelInfo[]>;
34
- getApiKeyLink?: string;
35
- labelForGetApiKey?: string;
36
- icon?: string;
37
- }
 
26
  provider: string;
27
  maxTokenAllowed: number;
28
  }
 
 
 
 
 
 
 
 
 
docs/docs/CONTRIBUTING.md CHANGED
@@ -4,7 +4,7 @@
4
 
5
  The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum number of context values used by the qwen2.5-coder model. For example, to limit the context to 24576 values (which uses 32GB of VRAM), set `DEFAULT_NUM_CTX=24576` in your `.env.local` file.
6
 
7
- First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide.
8
 
9
  ## 📋 Table of Contents
10
  - [Code of Conduct](#code-of-conduct)
@@ -62,7 +62,7 @@ We're looking for dedicated contributors to help maintain and grow this project.
62
  ### 🔄 Initial Setup
63
  1. Clone the repository:
64
  ```bash
65
- git clone https://github.com/coleam00/bolt.new-any-llm.git
66
  ```
67
 
68
  2. Install dependencies:
 
4
 
5
  The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum number of context values used by the qwen2.5-coder model. For example, to limit the context to 24576 values (which uses 32GB of VRAM), set `DEFAULT_NUM_CTX=24576` in your `.env.local` file.
6
 
7
+ First off, thank you for considering contributing to Bolt.diy! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.diy a better tool for developers worldwide.
8
 
9
  ## 📋 Table of Contents
10
  - [Code of Conduct](#code-of-conduct)
 
62
  ### 🔄 Initial Setup
63
  1. Clone the repository:
64
  ```bash
65
+ git clone https://github.com/stackblitz-labs/bolt.diy.git
66
  ```
67
 
68
  2. Install dependencies:
docs/docs/FAQ.md CHANGED
@@ -1,15 +1,15 @@
1
  # Frequently Asked Questions (FAQ)
2
 
3
- ## How do I get the best results with oTToDev?
4
 
5
  - **Be specific about your stack**:
6
- Mention the frameworks or libraries you want to use (e.g., Astro, Tailwind, ShadCN) in your initial prompt. This ensures that oTToDev scaffolds the project according to your preferences.
7
 
8
  - **Use the enhance prompt icon**:
9
  Before sending your prompt, click the *enhance* icon to let the AI refine your prompt. You can edit the suggested improvements before submitting.
10
 
11
  - **Scaffold the basics first, then add features**:
12
- Ensure the foundational structure of your application is in place before introducing advanced functionality. This helps oTToDev establish a solid base to build on.
13
 
14
  - **Batch simple instructions**:
15
  Combine simple tasks into a single prompt to save time and reduce API credit consumption. For example:
@@ -17,19 +17,14 @@
17
 
18
  ---
19
 
20
- ## How do I contribute to oTToDev?
21
 
22
  Check out our [Contribution Guide](CONTRIBUTING.md) for more details on how to get involved!
23
 
24
  ---
25
 
26
- ## Do you plan on merging oTToDev back into the official Bolt.new repo?
27
 
28
- Stay tuned! We’ll share updates on this early next month.
29
-
30
- ---
31
-
32
- ## What are the future plans for oTToDev?
33
 
34
  Visit our [Roadmap](https://roadmap.sh/r/ottodev-roadmap-2ovzo) for the latest updates.
35
  New features and improvements are on the way!
@@ -38,13 +33,13 @@ New features and improvements are on the way!
38
 
39
  ## Why are there so many open issues/pull requests?
40
 
41
- oTToDev began as a small showcase project on @ColeMedin's YouTube channel to explore editing open-source projects with local LLMs. However, it quickly grew into a massive community effort!
42
 
43
  We’re forming a team of maintainers to manage demand and streamline issue resolution. The maintainers are rockstars, and we’re also exploring partnerships to help the project thrive.
44
 
45
  ---
46
 
47
- ## How do local LLMs compare to larger models like Claude 3.5 Sonnet for oTToDev/Bolt.new?
48
 
49
  While local LLMs are improving rapidly, larger models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b still offer the best results for complex applications. Our ongoing focus is to improve prompts, agents, and the platform to better support smaller local LLMs.
50
 
 
1
  # Frequently Asked Questions (FAQ)
2
 
3
+ ## How do I get the best results with Bolt.diy?
4
 
5
  - **Be specific about your stack**:
6
+ Mention the frameworks or libraries you want to use (e.g., Astro, Tailwind, ShadCN) in your initial prompt. This ensures that Bolt.diy scaffolds the project according to your preferences.
7
 
8
  - **Use the enhance prompt icon**:
9
  Before sending your prompt, click the *enhance* icon to let the AI refine your prompt. You can edit the suggested improvements before submitting.
10
 
11
  - **Scaffold the basics first, then add features**:
12
+ Ensure the foundational structure of your application is in place before introducing advanced functionality. This helps Bolt.diy establish a solid base to build on.
13
 
14
  - **Batch simple instructions**:
15
  Combine simple tasks into a single prompt to save time and reduce API credit consumption. For example:
 
17
 
18
  ---
19
 
20
+ ## How do I contribute to Bolt.diy?
21
 
22
  Check out our [Contribution Guide](CONTRIBUTING.md) for more details on how to get involved!
23
 
24
  ---
25
 
 
26
 
27
+ ## What are the future plans for Bolt.diy?
 
 
 
 
28
 
29
  Visit our [Roadmap](https://roadmap.sh/r/ottodev-roadmap-2ovzo) for the latest updates.
30
  New features and improvements are on the way!
 
33
 
34
  ## Why are there so many open issues/pull requests?
35
 
36
+ Bolt.diy began as a small showcase project on @ColeMedin's YouTube channel to explore editing open-source projects with local LLMs. However, it quickly grew into a massive community effort!
37
 
38
  We’re forming a team of maintainers to manage demand and streamline issue resolution. The maintainers are rockstars, and we’re also exploring partnerships to help the project thrive.
39
 
40
  ---
41
 
42
+ ## How do local LLMs compare to larger models like Claude 3.5 Sonnet for Bolt.diy?
43
 
44
  While local LLMs are improving rapidly, larger models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b still offer the best results for complex applications. Our ongoing focus is to improve prompts, agents, and the platform to better support smaller local LLMs.
45
 
docs/docs/index.md CHANGED
@@ -1,28 +1,28 @@
1
- # Welcome to OTTO Dev
2
- This fork of Bolt.new (oTToDev) allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
3
 
4
- Join the community for oTToDev!
5
 
6
  https://thinktank.ottomator.ai
7
 
8
- ## Whats Bolt.new
9
 
10
- Bolt.new is an AI-powered web development agent that allows you to prompt, run, edit, and deploy full-stack applications directly from your browser—no local setup required. If you're here to build your own AI-powered web dev agent using the Bolt open source codebase, [click here to get started!](./CONTRIBUTING.md)
11
 
12
- ## What Makes Bolt.new Different
13
 
14
- Claude, v0, etc are incredible- but you can't install packages, run backends, or edit code. That’s where Bolt.new stands out:
15
 
16
- - **Full-Stack in the Browser**: Bolt.new integrates cutting-edge AI models with an in-browser development environment powered by **StackBlitz’s WebContainers**. This allows you to:
17
  - Install and run npm tools and libraries (like Vite, Next.js, and more)
18
  - Run Node.js servers
19
  - Interact with third-party APIs
20
  - Deploy to production from chat
21
  - Share your work via a URL
22
 
23
- - **AI with Environment Control**: Unlike traditional dev environments where the AI can only assist in code generation, Bolt.new gives AI models **complete control** over the entire environment including the filesystem, node server, package manager, terminal, and browser console. This empowers AI agents to handle the whole app lifecycle—from creation to deployment.
24
 
25
- Whether you’re an experienced developer, a PM, or a designer, Bolt.new allows you to easily build production-grade full-stack applications.
26
 
27
  For developers interested in building their own AI-powered development tools with WebContainers, check out the open-source Bolt codebase in this repo!
28
 
@@ -47,10 +47,10 @@ If you see usr/local/bin in the output then you're good to go.
47
  3. Clone the repository (if you haven't already) by opening a Terminal window (or CMD with admin permissions) and then typing in this:
48
 
49
  ```
50
- git clone https://github.com/coleam00/bolt.new-any-llm.git
51
  ```
52
 
53
- 3. Rename .env.example to .env.local and add your LLM API keys. You will find this file on a Mac at "[your name]/bold.new-any-llm/.env.example". For Windows and Linux the path will be similar.
54
 
55
  ![image](https://github.com/user-attachments/assets/7e6a532c-2268-401f-8310-e8d20c731328)
56
 
@@ -150,7 +150,7 @@ pnpm run dev
150
 
151
  ## Adding New LLMs:
152
 
153
- To make new LLMs available to use in this version of Bolt.new, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.
154
 
155
  By default, Anthropic, OpenAI, Groq, and Ollama are implemented as providers, but the YouTube video for this repo covers how to extend this to work with more providers if you wish!
156
 
@@ -179,7 +179,7 @@ This will start the Remix Vite development server. You will need Google Chrome C
179
 
180
  ## Tips and Tricks
181
 
182
- Here are some tips to get the most out of Bolt.new:
183
 
184
  - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
185
 
 
1
+ # Welcome to Bolt DIY
2
+ Bolt.diy allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
3
 
4
+ Join the community!
5
 
6
  https://thinktank.ottomator.ai
7
 
8
+ ## Whats Bolt.diy
9
 
10
+ Bolt.diy is an AI-powered web development agent that allows you to prompt, run, edit, and deploy full-stack applications directly from your browser—no local setup required. If you're here to build your own AI-powered web dev agent using the Bolt open source codebase, [click here to get started!](./CONTRIBUTING.md)
11
 
12
+ ## What Makes Bolt.diy Different
13
 
14
+ Claude, v0, etc are incredible- but you can't install packages, run backends, or edit code. That’s where Bolt.diy stands out:
15
 
16
+ - **Full-Stack in the Browser**: Bolt.diy integrates cutting-edge AI models with an in-browser development environment powered by **StackBlitz’s WebContainers**. This allows you to:
17
  - Install and run npm tools and libraries (like Vite, Next.js, and more)
18
  - Run Node.js servers
19
  - Interact with third-party APIs
20
  - Deploy to production from chat
21
  - Share your work via a URL
22
 
23
+ - **AI with Environment Control**: Unlike traditional dev environments where the AI can only assist in code generation, Bolt.diy gives AI models **complete control** over the entire environment including the filesystem, node server, package manager, terminal, and browser console. This empowers AI agents to handle the whole app lifecycle—from creation to deployment.
24
 
25
+ Whether you’re an experienced developer, a PM, or a designer, Bolt.diy allows you to easily build production-grade full-stack applications.
26
 
27
  For developers interested in building their own AI-powered development tools with WebContainers, check out the open-source Bolt codebase in this repo!
28
 
 
47
  3. Clone the repository (if you haven't already) by opening a Terminal window (or CMD with admin permissions) and then typing in this:
48
 
49
  ```
50
+ git clone https://github.com/stackblitz-labs/bolt.diy.git
51
  ```
52
 
53
+ 3. Rename .env.example to .env.local and add your LLM API keys. You will find this file on a Mac at "[your name]/bolt.diy/.env.example". For Windows and Linux the path will be similar.
54
 
55
  ![image](https://github.com/user-attachments/assets/7e6a532c-2268-401f-8310-e8d20c731328)
56
 
 
150
 
151
  ## Adding New LLMs:
152
 
153
+ To make new LLMs available to use in this version of Bolt.diy, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.
154
 
155
  By default, Anthropic, OpenAI, Groq, and Ollama are implemented as providers, but the YouTube video for this repo covers how to extend this to work with more providers if you wish!
156
 
 
179
 
180
  ## Tips and Tricks
181
 
182
+ Here are some tips to get the most out of Bolt.diy:
183
 
184
  - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
185
 
docs/mkdocs.yml CHANGED
@@ -1,4 +1,4 @@
1
- site_name: Bolt.Local Docs
2
  site_dir: ../site
3
  theme:
4
  name: material
@@ -31,19 +31,19 @@ theme:
31
  repo: fontawesome/brands/github
32
  # logo: assets/logo.png
33
  # favicon: assets/logo.png
34
- repo_name: Bolt.Local
35
- repo_url: https://github.com/coleam00/bolt.new-any-llm
36
  edit_uri: ""
37
 
38
  extra:
39
  generator: false
40
  social:
41
  - icon: fontawesome/brands/github
42
- link: https://github.com/coleam00/bolt.new-any-llm
43
- name: Bolt.Local
44
  - icon: fontawesome/brands/discourse
45
  link: https://thinktank.ottomator.ai/
46
- name: Bolt.Local Discourse
47
 
48
 
49
  markdown_extensions:
 
1
+ site_name: Bolt.diy Docs
2
  site_dir: ../site
3
  theme:
4
  name: material
 
31
  repo: fontawesome/brands/github
32
  # logo: assets/logo.png
33
  # favicon: assets/logo.png
34
+ repo_name: Bolt.diy
35
+ repo_url: https://github.com/stackblitz-labs/bolt.diy
36
  edit_uri: ""
37
 
38
  extra:
39
  generator: false
40
  social:
41
  - icon: fontawesome/brands/github
42
+ link: https://github.com/stackblitz-labs/bolt.diy
43
+ name: Bolt.diy
44
  - icon: fontawesome/brands/discourse
45
  link: https://thinktank.ottomator.ai/
46
+ name: Bolt.diy Discourse
47
 
48
 
49
  markdown_extensions: