atrokhym commited on
Commit
7cdb56a
·
2 Parent(s): 809b54e 9b62edd

merge with upstream/main

Browse files
.husky/pre-commit ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/bin/sh
2
+
3
+ echo "🔍 Running pre-commit hook to check the code looks good... 🔍"
4
+
5
+ if ! pnpm typecheck; then
6
+ echo "❌ Type checking failed! Please review TypeScript types."
7
+ echo "Once you're done, don't forget to add your changes to the commit! 🚀"
8
+ exit 1
9
+ fi
10
+
11
+ if ! pnpm lint; then
12
+ echo "❌ Linting failed! 'pnpm lint:check' will help you fix the easy ones."
13
+ echo "Once you're done, don't forget to add your beautification to the commit! 🤩"
14
+ exit 1
15
+ fi
16
+
17
+ echo "👍 All good! Committing changes..."
CONTRIBUTING.md CHANGED
@@ -1,9 +1,6 @@
1
- # Contributing to Bolt.new Fork
2
- ## DEFAULT_NUM_CTX
3
 
4
- The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum number of context values used by the qwen2.5-coder model. For example, to limit the context to 24576 values (which uses 32GB of VRAM), set `DEFAULT_NUM_CTX=24576` in your `.env.local` file.
5
-
6
- First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide.
7
 
8
  ## 📋 Table of Contents
9
  - [Code of Conduct](#code-of-conduct)
@@ -56,6 +53,8 @@ We're looking for dedicated contributors to help maintain and grow this project.
56
  - Comment complex logic
57
  - Keep functions focused and small
58
  - Use meaningful variable names
 
 
59
 
60
  ## Development Setup
61
 
 
1
+ # Contributing to oTToDev
 
2
 
3
+ First off, thank you for considering contributing to oTToDev! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make oTToDev a better tool for developers worldwide.
 
 
4
 
5
  ## 📋 Table of Contents
6
  - [Code of Conduct](#code-of-conduct)
 
53
  - Comment complex logic
54
  - Keep functions focused and small
55
  - Use meaningful variable names
56
+ - Lint your code. This repo contains a pre-commit-hook that will verify your code is linted properly,
57
+ so set up your IDE to do that for you!
58
 
59
  ## Development Setup
60
 
README.md CHANGED
@@ -29,6 +29,8 @@ https://thinktank.ottomator.ai
29
  - ✅ Bolt terminal to see the output of LLM run commands (@thecodacus)
30
  - ✅ Streaming of code output (@thecodacus)
31
  - ✅ Ability to revert code to earlier version (@wonderwhy-er)
 
 
32
  - ⬜ **HIGH PRIORITY** - Prevent Bolt from rewriting files as often (file locking and diffs)
33
  - ⬜ **HIGH PRIORITY** - Better prompting for smaller LLMs (code window sometimes doesn't start)
34
  - ⬜ **HIGH PRIORITY** - Load local projects into the app
@@ -39,8 +41,6 @@ https://thinktank.ottomator.ai
39
  - ⬜ Azure Open AI API Integration
40
  - ⬜ Perplexity Integration
41
  - ⬜ Vertex AI Integration
42
- - ✅ Cohere Integration (@hasanraiyan)
43
- - ✅ Dynamic model max token length (@hasanraiyan)
44
  - ⬜ Deploy directly to Vercel/Netlify/other similar platforms
45
  - ⬜ Prompt caching
46
  - ⬜ Better prompt enhancing
@@ -246,14 +246,55 @@ pnpm run dev
246
 
247
  This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway.
248
 
249
- ## Tips and Tricks
250
 
251
- Here are some tips to get the most out of Bolt.new:
252
 
253
  - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
254
 
255
  - **Use the enhance prompt icon**: Before sending your prompt, try clicking the 'enhance' icon to have the AI model help you refine your prompt, then edit the results before submitting.
256
 
257
- - **Scaffold the basics first, then add features**: Make sure the basic structure of your application is in place before diving into more advanced functionality. This helps Bolt understand the foundation of your project and ensure everything is wired up right before building out more advanced functionality.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
258
 
259
- - **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask Bolt to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly.
 
29
  - ✅ Bolt terminal to see the output of LLM run commands (@thecodacus)
30
  - ✅ Streaming of code output (@thecodacus)
31
  - ✅ Ability to revert code to earlier version (@wonderwhy-er)
32
+ - ✅ Cohere Integration (@hasanraiyan)
33
+ - ✅ Dynamic model max token length (@hasanraiyan)
34
  - ⬜ **HIGH PRIORITY** - Prevent Bolt from rewriting files as often (file locking and diffs)
35
  - ⬜ **HIGH PRIORITY** - Better prompting for smaller LLMs (code window sometimes doesn't start)
36
  - ⬜ **HIGH PRIORITY** - Load local projects into the app
 
41
  - ⬜ Azure Open AI API Integration
42
  - ⬜ Perplexity Integration
43
  - ⬜ Vertex AI Integration
 
 
44
  - ⬜ Deploy directly to Vercel/Netlify/other similar platforms
45
  - ⬜ Prompt caching
46
  - ⬜ Better prompt enhancing
 
246
 
247
  This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway.
248
 
249
+ ## FAQ
250
 
251
+ ### How do I get the best results with oTToDev?
252
 
253
  - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
254
 
255
  - **Use the enhance prompt icon**: Before sending your prompt, try clicking the 'enhance' icon to have the AI model help you refine your prompt, then edit the results before submitting.
256
 
257
+ - **Scaffold the basics first, then add features**: Make sure the basic structure of your application is in place before diving into more advanced functionality. This helps oTToDev understand the foundation of your project and ensure everything is wired up right before building out more advanced functionality.
258
+
259
+ - **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask oTToDev to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly.
260
+
261
+ ### How do I contribute to oTToDev?
262
+
263
+ [Please check out our dedicated page for contributing to oTToDev here!](CONTRIBUTING.md)
264
+
265
+ ### Do you plan on merging oTToDev back into the official Bolt.new repo?
266
+
267
+ More news coming on this coming early next month - stay tuned!
268
+
269
+ ### What are the future plans for oTToDev?
270
+
271
+ [Check out our Roadmap here!](https://roadmap.sh/r/ottodev-roadmap-2ovzo)
272
+
273
+ Lot more updates to this roadmap coming soon!
274
+
275
+ ### Why are there so many open issues/pull requests?
276
+
277
+ oTToDev was started simply to showcase how to edit an open source project and to do something cool with local LLMs on my (@ColeMedin) YouTube channel! However, it quickly
278
+ grew into a massive community project that I am working hard to keep up with the demand of by forming a team of maintainers and getting as many people involved as I can.
279
+ That effort is going well and all of our maintainers are ABSOLUTE rockstars, but it still takes time to organize everything so we can efficiently get through all
280
+ the issues and PRs. But rest assured, we are working hard and even working on some partnerships behind the scenes to really help this project take off!
281
+
282
+ ### How do local LLMs fair compared to larger models like Claude 3.5 Sonnet for oTToDev/Bolt.new?
283
+
284
+ As much as the gap is quickly closing between open source and massive close source models, you’re still going to get the best results with the very large models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b. This is one of the big tasks we have at hand - figuring out how to prompt better, use agents, and improve the platform as a whole to make it work better for even the smaller local LLMs!
285
+
286
+ ### I'm getting the error: "There was an error processing this request"
287
+
288
+ If you see this error within oTToDev, that is just the application telling you there is a problem at a high level, and this could mean a number of different things. To find the actual error, please check BOTH the terminal where you started the application (with Docker or pnpm) and the developer console in the browser. For most browsers, you can access the developer console by pressing F12 or right clicking anywhere in the browser and selecting “Inspect”. Then go to the “console” tab in the top right.
289
+
290
+ ### I'm getting the error: "x-api-key header missing"
291
+
292
+ We have seen this error a couple times and for some reason just restarting the Docker container has fixed it. This seems to be Ollama specific. Another thing to try is try to run oTToDev with Docker or pnpm, whichever you didn’t run first. We are still on the hunt for why this happens once and a while!
293
+
294
+ ### I'm getting a blank preview when oTToDev runs my app!
295
+
296
+ We promise you that we are constantly testing new PRs coming into oTToDev and the preview is core functionality, so the application is not broken! When you get a blank preview or don’t get a preview, this is generally because the LLM hallucinated bad code or incorrect commands. We are working on making this more transparent so it is obvious. Sometimes the error will appear in developer console too so check that as well.
297
+
298
+ ### Everything works but the results are bad
299
 
300
+ This goes to the point above about how local LLMs are getting very powerful but you still are going to see better (sometimes much better) results with the largest LLMs like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b. If you are using smaller LLMs like Qwen-2.5-Coder, consider it more experimental and educational at this point. It can build smaller applications really well, which is super impressive for a local LLM, but for larger scale applications you want to use the larger LLMs still!
app/components/chat/APIKeyManager.tsx CHANGED
@@ -10,6 +10,7 @@ interface APIKeyManagerProps {
10
  labelForGetApiKey?: string;
11
  }
12
 
 
13
  export const APIKeyManager: React.FC<APIKeyManagerProps> = ({ provider, apiKey, setApiKey }) => {
14
  const [isEditing, setIsEditing] = useState(false);
15
  const [tempKey, setTempKey] = useState(apiKey);
 
10
  labelForGetApiKey?: string;
11
  }
12
 
13
+ // eslint-disable-next-line @typescript-eslint/naming-convention
14
  export const APIKeyManager: React.FC<APIKeyManagerProps> = ({ provider, apiKey, setApiKey }) => {
15
  const [isEditing, setIsEditing] = useState(false);
16
  const [tempKey, setTempKey] = useState(apiKey);
app/components/chat/BaseChat.tsx CHANGED
@@ -1,47 +1,45 @@
1
- // @ts-nocheck
2
- // Preventing TS checks with files presented in the video for a better presentation.
 
 
3
  import type { Message } from 'ai';
4
- import React, { type RefCallback, useEffect } from 'react';
5
  import { ClientOnly } from 'remix-utils/client-only';
6
  import { Menu } from '~/components/sidebar/Menu.client';
7
  import { IconButton } from '~/components/ui/IconButton';
8
  import { Workbench } from '~/components/workbench/Workbench.client';
9
  import { classNames } from '~/utils/classNames';
10
- import { MODEL_LIST, DEFAULT_PROVIDER, PROVIDER_LIST, initializeModelList } from '~/utils/constants';
11
  import { Messages } from './Messages.client';
12
  import { SendButton } from './SendButton.client';
13
- import { useState } from 'react';
14
  import { APIKeyManager } from './APIKeyManager';
15
  import Cookies from 'js-cookie';
 
16
 
17
  import styles from './BaseChat.module.scss';
18
  import type { ProviderInfo } from '~/utils/types';
 
 
 
19
 
20
  import FilePreview from './FilePreview';
21
 
22
- const EXAMPLE_PROMPTS = [
23
- { text: 'Build a todo app in React using Tailwind' },
24
- { text: 'Build a simple blog using Astro' },
25
- { text: 'Create a cookie consent form using Material UI' },
26
- { text: 'Make a space invaders game' },
27
- { text: 'How do I center a div?' },
28
- ];
29
-
30
- const providerList = PROVIDER_LIST;
31
-
32
  const ModelSelector = ({ model, setModel, provider, setProvider, modelList, providerList, apiKeys }) => {
33
  return (
34
  <div className="mb-2 flex gap-2 flex-col sm:flex-row">
35
  <select
36
  value={provider?.name}
37
  onChange={(e) => {
38
- setProvider(providerList.find((p) => p.name === e.target.value));
 
39
  const firstModel = [...modelList].find((m) => m.provider == e.target.value);
40
  setModel(firstModel ? firstModel.name : '');
41
  }}
42
  className="flex-1 p-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary focus:outline-none focus:ring-2 focus:ring-bolt-elements-focus transition-all"
43
  >
44
- {providerList.map((provider) => (
45
  <option key={provider.name} value={provider.name}>
46
  {provider.name}
47
  </option>
@@ -75,6 +73,7 @@ interface BaseChatProps {
75
  chatStarted?: boolean;
76
  isStreaming?: boolean;
77
  messages?: Message[];
 
78
  enhancingPrompt?: boolean;
79
  promptEnhanced?: boolean;
80
  input?: string;
@@ -86,6 +85,8 @@ interface BaseChatProps {
86
  sendMessage?: (event: React.UIEvent, messageInput?: string) => void;
87
  handleInputChange?: (event: React.ChangeEvent<HTMLTextAreaElement>) => void;
88
  enhancePrompt?: () => void;
 
 
89
  uploadedFiles?: File[];
90
  setUploadedFiles?: (files: File[]) => void;
91
  imageDataList?: string[];
@@ -111,12 +112,13 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
111
  enhancePrompt,
112
  sendMessage,
113
  handleStop,
114
- uploadedFiles,
 
 
115
  setUploadedFiles,
116
- imageDataList,
117
  setImageDataList,
118
  messages,
119
- children, // Add this
120
  },
121
  ref,
122
  ) => {
@@ -128,14 +130,17 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
128
  // Load API keys from cookies on component mount
129
  try {
130
  const storedApiKeys = Cookies.get('apiKeys');
 
131
  if (storedApiKeys) {
132
  const parsedKeys = JSON.parse(storedApiKeys);
 
133
  if (typeof parsedKeys === 'object' && parsedKeys !== null) {
134
  setApiKeys(parsedKeys);
135
  }
136
  }
137
  } catch (error) {
138
  console.error('Error loading API keys from cookies:', error);
 
139
  // Clear invalid cookie data
140
  Cookies.remove('apiKeys');
141
  }
@@ -149,6 +154,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
149
  try {
150
  const updatedApiKeys = { ...apiKeys, [provider]: key };
151
  setApiKeys(updatedApiKeys);
 
152
  // Save updated API keys to cookies with 30 day expiry and secure settings
153
  Cookies.set('apiKeys', JSON.stringify(updatedApiKeys), {
154
  expires: 30, // 30 days
@@ -161,11 +167,6 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
161
  }
162
  };
163
 
164
- const handleRemoveFile = () => {
165
- setUploadedFiles([]);
166
- setImageDataList([]);
167
- };
168
-
169
  const handleFileUpload = () => {
170
  const input = document.createElement('input');
171
  input.type = 'file';
@@ -173,8 +174,10 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
173
 
174
  input.onchange = async (e) => {
175
  const file = (e.target as HTMLInputElement).files?.[0];
 
176
  if (file) {
177
  const reader = new FileReader();
 
178
  reader.onload = (e) => {
179
  const base64Image = e.target?.result as string;
180
  setUploadedFiles?.([...uploadedFiles, file]);
@@ -187,7 +190,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
187
  input.click();
188
  };
189
 
190
- return (
191
  <div
192
  ref={ref}
193
  className={classNames(
@@ -300,6 +303,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
300
  handleStop?.();
301
  return;
302
  }
 
303
  if (input.length > 0 || uploadedFiles.length > 0) {
304
  sendMessage?.(event);
305
  }
@@ -309,22 +313,19 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
309
  </ClientOnly>
310
  <div className="flex justify-between items-center text-sm p-4 pt-2">
311
  <div className="flex gap-1 items-center">
312
- <IconButton
313
- title="Upload file"
314
- className="transition-all"
315
- onClick={() => handleFileUpload()}
316
- >
317
  <div className="i-ph:paperclip text-xl"></div>
318
  </IconButton>
319
-
320
  <IconButton
321
  title="Enhance prompt"
322
  disabled={input.length === 0 || enhancingPrompt}
323
- className={classNames('transition-all', {
324
- 'opacity-100!': enhancingPrompt,
325
- 'text-bolt-elements-item-contentAccent! pr-1.5 enabled:hover:bg-bolt-elements-item-backgroundAccent!':
326
- promptEnhanced,
327
- })}
 
 
328
  onClick={() => enhancePrompt?.()}
329
  >
330
  {enhancingPrompt ? (
@@ -339,6 +340,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
339
  </>
340
  )}
341
  </IconButton>
 
342
  </div>
343
  {input.length > 3 ? (
344
  <div className="text-xs text-bolt-elements-textTertiary">
@@ -351,30 +353,14 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
351
  </div>
352
  </div>
353
  </div>
354
- {!chatStarted && (
355
- <div id="examples" className="relative w-full max-w-xl mx-auto mt-8 flex justify-center">
356
- <div className="flex flex-col space-y-2 [mask-image:linear-gradient(to_bottom,black_0%,transparent_180%)] hover:[mask-image:none]">
357
- {EXAMPLE_PROMPTS.map((examplePrompt, index) => {
358
- return (
359
- <button
360
- key={index}
361
- onClick={(event) => {
362
- sendMessage?.(event, examplePrompt.text);
363
- }}
364
- className="group flex items-center w-full gap-2 justify-center bg-transparent text-bolt-elements-textTertiary hover:text-bolt-elements-textPrimary transition-theme"
365
- >
366
- {examplePrompt.text}
367
- <div className="i-ph:arrow-bend-down-left" />
368
- </button>
369
- );
370
- })}
371
- </div>
372
- </div>
373
- )}
374
  </div>
375
  <ClientOnly>{() => <Workbench chatStarted={chatStarted} isStreaming={isStreaming} />}</ClientOnly>
376
  </div>
377
  </div>
378
  );
 
 
379
  },
380
  );
 
1
+ /*
2
+ * @ts-nocheck
3
+ * Preventing TS checks with files presented in the video for a better presentation.
4
+ */
5
  import type { Message } from 'ai';
6
+ import React, { type RefCallback, useEffect, useState } from 'react';
7
  import { ClientOnly } from 'remix-utils/client-only';
8
  import { Menu } from '~/components/sidebar/Menu.client';
9
  import { IconButton } from '~/components/ui/IconButton';
10
  import { Workbench } from '~/components/workbench/Workbench.client';
11
  import { classNames } from '~/utils/classNames';
12
+ import { MODEL_LIST, PROVIDER_LIST, initializeModelList } from '~/utils/constants';
13
  import { Messages } from './Messages.client';
14
  import { SendButton } from './SendButton.client';
 
15
  import { APIKeyManager } from './APIKeyManager';
16
  import Cookies from 'js-cookie';
17
+ import * as Tooltip from '@radix-ui/react-tooltip';
18
 
19
  import styles from './BaseChat.module.scss';
20
  import type { ProviderInfo } from '~/utils/types';
21
+ import { ExportChatButton } from '~/components/chat/chatExportAndImport/ExportChatButton';
22
+ import { ImportButtons } from '~/components/chat/chatExportAndImport/ImportButtons';
23
+ import { ExamplePrompts } from '~/components/chat/ExamplePrompts';
24
 
25
  import FilePreview from './FilePreview';
26
 
27
+ // @ts-ignore TODO: Introduce proper types
28
+ // eslint-disable-next-line @typescript-eslint/no-unused-vars
 
 
 
 
 
 
 
 
29
  const ModelSelector = ({ model, setModel, provider, setProvider, modelList, providerList, apiKeys }) => {
30
  return (
31
  <div className="mb-2 flex gap-2 flex-col sm:flex-row">
32
  <select
33
  value={provider?.name}
34
  onChange={(e) => {
35
+ setProvider(providerList.find((p: ProviderInfo) => p.name === e.target.value));
36
+
37
  const firstModel = [...modelList].find((m) => m.provider == e.target.value);
38
  setModel(firstModel ? firstModel.name : '');
39
  }}
40
  className="flex-1 p-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary focus:outline-none focus:ring-2 focus:ring-bolt-elements-focus transition-all"
41
  >
42
+ {providerList.map((provider: ProviderInfo) => (
43
  <option key={provider.name} value={provider.name}>
44
  {provider.name}
45
  </option>
 
73
  chatStarted?: boolean;
74
  isStreaming?: boolean;
75
  messages?: Message[];
76
+ description?: string;
77
  enhancingPrompt?: boolean;
78
  promptEnhanced?: boolean;
79
  input?: string;
 
85
  sendMessage?: (event: React.UIEvent, messageInput?: string) => void;
86
  handleInputChange?: (event: React.ChangeEvent<HTMLTextAreaElement>) => void;
87
  enhancePrompt?: () => void;
88
+ importChat?: (description: string, messages: Message[]) => Promise<void>;
89
+ exportChat?: () => void;
90
  uploadedFiles?: File[];
91
  setUploadedFiles?: (files: File[]) => void;
92
  imageDataList?: string[];
 
112
  enhancePrompt,
113
  sendMessage,
114
  handleStop,
115
+ importChat,
116
+ exportChat,
117
+ uploadedFiles = [],
118
  setUploadedFiles,
119
+ imageDataList = [],
120
  setImageDataList,
121
  messages,
 
122
  },
123
  ref,
124
  ) => {
 
130
  // Load API keys from cookies on component mount
131
  try {
132
  const storedApiKeys = Cookies.get('apiKeys');
133
+
134
  if (storedApiKeys) {
135
  const parsedKeys = JSON.parse(storedApiKeys);
136
+
137
  if (typeof parsedKeys === 'object' && parsedKeys !== null) {
138
  setApiKeys(parsedKeys);
139
  }
140
  }
141
  } catch (error) {
142
  console.error('Error loading API keys from cookies:', error);
143
+
144
  // Clear invalid cookie data
145
  Cookies.remove('apiKeys');
146
  }
 
154
  try {
155
  const updatedApiKeys = { ...apiKeys, [provider]: key };
156
  setApiKeys(updatedApiKeys);
157
+
158
  // Save updated API keys to cookies with 30 day expiry and secure settings
159
  Cookies.set('apiKeys', JSON.stringify(updatedApiKeys), {
160
  expires: 30, // 30 days
 
167
  }
168
  };
169
 
 
 
 
 
 
170
  const handleFileUpload = () => {
171
  const input = document.createElement('input');
172
  input.type = 'file';
 
174
 
175
  input.onchange = async (e) => {
176
  const file = (e.target as HTMLInputElement).files?.[0];
177
+
178
  if (file) {
179
  const reader = new FileReader();
180
+
181
  reader.onload = (e) => {
182
  const base64Image = e.target?.result as string;
183
  setUploadedFiles?.([...uploadedFiles, file]);
 
190
  input.click();
191
  };
192
 
193
+ const baseChat = (
194
  <div
195
  ref={ref}
196
  className={classNames(
 
303
  handleStop?.();
304
  return;
305
  }
306
+
307
  if (input.length > 0 || uploadedFiles.length > 0) {
308
  sendMessage?.(event);
309
  }
 
313
  </ClientOnly>
314
  <div className="flex justify-between items-center text-sm p-4 pt-2">
315
  <div className="flex gap-1 items-center">
316
+ <IconButton title="Upload file" className="transition-all" onClick={() => handleFileUpload()}>
 
 
 
 
317
  <div className="i-ph:paperclip text-xl"></div>
318
  </IconButton>
 
319
  <IconButton
320
  title="Enhance prompt"
321
  disabled={input.length === 0 || enhancingPrompt}
322
+ className={classNames(
323
+ 'transition-all',
324
+ enhancingPrompt ? 'opacity-100' : '',
325
+ promptEnhanced ? 'text-bolt-elements-item-contentAccent' : '',
326
+ promptEnhanced ? 'pr-1.5' : '',
327
+ promptEnhanced ? 'enabled:hover:bg-bolt-elements-item-backgroundAccent' : '',
328
+ )}
329
  onClick={() => enhancePrompt?.()}
330
  >
331
  {enhancingPrompt ? (
 
340
  </>
341
  )}
342
  </IconButton>
343
+ {chatStarted && <ClientOnly>{() => <ExportChatButton exportChat={exportChat} />}</ClientOnly>}
344
  </div>
345
  {input.length > 3 ? (
346
  <div className="text-xs text-bolt-elements-textTertiary">
 
353
  </div>
354
  </div>
355
  </div>
356
+ {!chatStarted && ImportButtons(importChat)}
357
+ {!chatStarted && ExamplePrompts(sendMessage)}
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
358
  </div>
359
  <ClientOnly>{() => <Workbench chatStarted={chatStarted} isStreaming={isStreaming} />}</ClientOnly>
360
  </div>
361
  </div>
362
  );
363
+
364
+ return <Tooltip.Provider delayDuration={200}>{baseChat}</Tooltip.Provider>;
365
  },
366
  );
app/components/chat/Chat.client.tsx CHANGED
@@ -1,5 +1,7 @@
1
- // @ts-nocheck
2
- // Preventing TS checks with files presented in the video for a better presentation.
 
 
3
  import { useStore } from '@nanostores/react';
4
  import type { Message } from 'ai';
5
  import { useChat } from 'ai/react';
@@ -7,10 +9,9 @@ import { useAnimate } from 'framer-motion';
7
  import { memo, useEffect, useRef, useState } from 'react';
8
  import { cssTransition, toast, ToastContainer } from 'react-toastify';
9
  import { useMessageParser, usePromptEnhancer, useShortcuts, useSnapScroll } from '~/lib/hooks';
10
- import { useChatHistory } from '~/lib/persistence';
11
  import { chatStore } from '~/lib/stores/chat';
12
  import { workbenchStore } from '~/lib/stores/workbench';
13
- import { fileModificationsToHTML } from '~/utils/diff';
14
  import { DEFAULT_MODEL, DEFAULT_PROVIDER, PROVIDER_LIST } from '~/utils/constants';
15
  import { cubicEasingFn } from '~/utils/easings';
16
  import { createScopedLogger, renderLogger } from '~/utils/logger';
@@ -28,11 +29,20 @@ const logger = createScopedLogger('Chat');
28
  export function Chat() {
29
  renderLogger.trace('Chat');
30
 
31
- const { ready, initialMessages, storeMessageHistory } = useChatHistory();
 
32
 
33
  return (
34
  <>
35
- {ready && <ChatImpl initialMessages={initialMessages} storeMessageHistory={storeMessageHistory} />}
 
 
 
 
 
 
 
 
36
  <ToastContainer
37
  closeButton={({ closeToast }) => {
38
  return (
@@ -67,247 +77,255 @@ export function Chat() {
67
  interface ChatProps {
68
  initialMessages: Message[];
69
  storeMessageHistory: (messages: Message[]) => Promise<void>;
 
 
 
70
  }
71
 
72
- export const ChatImpl = memo(({ initialMessages, storeMessageHistory }: ChatProps) => {
73
- useShortcuts();
74
-
75
- const textareaRef = useRef<HTMLTextAreaElement>(null);
76
- const [chatStarted, setChatStarted] = useState(initialMessages.length > 0);
77
- const [uploadedFiles, setUploadedFiles] = useState<File[]>([]); // Move here
78
- const [imageDataList, setImageDataList] = useState<string[]>([]); // Move here
79
-
80
-
81
- const [model, setModel] = useState(() => {
82
- const savedModel = Cookies.get('selectedModel');
83
- return savedModel || DEFAULT_MODEL;
84
- });
85
- const [provider, setProvider] = useState(() => {
86
- const savedProvider = Cookies.get('selectedProvider');
87
- return PROVIDER_LIST.find(p => p.name === savedProvider) || DEFAULT_PROVIDER;
88
- });
89
-
90
- const { showChat } = useStore(chatStore);
91
-
92
- const [animationScope, animate] = useAnimate();
93
-
94
- const [apiKeys, setApiKeys] = useState<Record<string, string>>({});
95
-
96
- const { messages, isLoading, input, handleInputChange, setInput, stop, append } = useChat({
97
- api: '/api/chat',
98
- body: {
99
- apiKeys
100
- },
101
- onError: (error) => {
102
- logger.error('Request failed\n\n', error);
103
- toast.error('There was an error processing your request: ' + (error.message ? error.message : "No details were returned"));
104
- },
105
- onFinish: () => {
106
- logger.debug('Finished streaming');
107
- },
108
- initialMessages,
109
- });
110
-
111
- const { enhancingPrompt, promptEnhanced, enhancePrompt, resetEnhancer } = usePromptEnhancer();
112
- const { parsedMessages, parseMessages } = useMessageParser();
113
-
114
- const TEXTAREA_MAX_HEIGHT = chatStarted ? 400 : 200;
115
-
116
- useEffect(() => {
117
- chatStore.setKey('started', initialMessages.length > 0);
118
- }, []);
119
-
120
- useEffect(() => {
121
- parseMessages(messages, isLoading);
122
-
123
- if (messages.length > initialMessages.length) {
124
- storeMessageHistory(messages).catch((error) => toast.error(error.message));
125
- }
126
- }, [messages, isLoading, parseMessages]);
127
-
128
- const scrollTextArea = () => {
129
- const textarea = textareaRef.current;
130
 
131
- if (textarea) {
132
- textarea.scrollTop = textarea.scrollHeight;
133
- }
134
- };
135
 
136
- const abort = () => {
137
- stop();
138
- chatStore.setKey('aborted', true);
139
- workbenchStore.abortAllActions();
140
- };
141
 
142
- useEffect(() => {
143
- const textarea = textareaRef.current;
 
144
 
145
- if (textarea) {
146
- textarea.style.height = 'auto';
147
 
148
- const scrollHeight = textarea.scrollHeight;
 
 
 
149
 
150
- textarea.style.height = `${Math.min(scrollHeight, TEXTAREA_MAX_HEIGHT)}px`;
151
- textarea.style.overflowY = scrollHeight > TEXTAREA_MAX_HEIGHT ? 'auto' : 'hidden';
152
- }
153
- }, [input, textareaRef]);
154
 
155
- const runAnimation = async () => {
156
- if (chatStarted) {
157
- return;
158
- }
159
 
160
- await Promise.all([
161
- animate('#examples', { opacity: 0, display: 'none' }, { duration: 0.1 }),
162
- animate('#intro', { opacity: 0, flex: 1 }, { duration: 0.2, ease: cubicEasingFn }),
163
- ]);
 
164
 
165
- chatStore.setKey('started', true);
 
166
 
167
- setChatStarted(true);
168
- };
169
 
170
- const sendMessage = async (_event: React.UIEvent, messageInput?: string) => {
171
- const _input = messageInput || input;
172
 
173
- if (_input.length === 0 || isLoading) {
174
- return;
175
- }
 
176
 
177
- /**
178
- * @note (delm) Usually saving files shouldn't take long but it may take longer if there
179
- * many unsaved files. In that case we need to block user input and show an indicator
180
- * of some kind so the user is aware that something is happening. But I consider the
181
- * happy case to be no unsaved files and I would expect users to save their changes
182
- * before they send another message.
183
- */
184
- await workbenchStore.saveAllFiles();
185
 
186
- const fileModifications = workbenchStore.getFileModifcations();
 
 
 
187
 
188
- chatStore.setKey('aborted', false);
189
 
190
- runAnimation();
 
191
 
192
- if (fileModifications !== undefined) {
193
- const diff = fileModificationsToHTML(fileModifications);
194
 
195
- /**
196
- * If we have file modifications we append a new user message manually since we have to prefix
197
- * the user input with the file modifications and we don't want the new user input to appear
198
- * in the prompt. Using `append` is almost the same as `handleSubmit` except that we have to
199
- * manually reset the input and we'd have to manually pass in file attachments. However, those
200
- * aren't relevant here.
201
- */
202
- append({
203
- role: 'user',
204
- content: [
205
- {
206
- type: 'text',
207
- text: `[Model: ${model}]\n\n[Provider: ${provider.name}]\n\n${diff}\n\n${_input}`
208
- },
209
- ...(imageDataList.map(imageData => ({
210
- type: 'image',
211
- image: imageData
212
- })))
213
- ]
214
- });
215
 
216
  /**
217
- * After sending a new message we reset all modifications since the model
218
- * should now be aware of all the changes.
 
 
 
219
  */
220
- workbenchStore.resetAllFileModifications();
221
- } else {
222
- append({
223
- role: 'user',
224
- content: [
225
- {
226
- type: 'text',
227
- text: `[Model: ${model}]\n\n[Provider: ${provider.name}]\n\n${_input}`
228
- },
229
- ...(imageDataList.map(imageData => ({
230
- type: 'image',
231
- image: imageData
232
- })))
233
- ]
234
- });
235
- }
236
-
237
- setInput('');
238
- // Add file cleanup here
239
- setUploadedFiles([]);
240
- setImageDataList([]);
241
-
242
- resetEnhancer();
243
-
244
- textareaRef.current?.blur();
245
- };
246
- const [messageRef, scrollRef] = useSnapScroll();
247
-
248
- useEffect(() => {
249
- const storedApiKeys = Cookies.get('apiKeys');
250
- if (storedApiKeys) {
251
- setApiKeys(JSON.parse(storedApiKeys));
252
- }
253
- }, []);
254
-
255
- const handleModelChange = (newModel: string) => {
256
- setModel(newModel);
257
- Cookies.set('selectedModel', newModel, { expires: 30 });
258
- };
259
-
260
- const handleProviderChange = (newProvider: ProviderInfo) => {
261
- setProvider(newProvider);
262
- Cookies.set('selectedProvider', newProvider.name, { expires: 30 });
263
- };
264
-
265
- return (
266
- <BaseChat
267
- ref={animationScope}
268
- textareaRef={textareaRef}
269
- input={input}
270
- showChat={showChat}
271
- chatStarted={chatStarted}
272
- isStreaming={isLoading}
273
- enhancingPrompt={enhancingPrompt}
274
- promptEnhanced={promptEnhanced}
275
- sendMessage={sendMessage}
276
- model={model}
277
- setModel={handleModelChange}
278
- provider={provider}
279
- setProvider={handleProviderChange}
280
- messageRef={messageRef}
281
- scrollRef={scrollRef}
282
- handleInputChange={handleInputChange}
283
- handleStop={abort}
284
- messages={messages.map((message, i) => {
285
- if (message.role === 'user') {
286
- return message;
287
- }
288
-
289
- return {
290
- ...message,
291
- content: parsedMessages[i] || '',
292
- };
293
- })}
294
- enhancePrompt={() => {
295
- enhancePrompt(
296
- input,
297
- (input) => {
298
- setInput(input);
299
- scrollTextArea();
300
- },
301
- model,
302
- provider,
303
- apiKeys
304
- );
305
- }}
306
- uploadedFiles={uploadedFiles}
307
- setUploadedFiles={setUploadedFiles}
308
- imageDataList={imageDataList}
309
- setImageDataList={setImageDataList}
310
- />
311
- );
312
- });
 
 
 
 
 
 
 
 
 
 
 
 
 
 
313
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /*
2
+ * @ts-nocheck
3
+ * Preventing TS checks with files presented in the video for a better presentation.
4
+ */
5
  import { useStore } from '@nanostores/react';
6
  import type { Message } from 'ai';
7
  import { useChat } from 'ai/react';
 
9
  import { memo, useEffect, useRef, useState } from 'react';
10
  import { cssTransition, toast, ToastContainer } from 'react-toastify';
11
  import { useMessageParser, usePromptEnhancer, useShortcuts, useSnapScroll } from '~/lib/hooks';
12
+ import { description, useChatHistory } from '~/lib/persistence';
13
  import { chatStore } from '~/lib/stores/chat';
14
  import { workbenchStore } from '~/lib/stores/workbench';
 
15
  import { DEFAULT_MODEL, DEFAULT_PROVIDER, PROVIDER_LIST } from '~/utils/constants';
16
  import { cubicEasingFn } from '~/utils/easings';
17
  import { createScopedLogger, renderLogger } from '~/utils/logger';
 
29
  export function Chat() {
30
  renderLogger.trace('Chat');
31
 
32
+ const { ready, initialMessages, storeMessageHistory, importChat, exportChat } = useChatHistory();
33
+ const title = useStore(description);
34
 
35
  return (
36
  <>
37
+ {ready && (
38
+ <ChatImpl
39
+ description={title}
40
+ initialMessages={initialMessages}
41
+ exportChat={exportChat}
42
+ storeMessageHistory={storeMessageHistory}
43
+ importChat={importChat}
44
+ />
45
+ )}
46
  <ToastContainer
47
  closeButton={({ closeToast }) => {
48
  return (
 
77
  interface ChatProps {
78
  initialMessages: Message[];
79
  storeMessageHistory: (messages: Message[]) => Promise<void>;
80
+ importChat: (description: string, messages: Message[]) => Promise<void>;
81
+ exportChat: () => void;
82
+ description?: string;
83
  }
84
 
85
+ export const ChatImpl = memo(
86
+ ({ description, initialMessages, storeMessageHistory, importChat, exportChat }: ChatProps) => {
87
+ useShortcuts();
88
+
89
+ const textareaRef = useRef<HTMLTextAreaElement>(null);
90
+ const [chatStarted, setChatStarted] = useState(initialMessages.length > 0);
91
+ const [uploadedFiles, setUploadedFiles] = useState<File[]>([]); // Move here
92
+ const [imageDataList, setImageDataList] = useState<string[]>([]); // Move here
93
+
94
+ const [model, setModel] = useState(() => {
95
+ const savedModel = Cookies.get('selectedModel');
96
+ return savedModel || DEFAULT_MODEL;
97
+ });
98
+ const [provider, setProvider] = useState(() => {
99
+ const savedProvider = Cookies.get('selectedProvider');
100
+ return PROVIDER_LIST.find((p) => p.name === savedProvider) || DEFAULT_PROVIDER;
101
+ });
102
+
103
+ const { showChat } = useStore(chatStore);
104
+
105
+ const [animationScope, animate] = useAnimate();
106
+
107
+ const [apiKeys, setApiKeys] = useState<Record<string, string>>({});
108
+
109
+ const { messages, isLoading, input, handleInputChange, setInput, stop, append } = useChat({
110
+ api: '/api/chat',
111
+ body: {
112
+ apiKeys,
113
+ },
114
+ onError: (error) => {
115
+ logger.error('Request failed\n\n', error);
116
+ toast.error(
117
+ 'There was an error processing your request: ' + (error.message ? error.message : 'No details were returned'),
118
+ );
119
+ },
120
+ onFinish: () => {
121
+ logger.debug('Finished streaming');
122
+ },
123
+ initialMessages,
124
+ });
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
125
 
126
+ const { enhancingPrompt, promptEnhanced, enhancePrompt, resetEnhancer } = usePromptEnhancer();
127
+ const { parsedMessages, parseMessages } = useMessageParser();
 
 
128
 
129
+ const TEXTAREA_MAX_HEIGHT = chatStarted ? 400 : 200;
 
 
 
 
130
 
131
+ useEffect(() => {
132
+ chatStore.setKey('started', initialMessages.length > 0);
133
+ }, []);
134
 
135
+ useEffect(() => {
136
+ parseMessages(messages, isLoading);
137
 
138
+ if (messages.length > initialMessages.length) {
139
+ storeMessageHistory(messages).catch((error) => toast.error(error.message));
140
+ }
141
+ }, [messages, isLoading, parseMessages]);
142
 
143
+ const scrollTextArea = () => {
144
+ const textarea = textareaRef.current;
 
 
145
 
146
+ if (textarea) {
147
+ textarea.scrollTop = textarea.scrollHeight;
148
+ }
149
+ };
150
 
151
+ const abort = () => {
152
+ stop();
153
+ chatStore.setKey('aborted', true);
154
+ workbenchStore.abortAllActions();
155
+ };
156
 
157
+ useEffect(() => {
158
+ const textarea = textareaRef.current;
159
 
160
+ if (textarea) {
161
+ textarea.style.height = 'auto';
162
 
163
+ const scrollHeight = textarea.scrollHeight;
 
164
 
165
+ textarea.style.height = `${Math.min(scrollHeight, TEXTAREA_MAX_HEIGHT)}px`;
166
+ textarea.style.overflowY = scrollHeight > TEXTAREA_MAX_HEIGHT ? 'auto' : 'hidden';
167
+ }
168
+ }, [input, textareaRef]);
169
 
170
+ const runAnimation = async () => {
171
+ if (chatStarted) {
172
+ return;
173
+ }
 
 
 
 
174
 
175
+ await Promise.all([
176
+ animate('#examples', { opacity: 0, display: 'none' }, { duration: 0.1 }),
177
+ animate('#intro', { opacity: 0, flex: 1 }, { duration: 0.2, ease: cubicEasingFn }),
178
+ ]);
179
 
180
+ chatStore.setKey('started', true);
181
 
182
+ setChatStarted(true);
183
+ };
184
 
185
+ const sendMessage = async (_event: React.UIEvent, messageInput?: string) => {
186
+ const _input = messageInput || input;
187
 
188
+ if (_input.length === 0 || isLoading) {
189
+ return;
190
+ }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
191
 
192
  /**
193
+ * @note (delm) Usually saving files shouldn't take long but it may take longer if there
194
+ * many unsaved files. In that case we need to block user input and show an indicator
195
+ * of some kind so the user is aware that something is happening. But I consider the
196
+ * happy case to be no unsaved files and I would expect users to save their changes
197
+ * before they send another message.
198
  */
199
+ await workbenchStore.saveAllFiles();
200
+
201
+ const fileModifications = workbenchStore.getFileModifcations();
202
+
203
+ chatStore.setKey('aborted', false);
204
+
205
+ runAnimation();
206
+
207
+ if (fileModifications !== undefined) {
208
+ /**
209
+ * If we have file modifications we append a new user message manually since we have to prefix
210
+ * the user input with the file modifications and we don't want the new user input to appear
211
+ * in the prompt. Using `append` is almost the same as `handleSubmit` except that we have to
212
+ * manually reset the input and we'd have to manually pass in file attachments. However, those
213
+ * aren't relevant here.
214
+ */
215
+ append({
216
+ role: 'user',
217
+ content: [
218
+ {
219
+ type: 'text',
220
+ text: `[Model: ${model}]\n\n[Provider: ${provider.name}]\n\n${_input}`,
221
+ },
222
+ ...imageDataList.map((imageData) => ({
223
+ type: 'image',
224
+ image: imageData,
225
+ })),
226
+ ] as any, // Type assertion to bypass compiler check
227
+ });
228
+
229
+ /**
230
+ * After sending a new message we reset all modifications since the model
231
+ * should now be aware of all the changes.
232
+ */
233
+ workbenchStore.resetAllFileModifications();
234
+ } else {
235
+ append({
236
+ role: 'user',
237
+ content: [
238
+ {
239
+ type: 'text',
240
+ text: `[Model: ${model}]\n\n[Provider: ${provider.name}]\n\n${_input}`,
241
+ },
242
+ ...imageDataList.map((imageData) => ({
243
+ type: 'image',
244
+ image: imageData,
245
+ })),
246
+ ] as any, // Type assertion to bypass compiler check
247
+ });
248
+ }
249
+
250
+ setInput('');
251
+
252
+ // Add file cleanup here
253
+ setUploadedFiles([]);
254
+ setImageDataList([]);
255
+
256
+ resetEnhancer();
257
+
258
+ textareaRef.current?.blur();
259
+ };
260
+ const [messageRef, scrollRef] = useSnapScroll();
261
+
262
+ useEffect(() => {
263
+ const storedApiKeys = Cookies.get('apiKeys');
264
+
265
+ if (storedApiKeys) {
266
+ setApiKeys(JSON.parse(storedApiKeys));
267
+ }
268
+ }, []);
269
+
270
+ const handleModelChange = (newModel: string) => {
271
+ setModel(newModel);
272
+ Cookies.set('selectedModel', newModel, { expires: 30 });
273
+ };
274
+
275
+ const handleProviderChange = (newProvider: ProviderInfo) => {
276
+ setProvider(newProvider);
277
+ Cookies.set('selectedProvider', newProvider.name, { expires: 30 });
278
+ };
279
+
280
+ return (
281
+ <BaseChat
282
+ ref={animationScope}
283
+ textareaRef={textareaRef}
284
+ input={input}
285
+ showChat={showChat}
286
+ chatStarted={chatStarted}
287
+ isStreaming={isLoading}
288
+ enhancingPrompt={enhancingPrompt}
289
+ promptEnhanced={promptEnhanced}
290
+ sendMessage={sendMessage}
291
+ model={model}
292
+ setModel={handleModelChange}
293
+ provider={provider}
294
+ setProvider={handleProviderChange}
295
+ messageRef={messageRef}
296
+ scrollRef={scrollRef}
297
+ handleInputChange={handleInputChange}
298
+ handleStop={abort}
299
+ description={description}
300
+ importChat={importChat}
301
+ exportChat={exportChat}
302
+ messages={messages.map((message, i) => {
303
+ if (message.role === 'user') {
304
+ return message;
305
+ }
306
 
307
+ return {
308
+ ...message,
309
+ content: parsedMessages[i] || '',
310
+ };
311
+ })}
312
+ enhancePrompt={() => {
313
+ enhancePrompt(
314
+ input,
315
+ (input) => {
316
+ setInput(input);
317
+ scrollTextArea();
318
+ },
319
+ model,
320
+ provider,
321
+ apiKeys,
322
+ );
323
+ }}
324
+ uploadedFiles={uploadedFiles}
325
+ setUploadedFiles={setUploadedFiles}
326
+ imageDataList={imageDataList}
327
+ setImageDataList={setImageDataList}
328
+ />
329
+ );
330
+ },
331
+ );
app/components/chat/ExamplePrompts.tsx ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import React from 'react';
2
+
3
+ const EXAMPLE_PROMPTS = [
4
+ { text: 'Build a todo app in React using Tailwind' },
5
+ { text: 'Build a simple blog using Astro' },
6
+ { text: 'Create a cookie consent form using Material UI' },
7
+ { text: 'Make a space invaders game' },
8
+ { text: 'How do I center a div?' },
9
+ ];
10
+
11
+ export function ExamplePrompts(sendMessage?: { (event: React.UIEvent, messageInput?: string): void | undefined }) {
12
+ return (
13
+ <div id="examples" className="relative w-full max-w-xl mx-auto mt-8 flex justify-center">
14
+ <div className="flex flex-col space-y-2 [mask-image:linear-gradient(to_bottom,black_0%,transparent_180%)] hover:[mask-image:none]">
15
+ {EXAMPLE_PROMPTS.map((examplePrompt, index: number) => {
16
+ return (
17
+ <button
18
+ key={index}
19
+ onClick={(event) => {
20
+ sendMessage?.(event, examplePrompt.text);
21
+ }}
22
+ className="group flex items-center w-full gap-2 justify-center bg-transparent text-bolt-elements-textTertiary hover:text-bolt-elements-textPrimary transition-theme"
23
+ >
24
+ {examplePrompt.text}
25
+ <div className="i-ph:arrow-bend-down-left" />
26
+ </button>
27
+ );
28
+ })}
29
+ </div>
30
+ </div>
31
+ );
32
+ }
app/components/chat/FilePreview.tsx CHANGED
@@ -3,37 +3,37 @@ import React from 'react';
3
 
4
  // Rest of the interface remains the same
5
  interface FilePreviewProps {
6
- files: File[];
7
- imageDataList: string[];
8
- onRemove: (index: number) => void;
9
  }
10
 
11
  const FilePreview: React.FC<FilePreviewProps> = ({ files, imageDataList, onRemove }) => {
12
- if (!files || files.length === 0) {
13
- return null;
14
- }
15
 
16
- return (
17
- <div className="flex flex-row overflow-x-auto">
18
- {files.map((file, index) => (
19
- <div key={file.name + file.size} className="mr-2 relative">
20
- {imageDataList[index] && (
21
- <div className="relative">
22
- <img src={imageDataList[index]} alt={file.name} className="max-h-20" />
23
- <button
24
- onClick={() => onRemove(index)}
25
- className="absolute -top-2 -right-2 z-10 bg-white rounded-full p-1 shadow-md hover:bg-gray-100"
26
- >
27
- <div className="bg-black rounded-full p-1">
28
- <div className="i-ph:x w-3 h-3 text-gray-400" />
29
- </div>
30
- </button>
31
- </div>
32
- )}
33
  </div>
34
- ))}
 
 
35
  </div>
36
- );
 
 
37
  };
38
 
39
  export default FilePreview;
 
3
 
4
  // Rest of the interface remains the same
5
  interface FilePreviewProps {
6
+ files: File[];
7
+ imageDataList: string[];
8
+ onRemove: (index: number) => void;
9
  }
10
 
11
  const FilePreview: React.FC<FilePreviewProps> = ({ files, imageDataList, onRemove }) => {
12
+ if (!files || files.length === 0) {
13
+ return null;
14
+ }
15
 
16
+ return (
17
+ <div className="flex flex-row overflow-x-auto">
18
+ {files.map((file, index) => (
19
+ <div key={file.name + file.size} className="mr-2 relative">
20
+ {imageDataList[index] && (
21
+ <div className="relative">
22
+ <img src={imageDataList[index]} alt={file.name} className="max-h-20" />
23
+ <button
24
+ onClick={() => onRemove(index)}
25
+ className="absolute -top-2 -right-2 z-10 bg-white rounded-full p-1 shadow-md hover:bg-gray-100"
26
+ >
27
+ <div className="bg-black rounded-full p-1">
28
+ <div className="i-ph:x w-3 h-3 text-gray-400" />
 
 
 
 
29
  </div>
30
+ </button>
31
+ </div>
32
+ )}
33
  </div>
34
+ ))}
35
+ </div>
36
+ );
37
  };
38
 
39
  export default FilePreview;
app/components/chat/ImportFolderButton.tsx ADDED
@@ -0,0 +1,164 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import React from 'react';
2
+ import type { Message } from 'ai';
3
+ import { toast } from 'react-toastify';
4
+ import ignore from 'ignore';
5
+
6
+ interface ImportFolderButtonProps {
7
+ className?: string;
8
+ importChat?: (description: string, messages: Message[]) => Promise<void>;
9
+ }
10
+
11
+ // Common patterns to ignore, similar to .gitignore
12
+ const IGNORE_PATTERNS = [
13
+ 'node_modules/**',
14
+ '.git/**',
15
+ 'dist/**',
16
+ 'build/**',
17
+ '.next/**',
18
+ 'coverage/**',
19
+ '.cache/**',
20
+ '.vscode/**',
21
+ '.idea/**',
22
+ '**/*.log',
23
+ '**/.DS_Store',
24
+ '**/npm-debug.log*',
25
+ '**/yarn-debug.log*',
26
+ '**/yarn-error.log*',
27
+ ];
28
+
29
+ const ig = ignore().add(IGNORE_PATTERNS);
30
+ const generateId = () => Math.random().toString(36).substring(2, 15);
31
+
32
+ const isBinaryFile = async (file: File): Promise<boolean> => {
33
+ const chunkSize = 1024; // Read the first 1 KB of the file
34
+ const buffer = new Uint8Array(await file.slice(0, chunkSize).arrayBuffer());
35
+
36
+ for (let i = 0; i < buffer.length; i++) {
37
+ const byte = buffer[i];
38
+
39
+ if (byte === 0 || (byte < 32 && byte !== 9 && byte !== 10 && byte !== 13)) {
40
+ return true; // Found a binary character
41
+ }
42
+ }
43
+
44
+ return false;
45
+ };
46
+
47
+ export const ImportFolderButton: React.FC<ImportFolderButtonProps> = ({ className, importChat }) => {
48
+ const shouldIncludeFile = (path: string): boolean => {
49
+ return !ig.ignores(path);
50
+ };
51
+
52
+ const createChatFromFolder = async (files: File[], binaryFiles: string[]) => {
53
+ const fileArtifacts = await Promise.all(
54
+ files.map(async (file) => {
55
+ return new Promise<string>((resolve, reject) => {
56
+ const reader = new FileReader();
57
+
58
+ reader.onload = () => {
59
+ const content = reader.result as string;
60
+ const relativePath = file.webkitRelativePath.split('/').slice(1).join('/');
61
+ resolve(
62
+ `<boltAction type="file" filePath="${relativePath}">
63
+ ${content}
64
+ </boltAction>`,
65
+ );
66
+ };
67
+ reader.onerror = reject;
68
+ reader.readAsText(file);
69
+ });
70
+ }),
71
+ );
72
+
73
+ const binaryFilesMessage =
74
+ binaryFiles.length > 0
75
+ ? `\n\nSkipped ${binaryFiles.length} binary files:\n${binaryFiles.map((f) => `- ${f}`).join('\n')}`
76
+ : '';
77
+
78
+ const message: Message = {
79
+ role: 'assistant',
80
+ content: `I'll help you set up these files.${binaryFilesMessage}
81
+
82
+ <boltArtifact id="imported-files" title="Imported Files">
83
+ ${fileArtifacts.join('\n\n')}
84
+ </boltArtifact>`,
85
+ id: generateId(),
86
+ createdAt: new Date(),
87
+ };
88
+
89
+ const userMessage: Message = {
90
+ role: 'user',
91
+ id: generateId(),
92
+ content: 'Import my files',
93
+ createdAt: new Date(),
94
+ };
95
+
96
+ const description = `Folder Import: ${files[0].webkitRelativePath.split('/')[0]}`;
97
+
98
+ if (importChat) {
99
+ await importChat(description, [userMessage, message]);
100
+ }
101
+ };
102
+
103
+ return (
104
+ <>
105
+ <input
106
+ type="file"
107
+ id="folder-import"
108
+ className="hidden"
109
+ webkitdirectory=""
110
+ directory=""
111
+ onChange={async (e) => {
112
+ const allFiles = Array.from(e.target.files || []);
113
+ const filteredFiles = allFiles.filter((file) => shouldIncludeFile(file.webkitRelativePath));
114
+
115
+ if (filteredFiles.length === 0) {
116
+ toast.error('No files found in the selected folder');
117
+ return;
118
+ }
119
+
120
+ try {
121
+ const fileChecks = await Promise.all(
122
+ filteredFiles.map(async (file) => ({
123
+ file,
124
+ isBinary: await isBinaryFile(file),
125
+ })),
126
+ );
127
+
128
+ const textFiles = fileChecks.filter((f) => !f.isBinary).map((f) => f.file);
129
+ const binaryFilePaths = fileChecks
130
+ .filter((f) => f.isBinary)
131
+ .map((f) => f.file.webkitRelativePath.split('/').slice(1).join('/'));
132
+
133
+ if (textFiles.length === 0) {
134
+ toast.error('No text files found in the selected folder');
135
+ return;
136
+ }
137
+
138
+ if (binaryFilePaths.length > 0) {
139
+ toast.info(`Skipping ${binaryFilePaths.length} binary files`);
140
+ }
141
+
142
+ await createChatFromFolder(textFiles, binaryFilePaths);
143
+ } catch (error) {
144
+ console.error('Failed to import folder:', error);
145
+ toast.error('Failed to import folder');
146
+ }
147
+
148
+ e.target.value = ''; // Reset file input
149
+ }}
150
+ {...({} as any)} // if removed webkitdirectory will throw errors as unknow attribute
151
+ />
152
+ <button
153
+ onClick={() => {
154
+ const input = document.getElementById('folder-import');
155
+ input?.click();
156
+ }}
157
+ className={className}
158
+ >
159
+ <div className="i-ph:folder-simple-upload" />
160
+ Import Folder
161
+ </button>
162
+ </>
163
+ );
164
+ };
app/components/chat/Messages.client.tsx CHANGED
@@ -3,11 +3,11 @@ import React from 'react';
3
  import { classNames } from '~/utils/classNames';
4
  import { AssistantMessage } from './AssistantMessage';
5
  import { UserMessage } from './UserMessage';
6
- import * as Tooltip from '@radix-ui/react-tooltip';
7
  import { useLocation } from '@remix-run/react';
8
  import { db, chatId } from '~/lib/persistence/useChatHistory';
9
  import { forkChat } from '~/lib/persistence/db';
10
  import { toast } from 'react-toastify';
 
11
 
12
  interface MessagesProps {
13
  id?: string;
@@ -41,92 +41,66 @@ export const Messages = React.forwardRef<HTMLDivElement, MessagesProps>((props:
41
  };
42
 
43
  return (
44
- <Tooltip.Provider delayDuration={200}>
45
- <div id={id} ref={ref} className={props.className}>
46
- {messages.length > 0
47
- ? messages.map((message, index) => {
48
- const { role, content, id: messageId } = message;
49
- const isUserMessage = role === 'user';
50
- const isFirst = index === 0;
51
- const isLast = index === messages.length - 1;
52
 
53
- return (
54
- <div
55
- key={index}
56
- className={classNames('flex gap-4 p-6 w-full rounded-[calc(0.75rem-1px)]', {
57
- 'bg-bolt-elements-messages-background': isUserMessage || !isStreaming || (isStreaming && !isLast),
58
- 'bg-gradient-to-b from-bolt-elements-messages-background from-30% to-transparent':
59
- isStreaming && isLast,
60
- 'mt-4': !isFirst,
61
- })}
62
- >
63
- {isUserMessage && (
64
- <div className="flex items-center justify-center w-[34px] h-[34px] overflow-hidden bg-white text-gray-600 rounded-full shrink-0 self-start">
65
- <div className="i-ph:user-fill text-xl"></div>
66
- </div>
67
- )}
68
- <div className="grid grid-col-1 w-full">
69
- {isUserMessage ? <UserMessage content={content} /> : <AssistantMessage content={content} />}
70
  </div>
71
- {!isUserMessage && (
72
- <div className="flex gap-2 flex-col lg:flex-row">
73
- <Tooltip.Root>
74
- <Tooltip.Trigger asChild>
75
- {messageId && (
76
- <button
77
- onClick={() => handleRewind(messageId)}
78
- key="i-ph:arrow-u-up-left"
79
- className={classNames(
80
- 'i-ph:arrow-u-up-left',
81
- 'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
82
- )}
83
- />
 
84
  )}
85
- </Tooltip.Trigger>
86
- <Tooltip.Portal>
87
- <Tooltip.Content
88
- className="bg-bolt-elements-tooltip-background text-bolt-elements-textPrimary px-3 py-2 rounded-lg text-sm shadow-lg"
89
- sideOffset={5}
90
- style={{ zIndex: 1000 }}
91
- >
92
- Revert to this message
93
- <Tooltip.Arrow className="fill-bolt-elements-tooltip-background" />
94
- </Tooltip.Content>
95
- </Tooltip.Portal>
96
- </Tooltip.Root>
97
 
98
- <Tooltip.Root>
99
- <Tooltip.Trigger asChild>
100
- <button
101
- onClick={() => handleFork(messageId)}
102
- key="i-ph:git-fork"
103
- className={classNames(
104
- 'i-ph:git-fork',
105
- 'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
106
- )}
107
- />
108
- </Tooltip.Trigger>
109
- <Tooltip.Portal>
110
- <Tooltip.Content
111
- className="bg-bolt-elements-tooltip-background text-bolt-elements-textPrimary px-3 py-2 rounded-lg text-sm shadow-lg"
112
- sideOffset={5}
113
- style={{ zIndex: 1000 }}
114
- >
115
- Fork chat from this message
116
- <Tooltip.Arrow className="fill-bolt-elements-tooltip-background" />
117
- </Tooltip.Content>
118
- </Tooltip.Portal>
119
- </Tooltip.Root>
120
- </div>
121
- )}
122
- </div>
123
- );
124
- })
125
- : null}
126
- {isStreaming && (
127
- <div className="text-center w-full text-bolt-elements-textSecondary i-svg-spinners:3-dots-fade text-4xl mt-4"></div>
128
- )}
129
- </div>
130
- </Tooltip.Provider>
131
  );
132
  });
 
3
  import { classNames } from '~/utils/classNames';
4
  import { AssistantMessage } from './AssistantMessage';
5
  import { UserMessage } from './UserMessage';
 
6
  import { useLocation } from '@remix-run/react';
7
  import { db, chatId } from '~/lib/persistence/useChatHistory';
8
  import { forkChat } from '~/lib/persistence/db';
9
  import { toast } from 'react-toastify';
10
+ import WithTooltip from '~/components/ui/Tooltip';
11
 
12
  interface MessagesProps {
13
  id?: string;
 
41
  };
42
 
43
  return (
44
+ <div id={id} ref={ref} className={props.className}>
45
+ {messages.length > 0
46
+ ? messages.map((message, index) => {
47
+ const { role, content, id: messageId } = message;
48
+ const isUserMessage = role === 'user';
49
+ const isFirst = index === 0;
50
+ const isLast = index === messages.length - 1;
 
51
 
52
+ return (
53
+ <div
54
+ key={index}
55
+ className={classNames('flex gap-4 p-6 w-full rounded-[calc(0.75rem-1px)]', {
56
+ 'bg-bolt-elements-messages-background': isUserMessage || !isStreaming || (isStreaming && !isLast),
57
+ 'bg-gradient-to-b from-bolt-elements-messages-background from-30% to-transparent':
58
+ isStreaming && isLast,
59
+ 'mt-4': !isFirst,
60
+ })}
61
+ >
62
+ {isUserMessage && (
63
+ <div className="flex items-center justify-center w-[34px] h-[34px] overflow-hidden bg-white text-gray-600 rounded-full shrink-0 self-start">
64
+ <div className="i-ph:user-fill text-xl"></div>
 
 
 
 
65
  </div>
66
+ )}
67
+ <div className="grid grid-col-1 w-full">
68
+ {isUserMessage ? <UserMessage content={content} /> : <AssistantMessage content={content} />}
69
+ </div>
70
+ {!isUserMessage && (
71
+ <div className="flex gap-2 flex-col lg:flex-row">
72
+ <WithTooltip tooltip="Revert to this message">
73
+ {messageId && (
74
+ <button
75
+ onClick={() => handleRewind(messageId)}
76
+ key="i-ph:arrow-u-up-left"
77
+ className={classNames(
78
+ 'i-ph:arrow-u-up-left',
79
+ 'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
80
  )}
81
+ />
82
+ )}
83
+ </WithTooltip>
 
 
 
 
 
 
 
 
 
84
 
85
+ <WithTooltip tooltip="Fork chat from this message">
86
+ <button
87
+ onClick={() => handleFork(messageId)}
88
+ key="i-ph:git-fork"
89
+ className={classNames(
90
+ 'i-ph:git-fork',
91
+ 'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
92
+ )}
93
+ />
94
+ </WithTooltip>
95
+ </div>
96
+ )}
97
+ </div>
98
+ );
99
+ })
100
+ : null}
101
+ {isStreaming && (
102
+ <div className="text-center w-full text-bolt-elements-textSecondary i-svg-spinners:3-dots-fade text-4xl mt-4"></div>
103
+ )}
104
+ </div>
 
 
 
 
 
 
 
 
 
 
 
 
 
105
  );
106
  });
app/components/chat/UserMessage.tsx CHANGED
@@ -1,6 +1,7 @@
1
- // @ts-nocheck
2
- // Preventing TS checks with files presented in the video for a better presentation.
3
- import { modificationsRegex } from '~/utils/diff';
 
4
  import { MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants';
5
  import { Markdown } from './Markdown';
6
 
@@ -18,11 +19,11 @@ export function UserMessage({ content }: UserMessageProps) {
18
  );
19
  }
20
 
21
- function sanitizeUserMessage(content: string | Array<{type: string, text?: string, image_url?: {url: string}}>) {
22
  if (Array.isArray(content)) {
23
- const textItem = content.find(item => item.type === 'text');
24
  return textItem?.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '') || '';
25
  }
26
-
27
  return content.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '');
28
- }
 
1
+ /*
2
+ * @ts-nocheck
3
+ * Preventing TS checks with files presented in the video for a better presentation.
4
+ */
5
  import { MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants';
6
  import { Markdown } from './Markdown';
7
 
 
19
  );
20
  }
21
 
22
+ function sanitizeUserMessage(content: string | Array<{ type: string; text?: string; image_url?: { url: string } }>) {
23
  if (Array.isArray(content)) {
24
+ const textItem = content.find((item) => item.type === 'text');
25
  return textItem?.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '') || '';
26
  }
27
+
28
  return content.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '');
29
+ }
app/components/chat/chatExportAndImport/ExportChatButton.tsx ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import WithTooltip from '~/components/ui/Tooltip';
2
+ import { IconButton } from '~/components/ui/IconButton';
3
+ import React from 'react';
4
+
5
+ export const ExportChatButton = ({ exportChat }: { exportChat?: () => void }) => {
6
+ return (
7
+ <WithTooltip tooltip="Export Chat">
8
+ <IconButton title="Export Chat" onClick={() => exportChat?.()}>
9
+ <div className="i-ph:download-simple text-xl"></div>
10
+ </IconButton>
11
+ </WithTooltip>
12
+ );
13
+ };
app/components/chat/chatExportAndImport/ImportButtons.tsx ADDED
@@ -0,0 +1,71 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import type { Message } from 'ai';
2
+ import { toast } from 'react-toastify';
3
+ import React from 'react';
4
+ import { ImportFolderButton } from '~/components/chat/ImportFolderButton';
5
+
6
+ export function ImportButtons(importChat: ((description: string, messages: Message[]) => Promise<void>) | undefined) {
7
+ return (
8
+ <div className="flex flex-col items-center justify-center flex-1 p-4">
9
+ <input
10
+ type="file"
11
+ id="chat-import"
12
+ className="hidden"
13
+ accept=".json"
14
+ onChange={async (e) => {
15
+ const file = e.target.files?.[0];
16
+
17
+ if (file && importChat) {
18
+ try {
19
+ const reader = new FileReader();
20
+
21
+ reader.onload = async (e) => {
22
+ try {
23
+ const content = e.target?.result as string;
24
+ const data = JSON.parse(content);
25
+
26
+ if (!Array.isArray(data.messages)) {
27
+ toast.error('Invalid chat file format');
28
+ }
29
+
30
+ await importChat(data.description, data.messages);
31
+ toast.success('Chat imported successfully');
32
+ } catch (error: unknown) {
33
+ if (error instanceof Error) {
34
+ toast.error('Failed to parse chat file: ' + error.message);
35
+ } else {
36
+ toast.error('Failed to parse chat file');
37
+ }
38
+ }
39
+ };
40
+ reader.onerror = () => toast.error('Failed to read chat file');
41
+ reader.readAsText(file);
42
+ } catch (error) {
43
+ toast.error(error instanceof Error ? error.message : 'Failed to import chat');
44
+ }
45
+ e.target.value = ''; // Reset file input
46
+ } else {
47
+ toast.error('Something went wrong');
48
+ }
49
+ }}
50
+ />
51
+ <div className="flex flex-col items-center gap-4 max-w-2xl text-center">
52
+ <div className="flex gap-2">
53
+ <button
54
+ onClick={() => {
55
+ const input = document.getElementById('chat-import');
56
+ input?.click();
57
+ }}
58
+ className="px-4 py-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary hover:bg-bolt-elements-background-depth-3 transition-all flex items-center gap-2"
59
+ >
60
+ <div className="i-ph:upload-simple" />
61
+ Import Chat
62
+ </button>
63
+ <ImportFolderButton
64
+ importChat={importChat}
65
+ className="px-4 py-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary hover:bg-bolt-elements-background-depth-3 transition-all flex items-center gap-2"
66
+ />
67
+ </div>
68
+ </div>
69
+ </div>
70
+ );
71
+ }
app/components/sidebar/HistoryItem.tsx CHANGED
@@ -1,70 +1,55 @@
1
  import * as Dialog from '@radix-ui/react-dialog';
2
- import { useEffect, useRef, useState } from 'react';
3
  import { type ChatHistoryItem } from '~/lib/persistence';
 
4
 
5
  interface HistoryItemProps {
6
  item: ChatHistoryItem;
7
  onDelete?: (event: React.UIEvent) => void;
8
  onDuplicate?: (id: string) => void;
 
9
  }
10
 
11
- export function HistoryItem({ item, onDelete, onDuplicate }: HistoryItemProps) {
12
- const [hovering, setHovering] = useState(false);
13
- const hoverRef = useRef<HTMLDivElement>(null);
14
-
15
- useEffect(() => {
16
- let timeout: NodeJS.Timeout | undefined;
17
-
18
- function mouseEnter() {
19
- setHovering(true);
20
-
21
- if (timeout) {
22
- clearTimeout(timeout);
23
- }
24
- }
25
-
26
- function mouseLeave() {
27
- setHovering(false);
28
- }
29
-
30
- hoverRef.current?.addEventListener('mouseenter', mouseEnter);
31
- hoverRef.current?.addEventListener('mouseleave', mouseLeave);
32
-
33
- return () => {
34
- hoverRef.current?.removeEventListener('mouseenter', mouseEnter);
35
- hoverRef.current?.removeEventListener('mouseleave', mouseLeave);
36
- };
37
- }, []);
38
-
39
  return (
40
- <div
41
- ref={hoverRef}
42
- className="group rounded-md text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary hover:bg-bolt-elements-background-depth-3 overflow-hidden flex justify-between items-center px-2 py-1"
43
- >
44
  <a href={`/chat/${item.urlId}`} className="flex w-full relative truncate block">
45
  {item.description}
46
- <div className="absolute right-0 z-1 top-0 bottom-0 bg-gradient-to-l from-bolt-elements-background-depth-2 group-hover:from-bolt-elements-background-depth-3 to-transparent w-10 flex justify-end group-hover:w-15 group-hover:from-45%">
47
- {hovering && (
48
- <div className="flex items-center p-1 text-bolt-elements-textSecondary">
49
- {onDuplicate && (
 
 
 
 
 
 
 
 
 
 
 
50
  <button
51
- className="i-ph:copy scale-110 mr-2"
 
52
  onClick={() => onDuplicate?.(item.id)}
53
  title="Duplicate chat"
54
  />
55
- )}
56
- <Dialog.Trigger asChild>
 
 
57
  <button
58
- className="i-ph:trash scale-110"
 
59
  onClick={(event) => {
60
- // we prevent the default so we don't trigger the anchor above
61
  event.preventDefault();
62
  onDelete?.(event);
63
  }}
64
  />
65
- </Dialog.Trigger>
66
- </div>
67
- )}
68
  </div>
69
  </a>
70
  </div>
 
1
  import * as Dialog from '@radix-ui/react-dialog';
 
2
  import { type ChatHistoryItem } from '~/lib/persistence';
3
+ import WithTooltip from '~/components/ui/Tooltip';
4
 
5
  interface HistoryItemProps {
6
  item: ChatHistoryItem;
7
  onDelete?: (event: React.UIEvent) => void;
8
  onDuplicate?: (id: string) => void;
9
+ exportChat: (id?: string) => void;
10
  }
11
 
12
+ export function HistoryItem({ item, onDelete, onDuplicate, exportChat }: HistoryItemProps) {
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  return (
14
+ <div className="group rounded-md text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary hover:bg-bolt-elements-background-depth-3 overflow-hidden flex justify-between items-center px-2 py-1">
 
 
 
15
  <a href={`/chat/${item.urlId}`} className="flex w-full relative truncate block">
16
  {item.description}
17
+ <div className="absolute right-0 z-1 top-0 bottom-0 bg-gradient-to-l from-bolt-elements-background-depth-2 group-hover:from-bolt-elements-background-depth-3 box-content pl-3 to-transparent w-10 flex justify-end group-hover:w-15 group-hover:from-99%">
18
+ <div className="flex items-center p-1 text-bolt-elements-textSecondary opacity-0 group-hover:opacity-100 transition-opacity">
19
+ <WithTooltip tooltip="Export chat">
20
+ <button
21
+ type="button"
22
+ className="i-ph:download-simple scale-110 mr-2 hover:text-bolt-elements-item-contentAccent"
23
+ onClick={(event) => {
24
+ event.preventDefault();
25
+ exportChat(item.id);
26
+ }}
27
+ title="Export chat"
28
+ />
29
+ </WithTooltip>
30
+ {onDuplicate && (
31
+ <WithTooltip tooltip="Duplicate chat">
32
  <button
33
+ type="button"
34
+ className="i-ph:copy scale-110 mr-2 hover:text-bolt-elements-item-contentAccent"
35
  onClick={() => onDuplicate?.(item.id)}
36
  title="Duplicate chat"
37
  />
38
+ </WithTooltip>
39
+ )}
40
+ <Dialog.Trigger asChild>
41
+ <WithTooltip tooltip="Delete chat">
42
  <button
43
+ type="button"
44
+ className="i-ph:trash scale-110 hover:text-bolt-elements-button-danger-text"
45
  onClick={(event) => {
 
46
  event.preventDefault();
47
  onDelete?.(event);
48
  }}
49
  />
50
+ </WithTooltip>
51
+ </Dialog.Trigger>
52
+ </div>
53
  </div>
54
  </a>
55
  </div>
app/components/sidebar/Menu.client.tsx CHANGED
@@ -2,7 +2,6 @@ import { motion, type Variants } from 'framer-motion';
2
  import { useCallback, useEffect, useRef, useState } from 'react';
3
  import { toast } from 'react-toastify';
4
  import { Dialog, DialogButton, DialogDescription, DialogRoot, DialogTitle } from '~/components/ui/Dialog';
5
- import { IconButton } from '~/components/ui/IconButton';
6
  import { ThemeSwitch } from '~/components/ui/ThemeSwitch';
7
  import { db, deleteById, getAll, chatId, type ChatHistoryItem, useChatHistory } from '~/lib/persistence';
8
  import { cubicEasingFn } from '~/utils/easings';
@@ -34,7 +33,7 @@ const menuVariants = {
34
  type DialogContent = { type: 'delete'; item: ChatHistoryItem } | null;
35
 
36
  export const Menu = () => {
37
- const { duplicateCurrentChat } = useChatHistory();
38
  const menuRef = useRef<HTMLDivElement>(null);
39
  const [list, setList] = useState<ChatHistoryItem[]>([]);
40
  const [open, setOpen] = useState(false);
@@ -102,7 +101,6 @@ export const Menu = () => {
102
 
103
  const handleDeleteClick = (event: React.UIEvent, item: ChatHistoryItem) => {
104
  event.preventDefault();
105
-
106
  setDialogContent({ type: 'delete', item });
107
  };
108
 
@@ -131,7 +129,7 @@ export const Menu = () => {
131
  </a>
132
  </div>
133
  <div className="text-bolt-elements-textPrimary font-medium pl-6 pr-5 my-2">Your Chats</div>
134
- <div className="flex-1 overflow-scroll pl-4 pr-5 pb-5">
135
  {list.length === 0 && <div className="pl-2 text-bolt-elements-textTertiary">No previous conversations</div>}
136
  <DialogRoot open={dialogContent !== null}>
137
  {binDates(list).map(({ category, items }) => (
@@ -143,6 +141,7 @@ export const Menu = () => {
143
  <HistoryItem
144
  key={item.id}
145
  item={item}
 
146
  onDelete={(event) => handleDeleteClick(event, item)}
147
  onDuplicate={() => handleDuplicate(item.id)}
148
  />
@@ -186,4 +185,4 @@ export const Menu = () => {
186
  </div>
187
  </motion.div>
188
  );
189
- }
 
2
  import { useCallback, useEffect, useRef, useState } from 'react';
3
  import { toast } from 'react-toastify';
4
  import { Dialog, DialogButton, DialogDescription, DialogRoot, DialogTitle } from '~/components/ui/Dialog';
 
5
  import { ThemeSwitch } from '~/components/ui/ThemeSwitch';
6
  import { db, deleteById, getAll, chatId, type ChatHistoryItem, useChatHistory } from '~/lib/persistence';
7
  import { cubicEasingFn } from '~/utils/easings';
 
33
  type DialogContent = { type: 'delete'; item: ChatHistoryItem } | null;
34
 
35
  export const Menu = () => {
36
+ const { duplicateCurrentChat, exportChat } = useChatHistory();
37
  const menuRef = useRef<HTMLDivElement>(null);
38
  const [list, setList] = useState<ChatHistoryItem[]>([]);
39
  const [open, setOpen] = useState(false);
 
101
 
102
  const handleDeleteClick = (event: React.UIEvent, item: ChatHistoryItem) => {
103
  event.preventDefault();
 
104
  setDialogContent({ type: 'delete', item });
105
  };
106
 
 
129
  </a>
130
  </div>
131
  <div className="text-bolt-elements-textPrimary font-medium pl-6 pr-5 my-2">Your Chats</div>
132
+ <div className="flex-1 overflow-auto pl-4 pr-5 pb-5">
133
  {list.length === 0 && <div className="pl-2 text-bolt-elements-textTertiary">No previous conversations</div>}
134
  <DialogRoot open={dialogContent !== null}>
135
  {binDates(list).map(({ category, items }) => (
 
141
  <HistoryItem
142
  key={item.id}
143
  item={item}
144
+ exportChat={exportChat}
145
  onDelete={(event) => handleDeleteClick(event, item)}
146
  onDuplicate={() => handleDuplicate(item.id)}
147
  />
 
185
  </div>
186
  </motion.div>
187
  );
188
+ };
app/components/ui/Tooltip.tsx ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import * as Tooltip from '@radix-ui/react-tooltip';
2
+
3
+ interface TooltipProps {
4
+ tooltip: React.ReactNode;
5
+ children: React.ReactNode;
6
+ sideOffset?: number;
7
+ className?: string;
8
+ arrowClassName?: string;
9
+ tooltipStyle?: React.CSSProperties;
10
+ position?: 'top' | 'bottom' | 'left' | 'right';
11
+ maxWidth?: number;
12
+ delay?: number;
13
+ }
14
+
15
+ const WithTooltip = ({
16
+ tooltip,
17
+ children,
18
+ sideOffset = 5,
19
+ className = '',
20
+ arrowClassName = '',
21
+ tooltipStyle = {},
22
+ position = 'top',
23
+ maxWidth = 250,
24
+ delay = 0,
25
+ }: TooltipProps) => {
26
+ return (
27
+ <Tooltip.Root delayDuration={delay}>
28
+ <Tooltip.Trigger asChild>{children}</Tooltip.Trigger>
29
+ <Tooltip.Portal>
30
+ <Tooltip.Content
31
+ side={position}
32
+ className={`
33
+ z-[2000]
34
+ px-2.5
35
+ py-1.5
36
+ max-h-[300px]
37
+ select-none
38
+ rounded-md
39
+ bg-bolt-elements-background-depth-3
40
+ text-bolt-elements-textPrimary
41
+ text-sm
42
+ leading-tight
43
+ shadow-lg
44
+ animate-in
45
+ fade-in-0
46
+ zoom-in-95
47
+ data-[state=closed]:animate-out
48
+ data-[state=closed]:fade-out-0
49
+ data-[state=closed]:zoom-out-95
50
+ ${className}
51
+ `}
52
+ sideOffset={sideOffset}
53
+ style={{
54
+ maxWidth,
55
+ ...tooltipStyle,
56
+ }}
57
+ >
58
+ <div className="break-words">{tooltip}</div>
59
+ <Tooltip.Arrow
60
+ className={`
61
+ fill-bolt-elements-background-depth-3
62
+ ${arrowClassName}
63
+ `}
64
+ width={12}
65
+ height={6}
66
+ />
67
+ </Tooltip.Content>
68
+ </Tooltip.Portal>
69
+ </Tooltip.Root>
70
+ );
71
+ };
72
+
73
+ export default WithTooltip;
app/components/workbench/EditorPanel.tsx CHANGED
@@ -239,7 +239,7 @@ export const EditorPanel = memo(
239
  <div className="i-ph:terminal-window-duotone text-lg" />
240
  Terminal {terminalCount > 1 && index}
241
  </button>
242
- </React.Fragment>
243
  )}
244
  </React.Fragment>
245
  );
@@ -255,6 +255,7 @@ export const EditorPanel = memo(
255
  </div>
256
  {Array.from({ length: terminalCount + 1 }, (_, index) => {
257
  const isActive = activeTerminal === index;
 
258
  if (index == 0) {
259
  logger.info('Starting bolt terminal');
260
 
@@ -273,6 +274,7 @@ export const EditorPanel = memo(
273
  />
274
  );
275
  }
 
276
  return (
277
  <Terminal
278
  key={index}
 
239
  <div className="i-ph:terminal-window-duotone text-lg" />
240
  Terminal {terminalCount > 1 && index}
241
  </button>
242
+ </React.Fragment>
243
  )}
244
  </React.Fragment>
245
  );
 
255
  </div>
256
  {Array.from({ length: terminalCount + 1 }, (_, index) => {
257
  const isActive = activeTerminal === index;
258
+
259
  if (index == 0) {
260
  logger.info('Starting bolt terminal');
261
 
 
274
  />
275
  );
276
  }
277
+
278
  return (
279
  <Terminal
280
  key={index}
app/components/workbench/FileTree.tsx CHANGED
@@ -111,7 +111,7 @@ export const FileTree = memo(
111
  };
112
 
113
  return (
114
- <div className={classNames('text-sm', className)}>
115
  {filteredFileList.map((fileOrFolder) => {
116
  switch (fileOrFolder.kind) {
117
  case 'file': {
 
111
  };
112
 
113
  return (
114
+ <div className={classNames('text-sm', className, 'overflow-y-auto')}>
115
  {filteredFileList.map((fileOrFolder) => {
116
  switch (fileOrFolder.kind) {
117
  case 'file': {
app/components/workbench/Workbench.client.tsx CHANGED
@@ -57,7 +57,6 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
57
  renderLogger.trace('Workbench');
58
 
59
  const [isSyncing, setIsSyncing] = useState(false);
60
- const [isUploading, setIsUploading] = useState(false);
61
 
62
  const hasPreview = useStore(computed(workbenchStore.previews, (previews) => previews.length > 0));
63
  const showWorkbench = useStore(workbenchStore.showWorkbench);
@@ -120,60 +119,6 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
120
  }
121
  }, []);
122
 
123
- const handleUploadFiles = useCallback(async () => {
124
- setIsUploading(true);
125
-
126
- try {
127
- // const directoryHandle = await window.showDirectoryPicker();
128
-
129
- // // First upload new files
130
- // await workbenchStore.uploadFilesFromDisk(directoryHandle);
131
-
132
- // // Get current files state
133
- // const currentFiles = workbenchStore.files.get();
134
-
135
- // // Create new modifications map with all files as "new"
136
- // const newModifications = new Map();
137
- // Object.entries(currentFiles).forEach(([path, file]) => {
138
- // if (file.type === 'file') {
139
- // newModifications.set(path, file.content);
140
- // }
141
- // });
142
-
143
- // // Update workbench state
144
- // await workbenchStore.refreshFiles();
145
- // workbenchStore.resetAllFileModifications();
146
-
147
- // toast.success('Files uploaded successfully');
148
- // } catch (error) {
149
- // toast.error('Failed to upload files');
150
- // }
151
- await handleUploadFilesFunc();
152
- }
153
-
154
- finally {
155
- setIsUploading(false);
156
- }
157
- }, []);
158
-
159
- async function handleUploadFilesFunc() {
160
- try {
161
- // First clean all statuses
162
- await workbenchStore.saveAllFiles();
163
- await workbenchStore.resetAllFileModifications();
164
- await workbenchStore.refreshFiles();
165
-
166
- // Now upload new files
167
- const directoryHandle = await window.showDirectoryPicker();
168
- await workbenchStore.uploadFilesFromDisk(directoryHandle);
169
-
170
- toast.success('Files uploaded successfully');
171
- } catch (error) {
172
- console.error('Upload files error:', error);
173
- toast.error('Failed to upload files');
174
- }
175
- }
176
-
177
  return (
178
  chatStarted && (
179
  <motion.div
@@ -213,10 +158,6 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
213
  {isSyncing ? <div className="i-ph:spinner" /> : <div className="i-ph:cloud-arrow-down" />}
214
  {isSyncing ? 'Syncing...' : 'Sync Files'}
215
  </PanelHeaderButton>
216
- <PanelHeaderButton className="mr-1 text-sm" onClick={handleUploadFiles} disabled={isSyncing}>
217
- {isSyncing ? <div className="i-ph:spinner" /> : <div className="i-ph:cloud-arrow-up" />}
218
- {isSyncing ? 'Uploading...' : 'Upload Files'}
219
- </PanelHeaderButton>
220
  <PanelHeaderButton
221
  className="mr-1 text-sm"
222
  onClick={() => {
@@ -233,16 +174,21 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
233
  'Please enter a name for your new GitHub repository:',
234
  'bolt-generated-project',
235
  );
 
236
  if (!repoName) {
237
  alert('Repository name is required. Push to GitHub cancelled.');
238
  return;
239
  }
 
240
  const githubUsername = prompt('Please enter your GitHub username:');
 
241
  if (!githubUsername) {
242
  alert('GitHub username is required. Push to GitHub cancelled.');
243
  return;
244
  }
 
245
  const githubToken = prompt('Please enter your GitHub personal access token:');
 
246
  if (!githubToken) {
247
  alert('GitHub token is required. Push to GitHub cancelled.');
248
  return;
 
57
  renderLogger.trace('Workbench');
58
 
59
  const [isSyncing, setIsSyncing] = useState(false);
 
60
 
61
  const hasPreview = useStore(computed(workbenchStore.previews, (previews) => previews.length > 0));
62
  const showWorkbench = useStore(workbenchStore.showWorkbench);
 
119
  }
120
  }, []);
121
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
122
  return (
123
  chatStarted && (
124
  <motion.div
 
158
  {isSyncing ? <div className="i-ph:spinner" /> : <div className="i-ph:cloud-arrow-down" />}
159
  {isSyncing ? 'Syncing...' : 'Sync Files'}
160
  </PanelHeaderButton>
 
 
 
 
161
  <PanelHeaderButton
162
  className="mr-1 text-sm"
163
  onClick={() => {
 
174
  'Please enter a name for your new GitHub repository:',
175
  'bolt-generated-project',
176
  );
177
+
178
  if (!repoName) {
179
  alert('Repository name is required. Push to GitHub cancelled.');
180
  return;
181
  }
182
+
183
  const githubUsername = prompt('Please enter your GitHub username:');
184
+
185
  if (!githubUsername) {
186
  alert('GitHub username is required. Push to GitHub cancelled.');
187
  return;
188
  }
189
+
190
  const githubToken = prompt('Please enter your GitHub personal access token:');
191
+
192
  if (!githubToken) {
193
  alert('GitHub token is required. Push to GitHub cancelled.');
194
  return;
app/lib/.server/llm/api-key.ts CHANGED
@@ -1,5 +1,7 @@
1
- // @ts-nocheck
2
- // Preventing TS checks with files presented in the video for a better presentation.
 
 
3
  import { env } from 'node:process';
4
 
5
  export function getAPIKey(cloudflareEnv: Env, provider: string, userApiKeys?: Record<string, string>) {
@@ -28,17 +30,19 @@ export function getAPIKey(cloudflareEnv: Env, provider: string, userApiKeys?: Re
28
  case 'OpenRouter':
29
  return env.OPEN_ROUTER_API_KEY || cloudflareEnv.OPEN_ROUTER_API_KEY;
30
  case 'Deepseek':
31
- return env.DEEPSEEK_API_KEY || cloudflareEnv.DEEPSEEK_API_KEY
32
  case 'Mistral':
33
- return env.MISTRAL_API_KEY || cloudflareEnv.MISTRAL_API_KEY;
34
- case "OpenAILike":
35
  return env.OPENAI_LIKE_API_KEY || cloudflareEnv.OPENAI_LIKE_API_KEY;
36
- case "xAI":
37
  return env.XAI_API_KEY || cloudflareEnv.XAI_API_KEY;
38
- case "Cohere":
39
  return env.COHERE_API_KEY;
 
 
40
  default:
41
- return "";
42
  }
43
  }
44
 
@@ -47,14 +51,17 @@ export function getBaseURL(cloudflareEnv: Env, provider: string) {
47
  case 'OpenAILike':
48
  return env.OPENAI_LIKE_API_BASE_URL || cloudflareEnv.OPENAI_LIKE_API_BASE_URL;
49
  case 'LMStudio':
50
- return env.LMSTUDIO_API_BASE_URL || cloudflareEnv.LMSTUDIO_API_BASE_URL || "http://localhost:1234";
51
- case 'Ollama':
52
- let baseUrl = env.OLLAMA_API_BASE_URL || cloudflareEnv.OLLAMA_API_BASE_URL || "http://localhost:11434";
53
- if (env.RUNNING_IN_DOCKER === 'true') {
54
- baseUrl = baseUrl.replace("localhost", "host.docker.internal");
55
- }
56
- return baseUrl;
 
 
 
57
  default:
58
- return "";
59
  }
60
  }
 
1
+ /*
2
+ * @ts-nocheck
3
+ * Preventing TS checks with files presented in the video for a better presentation.
4
+ */
5
  import { env } from 'node:process';
6
 
7
  export function getAPIKey(cloudflareEnv: Env, provider: string, userApiKeys?: Record<string, string>) {
 
30
  case 'OpenRouter':
31
  return env.OPEN_ROUTER_API_KEY || cloudflareEnv.OPEN_ROUTER_API_KEY;
32
  case 'Deepseek':
33
+ return env.DEEPSEEK_API_KEY || cloudflareEnv.DEEPSEEK_API_KEY;
34
  case 'Mistral':
35
+ return env.MISTRAL_API_KEY || cloudflareEnv.MISTRAL_API_KEY;
36
+ case 'OpenAILike':
37
  return env.OPENAI_LIKE_API_KEY || cloudflareEnv.OPENAI_LIKE_API_KEY;
38
+ case 'xAI':
39
  return env.XAI_API_KEY || cloudflareEnv.XAI_API_KEY;
40
+ case 'Cohere':
41
  return env.COHERE_API_KEY;
42
+ case 'AzureOpenAI':
43
+ return env.AZURE_OPENAI_API_KEY;
44
  default:
45
+ return '';
46
  }
47
  }
48
 
 
51
  case 'OpenAILike':
52
  return env.OPENAI_LIKE_API_BASE_URL || cloudflareEnv.OPENAI_LIKE_API_BASE_URL;
53
  case 'LMStudio':
54
+ return env.LMSTUDIO_API_BASE_URL || cloudflareEnv.LMSTUDIO_API_BASE_URL || 'http://localhost:1234';
55
+ case 'Ollama': {
56
+ let baseUrl = env.OLLAMA_API_BASE_URL || cloudflareEnv.OLLAMA_API_BASE_URL || 'http://localhost:11434';
57
+
58
+ if (env.RUNNING_IN_DOCKER === 'true') {
59
+ baseUrl = baseUrl.replace('localhost', 'host.docker.internal');
60
+ }
61
+
62
+ return baseUrl;
63
+ }
64
  default:
65
+ return '';
66
  }
67
  }
app/lib/.server/llm/model.ts CHANGED
@@ -1,27 +1,29 @@
1
- // @ts-nocheck
2
- // Preventing TS checks with files presented in the video for a better presentation.
 
 
3
  import { getAPIKey, getBaseURL } from '~/lib/.server/llm/api-key';
4
  import { createAnthropic } from '@ai-sdk/anthropic';
5
  import { createOpenAI } from '@ai-sdk/openai';
6
  import { createGoogleGenerativeAI } from '@ai-sdk/google';
7
  import { ollama } from 'ollama-ai-provider';
8
- import { createOpenRouter } from "@openrouter/ai-sdk-provider";
9
  import { createMistral } from '@ai-sdk/mistral';
10
- import { createCohere } from '@ai-sdk/cohere'
 
11
 
12
- export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ?
13
- parseInt(process.env.DEFAULT_NUM_CTX, 10) :
14
- 32768;
15
 
16
- export function getAnthropicModel(apiKey: string, model: string) {
 
 
17
  const anthropic = createAnthropic({
18
  apiKey,
19
  });
20
 
21
  return anthropic(model);
22
  }
23
-
24
- export function getOpenAILikeModel(baseURL: string, apiKey: string, model: string) {
25
  const openai = createOpenAI({
26
  baseURL,
27
  apiKey,
@@ -30,7 +32,7 @@ export function getOpenAILikeModel(baseURL: string, apiKey: string, model: strin
30
  return openai(model);
31
  }
32
 
33
- export function getCohereAIModel(apiKey:string, model: string){
34
  const cohere = createCohere({
35
  apiKey,
36
  });
@@ -38,7 +40,7 @@ export function getCohereAIModel(apiKey:string, model: string){
38
  return cohere(model);
39
  }
40
 
41
- export function getOpenAIModel(apiKey: string, model: string) {
42
  const openai = createOpenAI({
43
  apiKey,
44
  });
@@ -46,15 +48,15 @@ export function getOpenAIModel(apiKey: string, model: string) {
46
  return openai(model);
47
  }
48
 
49
- export function getMistralModel(apiKey: string, model: string) {
50
  const mistral = createMistral({
51
- apiKey
52
  });
53
 
54
  return mistral(model);
55
  }
56
 
57
- export function getGoogleModel(apiKey: string, model: string) {
58
  const google = createGoogleGenerativeAI({
59
  apiKey,
60
  });
@@ -62,7 +64,7 @@ export function getGoogleModel(apiKey: string, model: string) {
62
  return google(model);
63
  }
64
 
65
- export function getGroqModel(apiKey: string, model: string) {
66
  const openai = createOpenAI({
67
  baseURL: 'https://api.groq.com/openai/v1',
68
  apiKey,
@@ -71,7 +73,7 @@ export function getGroqModel(apiKey: string, model: string) {
71
  return openai(model);
72
  }
73
 
74
- export function getHuggingFaceModel(apiKey: string, model: string) {
75
  const openai = createOpenAI({
76
  baseURL: 'https://api-inference.huggingface.co/v1/',
77
  apiKey,
@@ -81,15 +83,16 @@ export function getHuggingFaceModel(apiKey: string, model: string) {
81
  }
82
 
83
  export function getOllamaModel(baseURL: string, model: string) {
84
- let Ollama = ollama(model, {
85
  numCtx: DEFAULT_NUM_CTX,
86
- });
87
 
88
- Ollama.config.baseURL = `${baseURL}/api`;
89
- return Ollama;
 
90
  }
91
 
92
- export function getDeepseekModel(apiKey: string, model: string) {
93
  const openai = createOpenAI({
94
  baseURL: 'https://api.deepseek.com/beta',
95
  apiKey,
@@ -98,9 +101,9 @@ export function getDeepseekModel(apiKey: string, model: string) {
98
  return openai(model);
99
  }
100
 
101
- export function getOpenRouterModel(apiKey: string, model: string) {
102
  const openRouter = createOpenRouter({
103
- apiKey
104
  });
105
 
106
  return openRouter.chat(model);
@@ -109,13 +112,13 @@ export function getOpenRouterModel(apiKey: string, model: string) {
109
  export function getLMStudioModel(baseURL: string, model: string) {
110
  const lmstudio = createOpenAI({
111
  baseUrl: `${baseURL}/v1`,
112
- apiKey: "",
113
  });
114
 
115
  return lmstudio(model);
116
  }
117
 
118
- export function getXAIModel(apiKey: string, model: string) {
119
  const openai = createOpenAI({
120
  baseURL: 'https://api.x.ai/v1',
121
  apiKey,
@@ -125,11 +128,13 @@ export function getXAIModel(apiKey: string, model: string) {
125
  }
126
 
127
  export function getModel(provider: string, model: string, env: Env, apiKeys?: Record<string, string>) {
128
- let apiKey; // Declare first
129
- let baseURL;
 
 
130
 
131
- apiKey = getAPIKey(env, provider, apiKeys); // Then assign
132
- baseURL = getBaseURL(env, provider);
133
 
134
  switch (provider) {
135
  case 'Anthropic':
@@ -159,4 +164,4 @@ export function getModel(provider: string, model: string, env: Env, apiKeys?: Re
159
  default:
160
  return getOllamaModel(baseURL, model);
161
  }
162
- }
 
1
+ /*
2
+ * @ts-nocheck
3
+ * Preventing TS checks with files presented in the video for a better presentation.
4
+ */
5
  import { getAPIKey, getBaseURL } from '~/lib/.server/llm/api-key';
6
  import { createAnthropic } from '@ai-sdk/anthropic';
7
  import { createOpenAI } from '@ai-sdk/openai';
8
  import { createGoogleGenerativeAI } from '@ai-sdk/google';
9
  import { ollama } from 'ollama-ai-provider';
10
+ import { createOpenRouter } from '@openrouter/ai-sdk-provider';
11
  import { createMistral } from '@ai-sdk/mistral';
12
+ import { createCohere } from '@ai-sdk/cohere';
13
+ import type { LanguageModelV1 } from 'ai';
14
 
15
+ export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ? parseInt(process.env.DEFAULT_NUM_CTX, 10) : 32768;
 
 
16
 
17
+ type OptionalApiKey = string | undefined;
18
+
19
+ export function getAnthropicModel(apiKey: OptionalApiKey, model: string) {
20
  const anthropic = createAnthropic({
21
  apiKey,
22
  });
23
 
24
  return anthropic(model);
25
  }
26
+ export function getOpenAILikeModel(baseURL: string, apiKey: OptionalApiKey, model: string) {
 
27
  const openai = createOpenAI({
28
  baseURL,
29
  apiKey,
 
32
  return openai(model);
33
  }
34
 
35
+ export function getCohereAIModel(apiKey: OptionalApiKey, model: string) {
36
  const cohere = createCohere({
37
  apiKey,
38
  });
 
40
  return cohere(model);
41
  }
42
 
43
+ export function getOpenAIModel(apiKey: OptionalApiKey, model: string) {
44
  const openai = createOpenAI({
45
  apiKey,
46
  });
 
48
  return openai(model);
49
  }
50
 
51
+ export function getMistralModel(apiKey: OptionalApiKey, model: string) {
52
  const mistral = createMistral({
53
+ apiKey,
54
  });
55
 
56
  return mistral(model);
57
  }
58
 
59
+ export function getGoogleModel(apiKey: OptionalApiKey, model: string) {
60
  const google = createGoogleGenerativeAI({
61
  apiKey,
62
  });
 
64
  return google(model);
65
  }
66
 
67
+ export function getGroqModel(apiKey: OptionalApiKey, model: string) {
68
  const openai = createOpenAI({
69
  baseURL: 'https://api.groq.com/openai/v1',
70
  apiKey,
 
73
  return openai(model);
74
  }
75
 
76
+ export function getHuggingFaceModel(apiKey: OptionalApiKey, model: string) {
77
  const openai = createOpenAI({
78
  baseURL: 'https://api-inference.huggingface.co/v1/',
79
  apiKey,
 
83
  }
84
 
85
  export function getOllamaModel(baseURL: string, model: string) {
86
+ const ollamaInstance = ollama(model, {
87
  numCtx: DEFAULT_NUM_CTX,
88
+ }) as LanguageModelV1 & { config: any };
89
 
90
+ ollamaInstance.config.baseURL = `${baseURL}/api`;
91
+
92
+ return ollamaInstance;
93
  }
94
 
95
+ export function getDeepseekModel(apiKey: OptionalApiKey, model: string) {
96
  const openai = createOpenAI({
97
  baseURL: 'https://api.deepseek.com/beta',
98
  apiKey,
 
101
  return openai(model);
102
  }
103
 
104
+ export function getOpenRouterModel(apiKey: OptionalApiKey, model: string) {
105
  const openRouter = createOpenRouter({
106
+ apiKey,
107
  });
108
 
109
  return openRouter.chat(model);
 
112
  export function getLMStudioModel(baseURL: string, model: string) {
113
  const lmstudio = createOpenAI({
114
  baseUrl: `${baseURL}/v1`,
115
+ apiKey: '',
116
  });
117
 
118
  return lmstudio(model);
119
  }
120
 
121
+ export function getXAIModel(apiKey: OptionalApiKey, model: string) {
122
  const openai = createOpenAI({
123
  baseURL: 'https://api.x.ai/v1',
124
  apiKey,
 
128
  }
129
 
130
  export function getModel(provider: string, model: string, env: Env, apiKeys?: Record<string, string>) {
131
+ /*
132
+ * let apiKey; // Declare first
133
+ * let baseURL;
134
+ */
135
 
136
+ const apiKey = getAPIKey(env, provider, apiKeys); // Then assign
137
+ const baseURL = getBaseURL(env, provider);
138
 
139
  switch (provider) {
140
  case 'Anthropic':
 
164
  default:
165
  return getOllamaModel(baseURL, model);
166
  }
167
+ }
app/lib/.server/llm/stream-text.ts CHANGED
@@ -1,10 +1,11 @@
1
- // @ts-nocheck
2
- // Preventing TS checks with files presented in the video for a better presentation.
3
- import { streamText as _streamText, convertToCoreMessages } from 'ai';
 
4
  import { getModel } from '~/lib/.server/llm/model';
5
  import { MAX_TOKENS } from './constants';
6
  import { getSystemPrompt } from './prompts';
7
- import { MODEL_LIST, DEFAULT_MODEL, DEFAULT_PROVIDER, MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants';
8
 
9
  interface ToolResult<Name extends string, Args, Result> {
10
  toolCallId: string;
@@ -26,41 +27,41 @@ export type StreamingOptions = Omit<Parameters<typeof _streamText>[0], 'model'>;
26
 
27
  function extractPropertiesFromMessage(message: Message): { model: string; provider: string; content: string } {
28
  const textContent = Array.isArray(message.content)
29
- ? message.content.find(item => item.type === 'text')?.text || ''
30
  : message.content;
31
 
32
  const modelMatch = textContent.match(MODEL_REGEX);
33
  const providerMatch = textContent.match(PROVIDER_REGEX);
34
 
35
- // Extract model
36
- // const modelMatch = message.content.match(MODEL_REGEX);
 
 
37
  const model = modelMatch ? modelMatch[1] : DEFAULT_MODEL;
38
 
39
- // Extract provider
40
- // const providerMatch = message.content.match(PROVIDER_REGEX);
 
 
41
  const provider = providerMatch ? providerMatch[1] : DEFAULT_PROVIDER;
42
 
43
  const cleanedContent = Array.isArray(message.content)
44
- ? message.content.map(item => {
45
- if (item.type === 'text') {
46
- return {
47
- type: 'text',
48
- text: item.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '')
49
- };
50
- }
51
- return item; // Preserve image_url and other types as is
52
- })
 
53
  : textContent.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '');
54
 
55
  return { model, provider, content: cleanedContent };
56
  }
57
 
58
- export function streamText(
59
- messages: Messages,
60
- env: Env,
61
- options?: StreamingOptions,
62
- apiKeys?: Record<string, string>
63
- ) {
64
  let currentModel = DEFAULT_MODEL;
65
  let currentProvider = DEFAULT_PROVIDER;
66
 
@@ -76,15 +77,13 @@ export function streamText(
76
 
77
  return { ...message, content };
78
  }
 
79
  return message;
80
  });
81
 
82
  const modelDetails = MODEL_LIST.find((m) => m.name === currentModel);
83
 
84
- const dynamicMaxTokens =
85
- modelDetails && modelDetails.maxTokenAllowed
86
- ? modelDetails.maxTokenAllowed
87
- : MAX_TOKENS;
88
 
89
  return _streamText({
90
  ...options,
 
1
+ // eslint-disable-next-line @typescript-eslint/ban-ts-comment
2
+ // @ts-nocheck TODO: Provider proper types
3
+
4
+ import { convertToCoreMessages, streamText as _streamText } from 'ai';
5
  import { getModel } from '~/lib/.server/llm/model';
6
  import { MAX_TOKENS } from './constants';
7
  import { getSystemPrompt } from './prompts';
8
+ import { DEFAULT_MODEL, DEFAULT_PROVIDER, MODEL_LIST, MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants';
9
 
10
  interface ToolResult<Name extends string, Args, Result> {
11
  toolCallId: string;
 
27
 
28
  function extractPropertiesFromMessage(message: Message): { model: string; provider: string; content: string } {
29
  const textContent = Array.isArray(message.content)
30
+ ? message.content.find((item) => item.type === 'text')?.text || ''
31
  : message.content;
32
 
33
  const modelMatch = textContent.match(MODEL_REGEX);
34
  const providerMatch = textContent.match(PROVIDER_REGEX);
35
 
36
+ /*
37
+ * Extract model
38
+ * const modelMatch = message.content.match(MODEL_REGEX);
39
+ */
40
  const model = modelMatch ? modelMatch[1] : DEFAULT_MODEL;
41
 
42
+ /*
43
+ * Extract provider
44
+ * const providerMatch = message.content.match(PROVIDER_REGEX);
45
+ */
46
  const provider = providerMatch ? providerMatch[1] : DEFAULT_PROVIDER;
47
 
48
  const cleanedContent = Array.isArray(message.content)
49
+ ? message.content.map((item) => {
50
+ if (item.type === 'text') {
51
+ return {
52
+ type: 'text',
53
+ text: item.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, ''),
54
+ };
55
+ }
56
+
57
+ return item; // Preserve image_url and other types as is
58
+ })
59
  : textContent.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '');
60
 
61
  return { model, provider, content: cleanedContent };
62
  }
63
 
64
+ export function streamText(messages: Messages, env: Env, options?: StreamingOptions, apiKeys?: Record<string, string>) {
 
 
 
 
 
65
  let currentModel = DEFAULT_MODEL;
66
  let currentProvider = DEFAULT_PROVIDER;
67
 
 
77
 
78
  return { ...message, content };
79
  }
80
+
81
  return message;
82
  });
83
 
84
  const modelDetails = MODEL_LIST.find((m) => m.name === currentModel);
85
 
86
+ const dynamicMaxTokens = modelDetails && modelDetails.maxTokenAllowed ? modelDetails.maxTokenAllowed : MAX_TOKENS;
 
 
 
87
 
88
  return _streamText({
89
  ...options,
app/lib/persistence/db.ts CHANGED
@@ -161,46 +161,48 @@ async function getUrlIds(db: IDBDatabase): Promise<string[]> {
161
 
162
  export async function forkChat(db: IDBDatabase, chatId: string, messageId: string): Promise<string> {
163
  const chat = await getMessages(db, chatId);
164
- if (!chat) throw new Error('Chat not found');
 
 
 
165
 
166
  // Find the index of the message to fork at
167
- const messageIndex = chat.messages.findIndex(msg => msg.id === messageId);
168
- if (messageIndex === -1) throw new Error('Message not found');
 
 
 
169
 
170
  // Get messages up to and including the selected message
171
  const messages = chat.messages.slice(0, messageIndex + 1);
172
 
173
- // Generate new IDs
174
- const newId = await getNextId(db);
175
- const urlId = await getUrlId(db, newId);
176
-
177
- // Create the forked chat
178
- await setMessages(
179
- db,
180
- newId,
181
- messages,
182
- urlId,
183
- chat.description ? `${chat.description} (fork)` : 'Forked chat'
184
- );
185
-
186
- return urlId;
187
  }
188
 
189
  export async function duplicateChat(db: IDBDatabase, id: string): Promise<string> {
190
  const chat = await getMessages(db, id);
 
191
  if (!chat) {
192
  throw new Error('Chat not found');
193
  }
194
 
 
 
 
 
 
 
 
 
195
  const newId = await getNextId(db);
196
  const newUrlId = await getUrlId(db, newId); // Get a new urlId for the duplicated chat
197
 
198
  await setMessages(
199
  db,
200
  newId,
201
- chat.messages,
202
  newUrlId, // Use the new urlId
203
- `${chat.description || 'Chat'} (copy)`
204
  );
205
 
206
  return newUrlId; // Return the urlId instead of id for navigation
 
161
 
162
  export async function forkChat(db: IDBDatabase, chatId: string, messageId: string): Promise<string> {
163
  const chat = await getMessages(db, chatId);
164
+
165
+ if (!chat) {
166
+ throw new Error('Chat not found');
167
+ }
168
 
169
  // Find the index of the message to fork at
170
+ const messageIndex = chat.messages.findIndex((msg) => msg.id === messageId);
171
+
172
+ if (messageIndex === -1) {
173
+ throw new Error('Message not found');
174
+ }
175
 
176
  // Get messages up to and including the selected message
177
  const messages = chat.messages.slice(0, messageIndex + 1);
178
 
179
+ return createChatFromMessages(db, chat.description ? `${chat.description} (fork)` : 'Forked chat', messages);
 
 
 
 
 
 
 
 
 
 
 
 
 
180
  }
181
 
182
  export async function duplicateChat(db: IDBDatabase, id: string): Promise<string> {
183
  const chat = await getMessages(db, id);
184
+
185
  if (!chat) {
186
  throw new Error('Chat not found');
187
  }
188
 
189
+ return createChatFromMessages(db, `${chat.description || 'Chat'} (copy)`, chat.messages);
190
+ }
191
+
192
+ export async function createChatFromMessages(
193
+ db: IDBDatabase,
194
+ description: string,
195
+ messages: Message[],
196
+ ): Promise<string> {
197
  const newId = await getNextId(db);
198
  const newUrlId = await getUrlId(db, newId); // Get a new urlId for the duplicated chat
199
 
200
  await setMessages(
201
  db,
202
  newId,
203
+ messages,
204
  newUrlId, // Use the new urlId
205
+ description,
206
  );
207
 
208
  return newUrlId; // Return the urlId instead of id for navigation
app/lib/persistence/useChatHistory.ts CHANGED
@@ -4,7 +4,15 @@ import { atom } from 'nanostores';
4
  import type { Message } from 'ai';
5
  import { toast } from 'react-toastify';
6
  import { workbenchStore } from '~/lib/stores/workbench';
7
- import { getMessages, getNextId, getUrlId, openDatabase, setMessages, duplicateChat } from './db';
 
 
 
 
 
 
 
 
8
 
9
  export interface ChatHistoryItem {
10
  id: string;
@@ -99,7 +107,7 @@ export function useChatHistory() {
99
 
100
  await setMessages(db, chatId.get() as string, messages, urlId, description.get());
101
  },
102
- duplicateCurrentChat: async (listItemId:string) => {
103
  if (!db || (!mixedId && !listItemId)) {
104
  return;
105
  }
@@ -110,8 +118,48 @@ export function useChatHistory() {
110
  toast.success('Chat duplicated successfully');
111
  } catch (error) {
112
  toast.error('Failed to duplicate chat');
 
113
  }
114
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
115
  };
116
  }
117
 
 
4
  import type { Message } from 'ai';
5
  import { toast } from 'react-toastify';
6
  import { workbenchStore } from '~/lib/stores/workbench';
7
+ import {
8
+ getMessages,
9
+ getNextId,
10
+ getUrlId,
11
+ openDatabase,
12
+ setMessages,
13
+ duplicateChat,
14
+ createChatFromMessages,
15
+ } from './db';
16
 
17
  export interface ChatHistoryItem {
18
  id: string;
 
107
 
108
  await setMessages(db, chatId.get() as string, messages, urlId, description.get());
109
  },
110
+ duplicateCurrentChat: async (listItemId: string) => {
111
  if (!db || (!mixedId && !listItemId)) {
112
  return;
113
  }
 
118
  toast.success('Chat duplicated successfully');
119
  } catch (error) {
120
  toast.error('Failed to duplicate chat');
121
+ console.log(error);
122
  }
123
+ },
124
+ importChat: async (description: string, messages: Message[]) => {
125
+ if (!db) {
126
+ return;
127
+ }
128
+
129
+ try {
130
+ const newId = await createChatFromMessages(db, description, messages);
131
+ window.location.href = `/chat/${newId}`;
132
+ toast.success('Chat imported successfully');
133
+ } catch (error) {
134
+ if (error instanceof Error) {
135
+ toast.error('Failed to import chat: ' + error.message);
136
+ } else {
137
+ toast.error('Failed to import chat');
138
+ }
139
+ }
140
+ },
141
+ exportChat: async (id = urlId) => {
142
+ if (!db || !id) {
143
+ return;
144
+ }
145
+
146
+ const chat = await getMessages(db, id);
147
+ const chatData = {
148
+ messages: chat.messages,
149
+ description: chat.description,
150
+ exportDate: new Date().toISOString(),
151
+ };
152
+
153
+ const blob = new Blob([JSON.stringify(chatData, null, 2)], { type: 'application/json' });
154
+ const url = URL.createObjectURL(blob);
155
+ const a = document.createElement('a');
156
+ a.href = url;
157
+ a.download = `chat-${new Date().toISOString()}.json`;
158
+ document.body.appendChild(a);
159
+ a.click();
160
+ document.body.removeChild(a);
161
+ URL.revokeObjectURL(url);
162
+ },
163
  };
164
  }
165
 
app/lib/runtime/action-runner.ts CHANGED
@@ -1,11 +1,10 @@
1
- import { WebContainer, type WebContainerProcess } from '@webcontainer/api';
2
  import { atom, map, type MapStore } from 'nanostores';
3
  import * as nodePath from 'node:path';
4
  import type { BoltAction } from '~/types/actions';
5
  import { createScopedLogger } from '~/utils/logger';
6
  import { unreachable } from '~/utils/unreachable';
7
  import type { ActionCallbackData } from './message-parser';
8
- import type { ITerminal } from '~/types/terminal';
9
  import type { BoltShell } from '~/utils/shell';
10
 
11
  const logger = createScopedLogger('ActionRunner');
@@ -45,7 +44,6 @@ export class ActionRunner {
45
  constructor(webcontainerPromise: Promise<WebContainer>, getShellTerminal: () => BoltShell) {
46
  this.#webcontainer = webcontainerPromise;
47
  this.#shellTerminal = getShellTerminal;
48
-
49
  }
50
 
51
  addAction(data: ActionCallbackData) {
@@ -88,15 +86,16 @@ export class ActionRunner {
88
  if (action.executed) {
89
  return;
90
  }
 
91
  if (isStreaming && action.type !== 'file') {
92
  return;
93
  }
94
 
95
  this.#updateAction(actionId, { ...action, ...data.action, executed: !isStreaming });
96
 
97
- return this.#currentExecutionPromise = this.#currentExecutionPromise
98
  .then(() => {
99
- return this.#executeAction(actionId, isStreaming);
100
  })
101
  .catch((error) => {
102
  console.error('Action failed:', error);
@@ -121,17 +120,23 @@ export class ActionRunner {
121
  case 'start': {
122
  // making the start app non blocking
123
 
124
- this.#runStartAction(action).then(()=>this.#updateAction(actionId, { status: 'complete' }))
125
- .catch(()=>this.#updateAction(actionId, { status: 'failed', error: 'Action failed' }))
126
- // adding a delay to avoid any race condition between 2 start actions
127
- // i am up for a better approch
128
- await new Promise(resolve=>setTimeout(resolve,2000))
129
- return
130
- break;
 
 
 
 
131
  }
132
  }
133
 
134
- this.#updateAction(actionId, { status: isStreaming ? 'running' : action.abortSignal.aborted ? 'aborted' : 'complete' });
 
 
135
  } catch (error) {
136
  this.#updateAction(actionId, { status: 'failed', error: 'Action failed' });
137
  logger.error(`[${action.type}]:Action failed\n\n`, error);
@@ -145,16 +150,19 @@ export class ActionRunner {
145
  if (action.type !== 'shell') {
146
  unreachable('Expected shell action');
147
  }
148
- const shell = this.#shellTerminal()
149
- await shell.ready()
 
 
150
  if (!shell || !shell.terminal || !shell.process) {
151
  unreachable('Shell terminal not found');
152
  }
153
- const resp = await shell.executeCommand(this.runnerId.get(), action.content)
154
- logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`)
155
- if (resp?.exitCode != 0) {
156
- throw new Error("Failed To Execute Shell Command");
157
 
 
 
 
 
 
158
  }
159
  }
160
 
@@ -162,21 +170,26 @@ export class ActionRunner {
162
  if (action.type !== 'start') {
163
  unreachable('Expected shell action');
164
  }
 
165
  if (!this.#shellTerminal) {
166
  unreachable('Shell terminal not found');
167
  }
168
- const shell = this.#shellTerminal()
169
- await shell.ready()
 
 
170
  if (!shell || !shell.terminal || !shell.process) {
171
  unreachable('Shell terminal not found');
172
  }
173
- const resp = await shell.executeCommand(this.runnerId.get(), action.content)
174
- logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`)
 
175
 
176
  if (resp?.exitCode != 0) {
177
- throw new Error("Failed To Start Application");
178
  }
179
- return resp
 
180
  }
181
 
182
  async #runFileAction(action: ActionState) {
 
1
+ import { WebContainer } from '@webcontainer/api';
2
  import { atom, map, type MapStore } from 'nanostores';
3
  import * as nodePath from 'node:path';
4
  import type { BoltAction } from '~/types/actions';
5
  import { createScopedLogger } from '~/utils/logger';
6
  import { unreachable } from '~/utils/unreachable';
7
  import type { ActionCallbackData } from './message-parser';
 
8
  import type { BoltShell } from '~/utils/shell';
9
 
10
  const logger = createScopedLogger('ActionRunner');
 
44
  constructor(webcontainerPromise: Promise<WebContainer>, getShellTerminal: () => BoltShell) {
45
  this.#webcontainer = webcontainerPromise;
46
  this.#shellTerminal = getShellTerminal;
 
47
  }
48
 
49
  addAction(data: ActionCallbackData) {
 
86
  if (action.executed) {
87
  return;
88
  }
89
+
90
  if (isStreaming && action.type !== 'file') {
91
  return;
92
  }
93
 
94
  this.#updateAction(actionId, { ...action, ...data.action, executed: !isStreaming });
95
 
96
+ this.#currentExecutionPromise = this.#currentExecutionPromise
97
  .then(() => {
98
+ this.#executeAction(actionId, isStreaming);
99
  })
100
  .catch((error) => {
101
  console.error('Action failed:', error);
 
120
  case 'start': {
121
  // making the start app non blocking
122
 
123
+ this.#runStartAction(action)
124
+ .then(() => this.#updateAction(actionId, { status: 'complete' }))
125
+ .catch(() => this.#updateAction(actionId, { status: 'failed', error: 'Action failed' }));
126
+
127
+ /*
128
+ * adding a delay to avoid any race condition between 2 start actions
129
+ * i am up for a better approach
130
+ */
131
+ await new Promise((resolve) => setTimeout(resolve, 2000));
132
+
133
+ return;
134
  }
135
  }
136
 
137
+ this.#updateAction(actionId, {
138
+ status: isStreaming ? 'running' : action.abortSignal.aborted ? 'aborted' : 'complete',
139
+ });
140
  } catch (error) {
141
  this.#updateAction(actionId, { status: 'failed', error: 'Action failed' });
142
  logger.error(`[${action.type}]:Action failed\n\n`, error);
 
150
  if (action.type !== 'shell') {
151
  unreachable('Expected shell action');
152
  }
153
+
154
+ const shell = this.#shellTerminal();
155
+ await shell.ready();
156
+
157
  if (!shell || !shell.terminal || !shell.process) {
158
  unreachable('Shell terminal not found');
159
  }
 
 
 
 
160
 
161
+ const resp = await shell.executeCommand(this.runnerId.get(), action.content);
162
+ logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`);
163
+
164
+ if (resp?.exitCode != 0) {
165
+ throw new Error('Failed To Execute Shell Command');
166
  }
167
  }
168
 
 
170
  if (action.type !== 'start') {
171
  unreachable('Expected shell action');
172
  }
173
+
174
  if (!this.#shellTerminal) {
175
  unreachable('Shell terminal not found');
176
  }
177
+
178
+ const shell = this.#shellTerminal();
179
+ await shell.ready();
180
+
181
  if (!shell || !shell.terminal || !shell.process) {
182
  unreachable('Shell terminal not found');
183
  }
184
+
185
+ const resp = await shell.executeCommand(this.runnerId.get(), action.content);
186
+ logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`);
187
 
188
  if (resp?.exitCode != 0) {
189
+ throw new Error('Failed To Start Application');
190
  }
191
+
192
+ return resp;
193
  }
194
 
195
  async #runFileAction(action: ActionState) {
app/lib/runtime/message-parser.ts CHANGED
@@ -55,7 +55,7 @@ interface MessageState {
55
  export class StreamingMessageParser {
56
  #messages = new Map<string, MessageState>();
57
 
58
- constructor(private _options: StreamingMessageParserOptions = {}) { }
59
 
60
  parse(messageId: string, input: string) {
61
  let state = this.#messages.get(messageId);
@@ -120,20 +120,20 @@ export class StreamingMessageParser {
120
  i = closeIndex + ARTIFACT_ACTION_TAG_CLOSE.length;
121
  } else {
122
  if ('type' in currentAction && currentAction.type === 'file') {
123
- let content = input.slice(i);
124
 
125
  this._options.callbacks?.onActionStream?.({
126
  artifactId: currentArtifact.id,
127
  messageId,
128
  actionId: String(state.actionId - 1),
129
  action: {
130
- ...currentAction as FileAction,
131
  content,
132
  filePath: currentAction.filePath,
133
  },
134
-
135
  });
136
  }
 
137
  break;
138
  }
139
  } else {
@@ -272,7 +272,7 @@ export class StreamingMessageParser {
272
  }
273
 
274
  (actionAttributes as FileAction).filePath = filePath;
275
- } else if (!(['shell', 'start'].includes(actionType))) {
276
  logger.warn(`Unknown action type '${actionType}'`);
277
  }
278
 
 
55
  export class StreamingMessageParser {
56
  #messages = new Map<string, MessageState>();
57
 
58
+ constructor(private _options: StreamingMessageParserOptions = {}) {}
59
 
60
  parse(messageId: string, input: string) {
61
  let state = this.#messages.get(messageId);
 
120
  i = closeIndex + ARTIFACT_ACTION_TAG_CLOSE.length;
121
  } else {
122
  if ('type' in currentAction && currentAction.type === 'file') {
123
+ const content = input.slice(i);
124
 
125
  this._options.callbacks?.onActionStream?.({
126
  artifactId: currentArtifact.id,
127
  messageId,
128
  actionId: String(state.actionId - 1),
129
  action: {
130
+ ...(currentAction as FileAction),
131
  content,
132
  filePath: currentAction.filePath,
133
  },
 
134
  });
135
  }
136
+
137
  break;
138
  }
139
  } else {
 
272
  }
273
 
274
  (actionAttributes as FileAction).filePath = filePath;
275
+ } else if (!['shell', 'start'].includes(actionType)) {
276
  logger.warn(`Unknown action type '${actionType}'`);
277
  }
278
 
app/lib/stores/files.ts CHANGED
@@ -80,10 +80,6 @@ export class FilesStore {
80
  this.#modifiedFiles.clear();
81
  }
82
 
83
- markFileAsNew(filePath: string) {
84
- this.#modifiedFiles.set(filePath, '');
85
- }
86
-
87
  async saveFile(filePath: string, content: string) {
88
  const webcontainer = await this.#webcontainer;
89
 
@@ -216,9 +212,5 @@ function isBinaryFile(buffer: Uint8Array | undefined) {
216
  * array buffer.
217
  */
218
  function convertToBuffer(view: Uint8Array): Buffer {
219
- const buffer = new Uint8Array(view.buffer, view.byteOffset, view.byteLength);
220
-
221
- Object.setPrototypeOf(buffer, Buffer.prototype);
222
-
223
- return buffer as Buffer;
224
  }
 
80
  this.#modifiedFiles.clear();
81
  }
82
 
 
 
 
 
83
  async saveFile(filePath: string, content: string) {
84
  const webcontainer = await this.#webcontainer;
85
 
 
212
  * array buffer.
213
  */
214
  function convertToBuffer(view: Uint8Array): Buffer {
215
+ return Buffer.from(view.buffer, view.byteOffset, view.byteLength);
 
 
 
 
216
  }
app/lib/stores/terminal.ts CHANGED
@@ -7,7 +7,7 @@ import { coloredText } from '~/utils/terminal';
7
  export class TerminalStore {
8
  #webcontainer: Promise<WebContainer>;
9
  #terminals: Array<{ terminal: ITerminal; process: WebContainerProcess }> = [];
10
- #boltTerminal = newBoltShellProcess()
11
 
12
  showTerminal: WritableAtom<boolean> = import.meta.hot?.data.showTerminal ?? atom(true);
13
 
@@ -27,8 +27,8 @@ export class TerminalStore {
27
  }
28
  async attachBoltTerminal(terminal: ITerminal) {
29
  try {
30
- let wc = await this.#webcontainer
31
- await this.#boltTerminal.init(wc, terminal)
32
  } catch (error: any) {
33
  terminal.write(coloredText.red('Failed to spawn bolt shell\n\n') + error.message);
34
  return;
 
7
  export class TerminalStore {
8
  #webcontainer: Promise<WebContainer>;
9
  #terminals: Array<{ terminal: ITerminal; process: WebContainerProcess }> = [];
10
+ #boltTerminal = newBoltShellProcess();
11
 
12
  showTerminal: WritableAtom<boolean> = import.meta.hot?.data.showTerminal ?? atom(true);
13
 
 
27
  }
28
  async attachBoltTerminal(terminal: ITerminal) {
29
  try {
30
+ const wc = await this.#webcontainer;
31
+ await this.#boltTerminal.init(wc, terminal);
32
  } catch (error: any) {
33
  terminal.write(coloredText.red('Failed to spawn bolt shell\n\n') + error.message);
34
  return;
app/lib/stores/workbench.ts CHANGED
@@ -11,9 +11,8 @@ import { PreviewsStore } from './previews';
11
  import { TerminalStore } from './terminal';
12
  import JSZip from 'jszip';
13
  import { saveAs } from 'file-saver';
14
- import { Octokit, type RestEndpointMethodTypes } from "@octokit/rest";
15
  import * as nodePath from 'node:path';
16
- import type { WebContainerProcess } from '@webcontainer/api';
17
  import { extractRelativePath } from '~/utils/diff';
18
 
19
  export interface ArtifactState {
@@ -32,7 +31,6 @@ export type WorkbenchViewType = 'code' | 'preview';
32
  export class WorkbenchStore {
33
  #previewsStore = new PreviewsStore(webcontainer);
34
  #filesStore = new FilesStore(webcontainer);
35
-
36
  #editorStore = new EditorStore(this.#filesStore);
37
  #terminalStore = new TerminalStore(webcontainer);
38
 
@@ -43,7 +41,6 @@ export class WorkbenchStore {
43
  unsavedFiles: WritableAtom<Set<string>> = import.meta.hot?.data.unsavedFiles ?? atom(new Set<string>());
44
  modifiedFiles = new Set<string>();
45
  artifactIdList: string[] = [];
46
- #boltTerminal: { terminal: ITerminal; process: WebContainerProcess } | undefined;
47
  #globalExecutionQueue = Promise.resolve();
48
  constructor() {
49
  if (import.meta.hot) {
@@ -55,7 +52,7 @@ export class WorkbenchStore {
55
  }
56
 
57
  addToExecutionQueue(callback: () => Promise<void>) {
58
- this.#globalExecutionQueue = this.#globalExecutionQueue.then(() => callback())
59
  }
60
 
61
  get previews() {
@@ -97,7 +94,6 @@ export class WorkbenchStore {
97
  this.#terminalStore.attachTerminal(terminal);
98
  }
99
  attachBoltTerminal(terminal: ITerminal) {
100
-
101
  this.#terminalStore.attachBoltTerminal(terminal);
102
  }
103
 
@@ -262,7 +258,8 @@ export class WorkbenchStore {
262
  this.artifacts.setKey(messageId, { ...artifact, ...state });
263
  }
264
  addAction(data: ActionCallbackData) {
265
- this._addAction(data)
 
266
  // this.addToExecutionQueue(()=>this._addAction(data))
267
  }
268
  async _addAction(data: ActionCallbackData) {
@@ -279,10 +276,9 @@ export class WorkbenchStore {
279
 
280
  runAction(data: ActionCallbackData, isStreaming: boolean = false) {
281
  if (isStreaming) {
282
- this._runAction(data, isStreaming)
283
- }
284
- else {
285
- this.addToExecutionQueue(() => this._runAction(data, isStreaming))
286
  }
287
  }
288
  async _runAction(data: ActionCallbackData, isStreaming: boolean = false) {
@@ -293,16 +289,21 @@ export class WorkbenchStore {
293
  if (!artifact) {
294
  unreachable('Artifact not found');
295
  }
 
296
  if (data.action.type === 'file') {
297
- let wc = await webcontainer
298
  const fullPath = nodePath.join(wc.workdir, data.action.filePath);
 
299
  if (this.selectedFile.value !== fullPath) {
300
  this.setSelectedFile(fullPath);
301
  }
 
302
  if (this.currentView.value !== 'code') {
303
  this.currentView.set('code');
304
  }
 
305
  const doc = this.#editorStore.documents.get()[fullPath];
 
306
  if (!doc) {
307
  await artifact.runner.runAction(data, isStreaming);
308
  }
@@ -382,63 +383,7 @@ export class WorkbenchStore {
382
  return syncedFiles;
383
  }
384
 
385
- async uploadFilesFromDisk(sourceHandle: FileSystemDirectoryHandle) {
386
- const loadedFiles = [];
387
- const wc = await webcontainer;
388
- const newFiles = {};
389
-
390
- const processDirectory = async (handle: FileSystemDirectoryHandle, currentPath: string = '') => {
391
- const entries = await Array.fromAsync(handle.values());
392
-
393
- for (const entry of entries) {
394
- const entryPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
395
- const fullPath = `/${entryPath}`;
396
-
397
- if (entry.kind === 'directory') {
398
- await wc.fs.mkdir(fullPath, { recursive: true });
399
- const subDirHandle = await handle.getDirectoryHandle(entry.name);
400
- await processDirectory(subDirHandle, entryPath);
401
- } else {
402
- const file = await entry.getFile();
403
- const content = await file.text();
404
-
405
- // Write to WebContainer
406
- await wc.fs.writeFile(fullPath, content);
407
-
408
- // Mark file as new
409
- this.#filesStore.markFileAsNew(fullPath);
410
-
411
- // Update the files store with the current content
412
- this.files.setKey(fullPath, { type: 'file', content, isBinary: false });
413
-
414
- // Collect for editor store with actual content
415
- newFiles[fullPath] = { type: 'file', content, isBinary: false };
416
- loadedFiles.push(entryPath);
417
- }
418
- }
419
- }
420
-
421
- await processDirectory(sourceHandle);
422
-
423
- return loadedFiles;
424
- }
425
-
426
- async refreshFiles() {
427
- // Clear old state
428
- this.modifiedFiles = new Set<string>();
429
- this.artifactIdList = [];
430
-
431
- // Reset stores
432
- this.#filesStore = new FilesStore(webcontainer);
433
- this.#editorStore = new EditorStore(this.#filesStore);
434
-
435
- // Update UI state
436
- this.currentView.set('code');
437
- this.unsavedFiles.set(new Set<string>());
438
- }
439
-
440
  async pushToGitHub(repoName: string, githubUsername: string, ghToken: string) {
441
-
442
  try {
443
  // Get the GitHub auth token from environment variables
444
  const githubToken = ghToken;
@@ -453,10 +398,11 @@ export class WorkbenchStore {
453
  const octokit = new Octokit({ auth: githubToken });
454
 
455
  // Check if the repository already exists before creating it
456
- let repo: RestEndpointMethodTypes["repos"]["get"]["response"]['data']
 
457
  try {
458
- let resp = await octokit.repos.get({ owner: owner, repo: repoName });
459
- repo = resp.data
460
  } catch (error) {
461
  if (error instanceof Error && 'status' in error && error.status === 404) {
462
  // Repository doesn't exist, so create a new one
@@ -474,6 +420,7 @@ export class WorkbenchStore {
474
 
475
  // Get all files
476
  const files = this.files.get();
 
477
  if (!files || Object.keys(files).length === 0) {
478
  throw new Error('No files found to push');
479
  }
@@ -490,7 +437,9 @@ export class WorkbenchStore {
490
  });
491
  return { path: extractRelativePath(filePath), sha: blob.sha };
492
  }
493
- })
 
 
494
  );
495
 
496
  const validBlobs = blobs.filter(Boolean); // Filter out any undefined blobs
@@ -542,21 +491,6 @@ export class WorkbenchStore {
542
  console.error('Error pushing to GitHub:', error instanceof Error ? error.message : String(error));
543
  }
544
  }
545
-
546
- async markFileAsModified(filePath: string) {
547
- const file = this.#filesStore.getFile(filePath);
548
- if (file?.type === 'file') {
549
- // First collect all original content
550
- const originalContent = file.content;
551
- console.log(`Processing ${filePath}:`, originalContent);
552
-
553
- // Then save modifications
554
- await this.saveFile(filePath, originalContent);
555
- }
556
- }
557
-
558
-
559
-
560
  }
561
 
562
  export const workbenchStore = new WorkbenchStore();
 
11
  import { TerminalStore } from './terminal';
12
  import JSZip from 'jszip';
13
  import { saveAs } from 'file-saver';
14
+ import { Octokit, type RestEndpointMethodTypes } from '@octokit/rest';
15
  import * as nodePath from 'node:path';
 
16
  import { extractRelativePath } from '~/utils/diff';
17
 
18
  export interface ArtifactState {
 
31
  export class WorkbenchStore {
32
  #previewsStore = new PreviewsStore(webcontainer);
33
  #filesStore = new FilesStore(webcontainer);
 
34
  #editorStore = new EditorStore(this.#filesStore);
35
  #terminalStore = new TerminalStore(webcontainer);
36
 
 
41
  unsavedFiles: WritableAtom<Set<string>> = import.meta.hot?.data.unsavedFiles ?? atom(new Set<string>());
42
  modifiedFiles = new Set<string>();
43
  artifactIdList: string[] = [];
 
44
  #globalExecutionQueue = Promise.resolve();
45
  constructor() {
46
  if (import.meta.hot) {
 
52
  }
53
 
54
  addToExecutionQueue(callback: () => Promise<void>) {
55
+ this.#globalExecutionQueue = this.#globalExecutionQueue.then(() => callback());
56
  }
57
 
58
  get previews() {
 
94
  this.#terminalStore.attachTerminal(terminal);
95
  }
96
  attachBoltTerminal(terminal: ITerminal) {
 
97
  this.#terminalStore.attachBoltTerminal(terminal);
98
  }
99
 
 
258
  this.artifacts.setKey(messageId, { ...artifact, ...state });
259
  }
260
  addAction(data: ActionCallbackData) {
261
+ this._addAction(data);
262
+
263
  // this.addToExecutionQueue(()=>this._addAction(data))
264
  }
265
  async _addAction(data: ActionCallbackData) {
 
276
 
277
  runAction(data: ActionCallbackData, isStreaming: boolean = false) {
278
  if (isStreaming) {
279
+ this._runAction(data, isStreaming);
280
+ } else {
281
+ this.addToExecutionQueue(() => this._runAction(data, isStreaming));
 
282
  }
283
  }
284
  async _runAction(data: ActionCallbackData, isStreaming: boolean = false) {
 
289
  if (!artifact) {
290
  unreachable('Artifact not found');
291
  }
292
+
293
  if (data.action.type === 'file') {
294
+ const wc = await webcontainer;
295
  const fullPath = nodePath.join(wc.workdir, data.action.filePath);
296
+
297
  if (this.selectedFile.value !== fullPath) {
298
  this.setSelectedFile(fullPath);
299
  }
300
+
301
  if (this.currentView.value !== 'code') {
302
  this.currentView.set('code');
303
  }
304
+
305
  const doc = this.#editorStore.documents.get()[fullPath];
306
+
307
  if (!doc) {
308
  await artifact.runner.runAction(data, isStreaming);
309
  }
 
383
  return syncedFiles;
384
  }
385
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
386
  async pushToGitHub(repoName: string, githubUsername: string, ghToken: string) {
 
387
  try {
388
  // Get the GitHub auth token from environment variables
389
  const githubToken = ghToken;
 
398
  const octokit = new Octokit({ auth: githubToken });
399
 
400
  // Check if the repository already exists before creating it
401
+ let repo: RestEndpointMethodTypes['repos']['get']['response']['data'];
402
+
403
  try {
404
+ const resp = await octokit.repos.get({ owner, repo: repoName });
405
+ repo = resp.data;
406
  } catch (error) {
407
  if (error instanceof Error && 'status' in error && error.status === 404) {
408
  // Repository doesn't exist, so create a new one
 
420
 
421
  // Get all files
422
  const files = this.files.get();
423
+
424
  if (!files || Object.keys(files).length === 0) {
425
  throw new Error('No files found to push');
426
  }
 
437
  });
438
  return { path: extractRelativePath(filePath), sha: blob.sha };
439
  }
440
+
441
+ return null;
442
+ }),
443
  );
444
 
445
  const validBlobs = blobs.filter(Boolean); // Filter out any undefined blobs
 
491
  console.error('Error pushing to GitHub:', error instanceof Error ? error.message : String(error));
492
  }
493
  }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
494
  }
495
 
496
  export const workbenchStore = new WorkbenchStore();
app/routes/api.chat.ts CHANGED
@@ -1,5 +1,6 @@
1
- // @ts-nocheck
2
- // Preventing TS checks with files presented in the video for a better presentation.
 
3
  import { type ActionFunctionArgs } from '@remix-run/cloudflare';
4
  import { MAX_RESPONSE_SEGMENTS, MAX_TOKENS } from '~/lib/.server/llm/constants';
5
  import { CONTINUE_PROMPT } from '~/lib/.server/llm/prompts';
@@ -14,14 +15,15 @@ function parseCookies(cookieHeader) {
14
  const cookies = {};
15
 
16
  // Split the cookie string by semicolons and spaces
17
- const items = cookieHeader.split(";").map(cookie => cookie.trim());
 
 
 
18
 
19
- items.forEach(item => {
20
- const [name, ...rest] = item.split("=");
21
  if (name && rest) {
22
  // Decode the name and value, and join value parts in case it contains '='
23
  const decodedName = decodeURIComponent(name.trim());
24
- const decodedValue = decodeURIComponent(rest.join("=").trim());
25
  cookies[decodedName] = decodedValue;
26
  }
27
  });
@@ -30,17 +32,15 @@ function parseCookies(cookieHeader) {
30
  }
31
 
32
  async function chatAction({ context, request }: ActionFunctionArgs) {
33
-
34
- const { messages, imageData, model } = await request.json<{
35
- messages: Messages,
36
- imageData?: string[],
37
- model: string
38
  }>();
39
 
40
- const cookieHeader = request.headers.get("Cookie");
41
 
42
  // Parse the cookie's value (returns an object or null if no cookie exists)
43
- const apiKeys = JSON.parse(parseCookies(cookieHeader).apiKeys || "{}");
44
 
45
  const stream = new SwitchableStream();
46
 
@@ -87,7 +87,7 @@ async function chatAction({ context, request }: ActionFunctionArgs) {
87
  if (error.message?.includes('API key')) {
88
  throw new Response('Invalid or missing API key', {
89
  status: 401,
90
- statusText: 'Unauthorized'
91
  });
92
  }
93
 
 
1
+ // eslint-disable-next-line @typescript-eslint/ban-ts-comment
2
+ // @ts-nocheck TODO: Provider proper types
3
+
4
  import { type ActionFunctionArgs } from '@remix-run/cloudflare';
5
  import { MAX_RESPONSE_SEGMENTS, MAX_TOKENS } from '~/lib/.server/llm/constants';
6
  import { CONTINUE_PROMPT } from '~/lib/.server/llm/prompts';
 
15
  const cookies = {};
16
 
17
  // Split the cookie string by semicolons and spaces
18
+ const items = cookieHeader.split(';').map((cookie) => cookie.trim());
19
+
20
+ items.forEach((item) => {
21
+ const [name, ...rest] = item.split('=');
22
 
 
 
23
  if (name && rest) {
24
  // Decode the name and value, and join value parts in case it contains '='
25
  const decodedName = decodeURIComponent(name.trim());
26
+ const decodedValue = decodeURIComponent(rest.join('=').trim());
27
  cookies[decodedName] = decodedValue;
28
  }
29
  });
 
32
  }
33
 
34
  async function chatAction({ context, request }: ActionFunctionArgs) {
35
+ const { messages, model } = await request.json<{
36
+ messages: Messages;
37
+ model: string;
 
 
38
  }>();
39
 
40
+ const cookieHeader = request.headers.get('Cookie');
41
 
42
  // Parse the cookie's value (returns an object or null if no cookie exists)
43
+ const apiKeys = JSON.parse(parseCookies(cookieHeader).apiKeys || '{}');
44
 
45
  const stream = new SwitchableStream();
46
 
 
87
  if (error.message?.includes('API key')) {
88
  throw new Response('Invalid or missing API key', {
89
  status: 401,
90
+ statusText: 'Unauthorized',
91
  });
92
  }
93
 
app/types/model.ts CHANGED
@@ -1,10 +1,10 @@
1
  import type { ModelInfo } from '~/utils/types';
2
 
3
  export type ProviderInfo = {
4
- staticModels: ModelInfo[],
5
- name: string,
6
- getDynamicModels?: () => Promise<ModelInfo[]>,
7
- getApiKeyLink?: string,
8
- labelForGetApiKey?: string,
9
- icon?:string,
10
  };
 
1
  import type { ModelInfo } from '~/utils/types';
2
 
3
  export type ProviderInfo = {
4
+ staticModels: ModelInfo[];
5
+ name: string;
6
+ getDynamicModels?: () => Promise<ModelInfo[]>;
7
+ getApiKeyLink?: string;
8
+ labelForGetApiKey?: string;
9
+ icon?: string;
10
  };
app/utils/constants.ts CHANGED
@@ -12,29 +12,42 @@ const PROVIDER_LIST: ProviderInfo[] = [
12
  {
13
  name: 'Anthropic',
14
  staticModels: [
15
- { name: 'claude-3-5-sonnet-latest', label: 'Claude 3.5 Sonnet (new)', provider: 'Anthropic', maxTokenAllowed: 8000 },
16
- { name: 'claude-3-5-sonnet-20240620', label: 'Claude 3.5 Sonnet (old)', provider: 'Anthropic', maxTokenAllowed: 8000 },
17
- { name: 'claude-3-5-haiku-latest', label: 'Claude 3.5 Haiku (new)', provider: 'Anthropic', maxTokenAllowed: 8000 },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
  { name: 'claude-3-opus-latest', label: 'Claude 3 Opus', provider: 'Anthropic', maxTokenAllowed: 8000 },
19
  { name: 'claude-3-sonnet-20240229', label: 'Claude 3 Sonnet', provider: 'Anthropic', maxTokenAllowed: 8000 },
20
- { name: 'claude-3-haiku-20240307', label: 'Claude 3 Haiku', provider: 'Anthropic', maxTokenAllowed: 8000 }
21
  ],
22
- getApiKeyLink: "https://console.anthropic.com/settings/keys",
23
  },
24
  {
25
  name: 'Ollama',
26
  staticModels: [],
27
  getDynamicModels: getOllamaModels,
28
- getApiKeyLink: "https://ollama.com/download",
29
- labelForGetApiKey: "Download Ollama",
30
- icon: "i-ph:cloud-arrow-down",
31
- }, {
 
32
  name: 'OpenAILike',
33
- staticModels: [
34
- { name: 'o1-mini', label: 'o1-mini', provider: 'OpenAILike' },
35
- { name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAILike' },
36
- ],
37
- getDynamicModels: getOpenAILikeModels
38
  },
39
  {
40
  name: 'Cohere',
@@ -50,7 +63,7 @@ const PROVIDER_LIST: ProviderInfo[] = [
50
  { name: 'c4ai-aya-expanse-8b', label: 'c4AI Aya Expanse 8b', provider: 'Cohere', maxTokenAllowed: 4096 },
51
  { name: 'c4ai-aya-expanse-32b', label: 'c4AI Aya Expanse 32b', provider: 'Cohere', maxTokenAllowed: 4096 },
52
  ],
53
- getApiKeyLink: 'https://dashboard.cohere.com/api-keys'
54
  },
55
  {
56
  name: 'OpenRouter',
@@ -59,50 +72,145 @@ const PROVIDER_LIST: ProviderInfo[] = [
59
  {
60
  name: 'anthropic/claude-3.5-sonnet',
61
  label: 'Anthropic: Claude 3.5 Sonnet (OpenRouter)',
62
- provider: 'OpenRouter'
63
- , maxTokenAllowed: 8000
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
64
  },
65
- { name: 'anthropic/claude-3-haiku', label: 'Anthropic: Claude 3 Haiku (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
66
- { name: 'deepseek/deepseek-coder', label: 'Deepseek-Coder V2 236B (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
67
- { name: 'google/gemini-flash-1.5', label: 'Google Gemini Flash 1.5 (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
68
- { name: 'google/gemini-pro-1.5', label: 'Google Gemini Pro 1.5 (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
69
  { name: 'x-ai/grok-beta', label: 'xAI Grok Beta (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
70
- { name: 'mistralai/mistral-nemo', label: 'OpenRouter Mistral Nemo (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
71
- { name: 'qwen/qwen-110b-chat', label: 'OpenRouter Qwen 110b Chat (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
72
- { name: 'cohere/command', label: 'Cohere Command (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 4096 }
 
 
 
 
 
 
 
 
 
 
73
  ],
74
  getDynamicModels: getOpenRouterModels,
75
  getApiKeyLink: 'https://openrouter.ai/settings/keys',
76
-
77
- }, {
78
  name: 'Google',
79
- staticModels: [
80
- { name: 'gemini-exp-1121', label: 'Gemini Experimental 1121', provider: 'Google' },
81
- { name: 'gemini-1.5-pro-002', label: 'Gemini 1.5 Pro 002', provider: 'Google' },
82
- { name: 'gemini-1.5-flash-latest', label: 'Gemini 1.5 Flash', provider: 'Google' },
83
- { name: 'gemini-1.5-pro-latest', label: 'Gemini 1.5 Pro', provider: 'Google' }
 
 
84
  ],
85
- getApiKeyLink: 'https://aistudio.google.com/app/apikey'
86
- }, {
 
87
  name: 'Groq',
88
  staticModels: [
89
  { name: 'llama-3.1-70b-versatile', label: 'Llama 3.1 70b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
90
  { name: 'llama-3.1-8b-instant', label: 'Llama 3.1 8b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
91
  { name: 'llama-3.2-11b-vision-preview', label: 'Llama 3.2 11b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
92
  { name: 'llama-3.2-3b-preview', label: 'Llama 3.2 3b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
93
- { name: 'llama-3.2-1b-preview', label: 'Llama 3.2 1b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }
94
  ],
95
- getApiKeyLink: 'https://console.groq.com/keys'
96
  },
97
  {
98
  name: 'HuggingFace',
99
  staticModels: [
100
- { name: 'Qwen/Qwen2.5-Coder-32B-Instruct', label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 },
101
- { name: '01-ai/Yi-1.5-34B-Chat', label: 'Yi-1.5-34B-Chat (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 },
102
- { name: 'codellama/CodeLlama-34b-Instruct-hf', label: 'CodeLlama-34b-Instruct (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 },
103
- { name: 'NousResearch/Hermes-3-Llama-3.1-8B', label: 'Hermes-3-Llama-3.1-8B (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
104
  ],
105
- getApiKeyLink: 'https://huggingface.co/settings/tokens'
106
  },
107
 
108
  {
@@ -111,23 +219,24 @@ const PROVIDER_LIST: ProviderInfo[] = [
111
  { name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAI', maxTokenAllowed: 8000 },
112
  { name: 'gpt-4-turbo', label: 'GPT-4 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 },
113
  { name: 'gpt-4', label: 'GPT-4', provider: 'OpenAI', maxTokenAllowed: 8000 },
114
- { name: 'gpt-3.5-turbo', label: 'GPT-3.5 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 }
115
  ],
116
- getApiKeyLink: "https://platform.openai.com/api-keys",
117
- }, {
 
118
  name: 'xAI',
119
- staticModels: [
120
- { name: 'grok-beta', label: 'xAI Grok Beta', provider: 'xAI', maxTokenAllowed: 8000 }
121
- ],
122
- getApiKeyLink: 'https://docs.x.ai/docs/quickstart#creating-an-api-key'
123
- }, {
124
  name: 'Deepseek',
125
  staticModels: [
126
  { name: 'deepseek-coder', label: 'Deepseek-Coder', provider: 'Deepseek', maxTokenAllowed: 8000 },
127
- { name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek', maxTokenAllowed: 8000 }
128
  ],
129
- getApiKeyLink: 'https://platform.deepseek.com/api_keys'
130
- }, {
 
131
  name: 'Mistral',
132
  staticModels: [
133
  { name: 'open-mistral-7b', label: 'Mistral 7B', provider: 'Mistral', maxTokenAllowed: 8000 },
@@ -138,27 +247,29 @@ const PROVIDER_LIST: ProviderInfo[] = [
138
  { name: 'ministral-8b-latest', label: 'Mistral 8B', provider: 'Mistral', maxTokenAllowed: 8000 },
139
  { name: 'mistral-small-latest', label: 'Mistral Small', provider: 'Mistral', maxTokenAllowed: 8000 },
140
  { name: 'codestral-latest', label: 'Codestral', provider: 'Mistral', maxTokenAllowed: 8000 },
141
- { name: 'mistral-large-latest', label: 'Mistral Large Latest', provider: 'Mistral', maxTokenAllowed: 8000 }
142
  ],
143
- getApiKeyLink: 'https://console.mistral.ai/api-keys/'
144
- }, {
 
145
  name: 'LMStudio',
146
  staticModels: [],
147
  getDynamicModels: getLMStudioModels,
148
  getApiKeyLink: 'https://lmstudio.ai/',
149
  labelForGetApiKey: 'Get LMStudio',
150
- icon: "i-ph:cloud-arrow-down",
151
- }
152
  ];
153
 
154
  export const DEFAULT_PROVIDER = PROVIDER_LIST[0];
155
 
156
- const staticModels: ModelInfo[] = PROVIDER_LIST.map(p => p.staticModels).flat();
157
 
158
  export let MODEL_LIST: ModelInfo[] = [...staticModels];
159
 
160
  const getOllamaBaseUrl = () => {
161
  const defaultBaseUrl = import.meta.env.OLLAMA_API_BASE_URL || 'http://localhost:11434';
 
162
  // Check if we're in the browser
163
  if (typeof window !== 'undefined') {
164
  // Frontend always uses localhost
@@ -168,47 +279,54 @@ const getOllamaBaseUrl = () => {
168
  // Backend: Check if we're running in Docker
169
  const isDocker = process.env.RUNNING_IN_DOCKER === 'true';
170
 
171
- return isDocker
172
- ? defaultBaseUrl.replace('localhost', 'host.docker.internal')
173
- : defaultBaseUrl;
174
  };
175
 
176
  async function getOllamaModels(): Promise<ModelInfo[]> {
 
 
 
 
177
  try {
178
- const base_url = getOllamaBaseUrl();
179
- const response = await fetch(`${base_url}/api/tags`);
180
- const data = await response.json() as OllamaApiResponse;
181
 
182
  return data.models.map((model: OllamaModel) => ({
183
  name: model.name,
184
  label: `${model.name} (${model.details.parameter_size})`,
185
  provider: 'Ollama',
186
- maxTokenAllowed:8000,
187
  }));
188
  } catch (e) {
 
189
  return [];
190
  }
191
  }
192
 
193
  async function getOpenAILikeModels(): Promise<ModelInfo[]> {
194
  try {
195
- const base_url = import.meta.env.OPENAI_LIKE_API_BASE_URL || '';
196
- if (!base_url) {
 
197
  return [];
198
  }
199
- const api_key = import.meta.env.OPENAI_LIKE_API_KEY ?? '';
200
- const response = await fetch(`${base_url}/models`, {
 
201
  headers: {
202
- Authorization: `Bearer ${api_key}`
203
- }
204
  });
205
- const res = await response.json() as any;
 
206
  return res.data.map((model: any) => ({
207
  name: model.id,
208
  label: model.id,
209
- provider: 'OpenAILike'
210
  }));
211
  } catch (e) {
 
212
  return [];
213
  }
214
  }
@@ -221,51 +339,71 @@ type OpenRouterModelsResponse = {
221
  pricing: {
222
  prompt: number;
223
  completion: number;
224
- }
225
- }[]
226
  };
227
 
228
  async function getOpenRouterModels(): Promise<ModelInfo[]> {
229
- const data: OpenRouterModelsResponse = await (await fetch('https://openrouter.ai/api/v1/models', {
230
- headers: {
231
- 'Content-Type': 'application/json'
232
- }
233
- })).json();
 
 
234
 
235
- return data.data.sort((a, b) => a.name.localeCompare(b.name)).map(m => ({
236
- name: m.id,
237
- label: `${m.name} - in:$${(m.pricing.prompt * 1_000_000).toFixed(
238
- 2)} out:$${(m.pricing.completion * 1_000_000).toFixed(2)} - context ${Math.floor(
239
- m.context_length / 1000)}k`,
240
- provider: 'OpenRouter',
241
- maxTokenAllowed:8000,
242
- }));
 
 
243
  }
244
 
245
  async function getLMStudioModels(): Promise<ModelInfo[]> {
 
 
 
 
246
  try {
247
- const base_url = import.meta.env.LMSTUDIO_API_BASE_URL || 'http://localhost:1234';
248
- const response = await fetch(`${base_url}/v1/models`);
249
- const data = await response.json() as any;
 
250
  return data.data.map((model: any) => ({
251
  name: model.id,
252
  label: model.id,
253
- provider: 'LMStudio'
254
  }));
255
  } catch (e) {
 
256
  return [];
257
  }
258
  }
259
 
260
-
261
-
262
  async function initializeModelList(): Promise<ModelInfo[]> {
263
- MODEL_LIST = [...(await Promise.all(
264
- PROVIDER_LIST
265
- .filter((p): p is ProviderInfo & { getDynamicModels: () => Promise<ModelInfo[]> } => !!p.getDynamicModels)
266
- .map(p => p.getDynamicModels())))
267
- .flat(), ...staticModels];
 
 
 
 
 
268
  return MODEL_LIST;
269
  }
270
 
271
- export { getOllamaModels, getOpenAILikeModels, getLMStudioModels, initializeModelList, getOpenRouterModels, PROVIDER_LIST };
 
 
 
 
 
 
 
 
12
  {
13
  name: 'Anthropic',
14
  staticModels: [
15
+ {
16
+ name: 'claude-3-5-sonnet-latest',
17
+ label: 'Claude 3.5 Sonnet (new)',
18
+ provider: 'Anthropic',
19
+ maxTokenAllowed: 8000,
20
+ },
21
+ {
22
+ name: 'claude-3-5-sonnet-20240620',
23
+ label: 'Claude 3.5 Sonnet (old)',
24
+ provider: 'Anthropic',
25
+ maxTokenAllowed: 8000,
26
+ },
27
+ {
28
+ name: 'claude-3-5-haiku-latest',
29
+ label: 'Claude 3.5 Haiku (new)',
30
+ provider: 'Anthropic',
31
+ maxTokenAllowed: 8000,
32
+ },
33
  { name: 'claude-3-opus-latest', label: 'Claude 3 Opus', provider: 'Anthropic', maxTokenAllowed: 8000 },
34
  { name: 'claude-3-sonnet-20240229', label: 'Claude 3 Sonnet', provider: 'Anthropic', maxTokenAllowed: 8000 },
35
+ { name: 'claude-3-haiku-20240307', label: 'Claude 3 Haiku', provider: 'Anthropic', maxTokenAllowed: 8000 },
36
  ],
37
+ getApiKeyLink: 'https://console.anthropic.com/settings/keys',
38
  },
39
  {
40
  name: 'Ollama',
41
  staticModels: [],
42
  getDynamicModels: getOllamaModels,
43
+ getApiKeyLink: 'https://ollama.com/download',
44
+ labelForGetApiKey: 'Download Ollama',
45
+ icon: 'i-ph:cloud-arrow-down',
46
+ },
47
+ {
48
  name: 'OpenAILike',
49
+ staticModels: [],
50
+ getDynamicModels: getOpenAILikeModels,
 
 
 
51
  },
52
  {
53
  name: 'Cohere',
 
63
  { name: 'c4ai-aya-expanse-8b', label: 'c4AI Aya Expanse 8b', provider: 'Cohere', maxTokenAllowed: 4096 },
64
  { name: 'c4ai-aya-expanse-32b', label: 'c4AI Aya Expanse 32b', provider: 'Cohere', maxTokenAllowed: 4096 },
65
  ],
66
+ getApiKeyLink: 'https://dashboard.cohere.com/api-keys',
67
  },
68
  {
69
  name: 'OpenRouter',
 
72
  {
73
  name: 'anthropic/claude-3.5-sonnet',
74
  label: 'Anthropic: Claude 3.5 Sonnet (OpenRouter)',
75
+ provider: 'OpenRouter',
76
+ maxTokenAllowed: 8000,
77
+ },
78
+ {
79
+ name: 'anthropic/claude-3-haiku',
80
+ label: 'Anthropic: Claude 3 Haiku (OpenRouter)',
81
+ provider: 'OpenRouter',
82
+ maxTokenAllowed: 8000,
83
+ },
84
+ {
85
+ name: 'deepseek/deepseek-coder',
86
+ label: 'Deepseek-Coder V2 236B (OpenRouter)',
87
+ provider: 'OpenRouter',
88
+ maxTokenAllowed: 8000,
89
+ },
90
+ {
91
+ name: 'google/gemini-flash-1.5',
92
+ label: 'Google Gemini Flash 1.5 (OpenRouter)',
93
+ provider: 'OpenRouter',
94
+ maxTokenAllowed: 8000,
95
+ },
96
+ {
97
+ name: 'google/gemini-pro-1.5',
98
+ label: 'Google Gemini Pro 1.5 (OpenRouter)',
99
+ provider: 'OpenRouter',
100
+ maxTokenAllowed: 8000,
101
  },
 
 
 
 
102
  { name: 'x-ai/grok-beta', label: 'xAI Grok Beta (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
103
+ {
104
+ name: 'mistralai/mistral-nemo',
105
+ label: 'OpenRouter Mistral Nemo (OpenRouter)',
106
+ provider: 'OpenRouter',
107
+ maxTokenAllowed: 8000,
108
+ },
109
+ {
110
+ name: 'qwen/qwen-110b-chat',
111
+ label: 'OpenRouter Qwen 110b Chat (OpenRouter)',
112
+ provider: 'OpenRouter',
113
+ maxTokenAllowed: 8000,
114
+ },
115
+ { name: 'cohere/command', label: 'Cohere Command (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 4096 },
116
  ],
117
  getDynamicModels: getOpenRouterModels,
118
  getApiKeyLink: 'https://openrouter.ai/settings/keys',
119
+ },
120
+ {
121
  name: 'Google',
122
+ staticModels: [
123
+ { name: 'gemini-1.5-flash-latest', label: 'Gemini 1.5 Flash', provider: 'Google', maxTokenAllowed: 8192 },
124
+ { name: 'gemini-1.5-flash-002', label: 'Gemini 1.5 Flash-002', provider: 'Google', maxTokenAllowed: 8192 },
125
+ { name: 'gemini-1.5-flash-8b', label: 'Gemini 1.5 Flash-8b', provider: 'Google', maxTokenAllowed: 8192 },
126
+ { name: 'gemini-1.5-pro-latest', label: 'Gemini 1.5 Pro', provider: 'Google', maxTokenAllowed: 8192 },
127
+ { name: 'gemini-1.5-pro-002', label: 'Gemini 1.5 Pro-002', provider: 'Google', maxTokenAllowed: 8192 },
128
+ { name: 'gemini-exp-1121', label: 'Gemini exp-1121', provider: 'Google', maxTokenAllowed: 8192 },
129
  ],
130
+ getApiKeyLink: 'https://aistudio.google.com/app/apikey',
131
+ },
132
+ {
133
  name: 'Groq',
134
  staticModels: [
135
  { name: 'llama-3.1-70b-versatile', label: 'Llama 3.1 70b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
136
  { name: 'llama-3.1-8b-instant', label: 'Llama 3.1 8b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
137
  { name: 'llama-3.2-11b-vision-preview', label: 'Llama 3.2 11b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
138
  { name: 'llama-3.2-3b-preview', label: 'Llama 3.2 3b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
139
+ { name: 'llama-3.2-1b-preview', label: 'Llama 3.2 1b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
140
  ],
141
+ getApiKeyLink: 'https://console.groq.com/keys',
142
  },
143
  {
144
  name: 'HuggingFace',
145
  staticModels: [
146
+ {
147
+ name: 'Qwen/Qwen2.5-Coder-32B-Instruct',
148
+ label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)',
149
+ provider: 'HuggingFace',
150
+ maxTokenAllowed: 8000,
151
+ },
152
+ {
153
+ name: '01-ai/Yi-1.5-34B-Chat',
154
+ label: 'Yi-1.5-34B-Chat (HuggingFace)',
155
+ provider: 'HuggingFace',
156
+ maxTokenAllowed: 8000,
157
+ },
158
+ {
159
+ name: 'codellama/CodeLlama-34b-Instruct-hf',
160
+ label: 'CodeLlama-34b-Instruct (HuggingFace)',
161
+ provider: 'HuggingFace',
162
+ maxTokenAllowed: 8000,
163
+ },
164
+ {
165
+ name: 'NousResearch/Hermes-3-Llama-3.1-8B',
166
+ label: 'Hermes-3-Llama-3.1-8B (HuggingFace)',
167
+ provider: 'HuggingFace',
168
+ maxTokenAllowed: 8000,
169
+ },
170
+ {
171
+ name: 'Qwen/Qwen2.5-Coder-32B-Instruct',
172
+ label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)',
173
+ provider: 'HuggingFace',
174
+ maxTokenAllowed: 8000,
175
+ },
176
+ {
177
+ name: 'Qwen/Qwen2.5-72B-Instruct',
178
+ label: 'Qwen2.5-72B-Instruct (HuggingFace)',
179
+ provider: 'HuggingFace',
180
+ maxTokenAllowed: 8000,
181
+ },
182
+ {
183
+ name: 'meta-llama/Llama-3.1-70B-Instruct',
184
+ label: 'Llama-3.1-70B-Instruct (HuggingFace)',
185
+ provider: 'HuggingFace',
186
+ maxTokenAllowed: 8000,
187
+ },
188
+ {
189
+ name: 'meta-llama/Llama-3.1-405B',
190
+ label: 'Llama-3.1-405B (HuggingFace)',
191
+ provider: 'HuggingFace',
192
+ maxTokenAllowed: 8000,
193
+ },
194
+ {
195
+ name: '01-ai/Yi-1.5-34B-Chat',
196
+ label: 'Yi-1.5-34B-Chat (HuggingFace)',
197
+ provider: 'HuggingFace',
198
+ maxTokenAllowed: 8000,
199
+ },
200
+ {
201
+ name: 'codellama/CodeLlama-34b-Instruct-hf',
202
+ label: 'CodeLlama-34b-Instruct (HuggingFace)',
203
+ provider: 'HuggingFace',
204
+ maxTokenAllowed: 8000,
205
+ },
206
+ {
207
+ name: 'NousResearch/Hermes-3-Llama-3.1-8B',
208
+ label: 'Hermes-3-Llama-3.1-8B (HuggingFace)',
209
+ provider: 'HuggingFace',
210
+ maxTokenAllowed: 8000,
211
+ },
212
  ],
213
+ getApiKeyLink: 'https://huggingface.co/settings/tokens',
214
  },
215
 
216
  {
 
219
  { name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAI', maxTokenAllowed: 8000 },
220
  { name: 'gpt-4-turbo', label: 'GPT-4 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 },
221
  { name: 'gpt-4', label: 'GPT-4', provider: 'OpenAI', maxTokenAllowed: 8000 },
222
+ { name: 'gpt-3.5-turbo', label: 'GPT-3.5 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 },
223
  ],
224
+ getApiKeyLink: 'https://platform.openai.com/api-keys',
225
+ },
226
+ {
227
  name: 'xAI',
228
+ staticModels: [{ name: 'grok-beta', label: 'xAI Grok Beta', provider: 'xAI', maxTokenAllowed: 8000 }],
229
+ getApiKeyLink: 'https://docs.x.ai/docs/quickstart#creating-an-api-key',
230
+ },
231
+ {
 
232
  name: 'Deepseek',
233
  staticModels: [
234
  { name: 'deepseek-coder', label: 'Deepseek-Coder', provider: 'Deepseek', maxTokenAllowed: 8000 },
235
+ { name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek', maxTokenAllowed: 8000 },
236
  ],
237
+ getApiKeyLink: 'https://platform.deepseek.com/apiKeys',
238
+ },
239
+ {
240
  name: 'Mistral',
241
  staticModels: [
242
  { name: 'open-mistral-7b', label: 'Mistral 7B', provider: 'Mistral', maxTokenAllowed: 8000 },
 
247
  { name: 'ministral-8b-latest', label: 'Mistral 8B', provider: 'Mistral', maxTokenAllowed: 8000 },
248
  { name: 'mistral-small-latest', label: 'Mistral Small', provider: 'Mistral', maxTokenAllowed: 8000 },
249
  { name: 'codestral-latest', label: 'Codestral', provider: 'Mistral', maxTokenAllowed: 8000 },
250
+ { name: 'mistral-large-latest', label: 'Mistral Large Latest', provider: 'Mistral', maxTokenAllowed: 8000 },
251
  ],
252
+ getApiKeyLink: 'https://console.mistral.ai/api-keys/',
253
+ },
254
+ {
255
  name: 'LMStudio',
256
  staticModels: [],
257
  getDynamicModels: getLMStudioModels,
258
  getApiKeyLink: 'https://lmstudio.ai/',
259
  labelForGetApiKey: 'Get LMStudio',
260
+ icon: 'i-ph:cloud-arrow-down',
261
+ },
262
  ];
263
 
264
  export const DEFAULT_PROVIDER = PROVIDER_LIST[0];
265
 
266
+ const staticModels: ModelInfo[] = PROVIDER_LIST.map((p) => p.staticModels).flat();
267
 
268
  export let MODEL_LIST: ModelInfo[] = [...staticModels];
269
 
270
  const getOllamaBaseUrl = () => {
271
  const defaultBaseUrl = import.meta.env.OLLAMA_API_BASE_URL || 'http://localhost:11434';
272
+
273
  // Check if we're in the browser
274
  if (typeof window !== 'undefined') {
275
  // Frontend always uses localhost
 
279
  // Backend: Check if we're running in Docker
280
  const isDocker = process.env.RUNNING_IN_DOCKER === 'true';
281
 
282
+ return isDocker ? defaultBaseUrl.replace('localhost', 'host.docker.internal') : defaultBaseUrl;
 
 
283
  };
284
 
285
  async function getOllamaModels(): Promise<ModelInfo[]> {
286
+ if (typeof window === 'undefined') {
287
+ return [];
288
+ }
289
+
290
  try {
291
+ const baseUrl = getOllamaBaseUrl();
292
+ const response = await fetch(`${baseUrl}/api/tags`);
293
+ const data = (await response.json()) as OllamaApiResponse;
294
 
295
  return data.models.map((model: OllamaModel) => ({
296
  name: model.name,
297
  label: `${model.name} (${model.details.parameter_size})`,
298
  provider: 'Ollama',
299
+ maxTokenAllowed: 8000,
300
  }));
301
  } catch (e) {
302
+ console.error('Error getting Ollama models:', e);
303
  return [];
304
  }
305
  }
306
 
307
  async function getOpenAILikeModels(): Promise<ModelInfo[]> {
308
  try {
309
+ const baseUrl = import.meta.env.OPENAI_LIKE_API_BASE_URL || '';
310
+
311
+ if (!baseUrl) {
312
  return [];
313
  }
314
+
315
+ const apiKey = import.meta.env.OPENAI_LIKE_API_KEY ?? '';
316
+ const response = await fetch(`${baseUrl}/models`, {
317
  headers: {
318
+ Authorization: `Bearer ${apiKey}`,
319
+ },
320
  });
321
+ const res = (await response.json()) as any;
322
+
323
  return res.data.map((model: any) => ({
324
  name: model.id,
325
  label: model.id,
326
+ provider: 'OpenAILike',
327
  }));
328
  } catch (e) {
329
+ console.error('Error getting OpenAILike models:', e);
330
  return [];
331
  }
332
  }
 
339
  pricing: {
340
  prompt: number;
341
  completion: number;
342
+ };
343
+ }[];
344
  };
345
 
346
  async function getOpenRouterModels(): Promise<ModelInfo[]> {
347
+ const data: OpenRouterModelsResponse = await (
348
+ await fetch('https://openrouter.ai/api/v1/models', {
349
+ headers: {
350
+ 'Content-Type': 'application/json',
351
+ },
352
+ })
353
+ ).json();
354
 
355
+ return data.data
356
+ .sort((a, b) => a.name.localeCompare(b.name))
357
+ .map((m) => ({
358
+ name: m.id,
359
+ label: `${m.name} - in:$${(m.pricing.prompt * 1_000_000).toFixed(
360
+ 2,
361
+ )} out:$${(m.pricing.completion * 1_000_000).toFixed(2)} - context ${Math.floor(m.context_length / 1000)}k`,
362
+ provider: 'OpenRouter',
363
+ maxTokenAllowed: 8000,
364
+ }));
365
  }
366
 
367
  async function getLMStudioModels(): Promise<ModelInfo[]> {
368
+ if (typeof window === 'undefined') {
369
+ return [];
370
+ }
371
+
372
  try {
373
+ const baseUrl = import.meta.env.LMSTUDIO_API_BASE_URL || 'http://localhost:1234';
374
+ const response = await fetch(`${baseUrl}/v1/models`);
375
+ const data = (await response.json()) as any;
376
+
377
  return data.data.map((model: any) => ({
378
  name: model.id,
379
  label: model.id,
380
+ provider: 'LMStudio',
381
  }));
382
  } catch (e) {
383
+ console.error('Error getting LMStudio models:', e);
384
  return [];
385
  }
386
  }
387
 
 
 
388
  async function initializeModelList(): Promise<ModelInfo[]> {
389
+ MODEL_LIST = [
390
+ ...(
391
+ await Promise.all(
392
+ PROVIDER_LIST.filter(
393
+ (p): p is ProviderInfo & { getDynamicModels: () => Promise<ModelInfo[]> } => !!p.getDynamicModels,
394
+ ).map((p) => p.getDynamicModels()),
395
+ )
396
+ ).flat(),
397
+ ...staticModels,
398
+ ];
399
  return MODEL_LIST;
400
  }
401
 
402
+ export {
403
+ getOllamaModels,
404
+ getOpenAILikeModels,
405
+ getLMStudioModels,
406
+ initializeModelList,
407
+ getOpenRouterModels,
408
+ PROVIDER_LIST,
409
+ };
app/utils/shell.ts CHANGED
@@ -52,67 +52,77 @@ export async function newShellProcess(webcontainer: WebContainer, terminal: ITer
52
  return process;
53
  }
54
 
55
-
56
 
57
  export class BoltShell {
58
- #initialized: (() => void) | undefined
59
- #readyPromise: Promise<void>
60
- #webcontainer: WebContainer | undefined
61
- #terminal: ITerminal | undefined
62
- #process: WebContainerProcess | undefined
63
- executionState = atom<{ sessionId: string, active: boolean, executionPrms?: Promise<any> } | undefined>()
64
- #outputStream: ReadableStreamDefaultReader<string> | undefined
65
- #shellInputStream: WritableStreamDefaultWriter<string> | undefined
 
66
  constructor() {
67
  this.#readyPromise = new Promise((resolve) => {
68
- this.#initialized = resolve
69
- })
70
  }
 
71
  ready() {
72
  return this.#readyPromise;
73
  }
 
74
  async init(webcontainer: WebContainer, terminal: ITerminal) {
75
- this.#webcontainer = webcontainer
76
- this.#terminal = terminal
77
- let callback = (data: string) => {
78
- console.log(data)
79
- }
80
- let { process, output } = await this.newBoltShellProcess(webcontainer, terminal)
81
- this.#process = process
82
- this.#outputStream = output.getReader()
83
- await this.waitTillOscCode('interactive')
84
- this.#initialized?.()
85
  }
 
86
  get terminal() {
87
- return this.#terminal
88
  }
 
89
  get process() {
90
- return this.#process
91
  }
92
- async executeCommand(sessionId: string, command: string) {
 
93
  if (!this.process || !this.terminal) {
94
- return
95
  }
96
- let state = this.executionState.get()
97
 
98
- //interrupt the current execution
99
- // this.#shellInputStream?.write('\x03');
 
 
 
 
100
  this.terminal.input('\x03');
 
101
  if (state && state.executionPrms) {
102
- await state.executionPrms
103
  }
 
104
  //start a new execution
105
  this.terminal.input(command.trim() + '\n');
106
 
107
  //wait for the execution to finish
108
- let executionPrms = this.getCurrentExecutionResult()
109
- this.executionState.set({ sessionId, active: true, executionPrms })
110
 
111
- let resp = await executionPrms
112
- this.executionState.set({ sessionId, active: false })
113
- return resp
114
 
 
115
  }
 
116
  async newBoltShellProcess(webcontainer: WebContainer, terminal: ITerminal) {
117
  const args: string[] = [];
118
 
@@ -126,6 +136,7 @@ export class BoltShell {
126
 
127
  const input = process.input.getWriter();
128
  this.#shellInputStream = input;
 
129
  const [internalOutput, terminalOutput] = process.output.tee();
130
 
131
  const jshReady = withResolvers<void>();
@@ -162,34 +173,48 @@ export class BoltShell {
162
 
163
  return { process, output: internalOutput };
164
  }
165
- async getCurrentExecutionResult() {
166
- let { output, exitCode } = await this.waitTillOscCode('exit')
 
167
  return { output, exitCode };
168
  }
 
169
  async waitTillOscCode(waitCode: string) {
170
  let fullOutput = '';
171
  let exitCode: number = 0;
172
- if (!this.#outputStream) return { output: fullOutput, exitCode };
173
- let tappedStream = this.#outputStream
 
 
 
 
174
 
175
  while (true) {
176
  const { value, done } = await tappedStream.read();
177
- if (done) break;
 
 
 
 
178
  const text = value || '';
179
  fullOutput += text;
180
 
181
  // Check if command completion signal with exit code
182
- const [, osc, , pid, code] = text.match(/\x1b\]654;([^\x07=]+)=?((-?\d+):(\d+))?\x07/) || [];
 
183
  if (osc === 'exit') {
184
  exitCode = parseInt(code, 10);
185
  }
 
186
  if (osc === waitCode) {
187
  break;
188
  }
189
  }
 
190
  return { output: fullOutput, exitCode };
191
  }
192
  }
 
193
  export function newBoltShellProcess() {
194
  return new BoltShell();
195
  }
 
52
  return process;
53
  }
54
 
55
+ export type ExecutionResult = { output: string; exitCode: number } | undefined;
56
 
57
  export class BoltShell {
58
+ #initialized: (() => void) | undefined;
59
+ #readyPromise: Promise<void>;
60
+ #webcontainer: WebContainer | undefined;
61
+ #terminal: ITerminal | undefined;
62
+ #process: WebContainerProcess | undefined;
63
+ executionState = atom<{ sessionId: string; active: boolean; executionPrms?: Promise<any> } | undefined>();
64
+ #outputStream: ReadableStreamDefaultReader<string> | undefined;
65
+ #shellInputStream: WritableStreamDefaultWriter<string> | undefined;
66
+
67
  constructor() {
68
  this.#readyPromise = new Promise((resolve) => {
69
+ this.#initialized = resolve;
70
+ });
71
  }
72
+
73
  ready() {
74
  return this.#readyPromise;
75
  }
76
+
77
  async init(webcontainer: WebContainer, terminal: ITerminal) {
78
+ this.#webcontainer = webcontainer;
79
+ this.#terminal = terminal;
80
+
81
+ const { process, output } = await this.newBoltShellProcess(webcontainer, terminal);
82
+ this.#process = process;
83
+ this.#outputStream = output.getReader();
84
+ await this.waitTillOscCode('interactive');
85
+ this.#initialized?.();
 
 
86
  }
87
+
88
  get terminal() {
89
+ return this.#terminal;
90
  }
91
+
92
  get process() {
93
+ return this.#process;
94
  }
95
+
96
+ async executeCommand(sessionId: string, command: string): Promise<ExecutionResult> {
97
  if (!this.process || !this.terminal) {
98
+ return undefined;
99
  }
 
100
 
101
+ const state = this.executionState.get();
102
+
103
+ /*
104
+ * interrupt the current execution
105
+ * this.#shellInputStream?.write('\x03');
106
+ */
107
  this.terminal.input('\x03');
108
+
109
  if (state && state.executionPrms) {
110
+ await state.executionPrms;
111
  }
112
+
113
  //start a new execution
114
  this.terminal.input(command.trim() + '\n');
115
 
116
  //wait for the execution to finish
117
+ const executionPromise = this.getCurrentExecutionResult();
118
+ this.executionState.set({ sessionId, active: true, executionPrms: executionPromise });
119
 
120
+ const resp = await executionPromise;
121
+ this.executionState.set({ sessionId, active: false });
 
122
 
123
+ return resp;
124
  }
125
+
126
  async newBoltShellProcess(webcontainer: WebContainer, terminal: ITerminal) {
127
  const args: string[] = [];
128
 
 
136
 
137
  const input = process.input.getWriter();
138
  this.#shellInputStream = input;
139
+
140
  const [internalOutput, terminalOutput] = process.output.tee();
141
 
142
  const jshReady = withResolvers<void>();
 
173
 
174
  return { process, output: internalOutput };
175
  }
176
+
177
+ async getCurrentExecutionResult(): Promise<ExecutionResult> {
178
+ const { output, exitCode } = await this.waitTillOscCode('exit');
179
  return { output, exitCode };
180
  }
181
+
182
  async waitTillOscCode(waitCode: string) {
183
  let fullOutput = '';
184
  let exitCode: number = 0;
185
+
186
+ if (!this.#outputStream) {
187
+ return { output: fullOutput, exitCode };
188
+ }
189
+
190
+ const tappedStream = this.#outputStream;
191
 
192
  while (true) {
193
  const { value, done } = await tappedStream.read();
194
+
195
+ if (done) {
196
+ break;
197
+ }
198
+
199
  const text = value || '';
200
  fullOutput += text;
201
 
202
  // Check if command completion signal with exit code
203
+ const [, osc, , , code] = text.match(/\x1b\]654;([^\x07=]+)=?((-?\d+):(\d+))?\x07/) || [];
204
+
205
  if (osc === 'exit') {
206
  exitCode = parseInt(code, 10);
207
  }
208
+
209
  if (osc === waitCode) {
210
  break;
211
  }
212
  }
213
+
214
  return { output: fullOutput, exitCode };
215
  }
216
  }
217
+
218
  export function newBoltShellProcess() {
219
  return new BoltShell();
220
  }
app/utils/types.ts CHANGED
@@ -1,4 +1,3 @@
1
-
2
  interface OllamaModelDetails {
3
  parent_model: string;
4
  format: string;
@@ -29,10 +28,10 @@ export interface ModelInfo {
29
  }
30
 
31
  export interface ProviderInfo {
32
- staticModels: ModelInfo[],
33
- name: string,
34
- getDynamicModels?: () => Promise<ModelInfo[]>,
35
- getApiKeyLink?: string,
36
- labelForGetApiKey?: string,
37
- icon?:string,
38
- };
 
 
1
  interface OllamaModelDetails {
2
  parent_model: string;
3
  format: string;
 
28
  }
29
 
30
  export interface ProviderInfo {
31
+ staticModels: ModelInfo[];
32
+ name: string;
33
+ getDynamicModels?: () => Promise<ModelInfo[]>;
34
+ getApiKeyLink?: string;
35
+ labelForGetApiKey?: string;
36
+ icon?: string;
37
+ }
eslint.config.mjs CHANGED
@@ -12,6 +12,8 @@ export default [
12
  '@blitz/catch-error-name': 'off',
13
  '@typescript-eslint/no-this-alias': 'off',
14
  '@typescript-eslint/no-empty-object-type': 'off',
 
 
15
  },
16
  },
17
  {
 
12
  '@blitz/catch-error-name': 'off',
13
  '@typescript-eslint/no-this-alias': 'off',
14
  '@typescript-eslint/no-empty-object-type': 'off',
15
+ '@blitz/comment-syntax': 'off',
16
+ '@blitz/block-scope-case': 'off',
17
  },
18
  },
19
  {
package.json CHANGED
@@ -11,8 +11,8 @@
11
  "dev": "remix vite:dev",
12
  "test": "vitest --run",
13
  "test:watch": "vitest",
14
- "lint": "eslint --cache --cache-location ./node_modules/.cache/eslint .",
15
- "lint:fix": "npm run lint -- --fix",
16
  "start": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings",
17
  "dockerstart": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings --ip 0.0.0.0 --port 5173 --no-show-interactive-dev-session",
18
  "dockerrun": "docker run -it -d --name bolt-ai-live -p 5173:5173 --env-file .env.local bolt-ai",
@@ -20,7 +20,8 @@
20
  "dockerbuild": "docker build -t bolt-ai:development -t bolt-ai:latest --target bolt-ai-development .",
21
  "typecheck": "tsc",
22
  "typegen": "wrangler types",
23
- "preview": "pnpm run build && pnpm run start"
 
24
  },
25
  "engines": {
26
  "node": ">=18.18.0"
@@ -70,6 +71,7 @@
70
  "diff": "^5.2.0",
71
  "file-saver": "^2.0.5",
72
  "framer-motion": "^11.2.12",
 
73
  "isbot": "^4.1.0",
74
  "istextorbinary": "^9.5.0",
75
  "jose": "^5.6.3",
@@ -101,6 +103,7 @@
101
  "@types/react": "^18.2.20",
102
  "@types/react-dom": "^18.2.7",
103
  "fast-glob": "^3.3.2",
 
104
  "is-ci": "^3.0.1",
105
  "node-fetch": "^3.3.2",
106
  "prettier": "^3.3.2",
 
11
  "dev": "remix vite:dev",
12
  "test": "vitest --run",
13
  "test:watch": "vitest",
14
+ "lint": "eslint --cache --cache-location ./node_modules/.cache/eslint app",
15
+ "lint:fix": "npm run lint -- --fix && prettier app --write",
16
  "start": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings",
17
  "dockerstart": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings --ip 0.0.0.0 --port 5173 --no-show-interactive-dev-session",
18
  "dockerrun": "docker run -it -d --name bolt-ai-live -p 5173:5173 --env-file .env.local bolt-ai",
 
20
  "dockerbuild": "docker build -t bolt-ai:development -t bolt-ai:latest --target bolt-ai-development .",
21
  "typecheck": "tsc",
22
  "typegen": "wrangler types",
23
+ "preview": "pnpm run build && pnpm run start",
24
+ "prepare": "husky"
25
  },
26
  "engines": {
27
  "node": ">=18.18.0"
 
71
  "diff": "^5.2.0",
72
  "file-saver": "^2.0.5",
73
  "framer-motion": "^11.2.12",
74
+ "ignore": "^6.0.2",
75
  "isbot": "^4.1.0",
76
  "istextorbinary": "^9.5.0",
77
  "jose": "^5.6.3",
 
103
  "@types/react": "^18.2.20",
104
  "@types/react-dom": "^18.2.7",
105
  "fast-glob": "^3.3.2",
106
+ "husky": "9.1.7",
107
  "is-ci": "^3.0.1",
108
  "node-fetch": "^3.3.2",
109
  "prettier": "^3.3.2",
pnpm-lock.yaml CHANGED
@@ -143,6 +143,9 @@ importers:
143
  framer-motion:
144
  specifier: ^11.2.12
145
 
 
 
146
  isbot:
147
  specifier: ^4.1.0
148
  version: 4.4.0
@@ -231,6 +234,9 @@ importers:
231
  fast-glob:
232
  specifier: ^3.3.2
233
  version: 3.3.2
 
 
 
234
  is-ci:
235
  specifier: ^3.0.1
236
  version: 3.0.1
@@ -2482,7 +2488,7 @@ packages:
2482
  resolution: {integrity: sha512-HpGFw18DgFWlncDfjTa2rcQ4W88O1mC8e8yZ2AvQY5KDaktSTwo+KRf6nHK6FRI5FyRyb/5T6+TSxfP7QyGsmQ==}
2483
 
2484
2485
- resolution: {integrity: sha512-pMhOfFDPiv9t5jjIXkHosWmkSyQbvsgEVNkz0ERHbuLh2T/7j4Mqqpz523Fe8MVY89KC6Sh/QfS2sM+SjgFDcw==}
2486
  engines: {node: '>= 0.8'}
2487
 
2488
@@ -3382,6 +3388,11 @@ packages:
3382
  resolution: {integrity: sha512-AXcZb6vzzrFAUE61HnN4mpLqd/cSIwNQjtNWR0euPm6y0iqx3G4gOXaIDdtdDwZmhwe82LA6+zinmW4UBWVePQ==}
3383
  engines: {node: '>=16.17.0'}
3384
 
 
 
 
 
 
3385
3386
  resolution: {integrity: sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==}
3387
  engines: {node: '>=0.10.0'}
@@ -3399,6 +3410,10 @@ packages:
3399
  resolution: {integrity: sha512-5Fytz/IraMjqpwfd34ke28PTVMjZjJG2MPn5t7OE4eUCUNf8BAa7b5WUS9/Qvr6mwOQS7Mk6vdsMno5he+T8Xw==}
3400
  engines: {node: '>= 4'}
3401
 
 
 
 
 
3402
3403
  resolution: {integrity: sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ==}
3404
 
@@ -9278,6 +9293,8 @@ snapshots:
9278
 
9279
9280
 
 
 
9281
9282
  dependencies:
9283
  safer-buffer: 2.1.2
@@ -9290,6 +9307,8 @@ snapshots:
9290
 
9291
9292
 
 
 
9293
9294
 
9295
 
143
  framer-motion:
144
  specifier: ^11.2.12
145
146
+ ignore:
147
+ specifier: ^6.0.2
148
+ version: 6.0.2
149
  isbot:
150
  specifier: ^4.1.0
151
  version: 4.4.0
 
234
  fast-glob:
235
  specifier: ^3.3.2
236
  version: 3.3.2
237
+ husky:
238
+ specifier: 9.1.7
239
+ version: 9.1.7
240
  is-ci:
241
  specifier: ^3.0.1
242
  version: 3.0.1
 
2488
  resolution: {integrity: sha512-HpGFw18DgFWlncDfjTa2rcQ4W88O1mC8e8yZ2AvQY5KDaktSTwo+KRf6nHK6FRI5FyRyb/5T6+TSxfP7QyGsmQ==}
2489
 
2490
2491
+ resolution: {integrity: sha1-0ygVQE1olpn4Wk6k+odV3ROpYEg=}
2492
  engines: {node: '>= 0.8'}
2493
 
2494
 
3388
  resolution: {integrity: sha512-AXcZb6vzzrFAUE61HnN4mpLqd/cSIwNQjtNWR0euPm6y0iqx3G4gOXaIDdtdDwZmhwe82LA6+zinmW4UBWVePQ==}
3389
  engines: {node: '>=16.17.0'}
3390
 
3391
3392
+ resolution: {integrity: sha512-5gs5ytaNjBrh5Ow3zrvdUUY+0VxIuWVL4i9irt6friV+BqdCfmV11CQTWMiBYWHbXhco+J1kHfTOUkePhCDvMA==}
3393
+ engines: {node: '>=18'}
3394
+ hasBin: true
3395
+
3396
3397
  resolution: {integrity: sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==}
3398
  engines: {node: '>=0.10.0'}
 
3410
  resolution: {integrity: sha512-5Fytz/IraMjqpwfd34ke28PTVMjZjJG2MPn5t7OE4eUCUNf8BAa7b5WUS9/Qvr6mwOQS7Mk6vdsMno5he+T8Xw==}
3411
  engines: {node: '>= 4'}
3412
 
3413
3414
+ resolution: {integrity: sha512-InwqeHHN2XpumIkMvpl/DCJVrAHgCsG5+cn1XlnLWGwtZBm8QJfSusItfrwx81CTp5agNZqpKU2J/ccC5nGT4A==}
3415
+ engines: {node: '>= 4'}
3416
+
3417
3418
  resolution: {integrity: sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ==}
3419
 
 
9293
 
9294
9295
 
9296
9297
+
9298
9299
  dependencies:
9300
  safer-buffer: 2.1.2
 
9307
 
9308
9309
 
9310
9311
+
9312
9313
 
9314
worker-configuration.d.ts CHANGED
@@ -9,4 +9,7 @@ interface Env {
9
  OPENAI_LIKE_API_BASE_URL: string;
10
  DEEPSEEK_API_KEY: string;
11
  LMSTUDIO_API_BASE_URL: string;
 
 
 
12
  }
 
9
  OPENAI_LIKE_API_BASE_URL: string;
10
  DEEPSEEK_API_KEY: string;
11
  LMSTUDIO_API_BASE_URL: string;
12
+ GOOGLE_GENERATIVE_AI_API_KEY: string;
13
+ MISTRAL_API_KEY: string;
14
+ XAI_API_KEY: string;
15
  }