Alex Parker
commited on
docs: updated env.example of OLLAMA & LMSTUDIO base url (#877)
Browse files* correct OLLAMA_API_BASE_URL
* correct OLLAMA_API_BASE_URL
* correct OLLAMA_API_BASE_URL
- .env.example +4 -2
.env.example
CHANGED
@@ -32,7 +32,8 @@ OPEN_ROUTER_API_KEY=
|
|
32 |
GOOGLE_GENERATIVE_AI_API_KEY=
|
33 |
|
34 |
# You only need this environment variable set if you want to use oLLAMA models
|
35 |
-
#
|
|
|
36 |
OLLAMA_API_BASE_URL=
|
37 |
|
38 |
# You only need this environment variable set if you want to use OpenAI Like models
|
@@ -62,7 +63,8 @@ COHERE_API_KEY=
|
|
62 |
|
63 |
# Get LMStudio Base URL from LM Studio Developer Console
|
64 |
# Make sure to enable CORS
|
65 |
-
#
|
|
|
66 |
LMSTUDIO_API_BASE_URL=
|
67 |
|
68 |
# Get your xAI API key
|
|
|
32 |
GOOGLE_GENERATIVE_AI_API_KEY=
|
33 |
|
34 |
# You only need this environment variable set if you want to use oLLAMA models
|
35 |
+
# DONT USE http://localhost:11434 due to IPV6 issues
|
36 |
+
# USE EXAMPLE http://127.0.0.1:11434
|
37 |
OLLAMA_API_BASE_URL=
|
38 |
|
39 |
# You only need this environment variable set if you want to use OpenAI Like models
|
|
|
63 |
|
64 |
# Get LMStudio Base URL from LM Studio Developer Console
|
65 |
# Make sure to enable CORS
|
66 |
+
# DONT USE http://localhost:1234 due to IPV6 issues
|
67 |
+
# Example: http://127.0.0.1:1234
|
68 |
LMSTUDIO_API_BASE_URL=
|
69 |
|
70 |
# Get your xAI API key
|