Dataset Viewer
Auto-converted to Parquet Duplicate
url
stringlengths
51
54
repository_url
stringclasses
1 value
labels_url
stringlengths
65
68
comments_url
stringlengths
60
63
events_url
stringlengths
58
61
html_url
stringlengths
39
44
id
int64
1.78B
2.82B
node_id
stringlengths
18
19
number
int64
1
8.69k
title
stringlengths
1
382
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
1 class
assignee
dict
assignees
listlengths
0
2
milestone
null
comments
int64
0
323
created_at
timestamp[s]
updated_at
timestamp[s]
closed_at
timestamp[s]
author_association
stringclasses
4 values
sub_issues_summary
dict
active_lock_reason
null
draft
bool
2 classes
pull_request
dict
body
stringlengths
2
118k
closed_by
dict
reactions
dict
timeline_url
stringlengths
60
63
performed_via_github_app
null
state_reason
stringclasses
4 values
is_pull_request
bool
2 classes
https://api.github.com/repos/ollama/ollama/issues/2979
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2979/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2979/comments
https://api.github.com/repos/ollama/ollama/issues/2979/events
https://github.com/ollama/ollama/issues/2979
2,173,683,721
I_kwDOJ0Z1Ps6Bj8gJ
2,979
Starcoder2 crashing ollama docker version 0.1.28
{ "login": "tilllt", "id": 1854364, "node_id": "MDQ6VXNlcjE4NTQzNjQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1854364?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tilllt", "html_url": "https://github.com/tilllt", "followers_url": "https://api.github.com/users/tilllt/foll...
[]
closed
false
null
[]
null
2
2024-03-07T11:51:04
2024-03-07T12:49:55
2024-03-07T12:49:54
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I noticed that the ollama version shipped as docker container has been updated to 0.1.28 and thus should run starcoder2 and gemma models - i am still not having luck running those, ollama just crashes... am i missing something? https://pastebin.com/ALJRfZZ5
{ "login": "tilllt", "id": 1854364, "node_id": "MDQ6VXNlcjE4NTQzNjQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1854364?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tilllt", "html_url": "https://github.com/tilllt", "followers_url": "https://api.github.com/users/tilllt/foll...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2979/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2979/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8660
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8660/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8660/comments
https://api.github.com/repos/ollama/ollama/issues/8660/events
https://github.com/ollama/ollama/issues/8660
2,818,256,756
I_kwDOJ0Z1Ps6n-y90
8,660
GPU Memory Not Released After Exiting deepseek-r1:32b Model
{ "login": "Sebjac06", "id": 172889704, "node_id": "U_kgDOCk4WaA", "avatar_url": "https://avatars.githubusercontent.com/u/172889704?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Sebjac06", "html_url": "https://github.com/Sebjac06", "followers_url": "https://api.github.com/users/Sebjac06/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2025-01-29T13:41:07
2025-01-29T13:51:19
2025-01-29T13:51:19
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? - Ollama Version: 0.5.7 - Model: deepseek-r1:32b - GPU: NVIDIA RTX 3090 (24GB VRAM) - OS: Windows 11 (include build version if known) After running the `deepseek-r1:32b` model via `ollama run deepseek-r1:32b` and exiting with `/bye` in my terminal, the GPU's dedicated memory remains fully alloc...
{ "login": "Sebjac06", "id": 172889704, "node_id": "U_kgDOCk4WaA", "avatar_url": "https://avatars.githubusercontent.com/u/172889704?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Sebjac06", "html_url": "https://github.com/Sebjac06", "followers_url": "https://api.github.com/users/Sebjac06/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8660/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8660/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3362
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3362/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3362/comments
https://api.github.com/repos/ollama/ollama/issues/3362/events
https://github.com/ollama/ollama/issues/3362
2,208,545,720
I_kwDOJ0Z1Ps6Do7u4
3,362
Report better error on windows on port conflict with winnat
{ "login": "Canman1963", "id": 133131797, "node_id": "U_kgDOB-9uFQ", "avatar_url": "https://avatars.githubusercontent.com/u/133131797?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Canman1963", "html_url": "https://github.com/Canman1963", "followers_url": "https://api.github.com/users/Can...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5860134234, "node_id": ...
open
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
4
2024-03-26T15:15:16
2024-04-28T19:01:09
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi DevOps My Ollama was working fine for me until I tried to use it today not sure what has happened. The LOGS show this repeated Crash and attempt to reload in the app.log Time=2024-03-25T12:09:31.329-05:00 level=INFO source=logging.go:45 msg="ollama app started" time=2024-03-25T12:09:31.389-05:00 level=INFO sou...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3362/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3362/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/5953
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5953/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5953/comments
https://api.github.com/repos/ollama/ollama/issues/5953/events
https://github.com/ollama/ollama/issues/5953
2,430,295,149
I_kwDOJ0Z1Ps6Q21xt
5,953
Who are you?
{ "login": "t7aliang", "id": 11693120, "node_id": "MDQ6VXNlcjExNjkzMTIw", "avatar_url": "https://avatars.githubusercontent.com/u/11693120?v=4", "gravatar_id": "", "url": "https://api.github.com/users/t7aliang", "html_url": "https://github.com/t7aliang", "followers_url": "https://api.github.com/users/t7a...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-07-25T15:23:15
2024-07-26T14:01:03
2024-07-26T14:01:03
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? ![Screenshot 2024-07-25 at 11 18 10 PM](https://github.com/user-attachments/assets/5a06a6fd-3e8c-4db9-84f7-5f076acdd4be) ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.2.8
{ "login": "t7aliang", "id": 11693120, "node_id": "MDQ6VXNlcjExNjkzMTIw", "avatar_url": "https://avatars.githubusercontent.com/u/11693120?v=4", "gravatar_id": "", "url": "https://api.github.com/users/t7aliang", "html_url": "https://github.com/t7aliang", "followers_url": "https://api.github.com/users/t7a...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5953/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/5953/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5602
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5602/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5602/comments
https://api.github.com/repos/ollama/ollama/issues/5602/events
https://github.com/ollama/ollama/issues/5602
2,400,911,034
I_kwDOJ0Z1Ps6PGv66
5,602
Running latest version 0.2.1 running slowly and not returning output for long text input
{ "login": "jillvillany", "id": 42828003, "node_id": "MDQ6VXNlcjQyODI4MDAz", "avatar_url": "https://avatars.githubusercontent.com/u/42828003?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jillvillany", "html_url": "https://github.com/jillvillany", "followers_url": "https://api.github.com/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5808482718, "node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng...
open
false
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[ { "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.g...
null
3
2024-07-10T14:19:02
2024-10-16T16:18:22
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I am running ollama on an AWS ml.p3.2xlarge SageMaker notebook instance. When I install the latest version, 0.2.1, the response time on a langchain chain running an extract names prompt on a page of text using llama3:latest is about 8 seconds and doesn't return any names. However, when I ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5602/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5602/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1876
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1876/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1876/comments
https://api.github.com/repos/ollama/ollama/issues/1876/events
https://github.com/ollama/ollama/issues/1876
2,073,129,789
I_kwDOJ0Z1Ps57kXM9
1,876
ollama list flags help
{ "login": "iplayfast", "id": 751306, "node_id": "MDQ6VXNlcjc1MTMwNg==", "avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4", "gravatar_id": "", "url": "https://api.github.com/users/iplayfast", "html_url": "https://github.com/iplayfast", "followers_url": "https://api.github.com/users/ipla...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 6960960225, "node_id": ...
open
false
null
[]
null
5
2024-01-09T20:35:48
2024-10-26T21:58:36
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
There is no obvious way of seeing what flags are available for ollama list ``` ollama list --help List models Usage: ollama list [flags] Aliases: list, ls Flags: -h, --help help for list ```
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1876/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1876/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/2787
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2787/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2787/comments
https://api.github.com/repos/ollama/ollama/issues/2787/events
https://github.com/ollama/ollama/issues/2787
2,157,567,148
I_kwDOJ0Z1Ps6Amdys
2,787
bug? - session save does not save latest messages of the chat
{ "login": "FotisK", "id": 7896645, "node_id": "MDQ6VXNlcjc4OTY2NDU=", "avatar_url": "https://avatars.githubusercontent.com/u/7896645?v=4", "gravatar_id": "", "url": "https://api.github.com/users/FotisK", "html_url": "https://github.com/FotisK", "followers_url": "https://api.github.com/users/FotisK/foll...
[]
closed
false
null
[]
null
1
2024-02-27T20:43:59
2024-05-17T01:50:42
2024-05-17T01:50:42
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I was having a very long conversation with nollama/mythomax-l2-13b:Q5_K_S, saved the session and restored it and found that the latest 100-200 lines of the discussion were missing. I haven't tried to reproduce it (I don't have lengthy chats often), but I thought I'd report it. When I get another chance, I'll test it ag...
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2787/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2787/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5307
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5307/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5307/comments
https://api.github.com/repos/ollama/ollama/issues/5307/events
https://github.com/ollama/ollama/pull/5307
2,376,001,387
PR_kwDOJ0Z1Ps5zq6Yg
5,307
Ollama Show: Check for Projector Type
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjha...
[]
closed
false
null
[]
null
1
2024-06-26T18:22:07
2024-06-28T18:30:19
2024-06-28T18:30:17
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5307", "html_url": "https://github.com/ollama/ollama/pull/5307", "diff_url": "https://github.com/ollama/ollama/pull/5307.diff", "patch_url": "https://github.com/ollama/ollama/pull/5307.patch", "merged_at": "2024-06-28T18:30:17" }
Fixes #5289 <img width="410" alt="Screenshot 2024-06-26 at 11 21 57 AM" src="https://github.com/ollama/ollama/assets/65097070/4ae18164-e5c2-453b-91d4-de54569b8e11">
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjha...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5307/reactions", "total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5307/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7499
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7499/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7499/comments
https://api.github.com/repos/ollama/ollama/issues/7499/events
https://github.com/ollama/ollama/pull/7499
2,634,169,544
PR_kwDOJ0Z1Ps6A3i20
7,499
build: Make target improvements
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[]
closed
false
null
[]
null
59
2024-11-05T00:47:49
2025-01-18T02:06:48
2024-12-10T17:47:19
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7499", "html_url": "https://github.com/ollama/ollama/pull/7499", "diff_url": "https://github.com/ollama/ollama/pull/7499.diff", "patch_url": "https://github.com/ollama/ollama/pull/7499.patch", "merged_at": "2024-12-10T17:47:19" }
Add a few new targets and help for building locally. This also adjusts the runner lookup to favor local builds, then runners relative to the executable. Fixes #7491 Fixes #7483 Fixes #7452 Fixes #2187 Fixes #2205 Fixes #2281 Fixes #7457 Fixes #7622 Fixes #7577 Fixes #1756 Fixes #7817 Fixes #6857 ...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7499/reactions", "total_count": 17, "+1": 7, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 10, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7499/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3528
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3528/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3528/comments
https://api.github.com/repos/ollama/ollama/issues/3528/events
https://github.com/ollama/ollama/pull/3528
2,229,959,636
PR_kwDOJ0Z1Ps5r8bOC
3,528
Update generate scripts with new `LLAMA_CUDA` variable, set `HIP_PLATFORM` on Windows to avoid compiler errors
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[]
closed
false
null
[]
null
0
2024-04-07T21:05:46
2024-04-07T23:29:52
2024-04-07T23:29:51
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3528", "html_url": "https://github.com/ollama/ollama/pull/3528", "diff_url": "https://github.com/ollama/ollama/pull/3528.diff", "patch_url": "https://github.com/ollama/ollama/pull/3528.patch", "merged_at": "2024-04-07T23:29:51" }
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3528/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3528/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6853
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6853/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6853/comments
https://api.github.com/repos/ollama/ollama/issues/6853/events
https://github.com/ollama/ollama/issues/6853
2,533,058,737
I_kwDOJ0Z1Ps6W-2ix
6,853
Setting temperature on any llava model makes the Ollama server hangs on REST calls
{ "login": "jluisreymejias", "id": 16193562, "node_id": "MDQ6VXNlcjE2MTkzNTYy", "avatar_url": "https://avatars.githubusercontent.com/u/16193562?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jluisreymejias", "html_url": "https://github.com/jluisreymejias", "followers_url": "https://api.gi...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6677367769, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q...
closed
false
null
[]
null
4
2024-09-18T08:23:25
2025-01-06T07:33:52
2025-01-06T07:33:52
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? When calling llava models from a REST client, setting temperature cause the ollama server hangs until process is killed. ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.3.10
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6853/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6853/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5160
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5160/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5160/comments
https://api.github.com/repos/ollama/ollama/issues/5160/events
https://github.com/ollama/ollama/issues/5160
2,363,800,007
I_kwDOJ0Z1Ps6M5LnH
5,160
Add HelpingAI-9B in it
{ "login": "OE-LUCIFER", "id": 158988478, "node_id": "U_kgDOCXn4vg", "avatar_url": "https://avatars.githubusercontent.com/u/158988478?v=4", "gravatar_id": "", "url": "https://api.github.com/users/OE-LUCIFER", "html_url": "https://github.com/OE-LUCIFER", "followers_url": "https://api.github.com/users/OE-...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
1
2024-06-20T08:05:57
2024-06-20T21:14:20
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
HelpingAI-9B is an advanced language model designed for emotionally intelligent conversational interactions. This model excels in empathetic engagement, understanding user emotions, and providing supportive dialogue.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5160/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5160/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/3438
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3438/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3438/comments
https://api.github.com/repos/ollama/ollama/issues/3438/events
https://github.com/ollama/ollama/issues/3438
2,218,222,188
I_kwDOJ0Z1Ps6EN2Js
3,438
Bug in MODEL download directory and launching ollama service in Linux
{ "login": "ejgutierrez74", "id": 11474846, "node_id": "MDQ6VXNlcjExNDc0ODQ2", "avatar_url": "https://avatars.githubusercontent.com/u/11474846?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ejgutierrez74", "html_url": "https://github.com/ejgutierrez74", "followers_url": "https://api.githu...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5755339642, "node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg...
open
false
null
[]
null
15
2024-04-01T13:06:01
2024-07-18T09:58:57
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I write this post to add more information: 1 - As you mentioned : I edited `sudo systemctl edit ollama.service` ![imagen](https://github.com/ollama/ollama/assets/11474846/d82ca623-5b89-4e8c-8b25-81a82de0b7b3) And the /media/Samsung/ollama_models is empty.... ![imagen](https://github.com/ollama/ollama/asse...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3438/reactions", "total_count": 6, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 2 }
https://api.github.com/repos/ollama/ollama/issues/3438/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6948
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6948/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6948/comments
https://api.github.com/repos/ollama/ollama/issues/6948/events
https://github.com/ollama/ollama/pull/6948
2,547,117,379
PR_kwDOJ0Z1Ps58nEbk
6,948
Fix Ollama silently failing on extra, unsupported openai parameters.
{ "login": "MadcowD", "id": 719535, "node_id": "MDQ6VXNlcjcxOTUzNQ==", "avatar_url": "https://avatars.githubusercontent.com/u/719535?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MadcowD", "html_url": "https://github.com/MadcowD", "followers_url": "https://api.github.com/users/MadcowD/fo...
[]
open
false
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[ { "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.g...
null
3
2024-09-25T07:00:07
2024-12-29T19:29:39
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6948", "html_url": "https://github.com/ollama/ollama/pull/6948", "diff_url": "https://github.com/ollama/ollama/pull/6948.diff", "patch_url": "https://github.com/ollama/ollama/pull/6948.patch", "merged_at": null }
Currently Ollama will just let you send unsupported api params to the openai compatible endpoint and just silently fail. This wreaks havoc on downstream uses causing unexpected behaviour. ```python # Retrieve the API key and ensure it's set ollama_api_key = os.getenv("OLLAMA_API_KEY") if not ollama_api_key: ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6948/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6948/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3427
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3427/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3427/comments
https://api.github.com/repos/ollama/ollama/issues/3427/events
https://github.com/ollama/ollama/issues/3427
2,217,052,006
I_kwDOJ0Z1Ps6EJYdm
3,427
prompt_eval_count in api is broken
{ "login": "drazdra", "id": 133811709, "node_id": "U_kgDOB_nN_Q", "avatar_url": "https://avatars.githubusercontent.com/u/133811709?v=4", "gravatar_id": "", "url": "https://api.github.com/users/drazdra", "html_url": "https://github.com/drazdra", "followers_url": "https://api.github.com/users/drazdra/foll...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
5
2024-03-31T15:39:03
2024-06-04T06:58:20
2024-06-04T06:58:20
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? prompt_eval_count parameter is absent on some calls, on other calls it returns wrong information. 1. i tried /api/chat with "stablelm2", no system prompt, prompt="hi". in result there is no field "prompt_eval_count" most of the time. sometimes it's there, randomly, but rarely. 2. when ...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3427/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3427/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8366
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8366/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8366/comments
https://api.github.com/repos/ollama/ollama/issues/8366/events
https://github.com/ollama/ollama/issues/8366
2,778,471,158
I_kwDOJ0Z1Ps6lnBr2
8,366
deepseek v3
{ "login": "Morrigan-Ship", "id": 138357319, "node_id": "U_kgDOCD8qRw", "avatar_url": "https://avatars.githubusercontent.com/u/138357319?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Morrigan-Ship", "html_url": "https://github.com/Morrigan-Ship", "followers_url": "https://api.github.com/...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
3
2025-01-09T18:07:28
2025-01-10T22:28:46
2025-01-10T22:28:42
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8366/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8366/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4260
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4260/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4260/comments
https://api.github.com/repos/ollama/ollama/issues/4260/events
https://github.com/ollama/ollama/issues/4260
2,285,566,329
I_kwDOJ0Z1Ps6IOvl5
4,260
Error: could not connect to ollama app, is it running?
{ "login": "starMagic", "id": 4728358, "node_id": "MDQ6VXNlcjQ3MjgzNTg=", "avatar_url": "https://avatars.githubusercontent.com/u/4728358?v=4", "gravatar_id": "", "url": "https://api.github.com/users/starMagic", "html_url": "https://github.com/starMagic", "followers_url": "https://api.github.com/users/st...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5860134234, "node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg...
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
1
2024-05-08T13:11:05
2024-05-21T18:34:09
2024-05-21T18:34:09
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? When try run command "Ollama list", the following error occurs: server.log 2024/05/08 20:50:26 routes.go:989: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4260/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4260/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/551
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/551/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/551/comments
https://api.github.com/repos/ollama/ollama/issues/551/events
https://github.com/ollama/ollama/issues/551
1,901,647,151
I_kwDOJ0Z1Ps5xWNUv
551
Dockerfile.cuda fails to build server
{ "login": "jamesbraza", "id": 8990777, "node_id": "MDQ6VXNlcjg5OTA3Nzc=", "avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jamesbraza", "html_url": "https://github.com/jamesbraza", "followers_url": "https://api.github.com/users...
[]
closed
false
null
[]
null
7
2023-09-18T19:58:22
2023-09-26T22:29:49
2023-09-26T22:29:49
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
On an AWS EC2 `g4dn.2xlarge` instance with Ollama https://github.com/jmorganca/ollama/tree/c345053a8bf47d5ef8f1fe15d385108059209fba: ```none > sudo docker buildx build . --file Dockerfile.cuda [+] Building 57.2s (7/16) ...
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/551/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/551/timeline
null
not_planned
false
https://api.github.com/repos/ollama/ollama/issues/5990
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5990/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5990/comments
https://api.github.com/repos/ollama/ollama/issues/5990/events
https://github.com/ollama/ollama/issues/5990
2,432,539,183
I_kwDOJ0Z1Ps6Q_Zov
5,990
Tools and properties.type Not Supporting Arrays
{ "login": "xonlly", "id": 4999786, "node_id": "MDQ6VXNlcjQ5OTk3ODY=", "avatar_url": "https://avatars.githubusercontent.com/u/4999786?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xonlly", "html_url": "https://github.com/xonlly", "followers_url": "https://api.github.com/users/xonlly/foll...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
open
false
null
[]
null
2
2024-07-26T16:06:52
2024-10-18T14:29:13
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? **Title:** Issue with `DynamicStructuredTool` and `properties.type` Not Supporting Arrays in LangchainJS **Description:** I am encountering an issue when using `DynamicStructuredTool` in LangchainJS. Specifically, the `type` property within `properties` does not currently support arrays. T...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5990/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5990/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6953
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6953/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6953/comments
https://api.github.com/repos/ollama/ollama/issues/6953/events
https://github.com/ollama/ollama/issues/6953
2,547,721,251
I_kwDOJ0Z1Ps6X2yQj
6,953
AMD ROCm Card can not use flash attention
{ "login": "superligen", "id": 4199207, "node_id": "MDQ6VXNlcjQxOTkyMDc=", "avatar_url": "https://avatars.githubusercontent.com/u/4199207?v=4", "gravatar_id": "", "url": "https://api.github.com/users/superligen", "html_url": "https://github.com/superligen", "followers_url": "https://api.github.com/users...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 6433346500, "node_id": ...
open
false
null
[]
null
4
2024-09-25T11:26:27
2024-12-19T19:36:01
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? My cards is w7900, and rocm driver is 6.3 , I found the llama-cpp server started by Ollama always without -fa flag. I check the code , found : // only cuda (compute capability 7+) and metal support flash attention if g.Library != "metal" && (g.Library != "c...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6953/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6953/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6670
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6670/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6670/comments
https://api.github.com/repos/ollama/ollama/issues/6670/events
https://github.com/ollama/ollama/issues/6670
2,509,778,404
I_kwDOJ0Z1Ps6VmC3k
6,670
expose slots data through API
{ "login": "aiseei", "id": 30615541, "node_id": "MDQ6VXNlcjMwNjE1NTQx", "avatar_url": "https://avatars.githubusercontent.com/u/30615541?v=4", "gravatar_id": "", "url": "https://api.github.com/users/aiseei", "html_url": "https://github.com/aiseei", "followers_url": "https://api.github.com/users/aiseei/fo...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
1
2024-09-06T07:58:13
2024-09-06T15:38:13
2024-09-06T15:38:13
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
hI Can the information that can be seen in the logs be exposed through /slots api per server/port ? We need this to manage queuing in our load balancer. This has been exposed by llama cpp already. https://github.com/ggerganov/llama.cpp/tree/master/examples/server#get-slots-returns-the-current-slots-processing-stat...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6670/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6670/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4248
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4248/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4248/comments
https://api.github.com/repos/ollama/ollama/issues/4248/events
https://github.com/ollama/ollama/issues/4248
2,284,620,426
I_kwDOJ0Z1Ps6ILIqK
4,248
error loading model architecture: unknown model architecture: 'qwen2moe'
{ "login": "li904775857", "id": 43633294, "node_id": "MDQ6VXNlcjQzNjMzMjk0", "avatar_url": "https://avatars.githubusercontent.com/u/43633294?v=4", "gravatar_id": "", "url": "https://api.github.com/users/li904775857", "html_url": "https://github.com/li904775857", "followers_url": "https://api.github.com/...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
1
2024-05-08T03:50:18
2024-07-25T17:43:34
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Qwen1.5-MoE-A2.7B-Chat is installed by convert-hf-to-gguf.py according to the process. After 4-bit quantization, ollamamodelfile is created, but it is not supported when loading. What is the cause of this? ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.1.32
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4248/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4248/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/5188
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5188/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5188/comments
https://api.github.com/repos/ollama/ollama/issues/5188/events
https://github.com/ollama/ollama/pull/5188
2,364,778,595
PR_kwDOJ0Z1Ps5zF3gJ
5,188
fix: skip os.removeAll() if PID does not exist
{ "login": "joshyan1", "id": 76125168, "node_id": "MDQ6VXNlcjc2MTI1MTY4", "avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4", "gravatar_id": "", "url": "https://api.github.com/users/joshyan1", "html_url": "https://github.com/joshyan1", "followers_url": "https://api.github.com/users/jos...
[]
closed
false
null
[]
null
0
2024-06-20T15:54:26
2024-06-20T17:40:59
2024-06-20T17:40:59
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5188", "html_url": "https://github.com/ollama/ollama/pull/5188", "diff_url": "https://github.com/ollama/ollama/pull/5188.diff", "patch_url": "https://github.com/ollama/ollama/pull/5188.patch", "merged_at": "2024-06-20T17:40:59" }
previously deleted all directories in $TMPDIR starting with ollama. Added a "continue" to skip the directory removal if a PID doesn't exist. We do this to prevent accidentally deleting directories in tmpdir that share the ollama name but aren't created by us for processes resolves: https://github.com/ollama/ollama/i...
{ "login": "joshyan1", "id": 76125168, "node_id": "MDQ6VXNlcjc2MTI1MTY4", "avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4", "gravatar_id": "", "url": "https://api.github.com/users/joshyan1", "html_url": "https://github.com/joshyan1", "followers_url": "https://api.github.com/users/jos...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5188/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5188/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4204
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4204/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4204/comments
https://api.github.com/repos/ollama/ollama/issues/4204/events
https://github.com/ollama/ollama/issues/4204
2,281,198,575
I_kwDOJ0Z1Ps6H-FPv
4,204
Support pull from habor registry in proxy mode an push to harbor
{ "login": "ptempier", "id": 6312537, "node_id": "MDQ6VXNlcjYzMTI1Mzc=", "avatar_url": "https://avatars.githubusercontent.com/u/6312537?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ptempier", "html_url": "https://github.com/ptempier", "followers_url": "https://api.github.com/users/ptemp...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
5
2024-05-06T15:49:57
2024-12-19T06:27:32
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Not sure why its not working, maybe i do something bad. From other ticket i understand it supposed to work with OCI registry. What i tried : ollama pull habor-server//ollama.com/library/llama3:text Error: pull model manifest: 400 ollama pull habor-server/ollama.com/llama3:text Error: pull model manifest: 4...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4204/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4204/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1223
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1223/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1223/comments
https://api.github.com/repos/ollama/ollama/issues/1223/events
https://github.com/ollama/ollama/pull/1223
2,004,804,614
PR_kwDOJ0Z1Ps5gDJXg
1,223
Make alt+backspace delete word
{ "login": "kejcao", "id": 106453563, "node_id": "U_kgDOBlhaOw", "avatar_url": "https://avatars.githubusercontent.com/u/106453563?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kejcao", "html_url": "https://github.com/kejcao", "followers_url": "https://api.github.com/users/kejcao/follower...
[]
closed
false
null
[]
null
0
2023-11-21T17:29:44
2023-11-21T20:26:47
2023-11-21T20:26:47
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1223", "html_url": "https://github.com/ollama/ollama/pull/1223", "diff_url": "https://github.com/ollama/ollama/pull/1223.diff", "patch_url": "https://github.com/ollama/ollama/pull/1223.patch", "merged_at": "2023-11-21T20:26:47" }
In GNU Readline you can press alt+backspace to delete word. I'm used to this behavior and so it's jarring not to be able to do it. This commit adds the feature.
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1223/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1223/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3007
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3007/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3007/comments
https://api.github.com/repos/ollama/ollama/issues/3007/events
https://github.com/ollama/ollama/issues/3007
2,176,498,516
I_kwDOJ0Z1Ps6BurtU
3,007
Search on ollama.com/library is missing lots of models
{ "login": "maxtheman", "id": 2172753, "node_id": "MDQ6VXNlcjIxNzI3NTM=", "avatar_url": "https://avatars.githubusercontent.com/u/2172753?v=4", "gravatar_id": "", "url": "https://api.github.com/users/maxtheman", "html_url": "https://github.com/maxtheman", "followers_url": "https://api.github.com/users/ma...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6573197867, "node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw...
open
false
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[ { "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api...
null
0
2024-03-08T17:46:15
2024-03-11T22:18:50
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Current behavior: Using @ehartford as an example since he's a prolific ollama model contributor: https://ollama.com/search?q=ehartford&p=1 Shows his models: <img width="1276" alt="Screenshot 2024-03-08 at 9 45 33 AM" src="https://github.com/ollama/ollama/assets/2172753/08b9dc80-5d94-4b86-82dd-37c0dddac326">...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3007/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3007/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/3227
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3227/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3227/comments
https://api.github.com/repos/ollama/ollama/issues/3227/events
https://github.com/ollama/ollama/issues/3227
2,192,660,065
I_kwDOJ0Z1Ps6CsVZh
3,227
ollama/ollama Docker image: committed modifications aren't saved
{ "login": "nicolasduminil", "id": 1037978, "node_id": "MDQ6VXNlcjEwMzc5Nzg=", "avatar_url": "https://avatars.githubusercontent.com/u/1037978?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nicolasduminil", "html_url": "https://github.com/nicolasduminil", "followers_url": "https://api.gith...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-03-18T16:14:49
2024-03-19T13:46:15
2024-03-19T08:48:59
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I'm using the Docker image `ollama/ollama:latest`. I'm running the image and, in the new created container, I'm pulling `llama2`. Once the pull operation finished, I'm checking its success using the `ollama list` command. Now, I commit the modification, I tag the new modified image and I push i...
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3227/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3227/timeline
null
not_planned
false
https://api.github.com/repos/ollama/ollama/issues/3293
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3293/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3293/comments
https://api.github.com/repos/ollama/ollama/issues/3293/events
https://github.com/ollama/ollama/issues/3293
2,202,706,972
I_kwDOJ0Z1Ps6DSqQc
3,293
ollama run in national user name
{ "login": "hgabor47", "id": 1212585, "node_id": "MDQ6VXNlcjEyMTI1ODU=", "avatar_url": "https://avatars.githubusercontent.com/u/1212585?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hgabor47", "html_url": "https://github.com/hgabor47", "followers_url": "https://api.github.com/users/hgabo...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
4
2024-03-22T15:01:10
2024-05-04T22:03:44
2024-05-04T22:03:43
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? ![image](https://github.com/ollama/ollama/assets/1212585/5a14f850-213b-471c-90c9-f1f0f8752c31) My Username has international characters like á and the ollama not handle it. ### What did you expect to see? RUN ### Steps to reproduce 1 Create a windows user with international charaters like: ...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3293/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3293/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6946
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6946/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6946/comments
https://api.github.com/repos/ollama/ollama/issues/6946/events
https://github.com/ollama/ollama/issues/6946
2,546,759,749
I_kwDOJ0Z1Ps6XzHhF
6,946
llama runner process has terminated: exit status 0xc0000005
{ "login": "viosay", "id": 16093380, "node_id": "MDQ6VXNlcjE2MDkzMzgw", "avatar_url": "https://avatars.githubusercontent.com/u/16093380?v=4", "gravatar_id": "", "url": "https://api.github.com/users/viosay", "html_url": "https://github.com/viosay", "followers_url": "https://api.github.com/users/viosay/fo...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA...
open
false
null
[]
null
5
2024-09-25T02:30:59
2024-11-02T17:12:45
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? It's again the https://github.com/ollama/ollama/issues/6011 issue. **The issue is with embedding call with the model converted using convert_hf_to_gguf.py.** litellm.llms.ollama.OllamaError: {"error":"llama runner process has terminated: exit status 0xc0000005"} ``` INFO [wmain] syst...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6946/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6946/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/3955
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3955/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3955/comments
https://api.github.com/repos/ollama/ollama/issues/3955/events
https://github.com/ollama/ollama/pull/3955
2,266,430,367
PR_kwDOJ0Z1Ps5t4K3P
3,955
return code `499` when user cancels request while a model is loading
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[]
closed
false
null
[]
null
0
2024-04-26T20:03:32
2024-04-26T21:38:30
2024-04-26T21:38:29
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3955", "html_url": "https://github.com/ollama/ollama/pull/3955", "diff_url": "https://github.com/ollama/ollama/pull/3955.diff", "patch_url": "https://github.com/ollama/ollama/pull/3955.patch", "merged_at": "2024-04-26T21:38:29" }
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3955/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3955/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5963
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5963/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5963/comments
https://api.github.com/repos/ollama/ollama/issues/5963/events
https://github.com/ollama/ollama/pull/5963
2,431,037,605
PR_kwDOJ0Z1Ps52hIpP
5,963
Revert "llm(llama): pass rope factors"
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[]
closed
false
null
[]
null
0
2024-07-25T21:53:31
2024-07-25T22:24:57
2024-07-25T22:24:55
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5963", "html_url": "https://github.com/ollama/ollama/pull/5963", "diff_url": "https://github.com/ollama/ollama/pull/5963.diff", "patch_url": "https://github.com/ollama/ollama/pull/5963.patch", "merged_at": "2024-07-25T22:24:55" }
Reverts ollama/ollama#5924
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5963/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5963/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6172
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6172/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6172/comments
https://api.github.com/repos/ollama/ollama/issues/6172/events
https://github.com/ollama/ollama/issues/6172
2,447,790,062
I_kwDOJ0Z1Ps6R5k_u
6,172
.git file is missing
{ "login": "Haritha-Maturi", "id": 100990846, "node_id": "U_kgDOBgT_fg", "avatar_url": "https://avatars.githubusercontent.com/u/100990846?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Haritha-Maturi", "html_url": "https://github.com/Haritha-Maturi", "followers_url": "https://api.github.c...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
0
2024-08-05T07:12:11
2024-08-05T08:14:46
2024-08-05T08:14:45
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? As per the docker file present in repo there needs to be a file named .git in repo but it is missing. ![image](https://github.com/user-attachments/assets/731a02ea-df73-4b4c-873f-f40a110ac7f3) ### OS Linux ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
{ "login": "Haritha-Maturi", "id": 100990846, "node_id": "U_kgDOBgT_fg", "avatar_url": "https://avatars.githubusercontent.com/u/100990846?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Haritha-Maturi", "html_url": "https://github.com/Haritha-Maturi", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6172/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6172/timeline
null
completed
false
End of preview. Expand in Data Studio
README.md exists but content is empty.
Downloads last month
6

Models trained or fine-tuned on 25b3nk/ollama-github-issues