Spaces:
Paused
Paused
Commit
·
15c704c
1
Parent(s):
11c65a0
Dockerfile set to Python3.12-mini
Browse files- Dockerfile +2 -1
- main/logs/llm_api.log +21 -0
Dockerfile
CHANGED
|
@@ -1,4 +1,5 @@
|
|
| 1 |
-
|
|
|
|
| 2 |
|
| 3 |
RUN useradd -m -u 1000 user
|
| 4 |
USER user
|
|
|
|
| 1 |
+
# Use Python 3.12 slim image as base
|
| 2 |
+
FROM python:3.12-slim
|
| 3 |
|
| 4 |
RUN useradd -m -u 1000 user
|
| 5 |
USER user
|
main/logs/llm_api.log
CHANGED
|
@@ -51,3 +51,24 @@
|
|
| 51 |
2025-01-09 17:15:14,956 - llm_api - INFO - Loading model from local path: main/models/phi-4
|
| 52 |
2025-01-09 17:15:14,965 - llm_api - ERROR - Failed to initialize generation model microsoft/phi-4: CUDA is required but not available for bitsandbytes. Please consider installing the multi-platform enabled version of bitsandbytes, which is currently a work in progress. Please check currently supported platforms and installation instructions at https://huggingface.co/docs/bitsandbytes/main/en/installation#multi-backend
|
| 53 |
2025-01-09 17:15:14,965 - api_routes - ERROR - Error initializing model: CUDA is required but not available for bitsandbytes. Please consider installing the multi-platform enabled version of bitsandbytes, which is currently a work in progress. Please check currently supported platforms and installation instructions at https://huggingface.co/docs/bitsandbytes/main/en/installation#multi-backend
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 51 |
2025-01-09 17:15:14,956 - llm_api - INFO - Loading model from local path: main/models/phi-4
|
| 52 |
2025-01-09 17:15:14,965 - llm_api - ERROR - Failed to initialize generation model microsoft/phi-4: CUDA is required but not available for bitsandbytes. Please consider installing the multi-platform enabled version of bitsandbytes, which is currently a work in progress. Please check currently supported platforms and installation instructions at https://huggingface.co/docs/bitsandbytes/main/en/installation#multi-backend
|
| 53 |
2025-01-09 17:15:14,965 - api_routes - ERROR - Error initializing model: CUDA is required but not available for bitsandbytes. Please consider installing the multi-platform enabled version of bitsandbytes, which is currently a work in progress. Please check currently supported platforms and installation instructions at https://huggingface.co/docs/bitsandbytes/main/en/installation#multi-backend
|
| 54 |
+
2025-01-13 16:04:32,247 - hf_validation - WARNING - No .env file found. Fine if you're on Huggingface, but you need one to run locally on your PC.
|
| 55 |
+
2025-01-13 16:04:32,247 - hf_validation - ERROR - No HF_TOKEN found in environment variables
|
| 56 |
+
2025-01-13 16:04:32,247 - main - INFO - Starting LLM API server
|
| 57 |
+
2025-01-13 16:04:32,248 - llm_api - INFO - Initializing LLM API
|
| 58 |
+
2025-01-13 16:04:32,248 - llm_api - INFO - LLM API initialized successfully
|
| 59 |
+
2025-01-13 16:04:32,248 - api_routes - INFO - Router initialized with LLM API instance
|
| 60 |
+
2025-01-13 16:04:32,252 - main - INFO - FastAPI application created successfully
|
| 61 |
+
2025-01-13 16:05:27,996 - api_routes - INFO - Received request to download model: microsoft/Phi-3.5-mini-instruct
|
| 62 |
+
2025-01-13 16:05:27,996 - llm_api - INFO - Starting download of model: microsoft/Phi-3.5-mini-instruct
|
| 63 |
+
2025-01-13 16:05:27,996 - llm_api - INFO - Enabling stdout logging for download
|
| 64 |
+
2025-01-13 16:08:46,773 - llm_api - INFO - Disabling stdout logging
|
| 65 |
+
2025-01-13 16:08:46,773 - llm_api - INFO - Saving model to main/models/Phi-3.5-mini-instruct
|
| 66 |
+
2025-01-13 16:10:23,543 - llm_api - INFO - Successfully downloaded model: microsoft/Phi-3.5-mini-instruct
|
| 67 |
+
2025-01-13 16:10:24,432 - api_routes - INFO - Successfully downloaded model: microsoft/Phi-3.5-mini-instruct
|
| 68 |
+
2025-01-13 16:18:45,409 - api_routes - INFO - Received request to initialize model: microsoft/Phi-3.5-mini-instruct
|
| 69 |
+
2025-01-13 16:18:45,409 - llm_api - INFO - Initializing generation model: microsoft/Phi-3.5-mini-instruct
|
| 70 |
+
2025-01-13 16:18:45,412 - llm_api - INFO - Loading model from local path: main/models/Phi-3.5-mini-instruct
|
| 71 |
+
2025-01-13 16:18:45,982 - llm_api - ERROR - Failed to initialize generation model microsoft/Phi-3.5-mini-instruct: Failed to import transformers.integrations.bitsandbytes because of the following error (look up to see its traceback):
|
| 72 |
+
Dynamo is not supported on Python 3.13+
|
| 73 |
+
2025-01-13 16:18:45,982 - api_routes - ERROR - Error initializing model: Failed to import transformers.integrations.bitsandbytes because of the following error (look up to see its traceback):
|
| 74 |
+
Dynamo is not supported on Python 3.13+
|