ateetvatan commited on
Commit
7a0a8c5
Β·
1 Parent(s): 2289445

final changes

Browse files
Files changed (2) hide show
  1. README.md +141 -2
  2. env.example +1 -0
README.md CHANGED
@@ -1,2 +1,141 @@
1
- # masx-openchat-llm
2
- Masx Openchat 3.5 llm interface
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # MASX OpenChat LLM
2
+
3
+ > **A powerful, production-ready FastAPI service that brings the OpenChat-3.5 language model to life through a clean, scalable REST API.**
4
+
5
+ ## What is this?
6
+
7
+ MASX OpenChat LLM is your gateway to conversational AI powered by the state-of-the-art OpenChat-3.5 model. Think of it as your personal AI assistant that you can integrate into any application, website, or service through simple HTTP requests.
8
+
9
+ ### Key Features
10
+ - **πŸ€– Powered by OpenChat-3.5**: Latest conversational AI model with 7B parameters
11
+
12
+ ## πŸš€ Quick Start
13
+
14
+ ### Requirements
15
+
16
+ - **8GB+ RAM** (16GB+ recommended for optimal performance)
17
+ - **GPU with 8GB+ VRAM** (optional, but recommended for speed)
18
+
19
+ ### Dependencies
20
+ ```bash
21
+ pip install -r requirements.txt
22
+ ```
23
+
24
+ ### Config
25
+ ```bash
26
+ cp env.example .env
27
+ # Edit .env with your preferred settings
28
+ ```
29
+
30
+ ### Start the server
31
+ ```bash
32
+ python app.py
33
+ ```
34
+
35
+ **That's it!** Your AI service is now running at `http://localhost:8080`
36
+
37
+ ## Use
38
+
39
+ ### Basic Chat Request
40
+
41
+ ```bash
42
+ curl -X POST "http://localhost:8080/chat" \
43
+ -H "Content-Type: application/json" \
44
+ -d '{
45
+ "prompt": "Hello! Can you help me write a Python function?",
46
+ "max_tokens": 256,
47
+ "temperature": 0.7
48
+ }'
49
+ ```
50
+
51
+ ### Response Format
52
+
53
+ ```json
54
+ {
55
+ "response": "Of course! I'd be happy to help you write a Python function. What kind of function would you like to create? Please let me know what it should do, and I'll help you implement it with proper syntax and best practices."
56
+ }
57
+ ```
58
+
59
+ ### API Endpoints
60
+
61
+ | Endpoint | Method | Description |
62
+ |----------|--------|-------------|
63
+ | `/status` | GET | Check service health and get model info |
64
+ | `/chat` | POST | Generate AI responses |
65
+ | `/docs` | GET | Interactive API documentation (Swagger UI) |
66
+ | `/redoc` | GET | Alternative API documentation |
67
+
68
+ ### Request Parameters
69
+
70
+ | Parameter | Type | Default | Description |
71
+ |-----------|------|---------|-------------|
72
+ | `prompt` | string | **required** | Your input text/question |
73
+ | `max_tokens` | integer | 256 | Maximum tokens to generate |
74
+ | `temperature` | float | 0.0 | Creativity level (0.0 = deterministic, 2.0 = very creative) |
75
+
76
+ ## πŸ”§ Configuration
77
+
78
+ The service is highly configurable through environment variables. Copy `env.example` to `.env` and customize:
79
+
80
+ ### Essential Settings
81
+
82
+ ```bash
83
+ # Server Configuration
84
+ HOST=0.0.0.0
85
+ PORT=8080
86
+ LOG_LEVEL=info
87
+ ```
88
+
89
+ ### Advanced S
90
+
91
+ ## 🐳 Docker Deployment
92
+
93
+ ## πŸ“Š Monitoring & Health
94
+
95
+ ### Health Check
96
+
97
+ ```bash
98
+ curl http://localhost:8080/status
99
+ ```
100
+
101
+ Response:
102
+ ```json
103
+ {
104
+ "status": "ok",
105
+ "max_tokens": 4096
106
+ }
107
+ ```
108
+
109
+ ### Logs
110
+
111
+ The service provides comprehensive logging:
112
+ - **Application logs**: `./logs/app.log`
113
+ - **Console output**: Real-time server logs
114
+ - **Error tracking**: Detailed error information with stack traces
115
+
116
+ ## πŸ› οΈ Development
117
+
118
+ ### Project Structure
119
+
120
+ ```
121
+ masx-openchat-llm/
122
+ β”œβ”€β”€ app.py # FastAPI application
123
+ β”œβ”€β”€ model_loader.py # Model loading and configuration
124
+ β”œβ”€β”€ requirements.txt # Python dependencies
125
+ β”œβ”€β”€ .env.example # Environment variables template
126
+ β”œβ”€β”€ .gitignore # Git ignore rules
127
+ └── README.md # This file
128
+ ```
129
+
130
+ ### Adding Features
131
+
132
+ 1. **New Endpoints**: Add routes in `app.py`
133
+ 2. **Model Configuration**: Modify `model_loader.py`
134
+ 3. **Dependencies**: Update `requirements.txt`
135
+ 4. **Environment Variables**: Add to `env.example`
136
+
137
+ ---
138
+
139
+ **Made by the MASX AI **
140
+
141
+ *Ready to build the future of AI-powered applications? Start with MASX OpenChat LLM!*
env.example ADDED
@@ -0,0 +1 @@
 
 
1
+ MODEL_NAME = "openchat/openchat-3.5-1210"