Update README.md
Browse files
README.md
CHANGED
@@ -1,14 +1,41 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
pinned: false
|
10 |
-
license: mit
|
11 |
-
short_description: coding copilot
|
12 |
---
|
13 |
|
14 |
-
|
|
|
|
|
|
|
|
1 |
+
# CodeLlama Copilot (Web AI Coding Assistant)
|
2 |
+
Conversational code assistant powered by Meta's CodeLlama-Instruct.
|
3 |
+
|
4 |
+
## Features
|
5 |
+
- Write, fix & explain code interactively
|
6 |
+
- Multi-round chat (remembers context)
|
7 |
+
- Runs entirely in your cloud workspace (private!)
|
8 |
+
|
9 |
+
---
|
10 |
+
|
11 |
+
## How to deploy on Hugging Face Spaces (FOR FREE)
|
12 |
+
Because HF provides **free hosting & free CPUs** (much better with Pro account + GPU though).
|
13 |
+
|
14 |
+
### Step-by-step:
|
15 |
+
1. **Sign up** or login at [huggingface.co](https://huggingface.co)
|
16 |
+
2. Go to: https://huggingface.co/spaces
|
17 |
+
3. Click **New Space**
|
18 |
+
4. Name it, select **Gradio** as SDK
|
19 |
+
5. Set it **Public** *(private needs a paid plan)*
|
20 |
+
6. Once created, open the Space repo UI
|
21 |
+
7. Upload **app.py** and **requirements.txt** exactly as above.
|
22 |
+
8. Save and **click "Commit"**
|
23 |
+
9. Wait ~1–2 mins: it will build & launch!
|
24 |
+
URL formed as:
|
25 |
+
`https://huggingface.co/spaces/<your-username>/<space-name>`
|
26 |
+
10. Use it live — share link with friends.
|
27 |
+
|
28 |
---
|
29 |
+
|
30 |
+
## Notes
|
31 |
+
- **Free Spaces use CPU, so can be slow / timeout for big models.**
|
32 |
+
- Upgrade to **Hugging Face Pro** ~$9/mo
|
33 |
+
→ Choose A10G GPU in Space hardware settings for FULL performance.
|
34 |
+
- Absolutely **no server hassles**. Just upload files!
|
35 |
+
|
|
|
|
|
|
|
36 |
---
|
37 |
|
38 |
+
## (Optional) Run locally:
|
39 |
+
```bash
|
40 |
+
pip install -r requirements.txt
|
41 |
+
python app.py
|