rogerscuall commited on
Commit
bf5275b
·
verified ·
1 Parent(s): c23e128

Upload folder using huggingface_hub

Browse files
.claude/settings.local.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "permissions": {
3
+ "allow": [
4
+ "mcp__container-use__environment_open",
5
+ "mcp__container-use__environment_update",
6
+ "mcp__container-use__environment_file_write",
7
+ "mcp__container-use__environment_file_read",
8
+ "mcp__container-use__environment_run_cmd",
9
+ "mcp__container-use__environment_file_list"
10
+ ],
11
+ "deny": []
12
+ }
13
+ }
.container-use/AGENT.md ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ This environment contains a chatbot project using HuggingFace's smol-agent and Gradio:
2
+ - Python 3.10 is installed
3
+ - Required packages: smolagents, gradio, python-dotenv
4
+ - Use pip for dependency management
5
+ - Environment variables are loaded from .env file using python-dotenv
6
+ - Run the app with: python app_gradio.py
.container-use/environment.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "workdir": "/workdir",
3
+ "base_image": "python:3.10-slim",
4
+ "setup_commands": [
5
+ "apt-get update",
6
+ "apt-get install -y git",
7
+ "pip install gradio smolagents python-dotenv"
8
+ ]
9
+ }
.github/copilot-instructions.md ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ ALWAYS use ONLY Environments for ANY and ALL file, code, or shell operations—NO EXCEPTIONS—even for simple or generic requests.
2
+
3
+ DO NOT install or use the git cli with the environment_run_cmd tool. All environment tools will handle git operations for you. Changing ".git" yourself will compromise the integrity of your environment.
4
+
5
+ You MUST inform the user how to view your work using `cu log <env_id>` AND `cu checkout <env_id>`. Failure to do this will make your work inaccessible to others.
.gitignore ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Environment variables
2
+ .env
3
+
4
+ # Python
5
+ __pycache__/
6
+ *.py[cod]
7
+ *$py.class
8
+ *.so
9
+ .Python
10
+ env/
11
+ build/
12
+ develop-eggs/
13
+ dist/
14
+ downloads/
15
+ eggs/
16
+ .eggs/
17
+ lib/
18
+ lib64/
19
+ parts/
20
+ sdist/
21
+ var/
22
+ *.egg-info/
23
+ .installed.cfg
24
+ *.egg
25
+
26
+ # Virtual environments
27
+ venv/
28
+ ENV/
29
+ .venv
30
+
31
+ # IDE files
32
+ .idea/
33
+ .vscode/
34
+ *.swp
35
+ *.swo
.python-version ADDED
@@ -0,0 +1 @@
 
 
1
+ 3.12
CLAUDE.md ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # CLAUDE.md
2
+
3
+ This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
4
+
5
+ ## Environment Instructions
6
+
7
+ ALWAYS use ONLY Environments for ANY and ALL file, code, or shell operations—ONLY EXCEPTIONS is modifications for CLAUDE.md or the command /init—even for simple or generic requests.
8
+
9
+ DO NOT install or use the git cli with the environment_run_cmd tool. All environment tools will handle git operations for you. Changing ".git" yourself will compromise the integrity of your environment.
10
+
11
+ You MUST inform the user how to view your work using `cu log <env_id>` AND `cu checkout <env_id>`. Failure to do this will make your work inaccessible to others.
12
+
13
+ ## Project Architecture
14
+
15
+ This project is a simple chatbot implementation using HuggingFace's smol-agent:
16
+
17
+ - **chatbot.py**: Core implementation using `smolagents.CodeAgent` and `HfApiModel` for LLM-based chat responses
18
+ - **app.py**: API server for the chatbot (intended to be implemented)
19
+ - **Configuration**: Environment variables loaded from `.env` file for API keys and model settings
20
+ - reference @repomix-output.xml has in a single file all the functions of the project, update often.
21
+
22
+ The chatbot currently uses the "meta-llama/Llama-3.3-70B-Instruct" model via the HuggingFace API.
23
+
24
+ ## Common Commands
25
+
26
+ ### Environment Setup
27
+
28
+ ```bash
29
+ # Copy example environment file and fill in your HuggingFace API key
30
+ cp .env.example .env
31
+
32
+ # Install dependencies using uv
33
+ uv pip install .
34
+ ```
35
+
36
+ ### Running the Application
37
+
38
+ ```bash
39
+ # Run the CLI chatbot
40
+ python chatbot.py
41
+
42
+ # Run the API server (once implemented)
43
+ python app.py
44
+ ```
45
+
46
+ ### Development Workflow
47
+
48
+ ```bash
49
+ # Update dependencies (when modifying pyproject.toml)
50
+ uv pip install --upgrade .
51
+
52
+ # Run with specific model settings
53
+ HF_API_TOKEN=your_token MODEL_NAME=your_model python chatbot.py
54
+ ```
README.md CHANGED
@@ -1,12 +1,49 @@
1
  ---
2
- title: Huggingface Small Agent Test
3
- emoji: 📉
4
- colorFrom: gray
5
- colorTo: red
6
  sdk: gradio
7
  sdk_version: 5.34.2
8
- app_file: app.py
9
- pinned: false
10
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
 
12
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
1
  ---
2
+ title: huggingface-small-agent-test
3
+ app_file: app.py
 
 
4
  sdk: gradio
5
  sdk_version: 5.34.2
 
 
6
  ---
7
+ # HuggingFace Smol-Agent Chatbot
8
+
9
+ A simple chatbot implementation using HuggingFace's smol-agent.
10
+
11
+ ## Setup
12
+
13
+ 1. Clone this repository
14
+ 2. Create a `.env` file based on `.env.example`:
15
+ ```bash
16
+ cp .env.example .env
17
+ ```
18
+ 3. Add your HuggingFace API token to the `.env` file
19
+
20
+ ## Usage
21
+
22
+ ### Install dependencies
23
+
24
+ ```bash
25
+ uv pip install .
26
+ ```
27
+
28
+ ### Run the CLI chatbot
29
+
30
+ ```bash
31
+ python chatbot.py
32
+ ```
33
+
34
+ ### Run the API server
35
+
36
+ ```bash
37
+ python api.py
38
+ ```
39
+
40
+ The API will be available at http://localhost:8000
41
+
42
+ ## API Endpoints
43
+
44
+ - **POST /chat**: Send a message to the chatbot
45
+ - Request: `{"message": "Hello, how are you?"}`
46
+ - Response: `{"response": "I'm doing well, thank you for asking!"}`
47
 
48
+ - **GET /health**: Check if the API is running
49
+ - Response: `{"status": "ok"}`
app.py ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import gradio as gr
3
+ from dotenv import load_dotenv
4
+ from smolagents import CodeAgent, HfApiModel
5
+
6
+ # Load environment variables
7
+ load_dotenv()
8
+
9
+ # Get API token and model settings from environment variables
10
+ hf_api_token = os.getenv("HF_API_TOKEN")
11
+ model_name = os.getenv("MODEL_NAME", "meta-llama/Llama-3.3-70B-Instruct")
12
+ max_new_tokens = int(os.getenv("MAX_NEW_TOKENS", 500))
13
+ temperature = float(os.getenv("TEMPERATURE", 0.7))
14
+ from chatbot import agent
15
+
16
+
17
+
18
+ # Chat history to maintain conversation context
19
+ def chatbot(message, history):
20
+ # Get response from agent
21
+ response = agent.run(message)
22
+ return response
23
+
24
+ # Create Gradio interface
25
+ demo = gr.ChatInterface(
26
+ fn=chatbot,
27
+ title="Smol-Agent Chatbot",
28
+ description="Ask me anything!",
29
+ examples=[
30
+ "What is machine learning?",
31
+ "How does a transformer model work?",
32
+ "Explain quantum computing in simple terms"
33
+ ],
34
+ theme=gr.themes.Soft()
35
+ )
36
+
37
+ # Launch the app
38
+ if __name__ == "__main__":
39
+ demo.launch(mcp_server=True)
chatbot.py ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # /// script
2
+ # dependencies = [
3
+ # "smolagents",
4
+ # ]
5
+ # ///
6
+ from smolagents import CodeAgent, HfApiModel, LiteLLMModel, AmazonBedrockServerModel
7
+
8
+ model = AmazonBedrockServerModel(model_id="us.anthropic.claude-3-7-sonnet-20250219-v1:0")
9
+
10
+
11
+ # Managed Agent
12
+ agent = CodeAgent(
13
+ tools=[],
14
+ model=model,
15
+ name="mychatbot",
16
+ description="A chatbot for various tasks",
17
+ verbosity_level=2,
18
+ max_steps=10,
19
+ add_base_tools=False,
20
+ )
process1.md ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ # Create a chat bot
2
+
3
+ Create HF smol-agent lets create a simple chatbot that read a string and reply.
4
+ Use python-dotenv to load all the variables from the .env
5
+ Use uvx to manage python dependencies.
pyproject.toml ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [project]
2
+ name = "huggingface-small-agent-test"
3
+ version = "0.1.0"
4
+ description = "Add your description here"
5
+ readme = "README.md"
6
+ requires-python = ">=3.12"
7
+ dependencies = [
8
+ "smolagents[aws-sdk,bedrock,litellm]>=1.18.0",
9
+ "gradio[mcp]>=4.0.0",
10
+ "python-dotenv>=1.0.0",
11
+ ]
repomix-output.xml ADDED
@@ -0,0 +1,243 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ This file is a merged representation of the entire codebase, combined into a single document by Repomix.
2
+
3
+ <file_summary>
4
+ This section contains a summary of this file.
5
+
6
+ <purpose>
7
+ This file contains a packed representation of the entire repository's contents.
8
+ It is designed to be easily consumable by AI systems for analysis, code review,
9
+ or other automated processes.
10
+ </purpose>
11
+
12
+ <file_format>
13
+ The content is organized as follows:
14
+ 1. This summary section
15
+ 2. Repository information
16
+ 3. Directory structure
17
+ 4. Repository files (if enabled)
18
+ 5. Multiple file entries, each consisting of:
19
+ - File path as an attribute
20
+ - Full contents of the file
21
+ </file_format>
22
+
23
+ <usage_guidelines>
24
+ - This file should be treated as read-only. Any changes should be made to the
25
+ original repository files, not this packed version.
26
+ - When processing this file, use the file path to distinguish
27
+ between different files in the repository.
28
+ - Be aware that this file may contain sensitive information. Handle it with
29
+ the same level of security as you would the original repository.
30
+ </usage_guidelines>
31
+
32
+ <notes>
33
+ - Some files may have been excluded based on .gitignore rules and Repomix's configuration
34
+ - Binary files are not included in this packed representation. Please refer to the Repository Structure section for a complete list of file paths, including binary files
35
+ - Files matching patterns in .gitignore are excluded
36
+ - Files matching default ignore patterns are excluded
37
+ - Files are sorted by Git change count (files with more changes are at the bottom)
38
+ </notes>
39
+
40
+ </file_summary>
41
+
42
+ <directory_structure>
43
+ .claude/
44
+ settings.local.json
45
+ .container-use/
46
+ AGENT.md
47
+ environment.json
48
+ .env.example
49
+ .gitignore
50
+ .python-version
51
+ chatbot.py
52
+ CLAUDE.md
53
+ process1.md
54
+ pyproject.toml
55
+ README.md
56
+ </directory_structure>
57
+
58
+ <files>
59
+ This section contains the contents of the repository's files.
60
+
61
+ <file path=".claude/settings.local.json">
62
+ {
63
+ "permissions": {
64
+ "allow": [
65
+ "mcp__container-use__environment_open",
66
+ "mcp__container-use__environment_update",
67
+ "mcp__container-use__environment_file_write"
68
+ ],
69
+ "deny": []
70
+ }
71
+ }
72
+ </file>
73
+
74
+ <file path=".env.example">
75
+ # HF API Key for smol-agent
76
+ HF_API_TOKEN=your_huggingface_api_token_here
77
+
78
+ # Chatbot settings
79
+ MODEL_NAME=HuggingFaceH4/zephyr-7b-beta
80
+ MAX_NEW_TOKENS=500
81
+ TEMPERATURE=0.7
82
+ </file>
83
+
84
+ <file path=".gitignore">
85
+ # Environment variables
86
+ .env
87
+
88
+ # Python
89
+ __pycache__/
90
+ *.py[cod]
91
+ *$py.class
92
+ *.so
93
+ .Python
94
+ env/
95
+ build/
96
+ develop-eggs/
97
+ dist/
98
+ downloads/
99
+ eggs/
100
+ .eggs/
101
+ lib/
102
+ lib64/
103
+ parts/
104
+ sdist/
105
+ var/
106
+ *.egg-info/
107
+ .installed.cfg
108
+ *.egg
109
+
110
+ # Virtual environments
111
+ venv/
112
+ ENV/
113
+ .venv
114
+
115
+ # IDE files
116
+ .idea/
117
+ .vscode/
118
+ *.swp
119
+ *.swo
120
+ </file>
121
+
122
+ <file path=".python-version">
123
+ 3.12
124
+ </file>
125
+
126
+ <file path="process1.md">
127
+ # Create a chat bot
128
+
129
+ Create HF smol-agent lets create a simple chatbot that read a string and reply.
130
+ Use python-dotenv to load all the variables from the .env
131
+ Use uvx to manage python dependencies.
132
+ </file>
133
+
134
+ <file path=".container-use/AGENT.md">
135
+ This environment contains a chatbot project using HuggingFace's smol-agent.
136
+ - Python 3.10 is installed
137
+ - Use `uvx` for dependency management
138
+ - Environment variables are loaded from .env file using python-dotenv
139
+ - Use `uvx run` to run Python scripts with dependencies
140
+ </file>
141
+
142
+ <file path=".container-use/environment.json">
143
+ {
144
+ "workdir": "/workdir",
145
+ "base_image": "python:3.10-slim",
146
+ "setup_commands": [
147
+ "apt-get update",
148
+ "apt-get install -y git",
149
+ "pip install uv",
150
+ "alias uvx=\"uv\""
151
+ ]
152
+ }
153
+ </file>
154
+
155
+ <file path="CLAUDE.md">
156
+ ALWAYS use ONLY Environments for ANY and ALL file, code, or shell operations—NO EXCEPTIONS—even for simple or generic requests.
157
+
158
+ DO NOT install or use the git cli with the environment_run_cmd tool. All environment tools will handle git operations for you. Changing ".git" yourself will compromise the integrity of your environment.
159
+
160
+ You MUST inform the user how to view your work using `cu log <env_id>` AND `cu checkout <env_id>`. Failure to do this will make your work inaccessible to others.
161
+ </file>
162
+
163
+ <file path="pyproject.toml">
164
+ [project]
165
+ name = "huggingface-small-agent-test"
166
+ version = "0.1.0"
167
+ description = "Add your description here"
168
+ readme = "README.md"
169
+ requires-python = ">=3.12"
170
+ dependencies = [
171
+ "smolagents>=1.18.0",
172
+ ]
173
+ </file>
174
+
175
+ <file path="README.md">
176
+ # HuggingFace Smol-Agent Chatbot
177
+
178
+ A simple chatbot implementation using HuggingFace's smol-agent.
179
+
180
+ ## Setup
181
+
182
+ 1. Clone this repository
183
+ 2. Create a `.env` file based on `.env.example`:
184
+ ```bash
185
+ cp .env.example .env
186
+ ```
187
+ 3. Add your HuggingFace API token to the `.env` file
188
+
189
+ ## Usage
190
+
191
+ ### Install dependencies
192
+
193
+ ```bash
194
+ uv pip install .
195
+ ```
196
+
197
+ ### Run the CLI chatbot
198
+
199
+ ```bash
200
+ python chatbot.py
201
+ ```
202
+
203
+ ### Run the API server
204
+
205
+ ```bash
206
+ python api.py
207
+ ```
208
+
209
+ The API will be available at http://localhost:8000
210
+
211
+ ## API Endpoints
212
+
213
+ - **POST /chat**: Send a message to the chatbot
214
+ - Request: `{"message": "Hello, how are you?"}`
215
+ - Response: `{"response": "I'm doing well, thank you for asking!"}`
216
+
217
+ - **GET /health**: Check if the API is running
218
+ - Response: `{"status": "ok"}`
219
+ </file>
220
+
221
+ <file path="chatbot.py">
222
+ # /// script
223
+ # dependencies = [
224
+ # "smolagents",
225
+ # ]
226
+ # ///
227
+ from smolagents import CodeAgent, HfApiModel
228
+ model_id = "meta-llama/Llama-3.3-70B-Instruct"
229
+
230
+
231
+ # Managed Agent
232
+ agent = CodeAgent(
233
+ tools=[],
234
+ model=HfApiModel(model_id=model_id),
235
+ name="mychatbot",
236
+ description="A chatbot for various tasks",
237
+ verbosity_level=2,
238
+ max_steps=10,
239
+ add_base_tools=False,
240
+ )
241
+ </file>
242
+
243
+ </files>
requirements.txt ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ smolagents>=1.18.0
2
+ gradio>=4.0.0
3
+ python-dotenv>=1.0.0
4
+ smolagents[litellm]
uv.lock ADDED
The diff for this file is too large to render. See raw diff