A newer version of the Gradio SDK is available:
5.42.0
CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
Environment Instructions
ALWAYS use ONLY Environments for ANY and ALL file, code, or shell operations—ONLY EXCEPTIONS is modifications for CLAUDE.md or the command /init—even for simple or generic requests.
DO NOT install or use the git cli with the environment_run_cmd tool. All environment tools will handle git operations for you. Changing ".git" yourself will compromise the integrity of your environment.
You MUST inform the user how to view your work using cu log <env_id>
AND cu checkout <env_id>
. Failure to do this will make your work inaccessible to others.
Project Architecture
This project is a simple chatbot implementation using HuggingFace's smol-agent:
- chatbot.py: Core implementation using
smolagents.CodeAgent
andHfApiModel
for LLM-based chat responses - app.py: API server for the chatbot (intended to be implemented)
- Configuration: Environment variables loaded from
.env
file for API keys and model settings - reference @repomix-output.xml has in a single file all the functions of the project, update often.
The chatbot currently uses the "meta-llama/Llama-3.3-70B-Instruct" model via the HuggingFace API.
Common Commands
Environment Setup
# Copy example environment file and fill in your HuggingFace API key
cp .env.example .env
# Install dependencies using uv
uv pip install .
Running the Application
# Run the CLI chatbot
python chatbot.py
# Run the API server (once implemented)
python app.py
Development Workflow
# Update dependencies (when modifying pyproject.toml)
uv pip install --upgrade .
# Run with specific model settings
HF_API_TOKEN=your_token MODEL_NAME=your_model python chatbot.py