rogerscuall commited on
Commit
335d527
Β·
verified Β·
1 Parent(s): 890d952

Upload folder using huggingface_hub

Browse files
Files changed (2) hide show
  1. README.md +7 -64
  2. requirements.txt +7 -11
README.md CHANGED
@@ -14,7 +14,7 @@ sdk_version: 5.30.0
14
 
15
 
16
  Queries network documentation with natural languague.
17
- Recommend ports to users
18
 
19
  ## πŸš€ Key Features
20
 
@@ -26,74 +26,27 @@ Recommend ports to users
26
  - **LLM Integration**: Powered by Google's Gemini Flash model
27
  - **Customizable Prompts**: Configure system and user prompts through YAML
28
 
29
- ## πŸ› οΈ Technical Stack
30
-
31
- - **Vector Search**: ChromaDB for efficient semantic search
32
- - **Embeddings**: Nomic AI's text embeddings (nomic-embed-text-v1)
33
- - **LLM Orchestration**: SmolagentsCodeAgent for tool use and reasoning
34
- - **UI Framework**: Gradio for the web interface
35
- - **Model**: Google's Gemini 2.0 Flash via LiteLLM
36
-
37
  ## πŸ“‹ Requirements
38
 
39
  - Python 3.8+
40
  - Dependencies listed in `requirements.txt`
41
 
42
- ## πŸ”§ Installation
43
-
44
- 1. **Clone the repository**:
45
- ```bash
46
- git clone https://github.com/yourusername/chat-with-avd-doc.git
47
- cd chat-with-avd-doc
48
- ```
49
-
50
- 2. **Create a virtual environment** (recommended):
51
- ```bash
52
- python -m venv venv
53
- source venv/bin/activate # On Windows: venv\Scripts\activate
54
- ```
55
-
56
- 3. **Install dependencies**:
57
- ```bash
58
- pip install -r requirements.txt
59
- ```
60
 
61
- 4. **Configure environment variables** (if needed):
62
- - For LiteLLM to access Gemini, you may need to set API keys
63
- - Create a `.env` file or set them directly in your environment
64
 
65
  ## πŸ” Working with the Vector Database
66
 
67
- The application uses ChromaDB to store document embeddings. The database is created in the `vector_db` directory.
68
 
69
  - Documents are organized by "fabric" collection
70
  - Metadata includes source information that can identify device-specific documentation
71
  - The system distinguishes between global network information and device-specific details
72
 
73
- ## πŸ“ Configuration
74
-
75
- ### Prompt Templates
76
-
77
- Edit `prompts.yaml` to customize:
78
-
79
- - System instructions for the AI agent
80
- - Formatting of responses
81
- - Behavior and capabilities
82
-
83
- ### Model Configuration
84
-
85
- The application uses Gemini Flash by default. To modify:
86
-
87
- ```python
88
- # In app.py
89
- model = LiteLLMModel("gemini/gemini-2.0-flash") # Change model here
90
- ```
91
 
92
  ## πŸš€ Usage
93
 
94
  1. **Start the application**:
95
  ```bash
96
- python app.py
97
  ```
98
 
99
  2. **Access the UI**:
@@ -103,6 +56,8 @@ model = LiteLLMModel("gemini/gemini-2.0-flash") # Change model here
103
  - Ask natural language questions about your network
104
  - Example: "What is the loopback Pool address used by the fabric?"
105
  - Example: "How many IP addresses are in use in the management network?"
 
 
106
 
107
  ## πŸ“‚ Project Structure
108
 
@@ -111,19 +66,10 @@ chat-with-avd-doc/
111
  β”œβ”€β”€ app.py # Main application with Retriever Tool and Gradio UI
112
  β”œβ”€β”€ prompts.yaml # Configuration for AI prompts and behavior
113
  β”œβ”€β”€ requirements.txt # Python dependencies
114
- └── vector_db/ # ChromaDB database files
115
- β”œβ”€β”€ chroma.sqlite3 # SQLite database for ChromaDB
116
- └── [UUID folders] # Vector data storage
117
  ```
118
 
119
- ## πŸ§ͺ How It Works
120
-
121
- 1. **User Query**: User enters a natural language question through the Gradio UI
122
- 2. **Semantic Encoding**: The query is converted to a vector embedding
123
- 3. **Vector Search**: ChromaDB searches for similar document vectors
124
- 4. **Context Assembly**: Top matches are assembled with source metadata
125
- 5. **LLM Processing**: The Gemini model processes the query with retrieved context
126
- 6. **Response Generation**: The system returns a natural language answer
127
 
128
  ## πŸ“„ License
129
 
@@ -139,6 +85,3 @@ Contributions are welcome! Please feel free to submit a Pull Request.
139
  4. Push to the branch (`git push origin feature/amazing-feature`)
140
  5. Open a Pull Request
141
 
142
- ## Test Questions
143
-
144
- 1. What is the name server in this network? 8.8.8.8
 
14
 
15
 
16
  Queries network documentation with natural languague.
17
+ Recommend interface ports to users based on network fabric documentation.
18
 
19
  ## πŸš€ Key Features
20
 
 
26
  - **LLM Integration**: Powered by Google's Gemini Flash model
27
  - **Customizable Prompts**: Configure system and user prompts through YAML
28
 
 
 
 
 
 
 
 
 
29
  ## πŸ“‹ Requirements
30
 
31
  - Python 3.8+
32
  - Dependencies listed in `requirements.txt`
33
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
34
 
 
 
 
35
 
36
  ## πŸ” Working with the Vector Database
37
 
38
+ The application uses FAISS to store document embeddings. The database is created in the `faiss_index` directory.
39
 
40
  - Documents are organized by "fabric" collection
41
  - Metadata includes source information that can identify device-specific documentation
42
  - The system distinguishes between global network information and device-specific details
43
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
44
 
45
  ## πŸš€ Usage
46
 
47
  1. **Start the application**:
48
  ```bash
49
+ uv run app.py
50
  ```
51
 
52
  2. **Access the UI**:
 
56
  - Ask natural language questions about your network
57
  - Example: "What is the loopback Pool address used by the fabric?"
58
  - Example: "How many IP addresses are in use in the management network?"
59
+ - Example: "I need unused ports to connect 6 servers. provide me a pair of connections per server."
60
+ - Example: "What is the name server in this network? -> 8.8.8.8"
61
 
62
  ## πŸ“‚ Project Structure
63
 
 
66
  β”œβ”€β”€ app.py # Main application with Retriever Tool and Gradio UI
67
  β”œβ”€β”€ prompts.yaml # Configuration for AI prompts and behavior
68
  β”œβ”€β”€ requirements.txt # Python dependencies
69
+ └── faiss_index/ # FAISS database files
70
+ β”œβ”€β”€ faiss.index # FAISS index file
 
71
  ```
72
 
 
 
 
 
 
 
 
 
73
 
74
  ## πŸ“„ License
75
 
 
85
  4. Push to the branch (`git push origin feature/amazing-feature`)
86
  5. Open a Pull Request
87
 
 
 
 
requirements.txt CHANGED
@@ -1,12 +1,8 @@
1
  PyYAML
2
- chromadb
3
- sentence-transformers
4
- smolagents
5
- gradio
6
- smolagents[litellm]
7
- einops
8
- langchain-community
9
- langchain
10
- faiss-cpu
11
- unstructured
12
- gradio[mcp]
 
1
  PyYAML
2
+ langchain-community # For FAISS, HuggingFaceEmbeddings
3
+ langchain # Core Langchain
4
+ faiss-cpu # FAISS vector store
5
+ sentence-transformers # For HuggingFaceEmbeddings
6
+ openai-agents # OpenAI Agents SDK
7
+ gradio[mcp]
8
+ unstructured # Required by loader.py