mtyrrell commited on
Commit
ac573bf
·
1 Parent(s): 8724cc4
Files changed (1) hide show
  1. README.md +30 -0
README.md CHANGED
@@ -21,3 +21,33 @@ This is an LLM-based generation service designed to be deployed as a modular com
21
  - HuggingFace: `HF_TOKEN`
22
 
23
  2. Inference provider and model settings are accessible via params.cfg
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
21
  - HuggingFace: `HF_TOKEN`
22
 
23
  2. Inference provider and model settings are accessible via params.cfg
24
+
25
+ ## MCP Endpoint
26
+
27
+ ## Available Tools
28
+
29
+ ### `rag_generate`
30
+
31
+ Generate an answer to a query using provided context through RAG. This function takes a user query and relevant context, then uses a language model to generate a comprehensive answer based on the provided information.
32
+
33
+ **Input Schema:**
34
+
35
+ | Parameter | Type | Description |
36
+ |-----------|------|-------------|
37
+ | `query` | string | The user's question or query |
38
+ | `context` | string | The relevant context/documents to use for answering |
39
+
40
+ **Returns:** The generated answer based on the query and context
41
+
42
+ **Example Usage:**
43
+
44
+ ```json
45
+ {
46
+ "query": "What are the benefits of renewable energy?",
47
+ "context": "Documents and information about renewable energy sources..."
48
+ }
49
+ ```
50
+
51
+ ---
52
+
53
+ *This tool uses an LLM to generate answers using the most relevant information from the context, along with the input query.*