title: VacAIgent
emoji: π¨
colorFrom: yellow
colorTo: purple
sdk: streamlit
sdk_version: 1.44.1
app_file: app.py
pinned: false
license: mit
short_description: Let AI agents plan your next vacation!
ποΈ VacAIgent: Streamlit-Integrated AI Crew for Trip Planning
VacAIgent leverages the CrewAI framework to automate and enhance the trip planning experience, integrating a user-friendly Streamlit interface. This project demonstrates how autonomous AI agents can collaborate and execute complex tasks efficiently.
Forked and enhanced from the crewAI examples repository. You can find the application hosted on Hugging Face Spaces here:
Check out the video below for code walkthrough π

(Trip example originally developed by @joaomdmoura)
CrewAI Framework
CrewAI simplifies the orchestration of role-playing AI agents. In VacAIgent, these agents collaboratively decide on cities and craft a complete itinerary for your trip based on specified preferences, all accessible via a Streamlit user interface.
Running the Application
To experience the VacAIgent app:
Pre-Requisites
- Get the API key from scrapinagent.com from scrapinagent
- Get the API from SERPER API from serper
- Bring your OpenAI compatible API key
- Bring your model endpoint URL and LLM model ID that you want to use
Deploy Trip Planner
Step 1
Clone the repository:
git clone https://github.com/opea-project/Enterprise-Inference/
cd examples/vacaigent
Step 2
Insall Dependencies
pip install -r requirements.txt
Step 3
Add Streamlit secrets. Create a .streamlit/secrets.toml
file and update the variables below:
SERPER_API_KEY=""
SCRAPINGANT_API_KEY=""
OPENAI_API_KEY=""
MODEL_ID="meta-llama/Llama-3.3-70B-Instruct"
MODEL_BASE_URL="https://api.inference.denvrdata.com/v1/"
Note: You can alternatively add these secrets directly to Hugging Face Spaces Secrets, under the Settings tab, if deploying the Streamlit application directly on Hugging Face.
Step 4
Run the application
streamlit run app.py
Your application should be up and running in your web browser.
β Disclaimer: The application uses meta-llama/Llama-3.3-70B-Instruct by default. Ensure you have access to an OpenAI-compatible API and be aware of any associated costs.
Details & Explanation
- Components:
- trip_tasks.py: Contains task prompts for the agents.
- trip_agents.py: Manages the creation of agents.
- tools directory: Houses tool classes used by agents.
- app.py: The heart of the frontend Streamlit app.
LLM Model
To switch the LLM model being used, you can switch the MODEL_ID
in the .streamlit/secrets.toml
file.
Using Local Models with Ollama
For enhanced privacy and customization, you can integrate local models like Ollama:
Setting Up Ollama
- Installation: Follow Ollama's guide for installation.
- Configuration: Customize the model as per your requirements.
Integrating Ollama with CrewAI
Pass the Ollama model to agents in the CrewAI framework:
from langchain.llms import Ollama
ollama_model = Ollama(model="agent")
class TripAgents:
# ... existing methods
def local_expert(self):
return Agent(
role='Local Expert',
tools=[SearchTools.search_internet, BrowserTools.scrape_and_summarize_website],
llm=ollama_model,
verbose=True
)
Benefits of Local Models
- Privacy: Process sensitive data in-house.
- Customization: Tailor models to fit specific needs.
- Performance: Potentially faster responses with on-premises models.
License
VacAIgent is open-sourced under the MIT license.