--- title: VacAIgent emoji: 🐨 colorFrom: yellow colorTo: purple sdk: streamlit sdk_version: 1.44.1 app_file: app.py pinned: false license: mit short_description: Let AI agents plan your next vacation! --- # 🏖️ VacAIgent: Streamlit-Integrated AI Crew for Trip Planning VacAIgent leverages the CrewAI framework to automate and enhance the trip planning experience, integrating a user-friendly Streamlit interface. This project demonstrates how autonomous AI agents can collaborate and execute complex tasks efficiently. _Forked and enhanced from the_ [_crewAI examples repository_](https://github.com/joaomdmoura/crewAI-examples/tree/main/trip_planner). You can find the application hosted on Hugging Face Spaces here: [![](images/hf_vacaigent.png)](https://huggingface.co/spaces/Intel/vacaigent) **Check out the video below for code walkthrough** 👇 Watch the video (_Trip example originally developed by [@joaomdmoura](https://x.com/joaomdmoura)_) ## CrewAI Framework CrewAI simplifies the orchestration of role-playing AI agents. In VacAIgent, these agents collaboratively decide on cities and craft a complete itinerary for your trip based on specified preferences, all accessible via a Streamlit user interface. ## Running the Application To experience the VacAIgent app: ### Pre-Requisites 1. Get the API key from **scrapinagent.com** from [scrapinagent](https://scrapingant.com/) 2. Get the API from **SERPER API** from [serper]( https://serper.dev/) 3. Bring your OpenAI compatible API key 4. Bring your model endpoint URL and LLM model ID that you want to use ### Deploy Trip Planner #### Step 1 Clone the repository: ```sh git clone https://github.com/opea-project/Enterprise-Inference/ cd examples/vacaigent ``` #### Step 2 Insall Dependencies ```sh pip install -r requirements.txt ``` #### Step 3 Add Streamlit secrets. Create a `.streamlit/secrets.toml` file and update the variables below: ```sh SERPER_API_KEY="" SCRAPINGANT_API_KEY="" OPENAI_API_KEY="" MODEL_ID="meta-llama/Llama-3.3-70B-Instruct" MODEL_BASE_URL="https://api.inference.denvrdata.com/v1/" ``` **Note**: You can alternatively add these secrets directly to Hugging Face Spaces Secrets, under the Settings tab, if deploying the Streamlit application directly on Hugging Face. #### Step 4 Run the application ```sh streamlit run app.py ``` Your application should be up and running in your web browser. ★ **Disclaimer**: The application uses meta-llama/Llama-3.3-70B-Instruct by default. Ensure you have access to an OpenAI-compatible API and be aware of any associated costs. ## Details & Explanation - **Components**: - [trip_tasks.py](trip_tasks.py): Contains task prompts for the agents. - [trip_agents.py](trip_agents.py): Manages the creation of agents. - [tools](tools) directory: Houses tool classes used by agents. - [app.py](app.py): The heart of the frontend Streamlit app. ## LLM Model To switch the LLM model being used, you can switch the `MODEL_ID` in the `.streamlit/secrets.toml` file. ## Using Local Models with Ollama For enhanced privacy and customization, you can integrate local models like Ollama: ### Setting Up Ollama - **Installation**: Follow Ollama's guide for installation. - **Configuration**: Customize the model as per your requirements. ### Integrating Ollama with CrewAI Pass the Ollama model to agents in the CrewAI framework: ```python from langchain.llms import Ollama ollama_model = Ollama(model="agent") class TripAgents: # ... existing methods def local_expert(self): return Agent( role='Local Expert', tools=[SearchTools.search_internet, BrowserTools.scrape_and_summarize_website], llm=ollama_model, verbose=True ) ``` ## Benefits of Local Models - **Privacy**: Process sensitive data in-house. - **Customization**: Tailor models to fit specific needs. - **Performance**: Potentially faster responses with on-premises models. ## License VacAIgent is open-sourced under the MIT license.