Spaces:
Sleeping
Sleeping
Fashion Analyzer - Docker Compose Setup
This project provides a secure, containerized fashion analysis application using Ollama and FastAPI with Docker Compose.
ποΈ Architecture
The application consists of three services:
- Ollama Service: Runs the Ollama server with LLaVA model for vision analysis
- FastAPI Service: Provides the web API and user interface
- Model Loader: One-time service to download the required LLaVA model
π Security Features
- β Pinned Ollama version (0.9.2) - No critical vulnerabilities
- β Non-root user execution - Enhanced container security
- β Security updates - Latest package updates applied
- β Health checks - Service monitoring and restart policies
- β Network isolation - Services communicate via internal network
π Quick Start
Prerequisites
- Docker Engine 20.10+
- Docker Compose 2.0+
1. Start the Application
# Start all services
docker-compose up -d
# View logs
docker-compose logs -f
# Check service status
docker-compose ps
2. Access the Application
- Web Interface: http://localhost:7860
- API Documentation: http://localhost:7860/docs
- Health Check: http://localhost:7860/health
- Ollama API: http://localhost:11434
3. Stop the Application
# Stop all services
docker-compose down
# Stop and remove volumes (removes downloaded models)
docker-compose down -v
π Project Structure
AI/
βββ docker-compose.yml # Main orchestration file
βββ Dockerfile.fastapi # FastAPI service Dockerfile
βββ .env # Environment variables
βββ .dockerignore # Docker build exclusions
βββ fast.py # FastAPI application
βββ requirements.txt # Python dependencies
βββ logs/ # Application logs (created at runtime)
βοΈ Configuration
Environment Variables (.env)
OLLAMA_HOST=0.0.0.0:11434
OLLAMA_ORIGINS=*
OLLAMA_BASE_URL=http://ollama:11434
OLLAMA_PORT=11434
FASTAPI_PORT=7860
Custom Configuration
To modify ports or other settings:
- Edit
.env
file - Restart services:
docker-compose up -d
π§ Development
Building Images
# Build only FastAPI service
docker-compose build fastapi
# Build with no cache
docker-compose build --no-cache
Viewing Logs
# All services
docker-compose logs -f
# Specific service
docker-compose logs -f fastapi
docker-compose logs -f ollama
Debugging
# Execute commands in running containers
docker-compose exec fastapi bash
docker-compose exec ollama bash
# Check service health
docker-compose exec fastapi curl http://localhost:7860/health
π Monitoring
Health Checks
All services include health checks:
- Ollama: Checks API availability
- FastAPI: Checks application health and Ollama connectivity
Service Dependencies
- FastAPI waits for Ollama to be healthy before starting
- Model loader runs after Ollama is ready
π οΈ Troubleshooting
Common Issues
- Port conflicts: Change ports in
.env
file - Model download fails: Check internet connection and Ollama logs
- FastAPI can't connect to Ollama: Verify network configuration
Reset Everything
# Stop and remove everything
docker-compose down -v --remove-orphans
# Remove images
docker-compose down --rmi all
# Start fresh
docker-compose up -d
π Scaling
To run multiple FastAPI instances:
# Scale FastAPI service
docker-compose up -d --scale fastapi=3
Note: You'll need a load balancer for multiple instances.
π Security Considerations
- Services run as non-root users
- Network isolation between services
- No sensitive data in environment variables
- Regular security updates applied
- Pinned dependency versions
π API Usage
Upload and Analyze Image
curl -X POST "http://localhost:7860/analyze-image" \
-H "accept: application/json" \
-H "Content-Type: multipart/form-data" \
-F "[email protected]"
Health Check
curl http://localhost:7860/health
π€ Contributing
- Make changes to the code
- Test with
docker-compose up --build
- Submit pull request
π License
This project is licensed under the MIT License.