metadata
title: E2B API Proxy
emoji: 🚀
colorFrom: blue
colorTo: indigo
sdk: docker
pinned: false
app_port: 7860
E2B API Proxy with FastAPI
This project is a FastAPI implementation of an API proxy for E2B (fragments.e2b.dev). It provides a compatible interface for various AI model providers including OpenAI, Google, and Anthropic.
Description
The E2B API Proxy acts as a middleware between your application and the E2B service, providing:
- Proxy API requests to E2B service
- Support for multiple AI models (OpenAI, Google Vertex AI, Anthropic)
- Streaming and non-streaming response handling
- CORS support for cross-origin requests
Deployment on Hugging Face Spaces
This application is ready to be deployed on Hugging Face Spaces:
- Create a new Space on Hugging Face with Docker SDK
- Upload these files to your Space
- Set the environment variables in the Space settings:
API_KEY
: Your API key for authentication (default: sk-123456)API_BASE_URL
: The base URL for the E2B service (default: https://fragments.e2b.dev)
API Endpoints
GET /hf/v1/models
: List available modelsPOST /hf/v1/chat/completions
: Send chat completion requestsGET /
: Root endpoint (health check)
Configuration
The main configuration is in the app.py
file. You can customize:
- API key for authentication
- Base URL for the E2B service
- Model configurations
- Default headers for requests
Local Development
Prerequisites
- Docker and Docker Compose
Running the Application Locally
- Clone this repository
- Update the API key in docker-compose.yml (replace
sk-123456
with your actual key) - Build and start the container:
docker-compose up -d
- The API will be available at http://localhost:7860
Testing the API
You can test the API using curl:
# Get available models
curl http://localhost:7860/hf/v1/models
# Send a chat completion request
curl -X POST http://localhost:7860/hf/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-123456" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Hello, how are you?"}
]
}'
Supported Models
The API supports various models from different providers:
- OpenAI: o1-preview, o3-mini, gpt-4o, gpt-4.5-preview, gpt-4-turbo
- Google: gemini-1.5-pro, gemini-2.5-pro-exp-03-25, gemini-exp-1121, gemini-2.0-flash-exp
- Anthropic: claude-3-5-sonnet-latest, claude-3-7-sonnet-latest, claude-3-5-haiku-latest
License
This project is open source and available under the MIT License.