Hunyuan3D-2.1 / API_README.md
asimfayaz's picture
Add API endpoints for 3D model generation
e060ab5
|
raw
history blame
3.43 kB
# Hunyuan3D-2.1 API Documentation
This document describes the REST API endpoints for the Hunyuan3D-2.1 service.
## Base URL
```
http://localhost:7860
```
## Endpoints
### 1. Health Check
**GET** `/api/health`
Check if the service is running.
**Response:**
```json
{
"status": "ok",
"version": "2.1"
}
```
### 2. Generate 3D Model
**POST** `/api/generate`
Start a 3D model generation job.
**Request Body:**
```json
{
"images": {
"front": "base64_encoded_image",
"back": "base64_encoded_image", // Optional
"left": "base64_encoded_image", // Optional
"right": "base64_encoded_image" // Optional
},
"options": {
"enable_pbr": true,
"should_remesh": true,
"should_texture": true
}
}
```
**Response:**
```json
{
"job_id": "uuid",
"status": "queued"
}
```
**Notes:**
- At least one image is required
- The `front` image is mandatory
- Images should be base64 encoded
- The `options` object is optional and will use defaults if not provided
### 3. Check Job Status
**GET** `/api/status?job_id=uuid`
Check the status of a generation job.
**Response:**
```json
{
"status": "completed|processing|queued|failed",
"progress": 0-100,
"model_urls": {
"glb": "url_to_glb_file"
}
}
```
**Status Values:**
- `queued`: Job is waiting to be processed
- `processing`: Job is currently being processed
- `completed`: Job completed successfully
- `failed`: Job failed with an error
## Usage Examples
### Python Example
```python
import requests
import base64
# Encode image
with open("image.png", "rb") as f:
image_base64 = base64.b64encode(f.read()).decode()
# Start generation
response = requests.post("http://localhost:7860/api/generate", json={
"images": {
"front": image_base64
},
"options": {
"enable_pbr": True,
"should_texture": True
}
})
job_id = response.json()["job_id"]
# Check status
while True:
status_response = requests.get(f"http://localhost:7860/api/status?job_id={job_id}")
data = status_response.json()
if data["status"] == "completed":
print(f"Model ready: {data['model_urls']['glb']}")
break
elif data["status"] == "failed":
print(f"Generation failed: {data.get('error')}")
break
print(f"Progress: {data['progress']}%")
time.sleep(5)
```
### cURL Example
```bash
# Health check
curl http://localhost:7860/api/health
# Generate model
curl -X POST http://localhost:7860/api/generate \
-H "Content-Type: application/json" \
-d '{
"images": {
"front": "base64_encoded_image_here"
},
"options": {
"enable_pbr": true,
"should_texture": true
}
}'
# Check status
curl "http://localhost:7860/api/status?job_id=your_job_id"
```
## Error Handling
The API returns appropriate HTTP status codes:
- `200`: Success
- `400`: Bad request (invalid input)
- `404`: Job not found
- `500`: Internal server error
Error responses include a detail message:
```json
{
"detail": "Error message here"
}
```
## Testing
Use the provided test script to verify the API:
```bash
python test_api.py
```
This will test all endpoints using the demo image.
## Notes
- Jobs are processed asynchronously in the background
- The service maintains job state in memory (jobs are lost on restart)
- Generated models are served via static file URLs
- The texture generation step is optional and can be disabled via options