Spaces:
Running
on
Zero
Running
on
Zero
Add API endpoints for 3D model generation
Browse files- Add /api/generate endpoint for starting 3D generation jobs
- Add /api/status endpoint for checking job progress and results
- Add /api/health endpoint for service health checks
- Implement background job processing with progress tracking
- Add comprehensive API documentation in API_README.md
- Support base64 image encoding and multi-view inputs
- Include proper error handling and validation
- API_README.md +176 -0
- gradio_app.py +228 -1
API_README.md
ADDED
@@ -0,0 +1,176 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Hunyuan3D-2.1 API Documentation
|
2 |
+
|
3 |
+
This document describes the REST API endpoints for the Hunyuan3D-2.1 service.
|
4 |
+
|
5 |
+
## Base URL
|
6 |
+
```
|
7 |
+
http://localhost:7860
|
8 |
+
```
|
9 |
+
|
10 |
+
## Endpoints
|
11 |
+
|
12 |
+
### 1. Health Check
|
13 |
+
**GET** `/api/health`
|
14 |
+
|
15 |
+
Check if the service is running.
|
16 |
+
|
17 |
+
**Response:**
|
18 |
+
```json
|
19 |
+
{
|
20 |
+
"status": "ok",
|
21 |
+
"version": "2.1"
|
22 |
+
}
|
23 |
+
```
|
24 |
+
|
25 |
+
### 2. Generate 3D Model
|
26 |
+
**POST** `/api/generate`
|
27 |
+
|
28 |
+
Start a 3D model generation job.
|
29 |
+
|
30 |
+
**Request Body:**
|
31 |
+
```json
|
32 |
+
{
|
33 |
+
"images": {
|
34 |
+
"front": "base64_encoded_image",
|
35 |
+
"back": "base64_encoded_image", // Optional
|
36 |
+
"left": "base64_encoded_image", // Optional
|
37 |
+
"right": "base64_encoded_image" // Optional
|
38 |
+
},
|
39 |
+
"options": {
|
40 |
+
"enable_pbr": true,
|
41 |
+
"should_remesh": true,
|
42 |
+
"should_texture": true
|
43 |
+
}
|
44 |
+
}
|
45 |
+
```
|
46 |
+
|
47 |
+
**Response:**
|
48 |
+
```json
|
49 |
+
{
|
50 |
+
"job_id": "uuid",
|
51 |
+
"status": "queued"
|
52 |
+
}
|
53 |
+
```
|
54 |
+
|
55 |
+
**Notes:**
|
56 |
+
- At least one image is required
|
57 |
+
- The `front` image is mandatory
|
58 |
+
- Images should be base64 encoded
|
59 |
+
- The `options` object is optional and will use defaults if not provided
|
60 |
+
|
61 |
+
### 3. Check Job Status
|
62 |
+
**GET** `/api/status?job_id=uuid`
|
63 |
+
|
64 |
+
Check the status of a generation job.
|
65 |
+
|
66 |
+
**Response:**
|
67 |
+
```json
|
68 |
+
{
|
69 |
+
"status": "completed|processing|queued|failed",
|
70 |
+
"progress": 0-100,
|
71 |
+
"model_urls": {
|
72 |
+
"glb": "url_to_glb_file"
|
73 |
+
}
|
74 |
+
}
|
75 |
+
```
|
76 |
+
|
77 |
+
**Status Values:**
|
78 |
+
- `queued`: Job is waiting to be processed
|
79 |
+
- `processing`: Job is currently being processed
|
80 |
+
- `completed`: Job completed successfully
|
81 |
+
- `failed`: Job failed with an error
|
82 |
+
|
83 |
+
## Usage Examples
|
84 |
+
|
85 |
+
### Python Example
|
86 |
+
```python
|
87 |
+
import requests
|
88 |
+
import base64
|
89 |
+
|
90 |
+
# Encode image
|
91 |
+
with open("image.png", "rb") as f:
|
92 |
+
image_base64 = base64.b64encode(f.read()).decode()
|
93 |
+
|
94 |
+
# Start generation
|
95 |
+
response = requests.post("http://localhost:7860/api/generate", json={
|
96 |
+
"images": {
|
97 |
+
"front": image_base64
|
98 |
+
},
|
99 |
+
"options": {
|
100 |
+
"enable_pbr": True,
|
101 |
+
"should_texture": True
|
102 |
+
}
|
103 |
+
})
|
104 |
+
|
105 |
+
job_id = response.json()["job_id"]
|
106 |
+
|
107 |
+
# Check status
|
108 |
+
while True:
|
109 |
+
status_response = requests.get(f"http://localhost:7860/api/status?job_id={job_id}")
|
110 |
+
data = status_response.json()
|
111 |
+
|
112 |
+
if data["status"] == "completed":
|
113 |
+
print(f"Model ready: {data['model_urls']['glb']}")
|
114 |
+
break
|
115 |
+
elif data["status"] == "failed":
|
116 |
+
print(f"Generation failed: {data.get('error')}")
|
117 |
+
break
|
118 |
+
|
119 |
+
print(f"Progress: {data['progress']}%")
|
120 |
+
time.sleep(5)
|
121 |
+
```
|
122 |
+
|
123 |
+
### cURL Example
|
124 |
+
```bash
|
125 |
+
# Health check
|
126 |
+
curl http://localhost:7860/api/health
|
127 |
+
|
128 |
+
# Generate model
|
129 |
+
curl -X POST http://localhost:7860/api/generate \
|
130 |
+
-H "Content-Type: application/json" \
|
131 |
+
-d '{
|
132 |
+
"images": {
|
133 |
+
"front": "base64_encoded_image_here"
|
134 |
+
},
|
135 |
+
"options": {
|
136 |
+
"enable_pbr": true,
|
137 |
+
"should_texture": true
|
138 |
+
}
|
139 |
+
}'
|
140 |
+
|
141 |
+
# Check status
|
142 |
+
curl "http://localhost:7860/api/status?job_id=your_job_id"
|
143 |
+
```
|
144 |
+
|
145 |
+
## Error Handling
|
146 |
+
|
147 |
+
The API returns appropriate HTTP status codes:
|
148 |
+
|
149 |
+
- `200`: Success
|
150 |
+
- `400`: Bad request (invalid input)
|
151 |
+
- `404`: Job not found
|
152 |
+
- `500`: Internal server error
|
153 |
+
|
154 |
+
Error responses include a detail message:
|
155 |
+
```json
|
156 |
+
{
|
157 |
+
"detail": "Error message here"
|
158 |
+
}
|
159 |
+
```
|
160 |
+
|
161 |
+
## Testing
|
162 |
+
|
163 |
+
Use the provided test script to verify the API:
|
164 |
+
|
165 |
+
```bash
|
166 |
+
python test_api.py
|
167 |
+
```
|
168 |
+
|
169 |
+
This will test all endpoints using the demo image.
|
170 |
+
|
171 |
+
## Notes
|
172 |
+
|
173 |
+
- Jobs are processed asynchronously in the background
|
174 |
+
- The service maintains job state in memory (jobs are lost on restart)
|
175 |
+
- Generated models are served via static file URLs
|
176 |
+
- The texture generation step is optional and can be disabled via options
|
gradio_app.py
CHANGED
@@ -37,20 +37,184 @@ import subprocess
|
|
37 |
import time
|
38 |
from glob import glob
|
39 |
from pathlib import Path
|
|
|
|
|
|
|
|
|
|
|
40 |
|
41 |
import gradio as gr
|
42 |
import torch
|
43 |
import trimesh
|
44 |
import uvicorn
|
45 |
-
from fastapi import FastAPI
|
46 |
from fastapi.staticfiles import StaticFiles
|
|
|
|
|
47 |
import uuid
|
48 |
import numpy as np
|
|
|
|
|
49 |
|
50 |
from hy3dshape.utils import logger
|
51 |
from hy3dpaint.convert_utils import create_glb_with_pbr_materials
|
52 |
|
53 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
54 |
MAX_SEED = 1e7
|
55 |
ENV = "Huggingface" # "Huggingface"
|
56 |
if ENV == 'Huggingface':
|
@@ -907,6 +1071,69 @@ if __name__ == '__main__':
|
|
907 |
# create a FastAPI app
|
908 |
app = FastAPI()
|
909 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
910 |
# create a static directory to store the static files
|
911 |
static_dir = Path(SAVE_DIR).absolute()
|
912 |
static_dir.mkdir(parents=True, exist_ok=True)
|
|
|
37 |
import time
|
38 |
from glob import glob
|
39 |
from pathlib import Path
|
40 |
+
import base64
|
41 |
+
import json
|
42 |
+
import threading
|
43 |
+
from typing import Dict, Optional, Any
|
44 |
+
from enum import Enum
|
45 |
|
46 |
import gradio as gr
|
47 |
import torch
|
48 |
import trimesh
|
49 |
import uvicorn
|
50 |
+
from fastapi import FastAPI, HTTPException, BackgroundTasks
|
51 |
from fastapi.staticfiles import StaticFiles
|
52 |
+
from fastapi.responses import JSONResponse
|
53 |
+
from pydantic import BaseModel
|
54 |
import uuid
|
55 |
import numpy as np
|
56 |
+
from PIL import Image
|
57 |
+
import io
|
58 |
|
59 |
from hy3dshape.utils import logger
|
60 |
from hy3dpaint.convert_utils import create_glb_with_pbr_materials
|
61 |
|
62 |
|
63 |
+
# API Models
|
64 |
+
class JobStatus(Enum):
|
65 |
+
QUEUED = "queued"
|
66 |
+
PROCESSING = "processing"
|
67 |
+
COMPLETED = "completed"
|
68 |
+
FAILED = "failed"
|
69 |
+
|
70 |
+
|
71 |
+
class GenerateRequest(BaseModel):
|
72 |
+
images: Dict[str, str] # base64 encoded images
|
73 |
+
options: Dict[str, Any] = {
|
74 |
+
"enable_pbr": True,
|
75 |
+
"should_remesh": True,
|
76 |
+
"should_texture": True
|
77 |
+
}
|
78 |
+
|
79 |
+
|
80 |
+
class JobInfo:
|
81 |
+
def __init__(self, job_id: str):
|
82 |
+
self.job_id = job_id
|
83 |
+
self.status = JobStatus.QUEUED
|
84 |
+
self.progress = 0
|
85 |
+
self.start_time = time.time()
|
86 |
+
self.end_time = None
|
87 |
+
self.error_message = None
|
88 |
+
self.model_urls = {}
|
89 |
+
self.images = {}
|
90 |
+
self.options = {}
|
91 |
+
|
92 |
+
|
93 |
+
# Global job storage
|
94 |
+
jobs: Dict[str, JobInfo] = {}
|
95 |
+
|
96 |
+
|
97 |
+
def create_job() -> str:
|
98 |
+
"""Create a new job and return its ID."""
|
99 |
+
job_id = str(uuid.uuid4())
|
100 |
+
jobs[job_id] = JobInfo(job_id)
|
101 |
+
return job_id
|
102 |
+
|
103 |
+
|
104 |
+
def update_job_status(job_id: str, status: JobStatus, progress: int = None, error_message: str = None):
|
105 |
+
"""Update job status and progress."""
|
106 |
+
if job_id in jobs:
|
107 |
+
jobs[job_id].status = status
|
108 |
+
if progress is not None:
|
109 |
+
jobs[job_id].progress = progress
|
110 |
+
if error_message is not None:
|
111 |
+
jobs[job_id].error_message = error_message
|
112 |
+
if status in [JobStatus.COMPLETED, JobStatus.FAILED]:
|
113 |
+
jobs[job_id].end_time = time.time()
|
114 |
+
|
115 |
+
|
116 |
+
def base64_to_pil_image(base64_string: str) -> Image.Image:
|
117 |
+
"""Convert base64 string to PIL Image."""
|
118 |
+
try:
|
119 |
+
# Remove data URL prefix if present
|
120 |
+
if base64_string.startswith('data:image'):
|
121 |
+
base64_string = base64_string.split(',')[1]
|
122 |
+
|
123 |
+
image_data = base64.b64decode(base64_string)
|
124 |
+
image = Image.open(io.BytesIO(image_data))
|
125 |
+
return image
|
126 |
+
except Exception as e:
|
127 |
+
raise HTTPException(status_code=400, detail=f"Invalid image data: {str(e)}")
|
128 |
+
|
129 |
+
|
130 |
+
def process_generation_job(job_id: str, images: Dict[str, str], options: Dict[str, Any]):
|
131 |
+
"""Background task to process generation job."""
|
132 |
+
global face_reduce_worker, tex_pipeline, HAS_TEXTUREGEN, SAVE_DIR
|
133 |
+
|
134 |
+
try:
|
135 |
+
update_job_status(job_id, JobStatus.PROCESSING, progress=10)
|
136 |
+
|
137 |
+
# Convert base64 images to PIL Images
|
138 |
+
pil_images = {}
|
139 |
+
for view, base64_img in images.items():
|
140 |
+
pil_images[view] = base64_to_pil_image(base64_img)
|
141 |
+
|
142 |
+
# Extract options
|
143 |
+
enable_pbr = options.get("enable_pbr", True)
|
144 |
+
should_remesh = options.get("should_remesh", True)
|
145 |
+
should_texture = options.get("should_texture", True)
|
146 |
+
|
147 |
+
update_job_status(job_id, JobStatus.PROCESSING, progress=20)
|
148 |
+
|
149 |
+
# Generate 3D mesh
|
150 |
+
mesh, main_image, save_folder, stats, seed = _gen_shape(
|
151 |
+
caption=None,
|
152 |
+
image=pil_images,
|
153 |
+
mv_image_front=pil_images.get('front'),
|
154 |
+
mv_image_back=pil_images.get('back'),
|
155 |
+
mv_image_left=pil_images.get('left'),
|
156 |
+
mv_image_right=pil_images.get('right'),
|
157 |
+
steps=30,
|
158 |
+
guidance_scale=7.5,
|
159 |
+
seed=1234,
|
160 |
+
octree_resolution=256,
|
161 |
+
check_box_rembg=True,
|
162 |
+
num_chunks=200000,
|
163 |
+
randomize_seed=False,
|
164 |
+
)
|
165 |
+
|
166 |
+
update_job_status(job_id, JobStatus.PROCESSING, progress=50)
|
167 |
+
|
168 |
+
# Export white mesh
|
169 |
+
white_mesh_path = export_mesh(mesh, save_folder, textured=False, type='obj')
|
170 |
+
|
171 |
+
# Face reduction
|
172 |
+
mesh = face_reduce_worker(mesh)
|
173 |
+
reduced_mesh_path = export_mesh(mesh, save_folder, textured=False, type='obj')
|
174 |
+
|
175 |
+
update_job_status(job_id, JobStatus.PROCESSING, progress=70)
|
176 |
+
|
177 |
+
# Texture generation if enabled
|
178 |
+
textured_mesh_path = None
|
179 |
+
if should_texture and HAS_TEXTUREGEN:
|
180 |
+
try:
|
181 |
+
text_path = os.path.join(save_folder, 'textured_mesh.obj')
|
182 |
+
textured_mesh_path = tex_pipeline(
|
183 |
+
mesh_path=reduced_mesh_path,
|
184 |
+
image_path=main_image,
|
185 |
+
output_mesh_path=text_path,
|
186 |
+
save_glb=False
|
187 |
+
)
|
188 |
+
|
189 |
+
# Convert to GLB
|
190 |
+
glb_path_textured = os.path.join(save_folder, 'textured_mesh.glb')
|
191 |
+
quick_convert_with_obj2gltf(textured_mesh_path, glb_path_textured)
|
192 |
+
textured_mesh_path = glb_path_textured
|
193 |
+
|
194 |
+
except Exception as e:
|
195 |
+
logger.error(f"Texture generation failed: {e}")
|
196 |
+
textured_mesh_path = None
|
197 |
+
|
198 |
+
update_job_status(job_id, JobStatus.PROCESSING, progress=90)
|
199 |
+
|
200 |
+
# Prepare model URLs
|
201 |
+
model_urls = {}
|
202 |
+
if textured_mesh_path and os.path.exists(textured_mesh_path):
|
203 |
+
model_urls["glb"] = f"/static/{os.path.relpath(textured_mesh_path, SAVE_DIR)}"
|
204 |
+
else:
|
205 |
+
# Fallback to white mesh
|
206 |
+
white_glb_path = export_mesh(mesh, save_folder, textured=False, type='glb')
|
207 |
+
model_urls["glb"] = f"/static/{os.path.relpath(white_glb_path, SAVE_DIR)}"
|
208 |
+
|
209 |
+
# Update job with results
|
210 |
+
jobs[job_id].model_urls = model_urls
|
211 |
+
update_job_status(job_id, JobStatus.COMPLETED, progress=100)
|
212 |
+
|
213 |
+
except Exception as e:
|
214 |
+
logger.error(f"Job {job_id} failed: {e}")
|
215 |
+
update_job_status(job_id, JobStatus.FAILED, error_message=str(e))
|
216 |
+
|
217 |
+
|
218 |
MAX_SEED = 1e7
|
219 |
ENV = "Huggingface" # "Huggingface"
|
220 |
if ENV == 'Huggingface':
|
|
|
1071 |
# create a FastAPI app
|
1072 |
app = FastAPI()
|
1073 |
|
1074 |
+
# API Endpoints
|
1075 |
+
@app.post("/api/generate")
|
1076 |
+
async def generate_3d_model(request: GenerateRequest, background_tasks: BackgroundTasks):
|
1077 |
+
"""Generate 3D model from images."""
|
1078 |
+
try:
|
1079 |
+
# Validate input
|
1080 |
+
if not request.images:
|
1081 |
+
raise HTTPException(status_code=400, detail="At least one image is required")
|
1082 |
+
|
1083 |
+
if 'front' not in request.images:
|
1084 |
+
raise HTTPException(status_code=400, detail="Front image is required")
|
1085 |
+
|
1086 |
+
# Create job
|
1087 |
+
job_id = create_job()
|
1088 |
+
|
1089 |
+
# Store job data
|
1090 |
+
jobs[job_id].images = request.images
|
1091 |
+
jobs[job_id].options = request.options
|
1092 |
+
|
1093 |
+
# Start background task
|
1094 |
+
background_tasks.add_task(
|
1095 |
+
process_generation_job,
|
1096 |
+
job_id,
|
1097 |
+
request.images,
|
1098 |
+
request.options
|
1099 |
+
)
|
1100 |
+
|
1101 |
+
return JSONResponse({
|
1102 |
+
"job_id": job_id,
|
1103 |
+
"status": "queued"
|
1104 |
+
})
|
1105 |
+
|
1106 |
+
except Exception as e:
|
1107 |
+
raise HTTPException(status_code=500, detail=str(e))
|
1108 |
+
|
1109 |
+
@app.get("/api/status")
|
1110 |
+
async def get_job_status(job_id: str):
|
1111 |
+
"""Get job status and results."""
|
1112 |
+
if job_id not in jobs:
|
1113 |
+
raise HTTPException(status_code=404, detail="Job not found")
|
1114 |
+
|
1115 |
+
job = jobs[job_id]
|
1116 |
+
|
1117 |
+
response = {
|
1118 |
+
"status": job.status.value,
|
1119 |
+
"progress": job.progress
|
1120 |
+
}
|
1121 |
+
|
1122 |
+
if job.status == JobStatus.COMPLETED:
|
1123 |
+
response["model_urls"] = job.model_urls
|
1124 |
+
elif job.status == JobStatus.FAILED:
|
1125 |
+
response["error"] = job.error_message
|
1126 |
+
|
1127 |
+
return JSONResponse(response)
|
1128 |
+
|
1129 |
+
@app.get("/api/health")
|
1130 |
+
async def health_check():
|
1131 |
+
"""Health check endpoint."""
|
1132 |
+
return JSONResponse({
|
1133 |
+
"status": "ok",
|
1134 |
+
"version": "2.1"
|
1135 |
+
})
|
1136 |
+
|
1137 |
# create a static directory to store the static files
|
1138 |
static_dir = Path(SAVE_DIR).absolute()
|
1139 |
static_dir.mkdir(parents=True, exist_ok=True)
|