Spaces:
Running
on
Zero
Running
on
Zero
change gpu duration calculation to dynamically adapt to the number of inference steps
Browse filesThis will increase the allocated ZeroGPU for a run according to the amount of steps, and reduce the amount of `ZeroGPU worker error GPU Task Aborted` errors
app.py
CHANGED
|
@@ -125,14 +125,7 @@ def get_duration(image,
|
|
| 125 |
seed,
|
| 126 |
progress):
|
| 127 |
"""Calculate dynamic GPU duration based on parameters."""
|
| 128 |
-
|
| 129 |
-
return 220
|
| 130 |
-
elif sampling_steps > 35 and duration_seconds >= 2:
|
| 131 |
-
return 180
|
| 132 |
-
elif sampling_steps < 35 or duration_seconds < 2:
|
| 133 |
-
return 105
|
| 134 |
-
else:
|
| 135 |
-
return 90
|
| 136 |
|
| 137 |
# --- 2. Gradio Inference Function ---
|
| 138 |
@spaces.GPU(duration=get_duration)
|
|
|
|
| 125 |
seed,
|
| 126 |
progress):
|
| 127 |
"""Calculate dynamic GPU duration based on parameters."""
|
| 128 |
+
return sampling_steps * 15
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 129 |
|
| 130 |
# --- 2. Gradio Inference Function ---
|
| 131 |
@spaces.GPU(duration=get_duration)
|