runtime error

Exit code: 1. Reason: οΏ½οΏ½β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 3.80G/4.93G [00:13<00:03, 353MB/s] model.fp16.safetensors: 85%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 4.20G/4.93G [00:14<00:01, 368MB/s] model.fp16.safetensors: 97%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹| 4.80G/4.93G [00:15<00:00, 429MB/s] model.fp16.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰| 4.93G/4.93G [00:15<00:00, 317MB/s] 2025-06-21 07:10:49,125 - hy3dgen.shapgen - INFO - Loading model from /home/user/.cache/huggingface/hub/models--tencent--Hunyuan3D-2/snapshots/34e28261f71c32975727be8db0eace439a280f82/hunyuan3d-dit-v2-0/model.fp16.safetensors Traceback (most recent call last): File "/home/user/app/gradio_app.py", line 740, in <module> i23d_worker = Hunyuan3DDiTFlowMatchingPipeline.from_pretrained( File "/home/user/app/hy3dgen/shapegen/pipelines.py", line 220, in from_pretrained return cls.from_single_file( File "/home/user/app/hy3dgen/shapegen/utils.py", line 83, in wrapper result = func(*args, **kwargs) File "/home/user/app/hy3dgen/shapegen/pipelines.py", line 191, in from_single_file return cls( File "/home/user/app/hy3dgen/shapegen/pipelines.py", line 246, in __init__ self.to(device, dtype) File "/home/user/app/hy3dgen/shapegen/pipelines.py", line 303, in to self.vae.to(device) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1355, in to return self._apply(convert) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 915, in _apply module._apply(fn) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1003, in _apply self._buffers[key] = fn(buf) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1341, in convert return t.to( File "/usr/local/lib/python3.10/site-packages/torch/cuda/__init__.py", line 372, in _lazy_init torch._C._cuda_init() RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx

Container logs:

Fetching error logs...