runtime error

Exit code: 1. Reason: cal/lib/python3.10/site-packages/transformers/pipelines/base.py", line 311, in infer_framework_load_model model = model_class.from_pretrained(model, **fp32_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 277, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 5048, in from_pretrained ) = cls._load_pretrained_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 5316, in _load_pretrained_model load_state_dict(checkpoint_files[0], map_location="meta", weights_only=weights_only).keys() File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 508, in load_state_dict check_torch_load_is_safe() File "/usr/local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1647, in check_torch_load_is_safe raise ValueError( ValueError: Due to a serious vulnerability issue in `torch.load`, even with `weights_only=True`, we now require users to upgrade torch to at least v2.6 in order to use the function. This version restriction does not apply when loading files with safetensors. See the vulnerability report here https://nvd.nist.gov/vuln/detail/CVE-2025-32434  Traceback (most recent call last): File "/app/app.py", line 112, in <module> net, feature_utils, seq_cfg = get_model() File "/app/app.py", line 95, in get_model with torch.cuda.device(device): File "/usr/local/lib/python3.10/site-packages/torch/cuda/__init__.py", line 382, in __init__ self.idx = _get_device_index(device, optional=True) File "/usr/local/lib/python3.10/site-packages/torch/cuda/_utils.py", line 34, in _get_device_index raise ValueError(f"Expected a cuda device, but got: {device}") ValueError: Expected a cuda device, but got: cpu model.safetensors: 57%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 179M/312M [00:01<00:00, 149MB/s] model.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 312M/312M [00:01<00:00, 228MB/s]

Container logs:

Fetching error logs...