runtime error

Exit code: 1. Reason: οΏ½οΏ½β–ˆβ–ˆ| 9.09M/9.09M [00:00<00:00, 50.2MB/s] special_tokens_map.json: 0%| | 0.00/73.0 [00:00<?, ?B/s] special_tokens_map.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 73.0/73.0 [00:00<00:00, 458kB/s] config.json: 0%| | 0.00/844 [00:00<?, ?B/s] config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 844/844 [00:00<00:00, 5.22MB/s] You don't have a GPU available to load the model, the inference will be slow because of weight unpacking model.safetensors: 0%| | 0.00/1.18G [00:00<?, ?B/s] model.safetensors: 2%|▏ | 24.3M/1.18G [00:01<00:55, 20.8MB/s] model.safetensors: 27%|β–ˆβ–ˆβ–‹ | 315M/1.18G [00:02<00:05, 166MB/s]  model.safetensors: 71%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 831M/1.18G [00:03<00:01, 318MB/s] model.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 1.18G/1.18G [00:03<00:00, 301MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 21, in <module> model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 282, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4470, in from_pretrained ) = cls._load_pretrained_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4909, in _load_pretrained_model disk_offload_index, cpu_offload_index = _load_state_dict_into_meta_model( File "/usr/local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 735, in _load_state_dict_into_meta_model file_pointer = safe_open(shard_file, framework="pt", device=tensor_device) safetensors_rust.SafetensorError: device disk is invalid

Container logs:

Fetching error logs...