runtime error

Exit code: 1. Reason: r/local/lib/python3.10/site-packages/torch/nn/modules/transformer.py", line 20, in <module> device: torch.device = torch.device(torch._C._get_default_device()), # torch.device('cpu'), /usr/local/lib/python3.10/site-packages/torch/nn/modules/transformer.py:20: UserWarning: Failed to initialize NumPy: _ARRAY_API not found (Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:84.) device: torch.device = torch.device(torch._C._get_default_device()), # torch.device('cpu'), Note: Environment variable`HF_TOKEN` is set and is the current active token independently from the token you've just configured. .gitattributes: 0%| | 0.00/1.57k [00:00<?, ?B/s] .gitattributes: 100%|██████████| 1.57k/1.57k [00:00<00:00, 7.17MB/s] README.md: 0%| | 0.00/10.8k [00:00<?, ?B/s] README.md: 100%|██████████| 10.8k/10.8k [00:00<00:00, 42.7MB/s] adapter_config.json: 0%| | 0.00/684 [00:00<?, ?B/s] adapter_config.json: 100%|██████████| 684/684 [00:00<00:00, 4.13MB/s] adapter_model.safetensors: 0%| | 0.00/7.69M [00:00<?, ?B/s] adapter_model.safetensors: 100%|██████████| 7.69M/7.69M [00:00<00:00, 183MB/s] run_inference.py: 0%| | 0.00/831 [00:00<?, ?B/s] run_inference.py: 100%|██████████| 831/831 [00:00<00:00, 5.72MB/s] config.json: 0%| | 0.00/624 [00:00<?, ?B/s] config.json: 100%|██████████| 624/624 [00:00<00:00, 4.46MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 31, in <module> base_config = AutoConfig.from_pretrained(base_config_path) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1039, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 734, in __getitem__ raise KeyError(key) KeyError: 'mistral'

Container logs:

Fetching error logs...