runtime error
Exit code: 1. Reason: Failed to import llava_llama from llava.language_model.llava_llama. Error: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback): Failed to import transformers.generation.utils because of the following error (look up to see its traceback): No module named 'torch.distributed.device_mesh' Failed to import llava_qwen from llava.language_model.llava_qwen. Error: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback): Failed to import transformers.generation.utils because of the following error (look up to see its traceback): No module named 'torch.distributed.device_mesh' Failed to import llava_mistral from llava.language_model.llava_mistral. Error: Failed to import transformers.models.mistral.modeling_mistral because of the following error (look up to see its traceback): Failed to import transformers.generation.utils because of the following error (look up to see its traceback): No module named 'torch.distributed.device_mesh' Failed to import llava_mixtral from llava.language_model.llava_mixtral. Error: Failed to import transformers.models.mixtral.modeling_mixtral because of the following error (look up to see its traceback): Failed to import transformers.generation.utils because of the following error (look up to see its traceback): No module named 'torch.distributed.device_mesh' Traceback (most recent call last): File "/home/user/app/app.py", line 16, in <module> from llava import conversation as conversation_lib File "/usr/local/lib/python3.10/site-packages/llava/__init__.py", line 1, in <module> from .model import LlavaLlamaForCausalLM ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' (/usr/local/lib/python3.10/site-packages/llava/model/__init__.py)
Container logs:
Fetching error logs...