text
stringlengths 0
696
|
---|
KeyError: 'hunyuan_v1_dense'
|
During handling of the above exception, another exception occurred:
|
Traceback (most recent call last):
|
File "/tmp/tencent_Hunyuan-0.5B-Instruct_039Ya9m.py", line 19, in <module>
|
pipe = pipeline("text-generation", model="tencent/Hunyuan-0.5B-Instruct")
|
File "/tmp/.cache/uv/environments-v2/2a6a6f0875dd3018/lib/python3.13/site-packages/transformers/pipelines/__init__.py", line 909, in pipeline
|
config = AutoConfig.from_pretrained(
|
model, _from_pipeline=task, code_revision=code_revision, **hub_kwargs, **model_kwargs
|
)
|
File "/tmp/.cache/uv/environments-v2/2a6a6f0875dd3018/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1273, in from_pretrained
|
raise ValueError(
|
...<8 lines>...
|
)
|
ValueError: The checkpoint you are trying to load has model type `hunyuan_v1_dense` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
|
You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`
|
Traceback (most recent call last):
|
File "/tmp/.cache/uv/environments-v2/89a254f735f44374/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1271, in from_pretrained
|
config_class = CONFIG_MAPPING[config_dict["model_type"]]
|
~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
File "/tmp/.cache/uv/environments-v2/89a254f735f44374/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 966, in __getitem__
|
raise KeyError(key)
|
KeyError: 'hunyuan_v1_dense'
|
During handling of the above exception, another exception occurred:
|
Traceback (most recent call last):
|
File "/tmp/tencent_Hunyuan-1.8B-Instruct_0B3jGno.py", line 19, in <module>
|
pipe = pipeline("text-generation", model="tencent/Hunyuan-1.8B-Instruct")
|
File "/tmp/.cache/uv/environments-v2/89a254f735f44374/lib/python3.13/site-packages/transformers/pipelines/__init__.py", line 909, in pipeline
|
config = AutoConfig.from_pretrained(
|
model, _from_pipeline=task, code_revision=code_revision, **hub_kwargs, **model_kwargs
|
)
|
File "/tmp/.cache/uv/environments-v2/89a254f735f44374/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1273, in from_pretrained
|
raise ValueError(
|
...<8 lines>...
|
)
|
ValueError: The checkpoint you are trying to load has model type `hunyuan_v1_dense` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
|
You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`
|
Traceback (most recent call last):
|
File "/tmp/.cache/uv/environments-v2/4146d04fd154d95d/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1271, in from_pretrained
|
config_class = CONFIG_MAPPING[config_dict["model_type"]]
|
~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
File "/tmp/.cache/uv/environments-v2/4146d04fd154d95d/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 966, in __getitem__
|
raise KeyError(key)
|
KeyError: 'hunyuan_v1_dense'
|
During handling of the above exception, another exception occurred:
|
Traceback (most recent call last):
|
File "/tmp/tencent_Hunyuan-7B-Instruct_0fvToDh.py", line 19, in <module>
|
pipe = pipeline("text-generation", model="tencent/Hunyuan-7B-Instruct")
|
File "/tmp/.cache/uv/environments-v2/4146d04fd154d95d/lib/python3.13/site-packages/transformers/pipelines/__init__.py", line 909, in pipeline
|
config = AutoConfig.from_pretrained(
|
model, _from_pipeline=task, code_revision=code_revision, **hub_kwargs, **model_kwargs
|
)
|
File "/tmp/.cache/uv/environments-v2/4146d04fd154d95d/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1273, in from_pretrained
|
raise ValueError(
|
...<8 lines>...
|
)
|
ValueError: The checkpoint you are trying to load has model type `hunyuan_v1_dense` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
|
You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`
|
No suitable GPU found for tencent/Hunyuan-A13B-Instruct | 389.33 GB VRAM requirement
|
No suitable GPU found for tencent/Hunyuan-A13B-Instruct | 389.33 GB VRAM requirement
|
No suitable GPU found for tngtech/DeepSeek-TNG-R1T2-Chimera | 3315.10 GB VRAM requirement
|
No suitable GPU found for tngtech/DeepSeek-TNG-R1T2-Chimera | 3315.10 GB VRAM requirement
|
No suitable GPU found for trillionlabs/Tri-21B | 100.37 GB VRAM requirement
|
No suitable GPU found for trillionlabs/Tri-21B | 100.37 GB VRAM requirement
|
No suitable GPU found for trillionlabs/Tri-70B-preview-SFT | 341.38 GB VRAM requirement
|
No suitable GPU found for trillionlabs/Tri-70B-preview-SFT | 341.38 GB VRAM requirement
|
Traceback (most recent call last):
|
File "/tmp/vikhyatk_moondream2_0vgzNSQ.py", line 13, in <module>
|
pipe = pipeline("image-text-to-text", model="vikhyatk/moondream2", trust_remote_code=True)
|
File "/tmp/.cache/uv/environments-v2/2a90239f0d4c576b/lib/python3.13/site-packages/transformers/pipelines/__init__.py", line 931, in pipeline
|
config = AutoConfig.from_pretrained(
|
model, _from_pipeline=task, code_revision=code_revision, **hub_kwargs, **model_kwargs
|
)
|
File "/tmp/.cache/uv/environments-v2/2a90239f0d4c576b/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1211, in from_pretrained
|
config_class = get_class_from_dynamic_module(
|
class_ref, pretrained_model_name_or_path, code_revision=code_revision, **kwargs
|
)
|
File "/tmp/.cache/uv/environments-v2/2a90239f0d4c576b/lib/python3.13/site-packages/transformers/dynamic_module_utils.py", line 570, in get_class_from_dynamic_module
|
final_module = get_cached_module_file(
|
repo_id,
|
...<8 lines>...
|
repo_type=repo_type,
|
)
|
File "/tmp/.cache/uv/environments-v2/2a90239f0d4c576b/lib/python3.13/site-packages/transformers/dynamic_module_utils.py", line 433, in get_cached_module_file
|
get_cached_module_file(
|
~~~~~~~~~~~~~~~~~~~~~~^
|
pretrained_model_name_or_path,
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
...<8 lines>...
|
_commit_hash=commit_hash,
|
^^^^^^^^^^^^^^^^^^^^^^^^^
|
)
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.