Instructions to use timm/coat_small.in1k with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- timm
How to use timm/coat_small.in1k with timm:
import timm model = timm.create_model("hf_hub:timm/coat_small.in1k", pretrained=True) - Transformers
How to use timm/coat_small.in1k with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-classification", model="timm/coat_small.in1k") pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("timm/coat_small.in1k", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 9d1e2994c6d68e528d81263dfef63c12edfba857f05476ed3b6cfe56afefa0bd
- Size of remote file:
- 86.9 MB
- SHA256:
- 5bb26015c9d3d9667a885dbf77d59de132989e1004897d6c2cadca5c89c181f2
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.