Instructions to use Intel/dpt-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Intel/dpt-large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("depth-estimation", model="Intel/dpt-large")# Load model directly from transformers import AutoImageProcessor, AutoModelForDepthEstimation processor = AutoImageProcessor.from_pretrained("Intel/dpt-large") model = AutoModelForDepthEstimation.from_pretrained("Intel/dpt-large") - Notebooks
- Google Colab
- Kaggle
ValueError: Could not load with any of the following classes
1
#12 opened almost 2 years ago
by
hhold
Some weights of DPTForDepthEstimation were not initialized from the model checkpoint at Intel/dpt-large
1
#9 opened over 2 years ago
by
jigu
DPT-NYU or DPT-Kitti?
1
#7 opened over 2 years ago
by
FatemehBehrad
Does this model predict absolute depth?
2
#5 opened almost 3 years ago
by
OlliOlli