runtime error

Exit code: 1. Reason: 383MB/s] 34%|β–ˆβ–ˆβ–ˆβ–Ž | 1.42G/4.23G [00:04<00:07, 386MB/s] 42%|β–ˆβ–ˆβ–ˆβ–ˆβ– | 1.79G/4.23G [00:05<00:06, 390MB/s] 51%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 2.16G/4.23G [00:06<00:05, 384MB/s] 59%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 2.52G/4.23G [00:07<00:04, 385MB/s] 68%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 2.88G/4.23G [00:08<00:03, 387MB/s] 77%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 3.25G/4.23G [00:09<00:02, 391MB/s] 86%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 3.63G/4.23G [00:10<00:01, 394MB/s] 94%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–| 4.00G/4.23G [00:11<00:00, 396MB/s] 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4.23G/4.23G [00:11<00:00, 388MB/s] tokenizer_config.json: 0%| | 0.00/48.0 [00:00<?, ?B/s] tokenizer_config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 48.0/48.0 [00:00<00:00, 313kB/s] vocab.txt: 0%| | 0.00/232k [00:00<?, ?B/s] vocab.txt: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 232k/232k [00:00<00:00, 96.1MB/s] tokenizer.json: 0%| | 0.00/466k [00:00<?, ?B/s] tokenizer.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 466k/466k [00:00<00:00, 68.2MB/s] config.json: 0%| | 0.00/570 [00:00<?, ?B/s] config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 570/570 [00:00<00:00, 5.67MB/s] The new embeddings will be initialized from a multivariate normal distribution that has old embeddings' mean and covariance. As described in this article: https://nlp.stanford.edu/~johnhew/vocab-expansion.html. To disable this, use `mean_resizing=False` -------------- ./ram_plus_swin_large_14m.pth -------------- Traceback (most recent call last): File "/home/user/app/app.py", line 43, in <module> ram_model = ram_plus( File "/usr/local/lib/python3.10/site-packages/ram/models/ram_plus.py", line 408, in ram_plus model, msg = load_checkpoint_swinlarge(model, pretrained, kwargs) File "/usr/local/lib/python3.10/site-packages/ram/models/utils.py", line 258, in load_checkpoint_swinlarge raise RuntimeError('checkpoint url or path is invalid') RuntimeError: checkpoint url or path is invalid

Container logs:

Fetching error logs...