runtime error
Exit code: 1. Reason: | 19.4M/4.91G [00:03<13:11, 6.18MB/s][A model.safetensors: 3%|â | 151M/4.91G [00:06<02:37, 30.2MB/s] [A model.safetensors: 4%|â | 218M/4.91G [00:09<03:05, 25.3MB/s][A model.safetensors: 7%|â | 352M/4.91G [00:13<02:36, 29.2MB/s][A model.safetensors: 11%|â | 549M/4.91G [00:15<01:31, 47.6MB/s][A model.safetensors: 21%|ââ | 1.02G/4.91G [00:16<00:35, 108MB/s][A model.safetensors: 30%|âââ | 1.48G/4.91G [00:17<00:20, 170MB/s][A model.safetensors: 37%|ââââ | 1.82G/4.91G [00:19<00:17, 178MB/s][A model.safetensors: 49%|âââââ | 2.42G/4.91G [00:20<00:09, 257MB/s][A model.safetensors: 60%|ââââââ | 2.95G/4.91G [00:22<00:06, 285MB/s][A model.safetensors: 68%|âââââââ | 3.36G/4.91G [00:23<00:05, 307MB/s][A model.safetensors: 78%|ââââââââ | 3.83G/4.91G [00:24<00:03, 317MB/s][A model.safetensors: 85%|âââââââââ | 4.17G/4.91G [00:26<00:02, 274MB/s][A model.safetensors: 92%|ââââââââââ| 4.51G/4.91G [00:27<00:01, 255MB/s][A model.safetensors: 100%|ââââââââââ| 4.91G/4.91G [00:28<00:00, 170MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 19, in <module> model = AutoModelForCausalLM.from_pretrained(model_name, File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 559, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4097, in from_pretrained model = cls(config, *model_args, **model_kwargs) File "/home/user/.cache/huggingface/modules/transformers_modules/AIDC-AI/Ovis2-2B/63d303670a24612b2d0dd368490fcd7f75989d73/modeling_ovis.py", line 293, in __init__ version.parse(importlib.metadata.version("flash_attn")) >= version.parse("2.6.3")), \ AssertionError: Using `flash_attention_2` requires having `flash_attn>=2.6.3` installed.
Container logs:
Fetching error logs...