File size: 655 Bytes
5775c50
85c9aea
 
 
 
 
 
 
 
 
 
 
 
4ad0254
5775c50
85c9aea
 
 
17af558
85c9aea
 
 
 
6ff279c
5f810c2
85c9aea
 
 
 
 
 
 
 
6ff279c
85c9aea
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
git+https://github.com/huggingface/accelerate.git
git+https://github.com/huggingface/peft.git
transformers-stream-generator
gradio_pdf==0.0.22
huggingface_hub
albumentations
beautifulsoup4
qwen-vl-utils
pyvips-binary
sentencepiece
opencv-python
docling-core
transformers
torch==2.6.0
python-docx
torchvision
matplotlib
tokenizers
pdf2image
num2words
reportlab
html2text
easydict
protobuf
markdown
requests
pymupdf
loguru
hf_xet
spaces
pyvips
pillow
addict 
gradio
einops
httpx
click
oss2
fpdf
timm
av