khopilot commited on
Commit
331c30a
Β·
verified Β·
1 Parent(s): 73ae9c4

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +51 -5
README.md CHANGED
@@ -1,5 +1,51 @@
1
- gradio>=5.0.0
2
- torch>=1.12.0
3
- numpy>=1.21.0
4
- matplotlib>=3.5.0
5
- git+https://github.com/khopilot/asi-v25-longformer-core.git
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ title: ASI V2.5 Live Demo
3
+ emoji: πŸš€
4
+ colorFrom: blue
5
+ colorTo: red
6
+ sdk: gradio
7
+ sdk_version: "5.0.0"
8
+ app_file: app.py
9
+ pinned: false
10
+ license: mit
11
+ ---
12
+
13
+ # πŸš€ ASI V2.5 Live Demo
14
+
15
+ Interactive demonstration of ASI V2.5 Ultra-Professional Linear Attention achieving **2.44x speedup** with 91.7% layer coverage.
16
+
17
+ ## Features
18
+
19
+ πŸ”₯ **Live Benchmark**: Run real-time performance comparisons
20
+ πŸ“Š **Interactive Charts**: Visualize speedup and timing results
21
+ πŸ“‹ **Installation Guide**: Copy-paste setup instructions
22
+ πŸ† **Validated Results**: Official performance metrics
23
+
24
+ ## Validated Performance
25
+
26
+ - ⚑ **2.44x speedup** on Longformer-4096
27
+ - πŸ“ˆ **Linear complexity** O(L) vs O(LΒ²)
28
+ - 🎯 **91.7% layer coverage** in real models
29
+ - 🍎 **Apple Silicon MPS** optimized
30
+ - πŸ”§ **Production ready** with comprehensive testing
31
+
32
+ ## Quick Installation
33
+
34
+ ```bash
35
+ pip install git+https://github.com/khopilot/asi-v25-longformer-core.git
36
+ ```
37
+
38
+ ## Usage
39
+
40
+ ```python
41
+ from asi_v25 import create_asi_attention
42
+
43
+ # Create ultra-fast attention
44
+ attention = create_asi_attention(use_extreme=True)
45
+ ```
46
+
47
+ ## Links
48
+
49
+ - πŸ™ **Source**: [GitHub Repository](https://github.com/khopilot/asi-v25-longformer-core)
50
+ - πŸ€— **Model Hub**: [HuggingFace Hub](https://huggingface.co/khopilot/asi-v25-longformer-core)
51
+ - πŸ“‹ **Examples**: Check `examples/` for reproduction scripts