Update README.md
Browse files
README.md
CHANGED
|
@@ -1,5 +1,6 @@
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
|
|
|
| 3 |
---
|
| 4 |
# MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts
|
| 5 |
|
|
@@ -105,4 +106,4 @@ If you find this paper useful, please consider staring 🌟 this repo and citing
|
|
| 105 |
journal={arXiv preprint arXiv:2410.07348},
|
| 106 |
year={2024}
|
| 107 |
}
|
| 108 |
-
```
|
|
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
+
pipeline_tag: text-generation
|
| 4 |
---
|
| 5 |
# MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts
|
| 6 |
|
|
|
|
| 106 |
journal={arXiv preprint arXiv:2410.07348},
|
| 107 |
year={2024}
|
| 108 |
}
|
| 109 |
+
```
|