Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,62 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
---
|
| 6 |
+
|
| 7 |
+
## Information
|
| 8 |
+
|
| 9 |
+
This is a Exl2 quantized version of [Magnum-Picaro-0.7-v2-12b](https://huggingface.co/Trappu/Magnum-Picaro-0.7-v2-12b)
|
| 10 |
+
|
| 11 |
+
Please refer to the original creator for more information.
|
| 12 |
+
|
| 13 |
+
Calibration dataset: Exl2 default
|
| 14 |
+
|
| 15 |
+
## Branches:
|
| 16 |
+
|
| 17 |
+
- main: Measurement files
|
| 18 |
+
- 4bpw: 4 bits per weight
|
| 19 |
+
- 5bpw: 5 bits per weight
|
| 20 |
+
- 6bpw: 6 bits per weight
|
| 21 |
+
|
| 22 |
+
## Notes
|
| 23 |
+
|
| 24 |
+
- 6bpw is recommended for the best quality to vram usage ratio (assuming you have enough vram).
|
| 25 |
+
- Quants greater than 6bpw will not be created because there is no improvement in using them. If you really want them, ask someone else or make them yourself.
|
| 26 |
+
|
| 27 |
+
|
| 28 |
+
## Download
|
| 29 |
+
|
| 30 |
+
|
| 31 |
+
With [async-hf-downloader](https://github.com/theroyallab/async-hf-downloader): A lightweight and asynchronous huggingface downloader created by me
|
| 32 |
+
|
| 33 |
+
|
| 34 |
+
```shell
|
| 35 |
+
./async-hf-downloader royallab/Magnum-Picaro-0.7-v2-12b-exl2 -r 6bpw -p Magnum-Picaro-0.7-v2-12b-exl2-6bpw
|
| 36 |
+
```
|
| 37 |
+
|
| 38 |
+
|
| 39 |
+
With HuggingFace hub (`pip install huggingface_hub`)
|
| 40 |
+
|
| 41 |
+
```shell
|
| 42 |
+
huggingface-cli download royallab/Magnum-Picaro-0.7-v2-12b-exl2 --revision 6bpw --local-dir Magnum-Picaro-0.7-v2-12b-exl2-6bpw
|
| 43 |
+
```
|
| 44 |
+
|
| 45 |
+
## Run in TabbyAPI
|
| 46 |
+
|
| 47 |
+
TabbyAPI is a pure exllamav2 FastAPI server developed by us. You can find TabbyAPI's source code here: [https://github.com/theroyallab/TabbyAPI](https://github.com/theroyallab/TabbyAPI)
|
| 48 |
+
|
| 49 |
+
1. Inside TabbyAPI's config.yml, set `model_name` to `Magnum-Picaro-0.7-v2-12b-exl2-6bpw`
|
| 50 |
+
|
| 51 |
+
1. You can also use an argument `--model_name Magnum-Picaro-0.7-v2-12b-exl2-6bpw` on startup or you can use the `/v2/model/load` endpoint
|
| 52 |
+
|
| 53 |
+
2. Launch TabbyAPI inside your python env by running `./start.bat` or `./start.sh`
|
| 54 |
+
|
| 55 |
+
|
| 56 |
+
## Donate?
|
| 57 |
+
|
| 58 |
+
All my infrastructure and cloud expenses are paid out of pocket. If you'd like to donate, you can do so here: https://ko-fi.com/kingbri
|
| 59 |
+
|
| 60 |
+
You should not feel obligated to donate, but if you do, I'd appreciate it.
|
| 61 |
+
---
|
| 62 |
+
|