Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,45 @@
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
license: apache-2.0
|
| 3 |
---
|
| 4 |
+
<div style="width: 100%;">
|
| 5 |
+
<img src="http://x-pai.algolet.com/bot/img/logo_core.png" alt="TigerBot" style="width: 20%; display: block; margin: auto;">
|
| 6 |
+
</div>
|
| 7 |
+
<p align="center">
|
| 8 |
+
<font face="黑体" size=5"> A cutting-edge foundation for your very own LLM. </font>
|
| 9 |
+
</p>
|
| 10 |
+
<p align="center">
|
| 11 |
+
🌐 <a href="https://tigerbot.com/" target="_blank">TigerBot</a> • 🤗 <a href="https://huggingface.co/TigerResearch" target="_blank">Hugging Face</a>
|
| 12 |
+
</p>
|
| 13 |
+
|
| 14 |
+
|
| 15 |
+
|
| 16 |
+
This is a 4-bit EXL2 version of the [tigerbot-70b-chat-v6](https://huggingface.co/TigerResearch/tigerbot-70b-chat-v6).
|
| 17 |
+
|
| 18 |
+
It was quantized to 4bit using: https://github.com/turboderp/exllamav2
|
| 19 |
+
|
| 20 |
+
## How to download and use this model in github: https://github.com/TigerResearch/TigerBot
|
| 21 |
+
|
| 22 |
+
Here are commands to clone the TigerBot and install.
|
| 23 |
+
|
| 24 |
+
```
|
| 25 |
+
conda create --name tigerbot python=3.8
|
| 26 |
+
conda activate tigerbot
|
| 27 |
+
conda install pytorch torchvision torchaudio pytorch-cuda=11.7 -c pytorch -c nvidia
|
| 28 |
+
|
| 29 |
+
git clone https://github.com/TigerResearch/TigerBot
|
| 30 |
+
cd TigerBot
|
| 31 |
+
pip install -r requirements.txt
|
| 32 |
+
```
|
| 33 |
+
|
| 34 |
+
Inference with command line interface
|
| 35 |
+
|
| 36 |
+
infer with exllamav2
|
| 37 |
+
```
|
| 38 |
+
# install exllamav2
|
| 39 |
+
git clone https://github.com/turboderp/exllamav2
|
| 40 |
+
cd exllamav2
|
| 41 |
+
pip install -r requirements.txt
|
| 42 |
+
|
| 43 |
+
# infer command
|
| 44 |
+
CUDA_VISIBLE_DEVICES=0 python other_infer/exllamav2_hf_infer.py --model_path TigerResearch/tigerbot-70b-chat-v6-4bit-exl2
|
| 45 |
+
```
|