Update README.md
Browse files
README.md
CHANGED
|
@@ -20,6 +20,16 @@ base_model:
|
|
| 20 |
- mistralai/Mistral-Small-24B-Base-2501
|
| 21 |
---
|
| 22 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 23 |
# Dolphin 3.0 R1 Mistral 24B 🐬
|
| 24 |
Part of the [Dolphin 3.0 Collection](https://huggingface.co/collections/cognitivecomputations/dolphin-30-677ab47f73d7ff66743979a3)
|
| 25 |
|
|
|
|
| 20 |
- mistralai/Mistral-Small-24B-Base-2501
|
| 21 |
---
|
| 22 |
|
| 23 |
+
Hey it's my first time quanting a Dolphin model. No idea how it will turn out, but the introduction of R1 likely introduces some slop since Deepseek has a lot of it, whereas Mistral Small base had a lot less slop. That being said, not going to complain about some variety you know?
|
| 24 |
+
|
| 25 |
+
<br>
|
| 26 |
+
[This is the EXL2 6bpw version of this model. For the original model, go here](https://huggingface.co/cognitivecomputations/Dolphin3.0-R1-Mistral-24B)
|
| 27 |
+
<br>
|
| 28 |
+
[For the 8bpw version, go here](https://huggingface.co/Statuo/Dolphin3-R1-MS-24b-EXL2-8bpw)
|
| 29 |
+
<br>
|
| 30 |
+
[For the 4bpw verison, go here](https://huggingface.co/Statuo/Dolphin3-R1-MS-24b-EXL2-4bpw)
|
| 31 |
+
<br>
|
| 32 |
+
|
| 33 |
# Dolphin 3.0 R1 Mistral 24B 🐬
|
| 34 |
Part of the [Dolphin 3.0 Collection](https://huggingface.co/collections/cognitivecomputations/dolphin-30-677ab47f73d7ff66743979a3)
|
| 35 |
|