Update README.md
Browse files
README.md
CHANGED
|
@@ -43,6 +43,9 @@ language:
|
|
| 43 |
> [!NOTE]
|
| 44 |
> [GGUF/Exl2 quants](https://huggingface.co/collections/xxx777xxxASD/snowstorm-4x8b-664b52a1d2a12e515efb5680)
|
| 45 |
|
|
|
|
|
|
|
|
|
|
| 46 |
|
| 47 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
| 48 |
|
|
|
|
| 43 |
> [!NOTE]
|
| 44 |
> [GGUF/Exl2 quants](https://huggingface.co/collections/xxx777xxxASD/snowstorm-4x8b-664b52a1d2a12e515efb5680)
|
| 45 |
|
| 46 |
+
> [!NOTE]
|
| 47 |
+
> Check for [v1.15A](https://huggingface.co/xxx777xxxASD/L3-SnowStorm-v1.15-4x8B-A) and [v1.15B](https://huggingface.co/xxx777xxxASD/L3-SnowStorm-v1.15-4x8B-B)
|
| 48 |
+
|
| 49 |
|
| 50 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
| 51 |
|