Update README.md
Browse files
README.md
CHANGED
|
@@ -49,7 +49,7 @@ language:
|
|
| 49 |
|
| 50 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
| 51 |
|
| 52 |
-
### Llama 3 SnowStorm 4x8B
|
| 53 |
```
|
| 54 |
base_model: NeverSleep_Llama-3-Lumimaid-8B-v0.1-OAS
|
| 55 |
gate_mode: random
|
|
|
|
| 49 |
|
| 50 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
| 51 |
|
| 52 |
+
### Llama 3 SnowStorm v1.0 4x8B
|
| 53 |
```
|
| 54 |
base_model: NeverSleep_Llama-3-Lumimaid-8B-v0.1-OAS
|
| 55 |
gate_mode: random
|