File size: 1,306 Bytes
5beb371 6199090 5beb371 df16d26 5beb371 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
---
base_model:
- ddh0/Cassiopeia-70B
- Sao10K/Llama-3.3-70B-Vulpecula-r1
base_model_relation: merge
license: unknown
thumbnail: https://huggingface.co/ddh0/Andromeda-70B/resolve/main/andromeda.png
---
# Andromeda-70B

**Andromeda-70B** is the result of an experimental SLERP merge of [Cassiopeia-70B](https://huggingface.co/ddh0/Cassiopeia-70B) and [Sao10K/Llama-3.3-70B-Vulpecula-r1](https://huggingface.co/Sao10K/Llama-3.3-70B-Vulpecula-r1). It is a coherent, unaligned model intended to be used for creative tasks such as storywriting, brainstorming, interactive roleplay, etc.
## UPDATE
After more thorough testing by myself and others, I don't think this model is very good. :( You should use Cassiopeia or Vulpecula instead.
## Merge composition
```yaml
models:
- model: /opt/workspace/hf/Cassiopeia-70B
- model: /opt/workspace/hf/Llama-3.3-70B-Vulpecula-r1
merge_method: slerp
base_model: /opt/workspace/hf/Cassiopeia-70B
parameters:
t: 0.7
dtype: bfloat16
```
## Feedback
If you like this model, please support [Sao10k](https://sao10k.carrd.co).
Feedback on this merge is very welcome, good or bad! Please leave a comment in this discussion with your thoughts: [Andromeda-70B/discussions/1](https://huggingface.co/ddh0/Andromeda-70B/discussions/1) |