--- base_model: - aixonlab/Eurydice-24b-v2 - ReadyArt/Broken-Tutu-24B - TheDrummer/Cydonia-24B-v2.1 - PocketDoc/Dans-PersonalityEngine-V1.2.0-24b - arcee-ai/Arcee-Blitz library_name: transformers tags: - mergekit - merge --- # DarkHazard-v1.1-24b This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Inspiration This merge was inspired by Yoesph/Haphazard-v1.1-24b ### Changelog v1.2 * replaced Yoesph/Haphazard-v1.1-24b with model: TheDrummer/Cydonia-24B-v2.1 * replaced ReadyArt/Safeword-Abomination-of-Omega-Darker-Gaslight_The-Final-Forgotten-Transgression-24B with ReadyArt/Broken-Tutu-24B ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [arcee-ai/Arcee-Blitz](https://huggingface.co/arcee-ai/Arcee-Blitz) as a base. ### Models Merged The following models were included in the merge: * [aixonlab/Eurydice-24b-v2](https://huggingface.co/aixonlab/Eurydice-24b-v2) * [ReadyArt/Broken-Tutu-24B](https://huggingface.co/ReadyArt/Broken-Tutu-24B) * [TheDrummer/Cydonia-24B-v2.1](https://huggingface.co/TheDrummer/Cydonia-24B-v2.1) * [PocketDoc/Dans-PersonalityEngine-V1.2.0-24b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-V1.2.0-24b) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: arcee-ai/Arcee-Blitz merge_method: model_stock dtype: bfloat16 models: - model: aixonlab/Eurydice-24b-v2 # storytelling / RP - model: TheDrummer/Cydonia-24B-v2.1 # uncensor - model: ReadyArt/Broken-Tutu-24B # uncensor + nsfw + Cydonia - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b # Prompt Adherence ```