--- base_model: - NousResearch/Hermes-3-Llama-3.2-3B - tiiuae/Falcon3-3B-Instruct library_name: transformers tags: - mergekit - merge --- # merge1 This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the Passthrough merge method using merge as a base. ### Models Merged The following models were included in the merge: * [NousResearch/Hermes-3-Llama-3.2-3B](https://huggingface.co/NousResearch/Hermes-3-Llama-3.2-3B) * [tiiuae/Falcon3-3B-Instruct](https://huggingface.co/tiiuae/Falcon3-3B-Instruct) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: passthrough dtype: bfloat16 base_model: merge slices: - sources: - model: tiiuae/Falcon3-3B-Instruct layer_range: [0, 12] - sources: - model: NousResearch/Hermes-3-Llama-3.2-3B layer_range: [12, 22] - sources: - model: merge # Reference the Step 1 output directory/model layer_range: [16, 28] dtype: float16 ```