powermove72's picture
Upload folder using huggingface_hub
a0d2d5a verified
|
raw
history blame
1.09 kB
metadata
base_model:
  - NousResearch/Hermes-3-Llama-3.2-3B
  - tiiuae/Falcon3-3B-Instruct
library_name: transformers
tags:
  - mergekit
  - merge

merge1

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Passthrough merge method using merge as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


merge_method: passthrough
dtype: bfloat16
base_model: merge

slices:
  - sources:
      - model: tiiuae/Falcon3-3B-Instruct
        layer_range: [0, 12]
  - sources:
      - model: NousResearch/Hermes-3-Llama-3.2-3B
        layer_range: [12, 22]
  - sources:
      - model: merge  # Reference the Step 1 output directory/model
        layer_range: [16, 28]
dtype: float16