pt-gemma-portuguese-luana-2b-x-gemma-2b-it-della_linear-50_50

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Linear DELLA merge method using google/gemma-2b-it as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

merge_method: della_linear
models:
- model: rhaymison/gemma-portuguese-luana-2b
  parameters:
    weight: 0.5
- model: google/gemma-2b-it
  parameters:
    weight: 0.5
parameters:
  normalize: true
  int8_mask: true
  density: 0.5
  lambda: 1.0
  epsilon: 0.2
dtype: bfloat16
tokenizer:
  source: union
base_model: google/gemma-2b-it
write_readme: README.md
Downloads last month
4
Safetensors
Model size
3B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for gsjang/pt-gemma-portuguese-luana-2b-x-gemma-2b-it-della_linear-50_50