Source model

Lunar-Twilight-12B by Vortex5


Provided quantized models

ExLlamaV3: release v0.0.18

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing

License detected: unknown

The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.


Backups

Date: 09.01.2026

Source files

Source page (click to expand)

Lunar-Twilight-12B

Overview

Lunar-Twilight-12B was created by merging Starlit-Shadow-12B, Red-Synthesis-12B, Tlacuilo-12B, and Mystic-Matron-12B using a custom method.

Merge configuration
base_model: Vortex5/Starlit-Shadow-12B
models:
  - model: Vortex5/Starlit-Shadow-12B
  - model: Vortex5/Red-Synthesis-12B
  - model: allura-org/Tlacuilo-12B
  - model: Vortex5/Mystic-Matron-12B
merge_method: hpq
chat_template: auto
parameters:
  strength: 0.73
  flavor: 0.36
  steps: 12
  cube_dims: 22
  paradox: 0.43
  boost: 0.56
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
      

Intended Use

๐Ÿ“œ Storytelling
๐ŸŽญ Roleplay
๐ŸŒ™ Creative Writing
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for DeathGodlike/Vortex5_Lunar-Twilight-12B_EXL3

Quantized
(3)
this model