Source model

Celestial-Queen-12B by Vortex5


Provided quantized models

ExLlamaV3: v0.0.28

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing

License detected: apache-2.0

The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.


Backups

Date: 04.04.2026

Source files

Source page (click to expand)

Celestial-Queen-12B

Overview

Celestial-Queen-12B was created through a multi-stage merge combining Crimson-Constellation-12B, Strawberry_Smoothie-12B-Model_Stock, MN-12B-Mag-Mell-R1, LunaMaid-12B, Mahou-1.5-mistral-nemo-12B, MN-12B-Celeste-V1.9, Omega-Darker_The-Final-Directive-12B, MegaMoon-Karcher-12B, and MN-12B-Mag-Mell-R1.

Multi-stage merge configuration
name: First
models:
  - model: Vortex5/Crimson-Constellation-12B
  - model: DreadPoor/Strawberry_Smoothie-12B-Model_Stock
  - model: inflatebot/MN-12B-Mag-Mell-R1
  - model: Vortex5/LunaMaid-12B
merge_method: saef
parameters:
  paradox: 0.40
  strength: 0.88
  boost: 0.28
  modes: 2
dtype: float32
tokenizer:
  source: Vortex5/LunaMaid-12B
---
name: Second
models:
  - model: flammenai/Mahou-1.5-mistral-nemo-12B
  - model: nothingiisreal/MN-12B-Celeste-V1.9
  - model: ReadyArt/Omega-Darker_The-Final-Directive-12B
merge_method: saef
parameters:
  paradox: 0.54
  strength: 0.9
  boost: 0.6
  modes: 2
dtype: float32
tokenizer:
  source: union
---
name: Nearswap1
models:
  - model: Vortex5/MegaMoon-Karcher-12B
merge_method: nearswap
base_model: First
parameters:
  t: 0.0008
dtype: float32
tokenizer:
  source: First
---
name: Nearswap2
models:
  - model: inflatebot/MN-12B-Mag-Mell-R1
merge_method: nearswap
base_model: Second
parameters:
  t: 0.0008
dtype: float32
tokenizer:
  source: Second
---
models:
  - model: Nearswap1
  - model: Nearswap2
merge_method: karcher
chat_template: auto
dtype: float32
out_dtype: bfloat16
parameters:
  tol: 1e-9
  max_iter: 1000
tokenizer:
  source: Vortex5/LunaMaid-12B

Intended Use

Storytelling Long-form narrative
Roleplay Emotion-forward interaction
Creative Writing Atmospheric fiction
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for DeathGodlike/Vortex5_Celestial-Queen-12B_EXL3

Quantized
(3)
this model