🦅 Altair Stock 12B v1

Altair

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Model Stock merge method using mistralai/Mistral-Nemo-Instruct-2407 as a base.

Models Merged

The following models were included in the merge:

🧩 Configuration

The following YAML configuration was used to produce this model:

architecture: MistralForCausalLM
base_model: B:\12B\!models--mistralai--Mistral-Nemo-Instruct-2407
merge_method: model_stock
models:
  - model: B:\12B\!models--mistralai--Mistral-Nemo-Instruct-2407
  - model: B:\12B\!models--anthracite-org--magnum-v4-12b
  - model: B:\12B\!models--ArliAI--Mistral-Nemo-12B-ArliAI-RPMax-v1.2
  - model: B:\12B\!models--Fizzarolli--MN-12b-Rosier-v1
  - model: B:\12B\!models--HumanLLMs--Human-Like-Mistral-Nemo-Instruct-2407
  - model: B:\12B\!models--inflatebot--MN-12B-Mag-Mell-R1
  - model: B:\12B\!models--KOOWEEYUS--BlackSheep-RP-12B
  - model: B:\12B\!models--Lambent--Arsenic-Shahrazad-12B-v3
  - model: B:\12B\!models--SicariusSicariiStuff--Impish_Bloodmoon_12B
  - model: B:\12B\!models--SuperbEmphasis--MN-12b-RP-Ink-RP-Longform
parameters:
  filter_wise: false
dtype: bfloat16
tokenizer:
  source: base
name: 🦅 Altair-Stock-12B-v1
Downloads last month
-
Safetensors
Model size
12B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for EldritchLabs/Altair-Stock-12B-v1

Paper for EldritchLabs/Altair-Stock-12B-v1