MagTie-v1-12B / README.md
grimjim's picture
Update README.md
9bde996 verified
metadata
base_model:
  - Delta-Vector/Francois-Huali-12B
  - grimjim/mistralai-Mistral-Nemo-Base-2407
  - grimjim/Magnolia-v3-12B
  - inflatebot/MN-12B-Mag-Mell-R1
library_name: transformers
pipeline_tag: text-generation
tags:
  - mergekit
  - merge
license: apache-2.0

MagTie-v1-12B

This is a merge of pre-trained language models created using mergekit.

We used a pretrained base model as the base for a DARE-TIES merge, compensating by boosting the weights and densities in order to retain more training from the contributing models.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using grimjim/mistralai-Mistral-Nemo-Base-2407 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
models:
  - model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - model: inflatebot/MN-12B-Mag-Mell-R1
    parameters:
      weight: 0.85
      density: 0.75
  - model: Delta-Vector/Francois-Huali-12B
    parameters:
      weight: 0.85
      density: 0.75
  - model: grimjim/Magnolia-v3-12B
    parameters:
      weight: 0.85
      density: 0.75
merge_method: dare_ties
parameters:
  normalize: true
  int8_mask: true
dtype: bfloat16