metadata
base_model:
- Entropicengine/LatentDream-exp-delta-8b
- Entropicengine/LatentDream-exp-gamma-8b
- Entropicengine/LatentDream-exp-beta-8b
- Entropicengine/LatentDream-exp-alpha-8b
- Entropicengine/Luminatium-L3-8b
library_name: transformers
tags:
- mergekit
- merge
~ The culmination!
LatentSoup-modelstock-8b
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Model Stock merge method using Entropicengine/Luminatium-L3-8b as a base.
Models Merged
The following models were included in the merge:
- Entropicengine/LatentDream-exp-delta-8b
- Entropicengine/LatentDream-exp-gamma-8b
- Entropicengine/LatentDream-exp-beta-8b
- Entropicengine/LatentDream-exp-alpha-8b
Configuration
The following YAML configuration was used to produce this model:
base_model: Entropicengine/Luminatium-L3-8b
dtype: bfloat16
merge_method: model_stock
modules:
default:
slices:
- sources:
- layer_range: [0, 32]
model: Entropicengine/LatentDream-exp-alpha-8b
- layer_range: [0, 32]
model: Entropicengine/LatentDream-exp-beta-8b
- layer_range: [0, 32]
model: Entropicengine/LatentDream-exp-gamma-8b
- layer_range: [0, 32]
model: Entropicengine/LatentDream-exp-delta-8b
- layer_range: [0, 32]
model: Entropicengine/Luminatium-L3-8b