--- base_model: - mlabonne/NeuralBeagle14-7B tags: - mergekit - merge license: apache-2.0 --- # franken-Beagle-11B ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5fad8602b8423e1d80b8a965/KQTqm6n3bkV-uvfmXk2IT.png) This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the passthrough merge method. ### Models Merged The following models were included in the merge: * [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: mlabonne/NeuralBeagle14-7B layer_range: [0, 24] - sources: - model: mlabonne/NeuralBeagle14-7B layer_range: [8, 32] merge_method: passthrough dtype: bfloat16 ```