--- license: apache-2.0 tags: - merge - mergekit - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat - vilm/vinallama-7b-chat --- # vinallama-chat-merge-2 This model is a merge of the following models made with [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing): * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) * [vilm/vinallama-7b-chat](https://huggingface.co/vilm/vinallama-7b-chat) ## 🧩 Configuration ```yaml slices: - sources: - model: vilm/vinallama-7b-chat layer_range: [0, 16] - sources: - model: vilm/vinallama-7b-chat layer_range: [8, 16] - sources: - model: vilm/vinallama-7b-chat layer_range: [8, 16] - sources: - model: vilm/vinallama-7b-chat layer_range: [12, 24] - sources: - model: vilm/vinallama-7b-chat layer_range: [12, 24] - sources: - model: vilm/vinallama-7b-chat layer_range: [20, 28] - sources: - model: vilm/vinallama-7b-chat layer_range: [20, 28] - sources: - model: vilm/vinallama-7b-chat layer_range: [28, 32] merge_method: passthrough dtype: bfloat16 ```