metadata
language:
- nl
license: apache-2.0
tags:
- generated_from_trainer
- GEITje
- mlx
datasets:
- Rijgersberg/no_robots_nl
- Rijgersberg/ultrachat_10k_nl
- BramVanroy/dutch_chat_datasets
base_model: Rijgersberg/GEITje-7B
pipeline_tag: conversational
model-index:
- name: GEITje-7B-chat-v2
results: []
Rijgersberg/GEITje-7B-chat-v2-mlx
This model was converted to MLX format from Rijgersberg/GEITje-7B-chat-v2
.
Refer to the original model card for more details on the model.
Use with mlx
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("Rijgersberg/GEITje-7B-chat-v2-mlx")
response = generate(model, tokenizer, prompt="hello", verbose=True)