|
--- |
|
license: mit |
|
base_model: moreh/MoMo-72B-lora-1.8.7-DPO |
|
--- |
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64c14f6b02e1f8f67c73bd05/pf4d6FA7DriRtVq5HCkxd.png) |
|
|
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64c14f6b02e1f8f67c73bd05/e4u8VYfDBh11u60rFYJHF.png) |
|
|
|
This model is a finetune of moreh's [MoMo-72B](https://huggingface.co/moreh/MoMo-72B-lora-1.8.7-DPO) model. |
|
It has been trained with new datasets and a new technique, which we will share to the community soon. |
|
This model has not utilised any form of merging. |
|
|
|
### Evaluation Results |
|
|
|
Coming soon. |
|
|
|
### Contamination Results |
|
|
|
Coming soon. |