File size: 642 Bytes
54a8c35
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
---
license: mit
base_model: moreh/MoMo-72B-lora-1.8.7-DPO
---

![image/png](https://cdn-uploads.huggingface.co/production/uploads/64c14f6b02e1f8f67c73bd05/pf4d6FA7DriRtVq5HCkxd.png)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64c14f6b02e1f8f67c73bd05/e4u8VYfDBh11u60rFYJHF.png)

This model is a finetune of moreh's [MoMo-72B](https://huggingface.co/moreh/MoMo-72B-lora-1.8.7-DPO) model.
It has been trained with new datasets and a new technique, which we will share to the community soon.
This model has not utilised any form of merging.

### Evaluation Results

Coming soon.

### Contamination Results

Coming soon.