Update README.md
Browse files
README.md
CHANGED
@@ -8,14 +8,4 @@ tags:
|
|
8 |
- medical
|
9 |
|
10 |
---
|
11 |
-
A medical MoEs developed through the amalgamation of three leading models in the medical domain: BioMistral, Meditron, and Medalpaca. This fusion has been meticulously achieved using the MergeKit library, a cutting-edge tool designed to blend multiple models' strengths into a unified, powerful system.
|
12 |
-
## 🧩 Configuration
|
13 |
-
|
14 |
-
```base_model: BioMistral/BioMistral-7B
|
15 |
-
dtype: float16
|
16 |
-
gate_mode: cheap_embed
|
17 |
-
experts:
|
18 |
-
- source_model: epfl-llm/meditron-7b
|
19 |
-
positive_prompts: ["You are a helpful medical assistant."]
|
20 |
-
- source_model: medalpaca/medalpaca-7b
|
21 |
-
positive_prompts: ["You are assistant for medical question answering."]```
|
|
|
8 |
- medical
|
9 |
|
10 |
---
|
11 |
+
A medical MoEs developed through the amalgamation of three leading models in the medical domain: BioMistral, Meditron, and Medalpaca. This fusion has been meticulously achieved using the MergeKit library, a cutting-edge tool designed to blend multiple models' strengths into a unified, powerful system.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|