BossRui commited on
Commit
814e3f2
โ€ข
1 Parent(s): 44ac6fe

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -18
README.md CHANGED
@@ -59,16 +59,18 @@ tags:
59
  # Democratizing Medical LLMs For Much More Languages
60
 
61
  Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish, Arabic, Russian, Japanese, Korean, German, Italian, Portuguese and 38 Minor Languages So far.
62
- <center>
63
 
64
 
65
 
66
  <p align="center">
67
- ๐Ÿ“ƒ <a href="https://arxiv.org/abs/2410.10626" target="_blank">Paper</a> โ€ข ๐ŸŒ <a href="" target="_blank">Demo</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEDataset" target="_blank">ApolloMoEDataset</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEBench" target="_blank">ApolloMoEBench</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/collections/FreedomIntelligence/apollomoe-and-apollo2-670ddebe3bb1ba1aebabbf2c" target="_blank">Models</a> โ€ข ๐ŸŒ <a href="https://github.com/FreedomIntelligence/Apollo" target="_blank">Apollo</a>
68
  </p>
69
 
 
 
70
  ![Apollo](assets/apollo_medium_final.png)
71
 
 
72
  ## ๐ŸŒˆ Update
73
 
74
  * **[2024.10.15]** ApolloMoE repo is published๏ผ๐ŸŽ‰
@@ -90,42 +92,47 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
90
  <details>
91
  <summary>Click to view the MoE routing image</summary>
92
 
93
- ![ApolloMoE](/assets/hybrid_routing.png)
94
 
95
  </details>
96
 
97
  ## Results
98
 
99
- ### Dense
100
- ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-0.5B" target="_blank">Apollo2-0.5B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-1.5B" target="_blank">Apollo2-1.5B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-2B" target="_blank">Apollo2-2B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-3.8B" target="_blank">Apollo2-3.8B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-7B" target="_blank">Apollo2-7B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-9B" target="_blank">Apollo2-9B</a>
101
 
 
 
102
  <details>
103
  <summary>Click to view the Dense Models Results</summary>
104
-
105
  ![ApolloMoE](assets/dense_results.png)
106
 
107
  </details>
108
 
109
- ### Post-MoE
 
110
  ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-0.5B" target="_blank">Apollo-MoE-0.5B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-1.5B" target="_blank">Apollo-MoE-1.5B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-7B" target="_blank">Apollo-MoE-7B</a>
111
-
112
  <details>
113
  <summary>Click to view the Post-MoE Models Results</summary>
114
-
115
  ![ApolloMoE](assets/post_moe_results.png)
116
 
117
  </details>
118
 
 
 
119
 
120
  ## Usage Format
121
- #### Apollo2
122
  - 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
123
  - 2B, 9B: User:{query}\nAssistant:{response}\<eos\>
124
  - 3.8B: <|user|>\n{query}<|end|><|assisitant|>\n{response}<|end|>
125
 
126
- #### Apollo-MoE
127
  - 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
128
-
129
  ## Dataset & Evaluation
130
 
131
  - Dataset
@@ -139,12 +146,12 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
139
 
140
 
141
  </details>
142
-
143
  - Evaluation
144
  ๐Ÿค— <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEBench" target="_blank">ApolloMoEBench</a>
145
 
146
  <details><summary>Click to expand</summary>
147
-
148
  - EN:
149
  - [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
150
  - [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test)
@@ -180,17 +187,16 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
180
  - PT: [BioInstructQA](https://huggingface.co/datasets/BioMistral/BioInstructQA): Portuguese part
181
  - RU: [RuMedBench](https://github.com/sb-ai-lab/MedBench)
182
 
183
-
184
-
185
 
186
 
187
  </details>
188
 
189
-
190
  ## Results reproduction
191
  <details><summary>Click to expand</summary>
192
 
193
-
194
  We take Apollo2-7B or Apollo-MoE-0.5B as example
195
  1. Download Dataset for project:
196
 
 
59
  # Democratizing Medical LLMs For Much More Languages
60
 
61
  Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish, Arabic, Russian, Japanese, Korean, German, Italian, Portuguese and 38 Minor Languages So far.
 
62
 
63
 
64
 
65
  <p align="center">
66
+ ๐Ÿ“ƒ <a href="https://arxiv.org/abs/2410.10626" target="_blank">Paper</a> โ€ข ๐ŸŒ <a href="" target="_blank">Demo</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEDataset" target="_blank">ApolloMoEDataset</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEBench" target="_blank">ApolloMoEBench</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/collections/FreedomIntelligence/apollomoe-and-apollo2-670ddebe3bb1ba1aebabbf2c" target="_blank">Models</a> โ€ข๐ŸŒ <a href="https://github.com/FreedomIntelligence/Apollo" target="_blank">Apollo</a> โ€ข ๐ŸŒ <a href="https://github.com/FreedomIntelligence/ApolloMoE" target="_blank">ApolloMoE</a>
67
  </p>
68
 
69
+
70
+
71
  ![Apollo](assets/apollo_medium_final.png)
72
 
73
+
74
  ## ๐ŸŒˆ Update
75
 
76
  * **[2024.10.15]** ApolloMoE repo is published๏ผ๐ŸŽ‰
 
92
  <details>
93
  <summary>Click to view the MoE routing image</summary>
94
 
95
+ ![ApolloMoE](assets/hybrid_routing.png)
96
 
97
  </details>
98
 
99
  ## Results
100
 
101
+ #### Dense
102
+ ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-0.5B" target="_blank">Apollo2-0.5B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-1.5B" target="_blank">Apollo2-1.5B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-2B" target="_blank">Apollo2-2B</a>
103
 
104
+ ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-3.8B" target="_blank">Apollo2-3.8B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-7B" target="_blank">Apollo2-7B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo2-9B" target="_blank">Apollo2-9B</a>
105
+
106
  <details>
107
  <summary>Click to view the Dense Models Results</summary>
108
+
109
  ![ApolloMoE](assets/dense_results.png)
110
 
111
  </details>
112
 
113
+
114
+ #### Post-MoE
115
  ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-0.5B" target="_blank">Apollo-MoE-0.5B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-1.5B" target="_blank">Apollo-MoE-1.5B</a> โ€ข ๐Ÿค— <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-7B" target="_blank">Apollo-MoE-7B</a>
116
+
117
  <details>
118
  <summary>Click to view the Post-MoE Models Results</summary>
119
+
120
  ![ApolloMoE](assets/post_moe_results.png)
121
 
122
  </details>
123
 
124
+
125
+
126
 
127
  ## Usage Format
128
+ ##### Apollo2
129
  - 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
130
  - 2B, 9B: User:{query}\nAssistant:{response}\<eos\>
131
  - 3.8B: <|user|>\n{query}<|end|><|assisitant|>\n{response}<|end|>
132
 
133
+ ##### Apollo-MoE
134
  - 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
135
+
136
  ## Dataset & Evaluation
137
 
138
  - Dataset
 
146
 
147
 
148
  </details>
149
+
150
  - Evaluation
151
  ๐Ÿค— <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEBench" target="_blank">ApolloMoEBench</a>
152
 
153
  <details><summary>Click to expand</summary>
154
+
155
  - EN:
156
  - [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
157
  - [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test)
 
187
  - PT: [BioInstructQA](https://huggingface.co/datasets/BioMistral/BioInstructQA): Portuguese part
188
  - RU: [RuMedBench](https://github.com/sb-ai-lab/MedBench)
189
 
190
+
 
191
 
192
 
193
  </details>
194
 
195
+
196
  ## Results reproduction
197
  <details><summary>Click to expand</summary>
198
 
199
+
200
  We take Apollo2-7B or Apollo-MoE-0.5B as example
201
  1. Download Dataset for project:
202