NouRed commited on
Commit
128da05
1 Parent(s): cd0f0fe

Click MedQSum

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -44,8 +44,9 @@ library_name: transformers
44
  ---
45
 
46
  ## MedQSum
47
-
48
- <img src="https://raw.githubusercontent.com/zekaouinoureddine/MedQSum/master/assets/models.png" alt="drawing" width="600"/>
 
49
 
50
  ## TL;DR
51
  **medqsum-bart-large-xsum-meqsum** is the best fine-tuned model in the paper [Enhancing Large Language Models' Utility for Medical Question-Answering: A Patient Health Question Summarization Approach](), which introduces a solution to get the most out of LLMs, when answering health-related questions. We address the challenge of crafting accurate prompts by summarizing consumer health questions (CHQs) to generate clear and concise medical questions. Our approach involves fine-tuning Transformer-based models, including Flan-T5 in resource-constrained environments and three medical question summarization datasets.
 
44
  ---
45
 
46
  ## MedQSum
47
+ <a href="https://github.com/zekaouinoureddine/MedQSum">
48
+ <img src="https://raw.githubusercontent.com/zekaouinoureddine/MedQSum/master/assets/models.png" alt="drawing" width="600"/>
49
+ </a>
50
 
51
  ## TL;DR
52
  **medqsum-bart-large-xsum-meqsum** is the best fine-tuned model in the paper [Enhancing Large Language Models' Utility for Medical Question-Answering: A Patient Health Question Summarization Approach](), which introduces a solution to get the most out of LLMs, when answering health-related questions. We address the challenge of crafting accurate prompts by summarizing consumer health questions (CHQs) to generate clear and concise medical questions. Our approach involves fine-tuning Transformer-based models, including Flan-T5 in resource-constrained environments and three medical question summarization datasets.