---
license: apache-2.0
datasets:
- HiTZ/casimedicos-squad
language:
- en
- es
- fr
- it
metrics:
- f1
pipeline_tag: token-classification
library_name: transformers
widget:
- text: >-
Paradoxical pulse is a drop in blood pressure > 10 mmHg during inspiration;
it represents an exaggeration of the physiological phenomenon consisting of
inspiratory lowering of BP (normal up to 10 mmHg). In cardiac tamponade,
inspiration, which causes an increase in blood flow to the right chambers,
increasing their volume, secondarily causes a displacement of the
interventricular septum to the left, so that the left heart lodges and
expels less blood during systole and the pulse, therefore, decreases. In a
normal heart this exaggerated displacement, caused by the pressure exerted
by the tamponade on the RV free wall, does not occur. Sinus X represents the
systolic collapse of the venous pulse, i.e., the pressure drop due to atrial
relaxation (also partly due to a downward displacement of the RV base during
systole). Sinus Y represents the diastolic collapse of the venous pulse,
i.e., the pressure drop that occurs from the moment blood enters the
tricuspid valve into the ventricle. In cardiac tamponade, the deep sinus X
is characteristic. In constrictive pericarditis, the deep Y sinus. For all
these reasons, the correct answer is 5.
---
# mDeBERTa-base for Multilingual Correct Explanation Extraction in the Medical Domain
This model is a fine-tuned version of [mdeberta-v3-base](https://huggingface.co/microsoft/mdeberta-v3-base) for a **novel extractive task**
which consists of **identifying the explanation of the correct answer** written by medical doctors. The model
has been fine-tuned using the multilingual [https://huggingface.co/datasets/HiTZ/casimedicos-squad](https://huggingface.co/datasets/HiTZ/casimedicos-squad) dataset,
which includes English, French, Italian and Spanish.
## Performance
The model scores **74.64 F1 partial match** (as defined in [SQuAD extractive QA task](https://huggingface.co/datasets/rajpurkar/squad_v2)) averaged across the 4 languages.
### Fine-tuning hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 48
- eval_batch_size: 8
- seed: random
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20.0
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.2
## Citation
If you use this model please **cite the following paper**:
```bibtex
@misc{goenaga2023explanatory,
title={Explanatory Argument Extraction of Correct Answers in Resident Medical Exams},
author={Iakes Goenaga and Aitziber Atutxa and Koldo Gojenola and Maite Oronoz and Rodrigo Agerri},
year={2023},
eprint={2312.00567},
archivePrefix={arXiv}
}
```
**Contact**: [Iakes Goenaga](http://www.hitz.eus/es/node/65) and [Rodrigo Agerri](https://ragerri.github.io/)
HiTZ Center - Ixa, University of the Basque Country UPV/EHU