Adapters
xlm-roberta
lenglaender commited on
Commit
0d104cb
1 Parent(s): 44990dc

Upload model

Browse files
Files changed (1) hide show
  1. README.md +17 -32
README.md CHANGED
@@ -4,72 +4,57 @@ tags:
4
  - xlm-roberta
5
  datasets:
6
  - UKPLab/m2qa
7
- - rajpurkar/squad_v2
8
  ---
9
 
10
- # M2QA Adapter: QA Head for MAD-X+Domain Setup
11
- This adapter is part of the M2QA publication to achieve language and domain transfer via adapters.
12
- 📃 Paper: [TODO](TODO)
13
- 🏗️ GitHub repo: [https://github.com/UKPLab/m2qa](https://github.com/UKPLab/m2qa)
14
- 💾 Hugging Face Dataset: [https://huggingface.co/UKPLab/m2qa](https://huggingface.co/UKPLab/m2qa)
15
 
16
- **Important:** This adapter only works together with the MAD-X language adapters and the M2QA MAD-X-Domain adapters. This QA adapter was trained on the SQuAD v2 dataset.
17
-
18
- This [adapter](https://adapterhub.ml) for the `xlm-roberta-base` model that was trained using the **[Adapters](https://github.com/Adapter-Hub/adapters)** library. For detailed training details see our paper or GitHub repository: [https://github.com/UKPLab/m2qa](https://github.com/UKPLab/m2qa). You can find the evaluation results for this adapter on the M2QA dataset in the GitHub repo and in the paper.
19
 
 
20
 
21
  ## Usage
22
 
23
- First, install `adapters`:
24
 
25
  ```
26
- pip install -U adapters
27
  ```
 
28
 
29
  Now, the adapter can be loaded and activated like this:
30
 
31
  ```python
32
- from adapters import AutoAdapterModel
33
- from adapters.composition import Stack
34
 
35
  model = AutoAdapterModel.from_pretrained("xlm-roberta-base")
36
-
37
- # 1. Load language adapter
38
- language_adapter_name = model.load_adapter("de/wiki@ukp") # MAD-X+Domain uses the MAD-X language adapter
39
-
40
- # 2. Load domain adapter
41
- domain_adapter_name = model.load_adapter("AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-news")
42
-
43
- # 3. Load QA head adapter
44
- qa_adapter_name = model.load_adapter("AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-qa-head")
45
-
46
- # 4. Activate them via the adapter stack
47
- model.active_adapters = Stack(language_adapter_name, domain_adapter_name, qa_adapter_name)
48
  ```
49
 
 
 
50
 
51
  See our repository for more information: See https://github.com/UKPLab/m2qa/tree/main/Experiments/mad-x-domain
52
 
53
 
54
- ## Contact
55
- Leon Engländer:
56
- - [HuggingFace Profile](https://huggingface.co/lenglaender)
57
- - [GitHub](https://github.com/lenglaender)
58
- - [Twitter](https://x.com/LeonEnglaender)
59
 
60
  ## Citation
61
 
 
62
  ```
63
  @article{englaender-etal-2024-m2qa,
64
  title="M2QA: Multi-domain Multilingual Question Answering",
65
- author={Engl{"a}nder, Leon and
66
  Sterz, Hannah and
67
  Poth, Clifton and
68
  Pfeiffer, Jonas and
69
  Kuznetsov, Ilia and
70
  Gurevych, Iryna},
71
  journal={arXiv preprint},
72
- url={TODO}
 
73
  year="2024"
74
  }
75
  ```
 
4
  - xlm-roberta
5
  datasets:
6
  - UKPLab/m2qa
 
7
  ---
8
 
9
+ # Adapter `AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-qa-head` for xlm-roberta-base
 
 
 
 
10
 
11
+ An [adapter](https://adapterhub.ml) for the `xlm-roberta-base` model that was trained on the [UKPLab/m2qa](https://huggingface.co/datasets/UKPLab/m2qa/) dataset and includes a prediction head for question answering.
 
 
12
 
13
+ This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
14
 
15
  ## Usage
16
 
17
+ First, install `adapter-transformers`:
18
 
19
  ```
20
+ pip install -U adapter-transformers
21
  ```
22
+ _Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_
23
 
24
  Now, the adapter can be loaded and activated like this:
25
 
26
  ```python
27
+ from transformers import AutoAdapterModel
 
28
 
29
  model = AutoAdapterModel.from_pretrained("xlm-roberta-base")
30
+ adapter_name = model.load_adapter("AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-qa-head", source="hf", set_active=True)
 
 
 
 
 
 
 
 
 
 
 
31
  ```
32
 
33
+ ## Architecture & Training
34
+
35
 
36
  See our repository for more information: See https://github.com/UKPLab/m2qa/tree/main/Experiments/mad-x-domain
37
 
38
 
39
+ ## Evaluation results
40
+
41
+ <!-- Add some description here -->
 
 
42
 
43
  ## Citation
44
 
45
+
46
  ```
47
  @article{englaender-etal-2024-m2qa,
48
  title="M2QA: Multi-domain Multilingual Question Answering",
49
+ author={Engl{\"a}nder, Leon and
50
  Sterz, Hannah and
51
  Poth, Clifton and
52
  Pfeiffer, Jonas and
53
  Kuznetsov, Ilia and
54
  Gurevych, Iryna},
55
  journal={arXiv preprint},
56
+ url="https://arxiv.org/abs/2407.01091",
57
+ month = jul,
58
  year="2024"
59
  }
60
  ```