Update README.md
Browse files
README.md
CHANGED
@@ -22,7 +22,7 @@ The training and evaluation code is available at the official [TripPy repository
|
|
22 |
|
23 |
The model was trained on MultiWOZ 2.1 data via supervised learning using the [TripPy codebase](https://gitlab.cs.uni-duesseldorf.de/general/dsml/trippy-public).
|
24 |
MultiWOZ 2.1 data was loaded via ConvLab-3's unified data format dataloader.
|
25 |
-
The pre-trained encoder is [RoBERTa](https://
|
26 |
Fine-tuning the encoder and training the DST specific classification heads was conducted for 10 epochs.
|
27 |
|
28 |
### Training hyperparameters
|
|
|
22 |
|
23 |
The model was trained on MultiWOZ 2.1 data via supervised learning using the [TripPy codebase](https://gitlab.cs.uni-duesseldorf.de/general/dsml/trippy-public).
|
24 |
MultiWOZ 2.1 data was loaded via ConvLab-3's unified data format dataloader.
|
25 |
+
The pre-trained encoder is [RoBERTa](https://huggingface.co/docs/transformers/model_doc/roberta) (base).
|
26 |
Fine-tuning the encoder and training the DST specific classification heads was conducted for 10 epochs.
|
27 |
|
28 |
### Training hyperparameters
|