DistilBERT
Collection
Smaller BERT models for question answering and text classification
•
13 items
•
Updated
Finetuned model pruned to 1:4 structured sparsity.
The model can be used for inference with sparsity optimization. For further details on the model and its usage will be soon available.
We get the following results on the SQuADv1.1 tasks development set:
Task | SQuADv1.1 (F1) |
---|---|
87.00 |