Edit model card

Basemodel: roBERTa

Configs: Vocab size: 10,000 Hidden size: 512 Max position embeddings: 512 Number of layers: 2 Number of heads: 4 Window size: 256 Intermediate-size: 1024

Results:

  • Task: glue Score: 57.91 Confidence Interval: [56.98, 58.87]
  • Task: blimp Score: 58.40 Confidence Interval: [57.37, 59.23]
Downloads last month
1
Inference API
Examples
Mask token: <mask>
This model can be loaded on Inference API (serverless).

Dataset used to train AISE-TUDelft/Custom-Activations-BERT-ReLU

Collection including AISE-TUDelft/Custom-Activations-BERT-ReLU