Edit model card

Model Card for distilbert-base-uncased

Model Details

Model Description

A fine-tuned version of distilbert/distilbert-base-uncased using the stanford-nlp/snli dataset.

  • Developed by: Karl Weinmeister
  • Language(s) (NLP): en
  • License: apache-2.0
  • Finetuned from model [optional]: distilbert/distilbert-base-uncased

Training Hyperparameters

  • Training regime: The model was trained for 5 epochs with batch size 128.
Downloads last month
32
Safetensors
Model size
67M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) does not yet support transformers models for this pipeline type.

Model tree for kweinmeister/distilbert-base-uncased-snli

Finetuned
(6692)
this model
Merges
1 model

Evaluation results