Edit model card

This checkpoint has been trained for the NER task using the CoNLL2002-es dataset.

This is a NER checkpoint created from Bertin Gaussian 512, which is a RoBERTa-base model trained from scratch in Spanish. Information on this base model may be found at its own card and at deeper detail on the main project card.

The training dataset for the base model is mc4 subsampling documents to a total of about 50 million examples. Sampling is biased towards average perplexity values (using a Gaussian function), discarding more often documents with very large values (poor quality) of very small values (short, repetitive texts).

This is part of the Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.

Team members

Downloads last month
18
Safetensors
Model size
124M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bertin-project/bertin-base-ner-conll2002-es

Finetunes
1 model

Space using bertin-project/bertin-base-ner-conll2002-es 1