Token Classification
Transformers
Safetensors
French
camembert
Inference Endpoints
bourdoiscatie's picture
Update README.md
4eb6b04 verified
---
license: mit
base_model: camembert/camembert-large
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: NERmembert-large-3entities
results: []
datasets:
- CATIE-AQ/frenchNER_3entities
language:
- fr
widget:
- text: >-
Le dévoilement du logo officiel des JO s'est déroulé le 21 octobre 2019 au
Grand Rex. Ce nouvel emblème et cette nouvelle typographie ont été conçus
par le designer Sylvain Boyer avec les agences Royalties & Ecobranding.
Rond, il rassemble trois symboles : une médaille d'or, la flamme olympique
et Marianne, symbolisée par un visage de femme mais privée de son bonnet
phrygien caractéristique. La typographie dessinée fait référence à l'Art
déco, mouvement artistique des années 1920, décennie pendant laquelle ont eu
lieu pour la dernière fois les Jeux olympiques à Paris en 1924. Pour la
première fois, ce logo sera unique pour les Jeux olympiques et les Jeux
paralympiques.
library_name: transformers
pipeline_tag: token-classification
co2_eq_emissions: 90
new_version: CATIE-AQ/NERmemberta-3entities
---
# NERmembert-large-3entities
## Model Description
We present **NERmembert-large-3entities**, which is a [CamemBERT large](https://huggingface.co/camembert/camembert-large) fine-tuned for the Name Entity Recognition task for the French language on five French NER datasets for 3 entities (LOC, PER, ORG).
All these datasets were concatenated and cleaned into a single dataset that we called [frenchNER_3entities](https://huggingface.co/datasets/CATIE-AQ/frenchNER_3entities).
This represents a total of over **420,264 rows, of which 346,071 are for training, 32,951 for validation and 41,242 for testing.**
Our methodology is described in a blog post available in [English](https://blog.vaniila.ai/en/NER_en/) or [French](https://blog.vaniila.ai/NER/).
## Dataset
The dataset used is [frenchNER_3entities](https://huggingface.co/datasets/CATIE-AQ/frenchNER_3entities), which represents ~420k sentences labeled in 4 categories:
| Label | Examples |
|:------|:-----------------------------------------------------------|
| PER | "La Bruyère", "Gaspard de Coligny", "Wittgenstein" |
| ORG | "UTBM", "American Airlines", "id Software" |
| LOC | "République du Cap-Vert", "Créteil", "Bordeaux" |
The distribution of the entities is as follows:
<table>
<thead>
<tr>
<th><br>Splits</th>
<th><br>O</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
</tr>
</thead>
<tbody>
<td><br>train</td>
<td><br>8,398,765</td>
<td><br>327,393</td>
<td><br>303,722</td>
<td><br>151,490</td>
</tr>
<tr>
<td><br>validation</td>
<td><br>592,815</td>
<td><br>34,127</td>
<td><br>30,279</td>
<td><br>18,743</td>
</tr>
<tr>
<td><br>test</td>
<td><br>773,871</td>
<td><br>43,634</td>
<td><br>39,195</td>
<td><br>21,391</td>
</tr>
</tbody>
</table>
## Evaluation results
The evaluation was carried out using the [**evaluate**](https://pypi.org/project/evaluate/) python package.
### frenchNER_3entities
For space reasons, we show only the F1 of the different models. You can see the full results below the table.
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="1"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br>0.941</td>
<td><br>0.883</td>
<td><br>0.658</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>0.942</td>
<td><br>0.882</td>
<td><br>0.647</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>0.966</td>
<td><br>0.940</td>
<td><br>0.876</td>
</tr>
<tr>
<td rowspan="1"><br>NERmembert-large-3entities (this model)</td>
<td><br><b>0.969</b></td>
<td><br><b>0.947</b></td>
<td><br><b>0.890</b></td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>0.951</td>
<td><br>0.894</td>
<td><br>0.671</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br>0.958</td>
<td><br>0.901</td>
<td><br>0.685</td>
</tr>
</tbody>
</table>
<details>
<summary>Full results</summary>
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>Metrics</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
<th><br>O</th>
<th><br>Overall</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="3"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br>Precision</td>
<td><br>0.918</td>
<td><br>0.860</td>
<td><br>0.831</td>
<td><br>0.992</td>
<td><br>0.974</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.964</td>
<td><br>0.908</td>
<td><br>0.544</td>
<td><br>0.964</td>
<td><br>0.948</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.941</td>
<td><br>0.883</td>
<td><br>0.658</td>
<td><br>0.978</td>
<td><br>0.961</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>Precision</td>
<td><br>0.929</td>
<td><br>0.861</td>
<td><br>0.813</td>
<td><br>0.991</td>
<td><br>0.974</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.956</td>
<td><br>0.905</td>
<td><br>0.956</td>
<td><br>0.965</td>
<td><br>0.948</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.942</td>
<td><br>0.882</td>
<td><br>0.647</td>
<td><br>0.978</td>
<td><br>0.961</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>Precision</td>
<td><br>0.961</td>
<td><br>0.935</td>
<td><br>0.877</td>
<td><br>0.995</td>
<td><br>0.986</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.972</td>
<td><br>0.946</td>
<td><br>0.876</td>
<td><br>0.994</td>
<td><br>0.986</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.966</td>
<td><br>0.940</td>
<td><br>0.876</td>
<td><br>0.994</td>
<td><br>0.986</td>
</tr>
<tr>
<td rowspan="3"><br>NERmembert-large-3entities (this model)</td>
<td><br>Precision</td>
<td><br>0.966</td>
<td><br>0.944</td>
<td><br>0.884</td>
<td><br>0.996</td>
<td><br>0.987</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.950</td>
<td><br>0.972</td>
<td><br>0.896</td>
<td><br>0.994</td>
<td><br>0.987</td>
</tr>
<tr>
<td>F1</td>
<td><br><b>0.969</b></td>
<td><br><b>0.947</b></td>
<td><br><b>0.890</b></td>
<td><br><b>0.995</b></td>
<td><br><b>0.987</b></td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>Precision</td>
<td><br>0.946</td>
<td><br>0.884</td>
<td><br>0.859</td>
<td><br>0.993</td>
<td><br>0.971</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.955</td>
<td><br>0.904</td>
<td><br>0.550</td>
<td><br>0.993</td>
<td><br>0.971</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.951</td>
<td><br>0.894</td>
<td><br>0.671</td>
<td><br>0.988</td>
<td><br>0.971</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br>Precision</td>
<td><br>0.955</td>
<td><br>0.896</td>
<td><br>0.866</td>
<td><br>0.983</td>
<td><br>0.974</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.960</td>
<td><br>0.906</td>
<td><br>0.567</td>
<td><br>0.994</td>
<td><br>0.974</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.958</td>
<td><br>0.901</td>
<td><br>0.685</td>
<td><br>0.988</td>
<td><br>0.974</td>
</tr>
</tbody>
</table>
</details>
In detail:
### multiconer
For space reasons, we show only the F1 of the different models. You can see the full results below the table.
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="1"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br>0.940</td>
<td><br>0.761</td>
<td><br>0.723</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>0.921</td>
<td><br>0.748</td>
<td><br>0.694</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>0.960</td>
<td><br>0.887</td>
<td><br>0.876</td>
</tr>
<tr>
<td rowspan="1"><br>NERmembert-large-3entities (this model)</td>
<td><br><b>0.965</b></td>
<td><br><b>0.902</b></td>
<td><br><b>0.896</b></td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>0.960</td>
<td><br>0.890</td>
<td><br>0.867</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br>0.969</td>
<td><br>0.919</td>
<td><br>0.904</td>
</tr>
</tbody>
</table>
<details>
<summary>Full results</summary>
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>Metrics</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
<th><br>O</th>
<th><br>Overall</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="3"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br>Precision</td>
<td><br>0.908</td>
<td><br>0.717</td>
<td><br>0.753</td>
<td><br>0.987</td>
<td><br>0.947</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.975</td>
<td><br>0.811</td>
<td><br>0.696</td>
<td><br>0.878</td>
<td><br>0.880</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.940</td>
<td><br>0.761</td>
<td><br>0.723</td>
<td><br>0.929</td>
<td><br>0.912</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>Precision</td>
<td><br>0.885</td>
<td><br>0.738</td>
<td><br>0.737</td>
<td><br>0.983</td>
<td><br>0.943</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.960</td>
<td><br>0.759</td>
<td><br>0.655</td>
<td><br>0.882</td>
<td><br>0.877</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.921</td>
<td><br>0.748</td>
<td><br>0.694</td>
<td><br>0.930</td>
<td><br>0.909</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>Precision</td>
<td><br>0.957</td>
<td><br>0.894</td>
<td><br>0.876</td>
<td><br>0.986</td>
<td><br>0.972</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.962</td>
<td><br>0.880</td>
<td><br>0.878</td>
<td><br>0.985</td>
<td><br>0.972</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.960</td>
<td><br>0.887</td>
<td><br>0.876</td>
<td><br>0.985</td>
<td><br>0.972</td>
</tr>
<tr>
<td rowspan="3"><br>NERmembert-large-3entities (this model)</td>
<td><br>Precision</td>
<td><br>0.960</td>
<td><br>0.903</td>
<td><br>0.916</td>
<td><br>0.987</td>
<td><br>0.976</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.969</td>
<td><br>0.900</td>
<td><br>0.877</td>
<td><br>0.987</td>
<td><br>0.976</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.965</td>
<td><br>0.902</td>
<td><br>0.896</td>
<td><br>0.987</td>
<td><br>0.976</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>Precision</td>
<td><br>0.954</td>
<td><br>0.893</td>
<td><br>0.851</td>
<td><br>0.988</td>
<td><br>0.972</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.967</td>
<td><br>0.887</td>
<td><br>0.883</td>
<td><br>0.984</td>
<td><br>0.972</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.960</td>
<td><br>0.890</td>
<td><br>0.867</td>
<td><br>0.986</td>
<td><br>0.972</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br>Precision</td>
<td><br>0.964</td>
<td><br>0.922</td>
<td><br>0.904</td>
<td><br>0.990</td>
<td><br>0.978</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.975</td>
<td><br>0.917</td>
<td><br>0.904</td>
<td><br>0.988</td>
<td><br>0.978</td>
</tr>
<tr>
<td>F1</td>
<td><br><b>0.969</b></td>
<td><br><b>0.919</b></td>
<td><br><b>0.904</b></td>
<td><br><b>0.989</b></td>
<td><br><b>0.978</b></td>
</tr>
</tbody>
</table>
</details>
### multinerd
For space reasons, we show only the F1 of the different models. You can see the full results below the table.
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="1"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br>0.962</td>
<td><br>0.934</td>
<td><br>0.888</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>0.972</td>
<td><br>0.938</td>
<td><br>0.884</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>0.985</td>
<td><br>0.973</td>
<td><br>0.938</td>
</tr>
<tr>
<td rowspan="1"><br>NERmembert-large-3entities (this model)</td>
<td><br><b>0.987</b></td>
<td><br><b>0.979</b></td>
<td><br><b>0.953</b></td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>0.985</td>
<td><br>0.973</td>
<td><br>0.938</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br><b>0.987</b></td>
<td><br>0.976</td>
<td><br>0.948</td>
</tr>
</tbody>
</table>
<details>
<summary>Full results</summary>
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>Metrics</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
<th><br>O</th>
<th><br>Overall</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="3"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br>Precision</td>
<td><br>0.931</td>
<td><br>0.893</td>
<td><br>0.827</td>
<td><br>0.999</td>
<td><br>0.988</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.994</td>
<td><br>0.980</td>
<td><br>0.959</td>
<td><br>0.973</td>
<td><br>0.974</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.962</td>
<td><br>0.934</td>
<td><br>0.888</td>
<td><br>0.986</td>
<td><br>0.981</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>Precision</td>
<td><br>0.954</td>
<td><br>0.908</td>
<td><br>0.817</td>
<td><br>0.999</td>
<td><br>0.990</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.991</td>
<td><br>0.969</td>
<td><br>0.963</td>
<td><br>0.975</td>
<td><br>0.975</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.972</td>
<td><br>0.938</td>
<td><br>0.884</td>
<td><br>0.987</td>
<td><br>0.983</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>Precision</td>
<td><br>0.974</td>
<td><br>0.965</td>
<td><br>0.910</td>
<td><br>0.999</td>
<td><br>0.995</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.995</td>
<td><br>0.981</td>
<td><br>0.968</td>
<td><br>0.996</td>
<td><br>0.995</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.985</td>
<td><br>0.973</td>
<td><br>0.938</td>
<td><br>0.998</td>
<td><br>0.995</td>
</tr>
<tr>
<td rowspan="3"><br>NERmembert-large-3entities (this model)</td>
<td><br>Precision</td>
<td><br>0.979</td>
<td><br>0.970</td>
<td><br>0.927</td>
<td><br>0.999</td>
<td><br>0.996</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.996</td>
<td><br>0.987</td>
<td><br>0.980</td>
<td><br>0.997</td>
<td><br>0.996</td>
</tr>
<tr>
<td>F1</td>
<td><br><b>0.987</b></td>
<td><br><b>0.979</b></td>
<td><br><b>0.953</b></td>
<td><br><b>0.998</b></td>
<td><br><b>0.996</b></td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>Precision</td>
<td><br>0.976</td>
<td><br>0.961</td>
<td><br>0.910</td>
<td><br>0.999</td>
<td><br>0.995</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.994</td>
<td><br>0.985</td>
<td><br>0.967</td>
<td><br>0.996</td>
<td><br>0.995</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.985</td>
<td><br>0.973</td>
<td><br>0.938</td>
<td><br>0.998</td>
<td><br>0.995</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br>Precision</td>
<td><br>0.979</td>
<td><br>0.967</td>
<td><br>0.922</td>
<td><br>0.999</td>
<td><br>0.996</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.996</td>
<td><br>0.986</td>
<td><br>0.974</td>
<td><br>0.974</td>
<td><br>0.996</td>
</tr>
<tr>
<td>F1</td>
<td><br><b>0.987</b></td>
<td><br>0.976</td>
<td><br>0.948</td>
<td><br>0.998</td>
<td><br>0.996</td>
</tr>
</tbody>
</table>
</details>
### wikiner
For space reasons, we show only the F1 of the different models. You can see the full results below the table.
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="1"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br><b>0.986</b></td>
<td><br><b>0.966</b></td>
<td><br><b>0.938</b></td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>0.983</td>
<td><br>0.964</td>
<td><br>0.925</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>0.969</td>
<td><br>0.945</td>
<td><br>0.878</td>
</tr>
<tr>
<td rowspan="1"><br>NERmembert-large-3entities (this model)</td>
<td><br>0.972</td>
<td><br>0.950</td>
<td><br>0.893</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>0.970</td>
<td><br>0.945</td>
<td><br>0.876</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br>0.975</td>
<td><br>0.953</td>
<td><br>0.896</td>
</tr>
</tbody>
</table>
<details>
<summary>Full results</summary>
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>Metrics</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
<th><br>O</th>
<th><br>Overall</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="3"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br>Precision</td>
<td><br>0.986</td>
<td><br>0.962</td>
<td><br>0.925</td>
<td><br>0.999</td>
<td><br>0.994</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.987</td>
<td><br>0.969</td>
<td><br>0.951</td>
<td><br>0.965</td>
<td><br>0.967</td>
</tr>
<tr>
<td>F1</td>
<td><br><b></b>0.986</b></td>
<td><br><b>0.966</b></td>
<td><br><b>0.938</b></td>
<td><br><b>0.982</b></td>
<td><br><b>0.980</b></td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>Precision</td>
<td><br>0.982</td>
<td><br>0.951</td>
<td><br>0.910</td>
<td><br>0.998</td>
<td><br>0.994</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.985</td>
<td><br>0.963</td>
<td><br>0.940</td>
<td><br>0.966</td>
<td><br>0.967</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.983</td>
<td><br>0.964</td>
<td><br>0.925</td>
<td><br>0.982</td>
<td><br>0.80</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>Precision</td>
<td><br>0.971</td>
<td><br>0.947</td>
<td><br>0.866</td>
<td><br>0.994</td>
<td><br>0.989</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.969</td>
<td><br>0.942</td>
<td><br>0.891</td>
<td><br>0.995</td>
<td><br>0.989</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.969</td>
<td><br>0.945</td>
<td><br>0.878</td>
<td><br>0.995</td>
<td><br>0.989</td>
</tr>
<tr>
<td rowspan="3"><br>NERmembert-large-3entities (this model)</td>
<td><br>Precision</td>
<td><br>0.973</td>
<td><br>0.953</td>
<td><br>0.873</td>
<td><br>0.996</td>
<td><br>0.990</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.990</td>
<td><br>0.948</td>
<td><br>0.913</td>
<td><br>0.995</td>
<td><br>0.990</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.972</td>
<td><br>0.950</td>
<td><br>0.893</td>
<td><br>0.996</td>
<td><br>0.990</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>Precision</td>
<td><br>0.970</td>
<td><br>0.944</td>
<td><br>0.872</td>
<td><br>0.955</td>
<td><br>0.988</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.989</td>
<td><br>0.947</td>
<td><br>0.880</td>
<td><br>0.995</td>
<td><br>0.988</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.970</td>
<td><br>0.945</td>
<td><br>0.876</td>
<td><br>0.995</td>
<td><br>0.988</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br>Precision</td>
<td><br>0.975</td>
<td><br>0.957</td>
<td><br>0.872</td>
<td><br>0.996</td>
<td><br>0.991</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.975</td>
<td><br>0.949</td>
<td><br>0.922</td>
<td><br>0.996</td>
<td><br>0.991</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.975</td>
<td><br>0.953</td>
<td><br>0.896</td>
<td><br>0.996</td>
<td><br>0.991</td>
</tr>
</tbody>
</table>
</details>
### wikiann
For space reasons, we show only the F1 of the different models. You can see the full results below the table.
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="1"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br>0.867</td>
<td><br>0.722</td>
<td><br>0.451</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>0.862</td>
<td><br>0.722</td>
<td><br>0.451</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>0.947</td>
<td><br>0.906</td>
<td><br>0.886</td>
</tr>
<tr>
<td rowspan="1"><br>NERmembert-large-3entities (this model)</td>
<td><br><b>0.949</b></td>
<td><br><b>0.912</b></td>
<td><br><b>0.899</b></td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>0.888</td>
<td><br>0.733</td>
<td><br>0.496</td>
</tr>
<tr>
<td rowspan="1"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br>0.905</td>
<td><br>0.741</td>
<td><br>0.511</td>
</tr>
</tbody>
</table>
<details>
<summary>Full results</summary>
<table>
<thead>
<tr>
<th><br>Model</th>
<th><br>Metrics</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
<th><br>O</th>
<th><br>Overall</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="3"><br><a href="https://hf.co/Jean-Baptiste/camembert-ner">Jean-Baptiste/camembert-ner</a></td>
<td><br>Precision</td>
<td><br>0.862</td>
<td><br>0.700</td>
<td><br>0.864</td>
<td><br>0.867</td>
<td><br>0.832</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.871</td>
<td><br>0.746</td>
<td><br>0.305</td>
<td><br>0.950</td>
<td><br>0.772</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.867</td>
<td><br>0.722</td>
<td><br>0.451</td>
<td><br>0.867</td>
<td><br>0.801</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/cmarkea/distilcamembert-base-ner">cmarkea/distilcamembert-base-ner</a></td>
<td><br>Precision</td>
<td><br>0.862</td>
<td><br>0.700</td>
<td><br>0.864</td>
<td><br>0.867</td>
<td><br>0.832</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.871</td>
<td><br>0.746</td>
<td><br>0.305</td>
<td><br>0.950</td>
<td><br>0.772</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.867</td>
<td><br>0.722</td>
<td><br>0.451</td>
<td><br>0.907</td>
<td><br>0.800</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-3entities">NERmembert-base-3entities</a></td>
<td><br>Precision</td>
<td><br>0.948</td>
<td><br>0.900</td>
<td><br>0.893</td>
<td><br>0.979</td>
<td><br>0.942</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.946</td>
<td><br>0.911</td>
<td><br>0.878</td>
<td><br>0.982</td>
<td><br>0.942</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.947</td>
<td><br>0.906</td>
<td><br>0.886</td>
<td><br>0.980</td>
<td><br>0.942</td>
</tr>
<tr>
<td rowspan="3"><br>NERmembert-large-3entities (this model)</td>
<td><br>Precision</td>
<td><br>0.958</td>
<td><br>0.917</td>
<td><br>0.897</td>
<td><br>0.980</td>
<td><br><b>0.948</b></td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.940</td>
<td><br>0.915</td>
<td><br>0.901</td>
<td><br>0.983</td>
<td><br><b>0.948</b></td>
</tr>
<tr>
<td>F1</td>
<td><br><b>0.949</b></td>
<td><br><b>0.912</b></td>
<td><br><b>0.899</b></td>
<td><br><b>0.983</b></td>
<td><br><b>0.948</b></td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-base-4entities">NERmembert-base-4entities</a></td>
<td><br>Precision</td>
<td><br>0.895</td>
<td><br>0.727</td>
<td><br>0.903</td>
<td><br>0.766</td>
<td><br>0.794</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.881</td>
<td><br>0.740</td>
<td><br>0.342</td>
<td><br>0.984</td>
<td><br>0.794</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.888</td>
<td><br>0.733</td>
<td><br>0.496</td>
<td><br>0.861</td>
<td><br>0.794</td>
</tr>
<tr>
<td rowspan="3"><br><a href="https://hf.co/CATIE-AQ/NERmembert-large-4entities">NERmembert-large-4entities</a></td>
<td><br>Precision</td>
<td><br>0.922</td>
<td><br>0.738</td>
<td><br>0.923</td>
<td><br>0.766</td>
<td><br>0.802</td>
</tr>
<tr>
<td><br>Recall</td>
<td><br>0.888</td>
<td><br>0.743</td>
<td><br>0.353</td>
<td><br>0.988</td>
<td><br>0.802</td>
</tr>
<tr>
<td>F1</td>
<td><br>0.905</td>
<td><br>0.741</td>
<td><br>0.511</td>
<td><br>0.863</td>
<td><br>0.802</td>
</tr>
</tbody>
</table>
</details>
## Usage
### Code
```python
from transformers import pipeline
ner = pipeline('token-classification', model='CATIE-AQ/NERmembert-large-3entities', tokenizer='CATIE-AQ/NERmembert-large-3entities', aggregation_strategy="simple")
result = ner(
"Le dévoilement du logo officiel des JO s'est déroulé le 21 octobre 2019 au Grand Rex. Ce nouvel emblème et cette nouvelle typographie ont été conçus par le designer Sylvain Boyer avec les agences Royalties & Ecobranding. Rond, il rassemble trois symboles : une médaille d'or, la flamme olympique et Marianne, symbolisée par un visage de femme mais privée de son bonnet phrygien caractéristique. La typographie dessinée fait référence à l'Art déco, mouvement artistique des années 1920, décennie pendant laquelle ont eu lieu pour la dernière fois les Jeux olympiques à Paris en 1924. Pour la première fois, ce logo sera unique pour les Jeux olympiques et les Jeux paralympiques."
)
print(result)
```
```python
[{'entity_group': 'LOC', 'score': 0.96300715, 'word': 'Grand Rex', 'start': 74, 'end': 84},
{'entity_group': 'PER', 'score': 0.84991235, 'word': 'Sylvain Boyer', 'start': 164, 'end': 178},
{'entity_group': 'ORG', 'score': 0.63318396, 'word': 'Royalties & Ecobranding', 'start': 195, 'end': 219}]
```
### Try it through Space
A Space has been created to test the model. It is available [here](https://huggingface.co/spaces/CATIE-AQ/NERmembert).
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:------:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0299 | 1.0 | 43650 | 0.0970 | 0.9837 | 0.9837 | 0.9837 | 0.9837 |
| 0.0164 | 2.0 | 87300 | 0.0835 | 0.9864 | 0.9864 | 0.9864 | 0.9864 |
| 0.0108 | 3.0 | 130950 | 0.0846 | 0.9874 | 0.9874 | 0.9874 | 0.9874 |
### Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2
- Datasets 2.16.1
- Tokenizers 0.15.0
## Environmental Impact
*Carbon emissions were estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact.*
- **Hardware Type:** A100 PCIe 40/80GB
- **Hours used:** 4h31min
- **Cloud Provider:** Private Infrastructure
- **Carbon Efficiency (kg/kWh):** 0.077 (estimated from [electricitymaps](https://app.electricitymaps.com/zone/FR) for the day of January 12, 2024.)
- **Carbon Emitted** *(Power consumption x Time x Carbon produced based on location of power grid)*: 0.009 kg eq. CO2
## Citations
### NERembert-large-3entities
```
@misc {NERmembert2024,
author = { {BOURDOIS, Loïck} },
organization = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { NERmembert-large-3entities },
year = 2024,
url = { https://huggingface.co/CATIE-AQ/NERmembert-large-3entities },
doi = { 10.57967/hf/1752 },
publisher = { Hugging Face }
}
```
### multiconer
```
@inproceedings{multiconer2-report,
title={{SemEval-2023 Task 2: Fine-grained Multilingual Named Entity Recognition (MultiCoNER 2)}},
author={Fetahu, Besnik and Kar, Sudipta and Chen, Zhiyu and Rokhlenko, Oleg and Malmasi, Shervin},
booktitle={Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023)},
year={2023},
publisher={Association for Computational Linguistics}}
@article{multiconer2-data,
title={{MultiCoNER v2: a Large Multilingual dataset for Fine-grained and Noisy Named Entity Recognition}},
author={Fetahu, Besnik and Chen, Zhiyu and Kar, Sudipta and Rokhlenko, Oleg and Malmasi, Shervin},
year={2023}}
```
### multinerd
```
@inproceedings{tedeschi-navigli-2022-multinerd,
title = "{M}ulti{NERD}: A Multilingual, Multi-Genre and Fine-Grained Dataset for Named Entity Recognition (and Disambiguation)",
author = "Tedeschi, Simone and Navigli, Roberto",
booktitle = "Findings of the Association for Computational Linguistics: NAACL 2022",
month = jul,
year = "2022",
address = "Seattle, United States",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.findings-naacl.60",
doi = "10.18653/v1/2022.findings-naacl.60",
pages = "801--812"}
```
### pii-masking-200k
```
@misc {ai4privacy_2023,
author = { {ai4Privacy} },
title = { pii-masking-200k (Revision 1d4c0a1) },
year = 2023,
url = { https://huggingface.co/datasets/ai4privacy/pii-masking-200k },
doi = { 10.57967/hf/1532 },
publisher = { Hugging Face }}
```
### wikiann
```
@inproceedings{rahimi-etal-2019-massively,
title = "Massively Multilingual Transfer for {NER}",
author = "Rahimi, Afshin and Li, Yuan and Cohn, Trevor",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1015",
pages = "151--164"}
```
### wikiner
```
@article{NOTHMAN2013151,
title = {Learning multilingual named entity recognition from Wikipedia},
journal = {Artificial Intelligence},
volume = {194},
pages = {151-175},
year = {2013},
note = {Artificial Intelligence, Wikipedia and Semi-Structured Resources},
issn = {0004-3702},
doi = {https://doi.org/10.1016/j.artint.2012.03.006},
url = {https://www.sciencedirect.com/science/article/pii/S0004370212000276},
author = {Joel Nothman and Nicky Ringland and Will Radford and Tara Murphy and James R. Curran}}
```
### frenchNER_3entities
```
@misc {frenchNER2024,
author = { {BOURDOIS, Loïck} },
organization = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { frenchNER_3entities },
year = 2024,
url = { https://huggingface.co/CATIE-AQ/frenchNER_3entities },
doi = { 10.57967/hf/1751 },
publisher = { Hugging Face }
}
```
### CamemBERT
```
@inproceedings{martin2020camembert,
title={CamemBERT: a Tasty French Language Model},
author={Martin, Louis and Muller, Benjamin and Su{\'a}rez, Pedro Javier Ortiz and Dupont, Yoann and Romary, Laurent and de la Clergerie, {\'E}ric Villemonte and Seddah, Djam{\'e} and Sagot, Beno{\^\i}t},
booktitle={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
year={2020}}
```
## License
MIT