This paper presents a fine-tuned Spanish BERT model (BETO) for the Named Entity Recognition (NER) task. The model was trained on the CONLL Corpora ES dataset and achieved an F1 score of 90.17%. The authors also compared their model with other state-of-the-art models, including a multilingual BERT and a TinyBERT model, and demonstrated its effectiveness in identifying entities in Spanish text.
This paper presents a fine-tuned Spanish BERT model (BETO) for the Named Entity Recognition (NER) task. The model was trained on the CONLL Corpora ES dataset and achieved an F1 score of 90.17%. The authors also compared their model with other state-of-the-art models, including a multilingual BERT and a TinyBERT model, and demonstrated its effectiveness in identifying entities in Spanish text.
This model is a fine-tuned on NER-C version of the Spanish BERT cased (BETO) for NER downstream task.
I preprocessed the dataset and split it as train / dev (80/20)
Dataset | # Examples |
---|---|
Train | 8.7 K |
Dev | 2.2 K |
Labels covered:
B-LOC
B-MISC
B-ORG
B-PER
I-LOC
I-MISC
I-ORG
I-PER
O
Metric | # score |
---|---|
F1 | 90.17 |
Precision | 89.86 |
Recall | 90.47 |
Model | # F1 score | Size(MB) |
---|---|---|
bert-base-spanish-wwm-cased (BETO) | 88.43 | 421 |
bert-spanish-cased-finetuned-ner (this one) | 90.17 | 420 |
Best Multilingual BERT | 87.38 | 681 |
TinyBERT-spanish-uncased-finetuned-ner | 70.00 | 55 |
Created by Manuel Romero/@mrm8488
Made with ♥ in Spain