neuralmind/bert-large-portuguese-cased cover image

neuralmind/bert-large-portuguese-cased

BERTimbau Large is a pretrained BERT model for Brazilian Portuguese that achieves state-of-the-art performances on three downstream NLP tasks. It is available in two sizes: Base and Large. The model can be used for various NLP tasks such as masked language modeling prediction, and BERT embeddings.

BERTimbau Large is a pretrained BERT model for Brazilian Portuguese that achieves state-of-the-art performances on three downstream NLP tasks. It is available in two sizes: Base and Large. The model can be used for various NLP tasks such as masked language modeling prediction, and BERT embeddings.

Public
$0.0005 / sec

Input

text prompt, should include exactly one [MASK] token

You need to login to use this model

Output

where is my father? (0.09)

where is my mother? (0.08)

BERTimbau Large (aka "bert-large-portuguese-cased")

Bert holding a berimbau

Introduction

BERTimbau Large is a pretrained BERT model for Brazilian Portuguese that achieves state-of-the-art performances on three downstream NLP tasks: Named Entity Recognition, Sentence Textual Similarity and Recognizing Textual Entailment. It is available in two sizes: Base and Large.

For further information or requests, please go to BERTimbau repository.

Available models

ModelArch.#Layers#Params
neuralmind/bert-base-portuguese-casedBERT-Base12110M
neuralmind/bert-large-portuguese-casedBERT-Large24335M

Citation

If you use our work, please cite:

@inproceedings{souza2020bertimbau,
  author    = {F{\'a}bio Souza and
               Rodrigo Nogueira and
               Roberto Lotufo},
  title     = {{BERT}imbau: pretrained {BERT} models for {B}razilian {P}ortuguese},
  booktitle = {9th Brazilian Conference on Intelligent Systems, {BRACIS}, Rio Grande do Sul, Brazil, October 20-23 (to appear)},
  year      = {2020}
}