GroNLP/bert-base-dutch-cased cover image

GroNLP/bert-base-dutch-cased

We present BERTje, a Dutch pre-trained BERT model developed at the University of Groningen. BERTje achieved state-of-the-art results on several NLP tasks such as named entity recognition and part-of-speech tagging. We also provide a detailed comparison of BERTje with other pre-trained models such as mBERT and RobBERT.

We present BERTje, a Dutch pre-trained BERT model developed at the University of Groningen. BERTje achieved state-of-the-art results on several NLP tasks such as named entity recognition and part-of-speech tagging. We also provide a detailed comparison of BERTje with other pre-trained models such as mBERT and RobBERT.

Public
$0.0005/sec

Input

text prompt, should include exactly one [MASK] token

You need to login to use this model

Output

where is my father? (0.09)

where is my mother? (0.08)

 


© 2023 Deep Infra. All rights reserved.

Discord Logo