We present BERTje, a Dutch pre-trained BERT model developed at the University of Groningen. BERTje achieved state-of-the-art results on several NLP tasks such as named entity recognition and part-of-speech tagging. We also provide a detailed comparison of BERTje with other pre-trained models such as mBERT and RobBERT.
We present BERTje, a Dutch pre-trained BERT model developed at the University of Groningen. BERTje achieved state-of-the-art results on several NLP tasks such as named entity recognition and part-of-speech tagging. We also provide a detailed comparison of BERTje with other pre-trained models such as mBERT and RobBERT.
text prompt, should include exactly one [MASK] token
You need to login to use this model
where is my father? (0.09)
where is my mother? (0.08)