bert-base-multilingual-cased cover image

bert-base-multilingual-cased

A pre-trained multilingual model that uses a masked language modeling objective to learn a bidirectional representation of languages. It was trained on 104 languages with the largest Wikipedias, and its inputs are in the form of [CLS] Sentence A [SEP] Sentence B [SEP]. The model is primarily aimed at being fine-tuned on tasks that use the whole sentence, potentially masked, to make decisions.

A pre-trained multilingual model that uses a masked language modeling objective to learn a bidirectional representation of languages. It was trained on 104 languages with the largest Wikipedias, and its inputs are in the form of [CLS] Sentence A [SEP] Sentence B [SEP]. The model is primarily aimed at being fine-tuned on tasks that use the whole sentence, potentially masked, to make decisions.

Public
$0.0005/sec

Input

text prompt, should include exactly one [MASK] token

You need to login to use this model

Output

where is my father? (0.09)

where is my mother? (0.08)

 


© 2023 Deep Infra. All rights reserved.

Discord Logo