distilbert-base-multilingual-cased cover image

distilbert-base-multilingual-cased

The DistilBERT model is a distilled version of the BERT base multilingual model, trained on 104 languages and featuring 6 layers, 768 dimensions, and 12 heads. It is designed for masked language modeling and next sentence prediction tasks, with potential applications in natural language processing and downstream tasks. However, it should not be used to intentionally create hostile or alienating environments for people, and users should be aware of its risks, biases, and limitations.

The DistilBERT model is a distilled version of the BERT base multilingual model, trained on 104 languages and featuring 6 layers, 768 dimensions, and 12 heads. It is designed for masked language modeling and next sentence prediction tasks, with potential applications in natural language processing and downstream tasks. However, it should not be used to intentionally create hostile or alienating environments for people, and users should be aware of its risks, biases, and limitations.

Public
$0.0005/sec

Input

text prompt, should include exactly one [MASK] token

You need to login to use this model

Output

where is my father? (0.09)

where is my mother? (0.08)

 


© 2023 Deep Infra. All rights reserved.

Discord Logo