A multilingual MiniLMv2 model trained on 16 languages, using a shared vocabulary and language-specific embeddings. The model is based on the transformer architecture and was developed by Microsoft Research. It includes support for various natural language processing tasks such as language translation, question answering, and text classification.
A multilingual MiniLMv2 model trained on 16 languages, using a shared vocabulary and language-specific embeddings. The model is based on the transformer architecture and was developed by Microsoft Research. It includes support for various natural language processing tasks such as language translation, question answering, and text classification.
text prompt, should include exactly one <mask> token
You need to login to use this model
where is my father? (0.09)
where is my mother? (0.08)