A multilingual MiniLMv2 model trained on 16 languages, using a shared vocabulary and language-specific embeddings. The model is based on the transformer architecture and was developed by Microsoft Research. It includes support for various natural language processing tasks such as language translation, question answering, and text classification.
A multilingual MiniLMv2 model trained on 16 languages, using a shared vocabulary and language-specific embeddings. The model is based on the transformer architecture and was developed by Microsoft Research. It includes support for various natural language processing tasks such as language translation, question answering, and text classification.
d828558d1a570cbbb5e62a8dbf85c8f18bf7982a
2023-03-03T02:34:08+00:00