bert-base-multilingual-uncased cover image

bert-base-multilingual-uncased

A transformer-based language model trained on 102 languages with the largest Wikipedia. It was introduced in a research paper by Google Research and has been widely used for various natural language processing tasks. The model is trained using a masked language modeling objective, where 15% of the tokens are masked, and the model predicts the missing tokens.

A transformer-based language model trained on 102 languages with the largest Wikipedia. It was introduced in a research paper by Google Research and has been widely used for various natural language processing tasks. The model is trained using a masked language modeling objective, where 15% of the tokens are masked, and the model predicts the missing tokens.

Public
$0.0005/sec

HTTP/cURL API

 

Input fields

inputstring

text prompt, should include exactly one [MASK] token


webhookfile

The webhook to call when inference is done, by default you will get the output in the response of your inference request

Input Schema

Output Schema


© 2023 Deep Infra. All rights reserved.

Discord Logo