A transformer-based language model trained on 102 languages with the largest Wikipedia. It was introduced in a research paper by Google Research and has been widely used for various natural language processing tasks. The model is trained using a masked language modeling objective, where 15% of the tokens are masked, and the model predicts the missing tokens.
A transformer-based language model trained on 102 languages with the largest Wikipedia. It was introduced in a research paper by Google Research and has been widely used for various natural language processing tasks. The model is trained using a masked language modeling objective, where 15% of the tokens are masked, and the model predicts the missing tokens.
webhook
fileThe webhook to call when inference is done, by default you will get the output in the response of your inference request