The ruRoberta-large model was trained by the SberDevices team for mask filling tasks using encoders and BBPE tokenizers. It has 355 million parameters and was trained on 250GB of data. The NLP Core Team RnD, including Dmitry Zmitrovich, contributed to its development.
The ruRoberta-large model was trained by the SberDevices team for mask filling tasks using encoders and BBPE tokenizers. It has 355 million parameters and was trained on 250GB of data. The NLP Core Team RnD, including Dmitry Zmitrovich, contributed to its development.
webhook
fileThe webhook to call when inference is done, by default you will get the output in the response of your inference request