The DistilBERT model is a distilled version of the BERT base multilingual model, trained on 104 languages and featuring 6 layers, 768 dimensions, and 12 heads. It is designed for masked language modeling and next sentence prediction tasks, with potential applications in natural language processing and downstream tasks. However, it should not be used to intentionally create hostile or alienating environments for people, and users should be aware of its risks, biases, and limitations.
The DistilBERT model is a distilled version of the BERT base multilingual model, trained on 104 languages and featuring 6 layers, 768 dimensions, and 12 heads. It is designed for masked language modeling and next sentence prediction tasks, with potential applications in natural language processing and downstream tasks. However, it should not be used to intentionally create hostile or alienating environments for people, and users should be aware of its risks, biases, and limitations.
webhook
fileThe webhook to call when inference is done, by default you will get the output in the response of your inference request