distilbert-base-uncased cover image

distilbert-base-uncased

DistilBERT is a smaller, faster, and cheaper version of BERT, a popular language model. It was trained on the same data as BERT, including BookCorpus and English Wikipedia, but with a few key differences in the preprocessing and training procedures. Despite its smaller size, DistilBERT achieve's similar results to BERT on various natural language processing tasks.

DistilBERT is a smaller, faster, and cheaper version of BERT, a popular language model. It was trained on the same data as BERT, including BookCorpus and English Wikipedia, but with a few key differences in the preprocessing and training procedures. Despite its smaller size, DistilBERT achieve's similar results to BERT on various natural language processing tasks.

Public
$0.0005/sec

HTTP/cURL API

 

Input fields

inputstring

text prompt, should include exactly one [MASK] token


webhookfile

The webhook to call when inference is done, by default you will get the output in the response of your inference request

Input Schema

Output Schema


© 2023 Deep Infra. All rights reserved.

Discord Logo