microsoft/deberta-base cover image

microsoft/deberta-base

DeBERTa is a variant of BERT that uses disentangled attention and an enhanced mask decoder to improve performance on natural language understanding (NLU) tasks. In a study, DeBERTa outperformed BERT and RoBERTa on most NLU tasks with only 80GB of training data. The model showed particularly strong results on the SQuAD 1.1/2.0 and MNLI tasks.

DeBERTa is a variant of BERT that uses disentangled attention and an enhanced mask decoder to improve performance on natural language understanding (NLU) tasks. In a study, DeBERTa outperformed BERT and RoBERTa on most NLU tasks with only 80GB of training data. The model showed particularly strong results on the SQuAD 1.1/2.0 and MNLI tasks.

Public
$0.0005/sec

HTTP/cURL API

 

Input fields

inputstring

text prompt, should include exactly one [MASK] token


webhookfile

The webhook to call when inference is done, by default you will get the output in the response of your inference request

Input Schema

Output Schema


© 2023 Deep Infra. All rights reserved.

Discord Logo