A pre-trained language model based on RoBERTa, fine-tuned on the SQuAD2.0 dataset for extractive question answering. It achieved scores of 79.87% exact match and 82.91% F1 score on the SQuAD2.0 dev set. Deepset is the company behind the open-source NLP framework Haystack, and offers other resources such as Distilled roberta-base-squad2, German BERT, and GermanQuAD datasets and models.
A pre-trained language model based on RoBERTa, fine-tuned on the SQuAD2.0 dataset for extractive question answering. It achieved scores of 79.87% exact match and 82.91% F1 score on the SQuAD2.0 dev set. Deepset is the company behind the open-source NLP framework Haystack, and offers other resources such as Distilled roberta-base-squad2, German BERT, and GermanQuAD datasets and models.
webhook
fileThe webhook to call when inference is done, by default you will get the output in the response of your inference request