bert-large-uncased-whole-word-masking-finetuned-squad cover image

bert-large-uncased-whole-word-masking-finetuned-squad

A whole word masking model finetuned on SQuAD is a transformer-based language model pretrained on a large corpus of English data. The model was trained using a masked language modeling objective, where 15% of the tokens in a sentence were randomly masked, and the model had to predict the missing tokens. The model was also fine-tuned on the SQuAD dataset for question answering tasks, achieving high scores on both F1 and exact match metrics.

A whole word masking model finetuned on SQuAD is a transformer-based language model pretrained on a large corpus of English data. The model was trained using a masked language modeling objective, where 15% of the tokens in a sentence were randomly masked, and the model had to predict the missing tokens. The model was also fine-tuned on the SQuAD dataset for question answering tasks, achieving high scores on both F1 and exact match metrics.

Public
$0.0005/sec

Input

question relating to context

question source material

You need to login to use this model

Output

fox (0.18)

 


© 2023 Deep Infra. All rights reserved.

Discord Logo