bert-base-uncased cover image

bert-base-uncased

A transformers model pretrained on a large corpus of English data in a self-supervised fashion. It was trained on BookCorpus, a dataset consisting of 11,038 unpublished books, and English Wikipedia, excluding lists, tables, and headers. The model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks.

A transformers model pretrained on a large corpus of English data in a self-supervised fashion. It was trained on BookCorpus, a dataset consisting of 11,038 unpublished books, and English Wikipedia, excluding lists, tables, and headers. The model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks.

Public
$0.0005/sec

Input

text prompt, should include exactly one [MASK] token

You need to login to use this model

Output

where is my father? (0.09)

where is my mother? (0.08)

 


© 2023 Deep Infra. All rights reserved.

Discord Logo