Rostlab/prot_bert_bfd cover image

Rostlab/prot_bert_bfd

A pretrained language model on protein sequences using a masked language modeling objective. It achieved high scores on various downstream tasks such as secondary structure prediction and localization. The model was trained on a large corpus of protein sequences in a self-supervised fashion, without human labeling, using a combination of a Bert model and a vocabulary size of 21.

A pretrained language model on protein sequences using a masked language modeling objective. It achieved high scores on various downstream tasks such as secondary structure prediction and localization. The model was trained on a large corpus of protein sequences in a self-supervised fashion, without human labeling, using a combination of a Bert model and a vocabulary size of 21.

Public
$0.0005/sec

Input

text prompt, should include exactly one [MASK] token

You need to login to use this model

Output

where is my father? (0.09)

where is my mother? (0.08)

 


© 2023 Deep Infra. All rights reserved.

Discord Logo