Rostlab/prot_bert_bfd cover image

Rostlab/prot_bert_bfd

A pretrained language model on protein sequences using a masked language modeling objective. It achieved high scores on various downstream tasks such as secondary structure prediction and localization. The model was trained on a large corpus of protein sequences in a self-supervised fashion, without human labeling, using a combination of a Bert model and a vocabulary size of 21.

A pretrained language model on protein sequences using a masked language modeling objective. It achieved high scores on various downstream tasks such as secondary structure prediction and localization. The model was trained on a large corpus of protein sequences in a self-supervised fashion, without human labeling, using a combination of a Bert model and a vocabulary size of 21.

Public
$0.0005 / sec
demoapi

6c5c8a55a52ff08a664dfd584aa1773f125a0487

2023-03-03T22:57:07+00:00