Rostlab/prot_bert cover image

Rostlab/prot_bert

A pre-trained language model developed specifically for protein sequences using a masked language modeling (MLM) objective. It achieved impressive results when fine-tuned on downstream tasks such as secondary structure prediction and sub-cellular localization. The model was trained on uppercase amino acids only and used a vocabulary size of 21, with inputs of the form "[CLS] Protein Sequence A [SEP] Protein Sequence B [SEP]"

A pre-trained language model developed specifically for protein sequences using a masked language modeling (MLM) objective. It achieved impressive results when fine-tuned on downstream tasks such as secondary structure prediction and sub-cellular localization. The model was trained on uppercase amino acids only and used a vocabulary size of 21, with inputs of the form "[CLS] Protein Sequence A [SEP] Protein Sequence B [SEP]"

Public
$0.0005/sec

HTTP/cURL API

 

Input fields

inputstring

text prompt, should include exactly one [MASK] token


webhookfile

The webhook to call when inference is done, by default you will get the output in the response of your inference request

Input Schema

Output Schema


© 2023 Deep Infra. All rights reserved.

Discord Logo