emilyalsentzer/Bio_ClinicalBERT cover image

emilyalsentzer/Bio_ClinicalBERT

The Bio+Clinical BERT model, is initialized from BioBERT and trained on all MIMIC notes. The model was pre-trained using a rules-based section splitter and Sentispacy tokenizer, with a batch size of 32, max sequence length of 128, and learning rate of 5·10^-5 for 150,000 steps.

The Bio+Clinical BERT model, is initialized from BioBERT and trained on all MIMIC notes. The model was pre-trained using a rules-based section splitter and Sentispacy tokenizer, with a batch size of 32, max sequence length of 128, and learning rate of 5·10^-5 for 150,000 steps.

Public
$0.0005 / sec
demoapi

41943bf7f983007123c758373c5246305cc536ec

2023-03-03T02:55:14+00:00