emilyalsentzer/Bio_Discharge_Summary_BERT cover image

emilyalsentzer/Bio_Discharge_Summary_BERT

The Bio+Discharge Summary BERT model, initialized from BioBERT and trained on only discharge summaries from MIMIC, is described. The model was pre-trained using a rules-based section splitter and SentencePiece tokenizer, with a batch size of 32, maximum sequence length of 128, and learning rate of 5·10^-5 for 150,000 steps.

The Bio+Discharge Summary BERT model, initialized from BioBERT and trained on only discharge summaries from MIMIC, is described. The model was pre-trained using a rules-based section splitter and SentencePiece tokenizer, with a batch size of 32, maximum sequence length of 128, and learning rate of 5·10^-5 for 150,000 steps.

Public
$0.0005 / sec
demoapi

affde836a50e4d333f15dae9270f5a856d59540b

2023-03-03T06:41:23+00:00