smanjil/German-MedBERT cover image

smanjil/German-MedBERT

This paper presents a fine-tuned German Medical BERT model for the medical domain, achieving improved performance on the NTS-ICD-10 text classification task. The model was trained using PyTorch and Hugging Face library on Colab GPU, with standard parameter settings and up to 25 epochs for classification. Evaluation results show significant improvement in micro precision, recall, and F1 score compared to the base German BERT model.

This paper presents a fine-tuned German Medical BERT model for the medical domain, achieving improved performance on the NTS-ICD-10 text classification task. The model was trained using PyTorch and Hugging Face library on Colab GPU, with standard parameter settings and up to 25 epochs for classification. Evaluation results show significant improvement in micro precision, recall, and F1 score compared to the base German BERT model.

Public
$0.0005 / sec

Input

text prompt, should include exactly one [MASK] token

You need to login to use this model

Output

where is my father? (0.09)

where is my mother? (0.08)

German Medical BERT

This is a fine-tuned model on the Medical domain for the German language and based on German BERT. This model has only been trained to improve on-target tasks (Masked Language Model). It can later be used to perform a downstream task of your needs, while I performed it for the NTS-ICD-10 text classification task.

Overview

Language model: bert-base-german-cased

Language: German

Fine-tuning: Medical articles (diseases, symptoms, therapies, etc..)

Eval data: NTS-ICD-10 dataset (Classification)

Infrastructure: Google Colab

Details

  • We fine-tuned using Pytorch with Huggingface library on Colab GPU.
  • With standard parameter settings for fine-tuning as mentioned in the original BERT paper.
  • Although had to train for up to 25 epochs for classification.

Performance (Micro precision, recall, and f1 score for multilabel code classification)

ModelsPRF1
German BERT86.0475.8280.60
German MedBERT-256 (fine-tuned)87.4177.9782.42
German MedBERT-512 (fine-tuned)87.7578.2682.73

Author

Manjil Shrestha: shresthamanjil21 [at] gmail.com

Get in touch: LinkedIn