smanjil/German-MedBERT cover image

smanjil/German-MedBERT

This paper presents a fine-tuned German Medical BERT model for the medical domain, achieving improved performance on the NTS-ICD-10 text classification task. The model was trained using PyTorch and Hugging Face library on Colab GPU, with standard parameter settings and up to 25 epochs for classification. Evaluation results show significant improvement in micro precision, recall, and F1 score compared to the base German BERT model.

This paper presents a fine-tuned German Medical BERT model for the medical domain, achieving improved performance on the NTS-ICD-10 text classification task. The model was trained using PyTorch and Hugging Face library on Colab GPU, with standard parameter settings and up to 25 epochs for classification. Evaluation results show significant improvement in micro precision, recall, and F1 score compared to the base German BERT model.

Public
$0.0005 / sec

HTTP/cURL API

You can use cURL or any other http client to run inferences:

curl -X POST \
    -d '{"input": "Where is my [MASK]?"}'  \
    -H "Authorization: bearer $DEEPINFRA_TOKEN"  \
    -H 'Content-Type: application/json'  \
    'https://api.deepinfra.com/v1/inference/smanjil/German-MedBERT'

which will give you back something similar to:

{
  "results": [
    {
      "sequence": "where is my father?",
      "score": 0.08898820728063583,
      "token": 2269,
      "token_str": "father"
    },
    {
      "sequence": "where is my mother?",
      "score": 0.07864926755428314,
      "token": 2388,
      "token_str": "mother"
    }
  ],
  "request_id": null,
  "inference_status": {
    "status": "unknown",
    "runtime_ms": 0,
    "cost": 0.0,
    "tokens_generated": 0,
    "tokens_input": 0
  }
}

Input fields

inputstring

text prompt, should include exactly one [MASK] token


webhookfile

The webhook to call when inference is done, by default you will get the output in the response of your inference request

Input Schema

Output Schema