KB/bert-base-swedish-cased cover image

KB/bert-base-swedish-cased

The National Library of Sweden has released three pre-trained language models based on BERT and ALBERT for Swedish text. The models include a BERT base model, a BERT fine-tuned for named entity recognition, and an experimental ALBERT model. They were trained on approximately 15-20 GB of text data from various sources such as books, news, government publications, Swedish Wikipedia, and internet forums.

The National Library of Sweden has released three pre-trained language models based on BERT and ALBERT for Swedish text. The models include a BERT base model, a BERT fine-tuned for named entity recognition, and an experimental ALBERT model. They were trained on approximately 15-20 GB of text data from various sources such as books, news, government publications, Swedish Wikipedia, and internet forums.

Public
$0.0005/sec
demoapi

81c7baa04742a30cb6732c181e678721868cb42e

2023-03-03T06:31:41+00:00


© 2023 Deep Infra. All rights reserved.

Discord Logo