The National Library of Sweden has released three pre-trained language models based on BERT and ALBERT for Swedish text. The models include a BERT base model, a BERT fine-tuned for named entity recognition, and an experimental ALBERT model. They were trained on approximately 15-20 GB of text data from various sources such as books, news, government publications, Swedish Wikipedia, and internet forums.
The National Library of Sweden has released three pre-trained language models based on BERT and ALBERT for Swedish text. The models include a BERT base model, a BERT fine-tuned for named entity recognition, and an experimental ALBERT model. They were trained on approximately 15-20 GB of text data from various sources such as books, news, government publications, Swedish Wikipedia, and internet forums.
webhook
fileThe webhook to call when inference is done, by default you will get the output in the response of your inference request