uer/albert-base-chinese-cluecorpussmall cover image

uer/albert-base-chinese-cluecorpussmall

We present a Chinese version of the popular BERT-based language model, called Albert, which is trained on the ClueCorpusSmall dataset using the UER-py toolkit. Our model achieves state-of-the-art results on various NLP tasks. The model is trained in two stages, first with a sequence length of 128 and then with a sequence length of 512.

We present a Chinese version of the popular BERT-based language model, called Albert, which is trained on the ClueCorpusSmall dataset using the UER-py toolkit. Our model achieves state-of-the-art results on various NLP tasks. The model is trained in two stages, first with a sequence length of 128 and then with a sequence length of 512.

Public
$0.0005 / sec
demoapi

8634d166f98a6c337bdc6e9fba197df932605cdf

2023-03-03T22:26:30+00:00