hfl/chinese-bert-wwm-ext cover image

hfl/chinese-bert-wwm-ext

Chinese pre-trained BERT with Whole Word Masking, which can be used for various NLP tasks such as question answering, sentiment analysis, named entity recognition, etc. This work is based on the original BERT model but with additional whole word masking techniques to improve its performance on out-of-vocabulary words.

Chinese pre-trained BERT with Whole Word Masking, which can be used for various NLP tasks such as question answering, sentiment analysis, named entity recognition, etc. This work is based on the original BERT model but with additional whole word masking techniques to improve its performance on out-of-vocabulary words.

Public
$0.0005 / sec
demoapi

2a995a880017c60e4683869e817130d8af548486

2023-03-03T02:39:03+00:00