aubmindlab/bert-base-arabertv02 cover image

aubmindlab/bert-base-arabertv02

An Arabic pretrained language model based on Google's BERT architecture, with two versions: AraBERTv1 and AraBERTv2. It uses the same BERT-Base configuration and is trained on a large dataset of 200 million words, including OSCAR-unshuffled, Arabic Wikipedia, and Assafir news articles. The model is available in TensorFlow 1.x and Hugging Face models repository.

An Arabic pretrained language model based on Google's BERT architecture, with two versions: AraBERTv1 and AraBERTv2. It uses the same BERT-Base configuration and is trained on a large dataset of 200 million words, including OSCAR-unshuffled, Arabic Wikipedia, and Assafir news articles. The model is available in TensorFlow 1.x and Hugging Face models repository.

Public
$0.0005/sec

Input

text prompt, should include exactly one [MASK] token

You need to login to use this model

Output

where is my father? (0.09)

where is my mother? (0.08)

 


© 2023 Deep Infra. All rights reserved.

Discord Logo