We present LEGAL-BERT, a family of BERT models for the legal domain, designed to assist legal NLP research, computational law, and legal technology applications. Our models are trained on a large corpus of legal texts and demonstrate improved performance compared to using BERT out of the box. We release our models and pre-training corpora to facilitate further research and development in this field.
We present LEGAL-BERT, a family of BERT models for the legal domain, designed to assist legal NLP research, computational law, and legal technology applications. Our models are trained on a large corpus of legal texts and demonstrate improved performance compared to using BERT out of the box. We release our models and pre-training corpora to facilitate further research and development in this field.
text prompt, should include exactly one [MASK] token
You need to login to use this model
where is my father? (0.09)
where is my mother? (0.08)