We present LEGAL-BERT, a family of BERT models for the legal domain, designed to assist legal NLP research, computational law, and legal technology applications. Our models are trained on a large corpus of legal texts and demonstrate improved performance compared to using BERT out of the box. We release our models and pre-training corpora to facilitate further research and development in this field.
We present LEGAL-BERT, a family of BERT models for the legal domain, designed to assist legal NLP research, computational law, and legal technology applications. Our models are trained on a large corpus of legal texts and demonstrate improved performance compared to using BERT out of the box. We release our models and pre-training corpora to facilitate further research and development in this field.
0e23f7a9a39f59768ea7e09766d8ee308580fb17
2023-03-03T02:38:08+00:00