We present LEGAL-BERT, a family of BERT models for the legal domain, designed to assist legal NLP research, computational law, and legal technology applications. Our models are trained on a large corpus of legal texts and demonstrate improved performance compared to using BERT out of the box. We release our models and pre-training corpora to facilitate further research and development in this field.
We present LEGAL-BERT, a family of BERT models for the legal domain, designed to assist legal NLP research, computational law, and legal technology applications. Our models are trained on a large corpus of legal texts and demonstrate improved performance compared to using BERT out of the box. We release our models and pre-training corpora to facilitate further research and development in this field.
webhook
fileThe webhook to call when inference is done, by default you will get the output in the response of your inference request