LEGAL-BERT is a family of BERT models for the legal domain, designed to assist legal NLP research, computational law, and legal technology applications. It includes five variants, including LEGAL-BERT-BASE, which achieved better performance than other models on several downstream tasks. The authors suggest possible applications, such as developing question answering systems for databases, ontologies, document collections, and the web; natural language generation from databases and ontologies; text classification; information extraction and opinion mining; and machine learning in natural language processing.
LEGAL-BERT is a family of BERT models for the legal domain, designed to assist legal NLP research, computational law, and legal technology applications. It includes five variants, including LEGAL-BERT-BASE, which achieved better performance than other models on several downstream tasks. The authors suggest possible applications, such as developing question answering systems for databases, ontologies, document collections, and the web; natural language generation from databases and ontologies; text classification; information extraction and opinion mining; and machine learning in natural language processing.
text prompt, should include exactly one [MASK] token
You need to login to use this model
where is my father? (0.09)
where is my mother? (0.08)