FLAN-T5 is a family of instructor-finetuned T5 models scaling up to 540B parameters. They are trained on more than 1000 tasks across over 100 diverse domains and cover multiple languages. FLAN-T5 demonstrates superiority over its predecessor T5 in various NLP tasks while being computationally efficient.
FLAN-T5 is a family of instructor-finetuned T5 models scaling up to 540B parameters. They are trained on more than 1000 tasks across over 100 diverse domains and cover multiple languages. FLAN-T5 demonstrates superiority over its predecessor T5 in various NLP tasks while being computationally efficient.
text to generate from
maximum length of the generated text (Default: 200, 1 ≤ max_length ≤ 2048)
You need to login to use this model
Haiku is a Japanese poem that is around 108 characters long. A tweet is ...