google/flan-t5-small cover image

google/flan-t5-small

FLAN-T5 is a family of instructor-finetuned T5 models scaling up to 540B parameters. They are trained on more than 1000 tasks across over 100 diverse domains and cover multiple languages. FLAN-T5 demonstrates superiority over its predecessor T5 in various NLP tasks while being computationally efficient.

FLAN-T5 is a family of instructor-finetuned T5 models scaling up to 540B parameters. They are trained on more than 1000 tasks across over 100 diverse domains and cover multiple languages. FLAN-T5 demonstrates superiority over its predecessor T5 in various NLP tasks while being computationally efficient.

Public
$0.0005/sec

Input

text to generate from

maximum length of the generated text (Default: 200, 1 ≤ max_length ≤ 2048)

You need to login to use this model

Output

Haiku is a Japanese poem that is around 108 characters long. A tweet is ...

 


© 2023 Deep Infra. All rights reserved.

Discord Logo