google/flan-ul2 cover image

google/flan-ul2

Flan-UL2 is an encoder decoder model based on the T5 architecture. It uses the same configuration as the UL2 model released the previous year. It was fine tuned using the "Flan" prompt tuning and dataset collection. The original UL2 model was only trained with receptive field of 512, which made it non-ideal for N-shot prompting where N is large.

Flan-UL2 is an encoder decoder model based on the T5 architecture. It uses the same configuration as the UL2 model released the previous year. It was fine tuned using the "Flan" prompt tuning and dataset collection. The original UL2 model was only trained with receptive field of 512, which made it non-ideal for N-shot prompting where N is large.

Public
$0.0005/sec

Input

text to generate from

maximum length of the generated text (Default: 200, 1 ≤ max_length ≤ 2048)

You need to login to use this model

Output

Haiku is a Japanese poem that is around 108 characters long. A tweet is ...

 


© 2023 Deep Infra. All rights reserved.

Discord Logo