gpt2 cover image
featured

gpt2

GPT-2 is a transformer-based language model developed by OpenAI that utilizes a causal language modeling (CLM) objective. It was trained on a 40GB dataset called WebText, which consists of texts from various websites, excluding Wikipedia. Without fine-tuning, GPT-2 achieved impressive zero-shot results on several benchmark datasets such as LAMBADA, CBT-CN, CBT-NE, WikiText2, PTB, enwiki8, and text8.

GPT-2 is a transformer-based language model developed by OpenAI that utilizes a causal language modeling (CLM) objective. It was trained on a 40GB dataset called WebText, which consists of texts from various websites, excluding Wikipedia. Without fine-tuning, GPT-2 achieved impressive zero-shot results on several benchmark datasets such as LAMBADA, CBT-CN, CBT-NE, WikiText2, PTB, enwiki8, and text8.

Public
$0.0005/sec
demoapi

e7da7f221d5bf496a48136c0cd264e630fe9fcc8

2023-05-03T21:27:12+00:00


© 2023 Deep Infra. All rights reserved.

Discord Logo