Qwen/
Qwen2.5 is a model pretrained on a large-scale dataset of up to 18 trillion tokens, offering significant improvements in knowledge, coding, mathematics, and instruction following compared to its predecessor Qwen2. The model also features enhanced capabilities in generating long texts, understanding structured data, and generating structured outputs, while supporting multilingual capabilities for over 29 languages.
Qwen2.5-72B-Instruct
Ask me anything
Settings
Run models at scale with our fully managed GPU infrastructure, delivering enterprise-grade uptime at the industry's best rates.