Qwen/Qwen2.5-Coder-7B cover image
featured

Qwen/Qwen2.5-Coder-7B

Qwen2.5-Coder-7B is a powerful code-specific large language model with 7.61 billion parameters. It's designed for code generation, reasoning, and fixing tasks. The model covers 92 programming languages and has been trained on 5.5 trillion tokens of data, including source code, text-code grounding, and synthetic data.

Qwen2.5-Coder-7B is a powerful code-specific large language model with 7.61 billion parameters. It's designed for code generation, reasoning, and fixing tasks. The model covers 92 programming languages and has been trained on 5.5 trillion tokens of data, including source code, text-code grounding, and synthetic data.

Public
$0.055 / Mtoken
32,768
ProjectPaperLicense
demoapi

097b213c52760d22753af1aa5cbdba94b5c99506

2024-09-20T21:37:50+00:00