A 15.5B parameter model trained on 80+ programming languages from The Stack (v1.2) dataset, using a GPT-2 architecture with multi-query attention and Fill-in-the-Middle objective. The model is capable of generating code snippets provided some context, but the generated code is not guaranteed to work as intended and may contain bugs or exploits. The model is licensed under the BigCode OpenRAIL-M v1 license agreement.
A 15.5B parameter model trained on 80+ programming languages from The Stack (v1.2) dataset, using a GPT-2 architecture with multi-query attention and Fill-in-the-Middle objective. The model is capable of generating code snippets provided some context, but the generated code is not guaranteed to work as intended and may contain bugs or exploits. The model is licensed under the BigCode OpenRAIL-M v1 license agreement.
c404ba7fc5e98265099e799ea446f2168695aef2
2023-05-26T00:45:00+00:00