CODEFUSION: A Pre-trained Diffusion Model for Code Generation
- Mukul Singh ,
- José Cambronero ,
- Sumit Gulwani ,
- Vu Le ,
- Gust Verbruggen ,
- Carina Negreanu
EMNLP 2023 |
Imagine a developer who can only change their last line of code — how often would they have to start writing a function from scratch before it is correct? Auto-regressive models for code generation from natural language have a similar limitation: they do not easily allow reconsid ering earlier tokens generated. We introduce CODEFUSION, a pre-trained diffusion code gen eration model that addresses this limitation by iteratively denoising a complete program con ditioned on the encoded natural language. We evaluate CODEFUSION on the task of natural language to code generation for Bash, Python, and Microsoft Excel conditional formatting (CF) rules. Experiments show that CODEFU SION (75M parameters) performs on par with state-of-the-art auto-regressive systems (350M 175B parameters) in top-1 accuracy and outper forms them in top-3 and top-5 accuracy, due to its better balance in diversity versus quality.
EMNLP 2023 Oral Presentation for CODEFUSION: A Pre-trained Diffusion Model for Code Generation
EMNLP 2023 Oral Presentation for CodeFusion: A Pre-trained Diffusion Model for Code Generation. Checkout full paper at CODEFUSION: A Pre-trained Diffusion Model for Code Generation - Microsoft Research