PwR: Exploring the Role of Representations in Conversational Programming

Large Language Models (LLMs) have revolutionized programming and software engineering. AI programming assistants such as GitHub Copilot X enable conversational programming, narrowing the gap between human intent and code generation. However, prior literature has identified a key challenge–there is a gap between user’s mental model of the system’s understanding after a sequence of natural language utterances, and the AI system’s actual understanding. To address this, we introduce Programming with Representations (PwR), an approach that uses representations to convey the system’s understanding back to the user in natural language. We conducted an in-lab task-centered study with 14 users of varying programming proficiency and found that representations significantly improve understandability, and instilled a sense of agency among our participants. Expert programmers use them for verification, while intermediate programmers benefit from confirmation. Natural language-based development with LLMs, coupled with representations, promises to transform software development, making it more accessible and efficient.

Publication Downloads

PwR Studio

January 19, 2024

PwR Studio converts natural language (NL) instructions into robust code. Programming with Representations (PwR, pronounced “power”) is a software development approach that relies on a domain-specific language (DSL), or representation, defined by a developer specializing in a specific domain. This representation includes built-in guardrails that are automatically implemented throughout the software development process.