Copy the page URI to the clipboard
Wermelinger, Michel
(2023).
DOI: https://doi.org/10.1145/3545945.3569830
Abstract
The teaching and assessment of introductory programming involves writing code that solves a problem described by text. Previous research found that OpenAI's Codex, a natural language machine learning model trained on billions of lines of code, performs well on many programming problems, often generating correct and readable Python code. GitHub's version of Codex, Copilot, is freely available to students. This raises pedagogic and academic integrity concerns. Educators need to know what Copilot is capable of, in order to adapt their teaching to AI-powered programming assistants. Previous research evaluated the most performant Codex model quantitatively, e.g. how many problems have at least one correct suggestion that passes all tests. Here I evaluate Copilot instead, to see if and how it differs from Codex, and look qualitatively at the generated suggestions, to understand the limitations of Copilot. I also report on the experience of using Copilot for other activities asked of students in programming courses: explaining code, generating tests and fixing bugs. The paper concludes with a discussion of the implications of the observed capabilities for the teaching of programming.
Viewing alternatives
Download history
Metrics
Public Attention
Altmetrics from AltmetricNumber of Citations
Citations from DimensionsItem Actions
Export
About
- Item ORO ID
- 86438
- Item Type
- Conference or Workshop Item
- ISBN
- 1-4503-9431-0, 978-1-4503-9431-4
- Keywords
- code generation; test generation; code explanation; programming exercises; programming patterns; novice programming; introductory programming; academic integrity; OpenAI Codex
- Academic Unit or School
-
Faculty of Science, Technology, Engineering and Mathematics (STEM) > Computing and Communications
Faculty of Science, Technology, Engineering and Mathematics (STEM) - Copyright Holders
- © 2023 Copyright held by the owner/author(s).
- Related URLs
-
- https://sigcse2023.sigcse.org/(Other)
- https://doi.org/10.1145/3545945(Publication)
- Depositing User
- Michel Wermelinger