Blogs (19) >>
Thu 16 Mar 2023 15:45 - 16:10 at 701B - Auto-Grading Chair(s): Zijian Ding

Traditionally exams in introductory programming courses have tended to be multiple choice, or ``paper-based'' coding exams in which students hand write code. This does not reflect how students typically write and are assessed on programming assignments in which they write code on a computer and are able to validate and assess their code using an auto-grading system.

Executable exams are exams in which students are given programming problems, write code using a computer within a development environment and submissions are digitally validated or executed. This format is far more consistent with how students engage in programming assignments.

This paper explores the executable exam format and attempts to gauge the state-of-the-practice and how prevalent it is. First, we formulate a taxonomy of characteristics of executable exams, identifying common aspects and various levels of flexibility. We then give two case studies: one in which executable exams have been utilized for nearly 10 years and another in which they’ve been recently adopted. Finally, we provide results from faculty surveys providing evidence that, though not standard practice, the use of executable exams is not uncommon and appears to be on the rise.

Thu 16 Mar

Displayed time zone: Eastern Time (US & Canada) change

15:45 - 17:00
Auto-GradingPapers at 701B
Chair(s): Zijian Ding University of Maryland, College Park
15:45
25m
Paper
Executable Exams: Taxonomy, Implementation and ProspectsIn-PersonGlobal
Papers
Chris Bourke University of Nebraska-Lincoln, Yael Erez Technion Israel Institute of Technology, Orit Hazzan Technion—Israel Institute of Technology
DOI
16:10
25m
Paper
Studying The Impact Of Auto-Graders Giving Immediate Feedback In Programming AssignmentsIn-Person
Papers
Joydeep Mitra Stony Brook University
DOI
16:35
25m
Paper
The Programming Exercise Markup Language: Towards Reducing the Effort Needed to Use Automated Grading ToolsIn-Person
Papers
Divyansh Mishra Virginia Tech, Stephen Edwards Virginia Tech
DOI