FSE 2025
Mon 23 - Fri 27 June 2025 Trondheim, Norway
Tue 24 Jun 2025 16:00 - 16:20 at Sirius - Assessment, Review, and Peer Feedback Chair(s): Kathryn Stolee

An online judge system (OJS) is a system that automatically compiles, verifies, and executes programs submitted by users for specific problems, and provide feedback. OJSs offer a vast array of programming-related problems, enabling users to choose relevant problems and engage in self-directed programming and algorithms learning. However, OJS research and learning support using OJS remain limited and insufficient. We analyzed the submission histories of AtCoder, an OJS platform included in CodeNet, focusing on error resolution rates, the time required for resolution, and the variation of these metrics based on the experience of users in solving problems of varying difficulty levels. We made the following findings: (1) Time limit exceeded error is the most difficult to solve regardless of the programming proficiency of users. (2) Reversal in the proportions of compilation errors and time limit exceeded errors before and after problems of specific difficulty level. (3) Error resolution time decreases with the programming proficiency of users regardless of error types. These findings are crucial in identifying errors in which users require support and enabling competition organizers to assess the programming abilities of participants.

Tue 24 Jun

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

16:00 - 18:00
Assessment, Review, and Peer FeedbackSoftware Engineering Education at Sirius
Chair(s): Kathryn Stolee North Carolina State University
16:00
20m
Talk
An Empirical Study of the Error Characteristics in an Online Judge System
Software Engineering Education
Shota Shimizu Ritsumeikan University, Erina Makihara Ritsumeikan University, Norihiro Yoshida Ritsumeikan University
16:20
20m
Talk
Applying Large Language Models to Enhance the Assessment of Java Programming Assignments
Software Engineering Education
Skyler Grandel Vanderbilt University, Douglas C. Schmidt Vanderbilt University, Kevin Leach Vanderbilt University
16:40
20m
Talk
Direct Automated Feedback Delivery for Student Submissions based on LLMs
Software Engineering Education
Maximilian Sölch Technical University of Munich, Felix T.J. Dietrich Technical University of Munich, Stephan Krusche Technical University of Munich
17:00
20m
Talk
Understanding Comparative Comprehension Barriers for Students during Code Review through Simplification
Software Engineering Education
Nick Case North Carolina State University, John-Paul Ore North Carolina State University, Kathryn Stolee North Carolina State University
17:20
10m
Talk
"Person is a person, a tool is a tool" - ChatGPT’s Role in Student Help-Seeking Behavior and Peer Support
Software Engineering Education
Sonja Hyrynsalmi LUT University, Micheal Tuape LUT University, Antti Knutas LUT University
17:30
10m
Talk
The Impact of Multi-Peer Feedback Summary Organization on Review and Implementation of Feedback
Software Engineering Education
Somayeh Bayat Esfandani Norwegian University of Science and Technology, Trond Aalberg Norwegian University of Science and Technology

Information for Participants
Tue 24 Jun 2025 16:00 - 18:00 at Sirius - Assessment, Review, and Peer Feedback Chair(s): Kathryn Stolee
Info for room Sirius:

Sirius is located just behind the registration desk.

Facing the registration desk, its entrance is on the right.