Write a Blog >>
ICSE 2023
Sun 14 - Sat 20 May 2023 Melbourne, Australia

Linting is the process of using static analysis tools to scan the source code and detect coding patterns that are considered bad programming practices. Given their importance, linters have been introduced in the classrooms to educate students on how to detect and potentially avoid these code anti-patterns. However, little is known about their effectiveness in raising students’ awareness, given that these linters tend to generate a large number of false positives. To understand and increase the awareness of potential coding issues that violate coding standards, in this paper, we aim to reflect on our experience with teaching the use of a static analysis tool for the purpose of evaluating its effect in code review with respect to software quality from the point view of educators in the context of Master’s students in SE/CS who analyze Java-based software projects. This paper discusses the results of an experiment in the classroom, over a period of 3 academic semesters, and involving 65 submissions that carried out code review activity of 690 rules using PMD. The results of the quantitative and qualitative analysis show that the presence of a set of PMD quality issues influences the acceptance or rejection of the issues, design and best practices-related categories take longer time to be resolved, and students acknowledge the potential of using static analysis tools during code review. Through this experiment, code review can turn into a vital part of the educational computing plan. We envision our findings enabling educators to support students with code review strategies in order to raise students’ awareness about static analysis tools and scaffold their coding skills.