Journal
2017 IEEE SYMPOSIUM ON VISUAL LANGUAGES AND HUMAN-CENTRIC COMPUTING (VL/HCC)
Volume -, Issue -, Pages 101-105Publisher
IEEE
Keywords
-
Categories
Funding
- Austrian Science Fund (FWF) [I2144]
- German Research Foundation (DFG) [JA 2095/4-1]
- Direct For Computer & Info Scie & Enginr
- Division of Computing and Communication Foundations [1645136] Funding Source: National Science Foundation
- Division of Computing and Communication Foundations
- Direct For Computer & Info Scie & Enginr [1564162] Funding Source: National Science Foundation
Ask authors/readers for more resources
Peer code reviews are important for giving and receiving peer feedback, but the code review process is time consuming. Static analysis tools can help reduce reviewer effort by catching common mistakes prior to peer code review. Ideally, contributors would use static analysis tools prior to pull request submission so common mistakes could be addressed first, before invoking the reviewer. To explore the potential efficiency gains for peer reviewers, we explore the overlap between reviewer comments on pull requests and warnings from the PMD static analysis tool. In an empirical study of 274 comments from 92 pull requests on GitHub, we observed that PMD overlapped with nearly 16% of the reviewer comments, indicating a time benefit to the reviewer if static analyzers would have been used prior to pull request submission. Using the non-overlapping set of comments, we identify four additional rules that, if implemented, could further reduce reviewer effort.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available