Related tools & artifacts:
Conference Paper:
ICST'17,
March, 2017
Developers of performance sensitive production software are in a dilemma: performance regression tests are too costly to run at each commit, but skipping the tests delays and complicates performance regression detection. Ideally, developers would have a system that predicts whether a given commit is likely to impact performance and suggests which tests to run to detect a potential performance regression. Prior approaches towards this problem require static or dynamic analyses that limit their generality and applicability. This paper presents an approach that is simple and general, and that works surprisingly well for real applications.
@INPROCEEDINGS{7927967,
author={A. B. D. Oliveira and S. Fischmeister and A. Diwan and M. Hauswirth and P. F. Sweeney},
booktitle={2017 IEEE International Conference on Software Testing, Verification and Validation (ICST)},
title={Perphecy: Performance Regression Test Selection Made Simple but Effective},
year={2017},
volume={},
number={},
pages={103-113},
keywords={program diagnostics;program testing;dynamic analysis;performance regression test selection;performance sensitive production software;perphecy;static analysis;Benchmark testing;Computer bugs;Context;Pins;Reliability;Software},
doi={10.1109/ICST.2017.17},
ISSN={},
month={March},}