Tracking Down Performance Variation Against Source Code Evolution
Little is known about how software performance evolves across software revisions. The severity of this situation is high since (i) most performance variations seem to happen accidentally and (ii) addressing a performance regression is challenging, especially when functional code is stacked on it.
This paper reports an empirical study on the performance evolution of 19 applications, totaling over 19 MLOC. It took 52 days to run our 49 benchmarks. By relating performance variation with source code revisions, we found out that: (i) 1 out of every 3 application revisions introduces a performance variation, (ii) performance variations may be classified into 9 patterns, (iii) the most prominent cause of performance regression involves loops and collections. We carefully describe the patterns we identified, and detail how we addressed the numerous challenges we faced to complete our experiment.
Tue 27 OctDisplayed time zone: Eastern Time (US & Canada) change
15:30 - 17:30 | |||
15:30 24mTalk | Measuring Polymorphism in Python Programs DLS | ||
15:54 24mTalk | Tracking Down Performance Variation Against Source Code Evolution DLS | ||
16:18 24mTalk | Server-Side Type Profiling for Optimizing Client-Side JavaScript Engines DLS Madhukar Kedlaya University of California, Santa Barbara, Behnam Robatmili Qualcomm Research, Ben Hardekopf UC Santa Barbara | ||
16:42 24mTalk | An Empirical Investigation of the Effects of Type Systems and Code Completion on API Usability using TypeScript and JavaScript in MS Visual Studio DLS Lars Fischer University of Duisburg-Essen, Essen, Germany, Stefan Hanenberg University of Duisburg-Essen | ||
17:06 24mTalk | Access Control to Reflection with Object Ownership DLS Camille Teruel INRIA, Stéphane Ducasse INRIA, France, Damien Cassou Lille 1 University, Marcus Denker INRIA Lille |