Characterizing and Detecting Performance Bugs for Smartphone Applications

Author(s): Yepang Liu, Chang Xu, Shing-Chi Cheung
Venue: International Conference on Software Engineering 2014
Date: 2014

Type of Experiement: Survey/Multi-Case Study
Sample Size: 8
Class/Experience Level: Professional
Participant Selection: Participants were selected from large open source projects. The 8 studied programs were chosen for their quality of labeling performance bugs.
Data Collection Method: Observation, Code Metric, Project Artifact(s)

Quality
4

Characterizing and Detecting Performance Bugs for Smartphone Applications attempts to answer the following questions about mobile application development:

  • What are the common software bugs that cause performance issues in mobile applications?
  • How do performance bugs manifest themselves?
  • Are performance bugs more difficult to debug and fix than non-performance
    bugs? What information or tools can help with this?
  • Are there common causes of performance bugs? Can we distill common bug patterns to facilitate performance analysis and bug detection?
  • Can we leverage our empirical findings, e.g., common bug patterns, to help developers identify performance optimization opportunities in real-world Android applications?

Lui, Xu, and Cheung analyze a selection of 8 open source applications that met a criteria of high popularity, high bug visibility, and high maintainability for performance causing issues. Out of the 8 applications, they chose to focus on 70 performance bugs. They find that the most common performance issues (94.7%) come from three categories: GUI lagging, Memory Bloat, and Power Drain. The authors find that 2/3 of the performance bugs came from small-scale user input, but 1/3 came from complicated sequences of user input, making them difficult to find through automated or system testing.

The authors suggest that mobile testing effective performance testing needs: (1) new coverage criteria to assess testing adequacy, (2) effective techniques for generating user interaction sequences to manifest performance bugs, and (3) automated oracles to judge performance degradation over time.

In addition to studying the categorization and manifestation of bugs, the authors also explore how difficult performance bugs are to fix. Lui, Xu, and Cheung find that the number of days a performance bug is not fixed has a statistically significant increase when compared to non-performance bugs. They also find that performance bugs generate more discussion, and that the patch size for fixing performance bugs is larger than non-performance bugs, suggesting that performance bugs are harder to fix than normal bugs.

The authors collected comments and analyzed how performance bugs were closed. They find that information provided by profilers and performance measurement tools are more helpful for debugging than traditional information like stack trace. The authors suggest that existing profilers can expect improvement for automatically analyzing, aggregating, simplifying and visualizing collected runtime profiles, due to how useful they are to have during debugging.

Lui, Xu, and Cheung also analyze the root causes of performance bugs. They find that the most common causes are : (1) lengthy operations in the main thread, (2) Wasted computation for invisible GUI elements, (3) Frequently invoked heavy-weight callbacks,

Finally, Lui, Xu, and Cheung attempt to apply their findings to detect performance bugs in current Android applications. They applied their knowledge of two of the most common bug types (Lengthy operations, and a violation of the View Holder Pattern), and a custom written program to automatically detect performance bugs in Android applications. Their program was able to correctly identify 68 performance issues in an Android application, with 58 false positives.

0