How to be Maniacal about User Experienece with the help of Performance Test

Hike is an user-centric, mobile-first product, where user experience is the top priority. Hike started as an instant messaging app but during the course of the last four years, it has evolved into a messaging platform that offers a bouquet of features and services on top of messaging.

As a part of quality team, it becomes our primary goal to make sure that app is doing well in terms of performance for a 100+ million user base with an average smartphone cost of $120. We as a product want to achieve and solve lot of user problems but with a constraint of app running with limited device resources.

So the question arise, how do we achieve this goal?

Now, every night, every Sprint (2 weeks) and every public release (monthly), we run our automated perf tests. We measure our app's performance using several app specific use-cases on 4 key areas:

  • CPU,
  • Memory
  • Battery
  • Network (data consumption)

Hike's CPU Utilization

We also benchmark the following scenarios for app latency:

  • App launch time upon Force Stop
  • App launch time upon Force Kill
  • App's busiest screen opening time
  • Scrolling latency in different parts of the app
  • Contact loading time in Compose screen

Hike App Benchmark

Embracing the fail-fast, fail-early principle, we now provide the necessary feedback to the developers as quickly as possible to avoid downstream delays.

 
2 favorite thumb_down thumb_up 1 comment visibility_off  Remove from Watchlist visibility  Add to Watchlist
 

Outline/structure of the Session

  1. Driving automated tests to measure performance in mobile.
  2. How hike ensures the performance.
  3. Tool selection
  4. How/What to benchmark in performance.
  5. App specific parameters to benchmark
  6. Challenges with utilization of mobile resources.
  7. Next Steps in our Journey (making Perf-tests part of CI builds, etc.) and running the perf in the field with the instrumentation in place
  8. Advantages and Disadvantages of the current Perf suite and next steps

Learning Outcome

  1. Bench-marking mobile app on the parameters such as CPU, Memory utilization, Data consumption and Battery usage.
  2. Bench-marking app specific tests
  3. Automating Pref-Tests
  4. Making perf test more realistic and evolved

Target Audience

Performance Testers, QA Managers, QA Leads, Tool Developers, Data Driven Testers

schedule Submitted 8 months ago

Comments Subscribe to Comments

comment Comment on this Proposal
  • Jutta Eckstein
    By Jutta Eckstein  ~  8 months ago
    reply Reply

    Thanks for your proposal. Actually I wonder how it fits into this theme - maybe it would be a better fit for lean product discovery?

    Jutta