Micro Focus is now part of OpenText. Learn more >

You are here

You are here

Mobile analytics: The smart way to raise your app's star rating

public://pictures/Christopher-Null-CEO-Null-Media.png
Christopher Null Freelance writer
 

What is the measure of an app? Is it the number of downloads it receives? Whether it earns a “featured” spot on a mobile app store? The size of its marketing budget?

For most observers, the most telling metric surrounding mobile app “success” is its star rating, the closest thing to a quantitative measure of user sentiment that the industry has yet to develop.

What goes into the star rating of an application? That’s tough to say, and it differs for each app on the market. For many, it might be compatibility and stability. (Crash-prone apps are regularly saddled with one-star reviews and are quickly uninstalled.) For others, performance or graphical fidelity may be key. Whatever the key metric or metrics, it all boils down to user experience.

So how does a developer analyze and monitor the user experience of an app before it becomes laden with negative reviews? That answer is increasingly found through two simple strategies: properly testing your app before and during its release, and measuring its performance through analytics once it’s in the wild so it can be continuously refined.

How do you test an app?

It feels almost obvious to say it: Testing your app is a key part of the development and release process. Of course, no one would release an app without testing it. That’s a given. But how does one properly “test” an app today?

Do you load the app on your phone and recruit a few friends with compatible devices to see if it works? Or perhaps you think a simple emulator will give you all the answers you need.

In today’s world, neither of these approaches is workable. In the Android universe, rampant fragmentation all but assures that your app will not work on some subset of handheld devices. In a recent survey, OpenSignal detected over 24,000 distinct Android devices, representing a vast range of manufacturers, operating systems, carriers, display sizes, and integrated sensors. Your app may work perfectly on an old Samsung phone but crash hopelessly on a new LG, or vice versa.  Even other apps installed on a handset can impact whether your app works.

The move to interactive testing

Issues such as physical location, localization settings (including language), and other device preferences can affect an app’s stability and performance. “Keeping up with just the top ten devices can easily be a full-time job,” says Julian Harty, co-author of The Mobile Analytics Playbook.

Clearly it has become impossible to test every app on every phone, and that fact led to the rise of automated testing. Automated testing tends to rely on an emulator to mimic various devices in the real world. It’s a fine first step, and one that can be useful during the development process, but it’s still a tool with limited utility.

Because of that, Harty became a pioneer and proponent of interactive testing, which, as he describes it, “involves a human being interacting with an actual device rather than an emulator on a computer.” Interactive testing also stands in contrast to manual testing, which typically relies on testers following a script of predetermined actions to determine if an app works as intended. The problem with manual testing — which Harty likens to the pejorative of “manual labor” — is that it is prescriptive and rigid. “A manual test schedule may tell a tester to open the home page, log in, tap a button, and order a pizza,” says Harty. “But most people don’t interact with a device reliably and repeatably, and the pizza may not be the only thing we want to know about.”

Interactive testing gives a user freer rein over the way he interacts with a device. This not only captures any bugs and flaws in the program, but it also lets testers assess the “human aspects” of an app. Are there quirks in the interface that make it difficult to use? Are certain functions hard to find, or do they simply not look right on the screen? These are elements that interactive testing can capture that manual and automated testing cannot.

A variety of more advanced techniques — including rapid testing practices, the development of personas, and heuristics — can all help to make interactive testing more thorough and its results more comprehensive.

The evolution of mobile analytics

How then does a developer capture and quantify the findings of interactive testing? The next logical step is the integration of mobile analytics into the application.

“Mobile analytics lets you learn much more about what’s happening with your applications” than any amount of testing can reveal, says Harty. Even the most sophisticated level of testing will necessarily be limited. There is simply no way to capture every error or uncover every bug through simulated testing. “If you only rely on testing, you will miss a whole lot of stuff that is important. It is also wasteful technology.”

Mobile analytics takes many forms, but it primarily involves integrating custom code into an application that sends automated (and anonymized) messages back to a web server about how the app is being used. Data can be collected by Google Analytics, Flurry, or another analytics database system.

“We’re talking about structured data,” says Harty, just a few hundred bytes of data at a time that are sent on an occasional basis. Analytics will report things like “the user is at a search page,” “the user is at the ‘buy it now’ page,” and so on. Wherever this analytics code is embedded, notifications are sent on a regular basis and aggregated at the server for later review. Harty notes that HPE’s mobile analytics system is unique in that it does not require a developer to manually write this code. Rather, it can automatically add this information to the app and be set to send reports without a programmer’s involvement.

The goal: Learn how the app is being used

The goal of mobile analytics is to help the developer determine how an application is really being used. “Testing is limited by the tester’s perspective and understanding,” says Harty. “Testers are making informed guesses about what to test and what not to test, and as a result they’re really not capturing the human experience out in the field.”

At a simple level, mobile analytics can give you insight into what devices are being used with your app specifically. It’s one thing to know that the Samsung Galaxy 5 is the most popular device across the industry, says Harty, but another to know if that device is actually the most common one that is being used with your specific application.

Geolocation and language provide another good example. While IP logs can reveal the portion of your users that reside in, say, Montreal, it will not reveal how many of those users speak French and how many speak English. Mobile analytics code embedded in the app can report on language settings, which can, in turn, be used to drive future development decisions.

Understanding the journey a user takes through your app — commonly called the “user flow” — is another major use case for mobile analytics. “Consider Gmail,” says Harty. “The first thing I may do is look through the inbox, but others may start a session by sending a message. Do users delete email or archive it? Do they use regular search or advanced search? 

Raise your star rating

After we know how users are really working with the app, we can use that information to inform an interactive testing regimen. For example, if analytics reveal that 20 percent of users of a shopping app check out with four or more items in their shopping basket, an interactive test might be configured to examine what happens if a user puts 20 items in the basket. Does the app crash or slow down? At what point do problems develop?

Done smartly, mobile analytics and interactive testing will feed one another, ultimately improving an app’s viability, star rating, and overall user experience.

Additional insights into the intersection of interactive testing and mobile analytics can be found in The Mobile Analytics Playbook, co-authored by Julian Harty and Antoine Aymer. You can download The Mobile Analytics Playbook for free here.

Image credit: Roman Drits

Keep learning

Read more articles about: App Dev & TestingTesting