Micro Focus is now part of OpenText. Learn more >

You are here

You are here

6 reasons to store and visualize your test automation data

public://pictures/ajeet_dhaliwal.jpg
Ajeet Dhaliwal Co-founder, Tesults
Charts on a tablet
 

Your testing team has made a considerable investment in developing automation infrastructure: writing the product integration code, the test hooks, a test case strategy, and the test scripts themselves, as well as doing all of the DevOps-related work having to do with continuous integration.

And those are noble and valuable efforts. But based on my experience, many teams fail to get the maximum return on their investment in automated testing.

If your team is using automated tests as part of its continuous integration system, then you have evolved quickly and are ahead of many others. But you may still be unaware of one key insight: You can extract even more value from the test result data you are generating, far above the basic analysis you may currently beperforming.

Here's why your organization should store and make your automated test data accessible—and visible.

 

Properly stored test data delivers

Examining results for a single test run in isolation provides some value, but storing results from multiple tests yields compounding value. With data from previous test runs stored and with the right reporting, a team can get solid insights into the health of their system like never before. Here are some of the most obvious benefits.

Identify regression faster

Being able to effectively perform a diff on any two test run results with associated revision numbers makes it possible to identify regression faster and easier by highlighting new passes (fixes), new failures, and continuing failures.  It is also possible to identify a specific revision or range where an issue was introduced. This reduces development costs and saves time.

Identify recurring failures more easily

Recurring failures can be identified more easily. The results of a "flaky" test should not be set aside too quickly, because they may indicate underlying issues regarding memory management and race conditions, which are often hard to detect. Good reporting also makes it possible to identify when failures are introduced, as well as how often and perhaps who is responsible. This information can be used to make process changes that boost productivity.

Understand the performance impact of changes

In the area of performance testing, it’s possible to chart metrics data for the same test case over multiple test runs, and that can be helpful in understanding the performance impact that changes are having over time. Comparison of screen captures between runs is handy too. All of this type of analysis is only made possible by retaining relevant data.

Make test result data visible across the team

Having visibility of test results across the entire team improves the overall test strategy. Here are three of the most important reasons for creating and maintaining that visibility.

Visualize level of test coverage

It allows us to see the level of test coverage across our product, and we can identify gaps in testing much more easily. This encourages more test cases to be written, where needed, and helps us find more bugs earlier. As useful as a test case strategy document may be, there’s nothing like up-to-date results, which make it clear what our coverage really looks like today.

At-a-glance reporting for management and team leads

The visibility of test result data helps a team understand the technical health status of the project at a glance. Having leads and management be aware of health at a high level can help set priorities against the ever-forward-marching force of new features. With better visibility across the team, everyone can own quality and think about the potential impact their changes are having to the wider system.

Ability to zero in on specific functionality

By carefully highlighting test results related to specific functionality and disciplines with the use of test suites and groups, team members can look out for any impact changes have to their core area as results are made available for the latest test run. This helps everyone even on a large team get a better sense of the robustness of various areas of a project, which means the team can stay on top of quality.

Put your automated test result data to work

Automated testing has become more common with a shift in development techniques, including faster deployment via continuous integration and delivery. There are some types of testing, such as server load, performance, and profiling, that simply cannot be done without some level of automation. 

And, in addition to its labor-saving capability, automation generates output that can be analyzed for greater insights. Test result data is often looked at once and then discarded. And before it is discarded, it is also often hidden away and viewed by only a select few. Assuming a team has tests as part of its continuous integration workflow, the key to maximizing value starts with properly capturing and storing the test result data being generated on every build to enable better reporting and analysis.

Given all the effort to add automated reporting—based on build script console output or a log file in a directory on a build server—the actual results are limited in two ways. First, this data is largely inaccessible: Only a build engineer, or perhaps a DevOps person with access, can get to it. Second, this data is not in a state where it can be queried, analyzed, or used for historical or trend comparison.

Teams should work to address these issues. There is value in making the test result data visible to the wider team. And storing that data in an organized manner will help maximize the return on automation, because proper storage makes two things possible: better analysis and better visibility. Both ultimately lead to better quality assurance.

Don't overlook the value of your test data

Knowing that automated tests are running regularly, and knowing what they actually do, can be enough for individuals beyond the core build and programming team to appreciate both the investment that goes into maintaining tests, and the consequences of failures. But for the test team itself, there's even more to appreciate in the data derived from test results.

Teams that don't take advantage of test result data that they generate are missing out on an opportunity to maximize the value of their investment in automated testing. I recommend you put into place a plan to more completely use this data a central part of your testing strategy. Treating test data as valuable—and storing, analyzing, and reporting results—should be a priority in moving to greater automation, not an afterthought.

 

Keep learning

Read more articles about: App Dev & TestingTesting