Don't be slow to move

World Quality Report 2017-18: The state of QA and testing

There's no question that today's quality executives and engineers almost universally recognize that the QA world needs to accelerate its adoption of agile and DevOps. But data from the World Quality Report 2017-18, just released by Capgemini/Sogeti Group, shows that the recognition still remains highly aspirational.

The reality is that initiatives like automation and test data management continue to see flat adoption rates in 2017. Experts warn that changes need to be made more swiftly if quality is to keep up with the pace of software engineering.

"There's not a dramatic change" versus last year, said Mark Buenen, leader and digital assurance and testing portfolio for Sogeti and lead author of the World Quality Report 2017-18. Problems with automation, test data and environments, and managing agile and DevOps remain the same, he explained.

"What's really disappointing is that organizations are not mastering better quality as compared to last year."
Mark Buenen

World Quality Report 2017-18

Testing automation adoption languishes

Part and parcel with making meaningful improvements will be driving more automation across the board. While automation has certainly garnered a lot of headlines over the last year, the 2017-18 World Quality Report shows that the needle hasn't budged globally on the rate of testing automation. The average rate still stands at about 16% across the board.

"A lot of automation is happening, especially in [the] traditional regression testing space, but enterprise-level usage of automation remains a huge challenge."
—Buenen

Now, even the most enthusiastic QA experts will say that automation is a wonderful thing—but implementing automation for automation's sake is a fool's errand. Melissa Tondi, director of quality engineering for EMS Software, said customers shouldn't try to automate everything because maintenance overhead alone will eat the organization alive.

Nevertheless, even with a high degree of selectivity and rigor in deciding what to test, most organizations should be hitting much higher rates of testing automation than this year's average.

"It should reach levels above 50 percent," said Buenen. "In my opinion, the current numbers have a lot to do with the fact that due to the speed of things and the fact that we focus on sprints and iterations."

"The strategic approach to increasing automation is lacking."
—Buenen

Think beyond the sprint

As teams focus on individual sprints, he explained, they grab a tool and if it works they immediately they'll run with it. But if it doesn't, they fall back on manual processes for the quickest fix. As a result, the quality of the QA process as a whole suffers.

That method is certainly understandable in the short term, but you are not helping or building the full automation capability required for continuous quality, he warned. Accomplishing this requires a more disciplined approach led by a quality engineer who can bring governance to the situation.

What to automate and why

The first step in this full approach, as EMS Software's Tondi explained, is coming up with documented criteria for automation—essentially outlining what to automate, and for which reason. It should be the "most important and valuable" tests, whether they're directly related to customer flow or revenue-based. 

"Whatever your internal criteria, it is important just going through that exercise and saying, 'We're going to automate in this priority order because we know that these are the most important tests to maintain.'"
Melissa Tondi

Get smart with automation

While many organizations still have a long way to go to bump up the average rate of test automation, they nevertheless still have their sights set high for excellence. According to the latest report, between 40 and 50 percent of organizations are interested in implementing some level of cognitive automation, machine learning or predictive analysis to improve quality.

That tracks with analysts' expectations. For example, Gartner predicts that by 2020, half of IT organizations will apply advanced analytics in application development to improve application quality. But, as Buenen explained, using smart analytics to make quality decisions is "at an initial stage" because the technology—and the processes around it—are still mostly emerging.

In the interim, it may make sense to tackle some of the nuts-and-bolts problems that the QA world has been facing for some time. Most organizations—over 75 percent—are starting to do this by bringing on cloud-based provisioning for non-functional test activities. About 15 percent have adopted containerization. But streamlining processes still plagues organizations.

Data deluge remains a thorn in the side of QA

For example, this year's World Quality Report shows that yet again, issues surrounding test environments and test data remain huge thorns in the side of organizations seeking to improve their rate of test automation. The lack of data and environments is still the most serious problem, cited by 46% of respondents, up three points from last year.

More than half of organizations have challenges managing the size of test data. They also struggle with creating and maintaining synthetic data, and with test-data regulation compliance.

Between 20 and 30 percent of testing effort is lost on the issues people have with environments and data, Buenen said.

"Technologically there are solutions available, but in practice it is a huge challenge for organizations."
—Buenen

These statistics pretty much confirm the anecdotal war stories heard from QA professionals all year round. Matt McComas, vice president of business platform operations at GM Financial, said this was exactly why his company made major investments in test-data management and service virtualization technology and processes over the last few years.

"We really got to a crossroads and we knew we couldn't go any further doing things the way we were doing them with long manual deployments, lots of steps and all kinds of resource problems."
Matt McComas

Fundamental to bringing in release-automation technology was employing solid test data management. "But, you know, at the end of the day, this is an application that requires data input and data output. And it is only going to be as good as the data it receives," McComas said.

Testing spend stays steady

Of course, these initiatives require an ample budget to pull off. Capgemini's analysis shows that budgets are largely holding steady at a reasonable rate. This year spending dedicated to quality and testing stands at about 26%, said Anand Moorthy, vice president and global head of Capgemini's testing practice.

True, it's five points lower than last year, but that difference may be a function of how quality-related finance is calculated more than anything else, Moorthy said.

"The way things are being done today, a lot of testing people are getting merged into the Scrum team itself." And how the test spend is calculated is "ever-evolving," he said.

World Quality Report 2017-18
Topics: Quality