5 ways to blow your test automation budget

5 proven ways to blow your test automation budget

As test automation responsibilities become more embedded within agile development teams, the role of the test automation framework is evolving to become more atomized, with fewer organizations relying on a monolithic commercial framework, and more moving toward loosely federated open source tools that more closely align with individual teams' objectives.

This offers a greater degree of flexibility to the organization, but it also poses a bit of a cost management problem. Having a greater distribution of automation responsibilities and tools makes it more difficult for QA experts to estimate how much, exactly, their test automation framework costs.

But despite the lack of a fully fleshed out accounting of costs, testing veterans have experience with the most common costly automation scenarios. TechBeacon asked three testing professionals to share their thoughts on which practices unnecessarily raise the cost of building out and maintaining an effective software testing framework. Here are their top five.

Magic Quadrant for Software Test Automation

1. Overengineering your test automation framework

There are many different opinions as to the various ways organizations can negatively affect the economics of test automation, but testing professionals unanimously agree on one: overengineering. There's no better way to raise costs than to try to automate everything.

"The outcome of test automation should not be to automate everything." —Melissa Tondi, director of quality engineering, EMS Software.

"If that is your outcome, with no perceived value, or no built-in time for maintaining all the scripts, you will quickly get to the point where you have scripts that have been created but have months and months between when they're actually executed. To me that's waste," Tondi says. "Regardless of whether you're commercial or open source, the first step is to understand what your outcome is, to figure out and to create an automation criteria checklist or document that says, 'In our organization, this is what we automate and why.'"

"It's kind of pointless if you become the world’s best automator of useless tests." —Bas Dijkstra, test automation consultant, The Future Group

Greg Paskal, director of quality assurance for Dave Ramsey Solutions, says that as a consulting organization engaging with vendors, outlining what the outcome is a good litmus test. "Beware of a vendor who wants to automate the entire regression suite—that almost always leads to spending more money than necessary."

He suggests starting by automating a critical test suite of 15 to 25 tests.

"A good automation test will be traceable back to a good manual test. The manual test should always proceed the development of automated tests." —Greg Paskal

2. Having a consultant build automation for you

Another very expensive mistake organizations make is to have consultants build out a framework as a one-off project to "jump start" the automation process. "In almost every case I've seen [one-off projects], the automation winds up on the shelf—it is not used," Pakal says.

That's often because the QA manager (or whoever pulled the trigger on the investment) was usually sold on the project as a cost savings measure—as a way to downsize the number of people needed to run manual tests. But the moment the contractors leave those tests start to need maintenance, and without the right resources in place, the organization is left in the lurch.

"Almost every time I've encountered these, we usually have to go in and do a lot of work to get them shored up so that we can make them sustainable over the long run," Paskal says, adding that many of these automation projects look like automation but aren't actually sustainable in the long run.

"If you're not trained to know what it takes to build a sustainable test automation framework, you can have something that looks great in the demo, and that might even run for a week or two. But as soon as it begins to break, boy, things get very expensive real quick," Paskal says.

3. Ignoring the team when choosing between open source and commercial testing tools

Many organizations choose open source frameworks and automation tools to attain a greater degree of flexibility, but many times organizations simply choose open source for financial reasons. Most technologists know that just because it says open source it's not necessarily cost-free. But with testing, many people don't realize how many skilled staffers they will need to put on a project to build out a useable framework or suite of automated tools.

The lesson: Consider the costs of acquiring developer and QA engineering expertise, or diverting it from some other part of the organization. Using open source tools often requires more in-depth knowledge of test automation and software development skills to program the tests, and there's a cost involved with that, says Dijkstra.

"One of the factors in choosing open source is evaluating the maturity and level of knowledge of the team that is going to be responsible for creating the automated tests." —Bas Dijkstra

4. Writing a framework from scratch

Even with the right blend of expertise in place to build out an effective framework, organizations risk wasting money if they attempt to reinvent the wheel.

Too often, leads choose between either a vendor-based tool or creating a framework from scratch. Both options are very expensive, says Paskal.

"Even when they go open source they’re wasting money because they’re starting from scratch. There are actually frameworks available that have all the piping or scaffolding available, and a lot of times engineers don’t take advantage of that." —Greg Paskal

5. Not having triage for when builds fail

Automation is great when it works, but it can quickly clog the gears of continuous software delivery if you don't have the right people and processes in place to fix those issues. Automation can quickly raise costs when an organization doesn't have a clear process for triage when builds fail, Tondi says.

If at any point the build itself fails in test suites that are being executed upon every new build, and you don't have the right people, "What we have is lots and lots of failed builds with nobody looking at it," she adds.

Having a consensus as to who the primary point person should be for failures, and having a process for addressing those failures, is critical. Otherwise, you'll have a lot of unnecessary scrambling going on, which translates into wasted time, she says.

"[The] next thing you know we have a queue of failed builds, and now we have to put more time into figuring out why those builds failed, versus having clear communication on the criteria to follow if and when your build fails." —Melissa Tondi

Getting bang for your framework buck

Prioritizing the needs of the organization relative to the resources should be paramount if you expect to keep testing automation costs under control. The perfect framework is not necessarily the one with every bell and whistle that developers and testers want. Such wish fulfillment can easily end up with unnecessary baggage.

Instead, organizations should take the same iterative, minimum viable product (MVP)-approach as they do with other software. In this way you can get something useful up and running quickly, and then slowly refine it as time goes on.

Magic Quadrant for Software Test Automation

Image source: Innovar.net

Topics: Quality