Micro Focus is now part of OpenText. Learn more >

You are here

You are here

Are you getting the most from your testing tools?

public://pictures/greg_paskal_2017_1280x1280.jpg
Greg Paskal Director of Quality Assurance – Automation, The Dave Ramsey Show
Swiss army knife
 

“We’re using this other test management tool. It’s got better reporting.” Those words were spoken by a senior QA analyst. I’d heard similar things countless times. Some technologist finds a new, shiny tool, and suddenly the one you purchased a year ago is vilified as the enemy of efficiency and progress. These tools might be used for test management, automation, performance, lifecycle, or other tasks, but all present a nearly identical justification: “This old tool stinks, but this new tool saves the day.”

When equipping their teams, most technology managers find themselves confronted with this common challenge: There’s a legitimate need to purchase the right tools without overspending on unnecessary or redundant tools. At a recent Google Test Automation Conference (GTAC), Google technology leaders shared how they evaluate and remove duplicate tools from their testing arsenal, a process that yields better results across their testing organization and helps them focus on getting the most out of individual tools.

The following principles will help you do the same.

1. Create a team tool strategy

Determine a team tool strategy, a process to use consistently in selecting and adding tools to your team’s technology stack. All tools—both commercial and open source—should go through some type of evaluation process, ranging from a few hours for minor tools to a few weeks for more complex ones.

Derive from the evaluation the top strengths and benefits of each tool, and compare those results to tools already in your technology stack. Look for tools that will save your team time without sacrificing quality. Identify cost benefits that would free up more of your budget for training or other activities, which would strengthen your testing organization.

Execute the following actions based upon the results of your tool evaluation:

  • Kill it: If the tool doesn’t have a significant advantage over other tools you’ve evaluated or the tools you already own, end the evaluation and uninstall the tool. No point putting more effort into a tool with only marginal advantages.
  • Continue it: If you don’t have an existing tool, you should evaluate several similar tools. If those tools are head-to-head in terms of capability and cost, continue your evaluation, performing a number of specific steps to narrow the choices and incorporate insights from peers. Think about qualities that might offer subtle advantages, such as ease of training and the product’s update history.
  • Keep it: Once you’ve identified a clear winner from the evaluation process, move forward with acquisition, training, and implementation. Be sure to sunset an existing tool if it's replaced by the new one, freeing up licensing and maintenance costs.

2. Educate your team

I’ve yet to encounter a technologist who doesn’t enjoy new gadgets. It’s in our DNA. But your team members have to understand how gadget "crushes" can interfere with fair evaluations.

During the “why” phase of your team tool strategy, talk about the importance of getting the team the right tools for its success. Explain that “tool debt” accrues when the tools that get added do nearly the same thing as the old tools, in the form of unnecessary training and confusion about which tool to use. Point out that focusing on a few well-differentiated tools will make it easier for team members to become power users. Those power users can then become champions for individual tools and teach the rest of the team about their intricacies.

[ Special coverage: PerfGuild performance testing conference ]

3. Use as designed, not as discovered

A common scenario in many testing organizations is that training is shortchanged and the team turns to the Internet to figure out how to use a new tool. This tends to drain away the new tool's benefits, with teams often ending up building custom functionality because they just didn’t know it was already in the tool. A few dollars are in the training budget, but that is overshadowed by overspending in the implementation and usage phases.

An entire chapter of my book Test Automation in the Real World is dedicated to the topic of using a tool as designed, not as discovered. When putting new tools in place, include the proper training.

  • Purchased tools: Opt for training from the vendor. More than anyone, it knows how it intended its tool to be used and will give you the best return on your training investment. Training costs might be negotiable, and you might be able to send just a couple of team members to get training, with the expectation that they will teach the rest of the team and with the benefit that they will become subject matter experts.
  • Open-source tools: With open-source tools, documentation can be scarce, so factor that challenge into your evaluation. My approach is to identify champions of open-source tools that might offer training.

Cautions and awareness

A few caveats:

  • Impostor syndrome happens when team members feel intimidated and don’t ask questions for fear of losing credibility in front of their peers. Model for your team that questions are expected, and they will put forward questions on how to use and get the most out of new tools. Keep a teachable attitude across your team so everyone gets the greatest benefits from the tools you choose.
  • Internet as oracle: Be alert to what is going on with your team. If it is falling back on Google to learn how to do things with a new tool or functionality, go back to the training materials to ensure you’re using your tools as designed, not as discovered.
  • Tool of the month: Be cautious using tools that were just released. You want to add tools to your team technology stack that are well-developed and have a history of being updated on a regular basis. And when commercial and open-source tools are no longer receiving updates, it’s probably time to find an alternative.

Putting great tools into the hands of your testing team, along with the proper training, will set your organization up for success. Following a smart team tool strategy will ensure that the right process is in place for getting the very best tools while keeping redundancy out of your team tool stack.

Want more productivity-enhancing tips on getting maximum performance from your current testing tools? See the presentation I gave at the recent Automation Guild online automated testing conference (Registration required to view). And check out the upcoming Performance Guild online conference.

Keep learning

Read more articles about: App Dev & TestingTesting