Micro Focus is now part of OpenText. Learn more >

You are here

You are here

How to create highly effective performance tests

public://pictures/amberrace.jpeg
Amber Race Senior Software Development Engineer in Test, Adobe
Weightlifter with coach
 

The terms performance testing, load testing, and stress testing are often used interchangeably, but measuring the speed of a service is not the same as measuring how much load the service can handle, and confirming the ability of a service to handle normal expected activity is different from seeing how that service responds to a very high load. So how do you create highly effective performance scripts without succumbing to common performance-testing mistakes?

First, consider what your expectations are for a service, and understand what aspect of a service you are testing before designing your test script. Is your service a hare that needs to respond to requests as quickly as possible? Or is it a tortoise that wins the race by plodding along at a consistent pace?

In reality, web services have both hare and tortoise aspects, which must be balanced. These should be highlighted differently, depending on the nature of your test. Without a thorough understanding of your service, client application, customer base, and test goals, you'll waste many hours chasing down phantom issues, while the real ones slip through unnoticed.

Here's the process you should follow to get started.

Understand your service

Your test design and expectations for a new service won't necessarily be the same as for one that has been chugging along in production for years. A service that is heavy on database writes will perform differently from one that just reads from a database. Other things to consider:

  • Is your service part of a microservice architecture, with lots of dependencies on other services?
  • Do other services depend on your service?
  • Does your service need to handle larger assets, such as artwork or sound files?
  • Where is the physical location of your service compared to the database host or client base?
  • How does your test environment compare to your production environment as far as number of boxes, CPU, memory, network, etc.?

Knowing the ins and outs of your service will help you pinpoint weaknesses and translate results from a lower environment to probable performance in production.

[ Special coverage: PerfGuild performance testing conference ]

Understand your application

Once your service is in production, it isn't your script that will be generating load; it will be your client application in the hands of actual customers. Understanding how the application is using the service is key to designing a relevant performance-testing script. So before you open JMeter, run your application through a proxy and analyze the requests. Some important things to look for include:

  • What is the sequence of calls at startup? Is it different for returning users?
  • What is the request sequence for the main usage scenario? Are there multiple common scenarios?
  • How is the request profile different for anonymous users versus logged-in users?
  • Are there points where the application is noticeably stalled, waiting for the server?
  • Is the application creating unnecessary load by duplicating requests or making requests too frequently?

Profiling your client application early in the test process will help you make a realistic script and find potential issues before they negatively impact your customers.

Understand your customers

To create a baseline for your service, you need to know the size and expected usage patterns of your customers. A service meant to handle worldwide shopping transactions on Cyber Monday will have different expectations from one supporting a few hundred beta-testers. Other things to consider:

  • Are there peak times when application usage is expected to spike? If so, how big is that spike?
  • How long is an average session within the application?
  • How many times a day does a typical customer use the application?
  • What percentage of your customer base is authenticated as opposed to anonymous?
  • What are your customers' expectations for application responsiveness?

Knowing the general size and makeup of your customer base will help you tune your script to ensure that your service matches the desired behavior.

Understand your test goals

It isn't enough to just have a task for "performance testing." Ask yourself and your team what the most important thing is that you need to learn from this round of testing. For example:

  • Will this service be able to handle the expected load?
  • What is the maximum throughput the service can process within an acceptable error rate?
  • How will the addition of a new feature affect overall server performance?
  • How quickly can the service respond to key requests? 
  • How quickly can the service respond to requests under the expected load?

Once you have determined what you want to learn, you can design a test to focus on those goals.

Tips for performance testing

When creating a script that is focusing on response speed, limit the number of requests you are analyzing in a single test. That way, it is easier to pinpoint issues with specific requests or APIs. Focus on lightweight ping or simple read requests to get an idea of the best possible system performance. This kind of testing can help uncover basic problems with configuration and machine setup.

Another strategy is to concentrate on blocking calls that cause a noticeable delay in the client application. As service is developed, you will be able to easily see any improvement or deterioration in these potential bottlenecks.

Tips for load testing

For general load testing, create a script that mimics as closely as possible all the different calls made by the client application in a typical session. If the service is already running in production, you will want to include a thread group that replicates the existing load. Once your script is ready, it is time to determine how many concurrent threads you need to represent a realistic server load.

As an example of calculating load, imagine an application that is expected to have 100,000 daily active users, with a peak of 30,000 users an hour. If each session lasts five minutes, that comes to 2,500 concurrent users at any given time.

Tips for stress testing

In a stress-testing run, you want to take your script and turn it up to 11, seeing how far you can push it before the system fails. The most obvious way to do this is to run your test with more and more threads until the server becomes unresponsive. Another strategy is to run your script overnight or over a weekend and see what happens. This is especially good to do with scripts that involve a lot of saving and retrieving of data.

Now go forth and win the performance race

Creating a useful performance script requires a lot of research and discussion with your team and stakeholders. But by following the tips above, you can create a test set that will ultimately result in a stronger, more reliable service in production.

Want to know even more about making your performance tests as effective as possible? Attend my presentation at the PerfGuild online performance testing conference. Can't make the live event? Not to worry: Registered attendees can view presentations at their leisure after the event concludes.

Keep learning

Read more articles about: App Dev & TestingTesting