Micro Focus is now part of OpenText. Learn more >

You are here

You are here

How to create killer automated functional tests

public://pictures/nicolay.jpg
Nikolay Advolodkin CEO and Senior Test Automation Engineer, Ultimate QA
Chalk text:
 

When it comes to automated functional test automation, do you know where to start? At what point do you stop? What are you going to automate? Do you have a strategy for creating robust test automation?

Throughout much of my career, I stumbled through functional test automation. I was improving, but often found myself lost in indecision. I was in a state of perpetual confusion because I lacked direction.

But then I began to connect the dots. Everything I learned enabled me to go further in functional test automation. And I recognized a pattern: Most functional test automation follows a similar process. If you simply follow that process, the quality of your test automation will be excellent. That is the process that I lay out here. It’s not perfect, but it can take you 90% of the way.

Your automation checklist

Think of this process as a guide and checklist for creating a suite of excellent automated functional tests. Follow it as outlined below and your results will be as spectacularly over the top.

As with all engineering disciplines, there are gray areas that can be tough to navigate. The same applies to the principles outlined here. If you are working in a somewhat agile environment, with management that demands growth, then these steps will work for you. 

 

How to use this guide

Everything discussed here is critical to achieving great functional test automation. If you have a bunch of tests that only you can execute, then those tests are only valuable to you. Having many tests with unreliable results is also useless because nobody can trust the tests. You need a holistic approach that tackles what’s needed for great test automation from all sides.

In the first section below, I describe the two rules you must follow as you develop your own automation framework. Coding the framework and writing automated tests happens almost simultaneously. Once you learn the framework rules, you can integrate them into your automation test design.

In the second section, I break down the coding of automated functional tests into several steps. Be sure to follow every step, from the first to the last, for each functional test. Then repeat for each automated functional test you create.

Finally, I'll show you one more rule on how to end functional test automation and spread your testing to other layers.

Rule 1: Plan for continuous integration from the start

One of the most painful points when it comes to software development is moving your source code from one environment to the next. At least once per month, I experience a situation where the code works differently on the dev server than it does on the test and production systems.

The root cause of these issues is usually a failure to constantly deploy to different environments. Moving between different environments is hard. Too many teams try to do it infrequently, and that's wrong. However, if those teams had more automation in place, the process might be easier, and they would probably do code promotions more often.

Therefore, as an automation engineer, think about how your functional test automation can plug into the continuous integration environment from the start. Write a single automated functional test, and do not move on to your next functional test until you have integrated it into your build pipeline. At a minimum, you should be able to run your functional test through a build system such as Jenkins. Test execution should happen automatically, at different intervals, in at least two environments, with no human intervention.

It is much easier to include your tests in the build process from the beginning. Going back and retrofitting hundreds of tests into the build cycle can be painful. This approach is similar to the idea of doing more deployments more frequently.

If you don't think about running your automated checks through a build cycle from the beginning, you may find yourself in a bind with environment and test data management. When you only need to think about writing for a single environment, you will code your tests in the same manner. And you will encounter issues when it's time to manage test data in different environments.

What's worse, it may be hard for you to manage different environments. I've seen scenarios in many companies where hundreds of tests are tightly coupled to an environment. If you can only run your automation in a single place, it loses its value.

So, write a single test, and make sure it can run in multiple environments. Use a build server such as Jenkins to execute the automated check, and do not move on to your second test until these steps are complete.

Rule 2: Functional tests should guide framework design

I've written some disgusting automated checks in my life because I approached framework design in a backward manner. One of my biggest breakthroughs in test automation development occurred when I learned test-driven development (TDD).

TDD forces you to write your unit tests prior to writing a single line of code for your software. You cycle through writing a small portion of a test, then implement the test to make that portion work. By applying these principles to my functional test automation, I was able to create extremely robust and flexible automated functional tests.

To convey the importance of this principle, I had two automation engineers write the same test. The code below is from an engineer who did not use TDD to code his test. Instead, he used conventional methods to try to write the framework to fit the needs of the test. (In the comments section at the end of this article, tell me what you think this test does.)

[Test, TestCaseSource(nameof(DataSource))]
        [Category("LL"), Category("Cog")]
        public void RespondToAllItems(string accNumber, string dataSubject)
        {
            #region Parameters
            //if (accessionNumber.Contains(TestContext.DataRow["AccessionNumber"].ToString()))
            //{
            accNumber = accNumber;
            string subject = dataSubject;
            string loginId = DataLookup.GetLoginIdByAccessionNumber(accNumber);
            string username = DataLookup.GetUsernameByLoginId(loginId);
            string password = DataLookup.GetPasswordByStateCode(
                DataLookup.GetStateCodeByLoginId(username));
            string schoolName = DataLookup.GetSchoolNameByUserAndLogin(username, loginId);
            string sessionNumber = DataLookup.GetSessionNumberBySchoolAndLogin(schoolName, loginId);
            int lineNumber = DataLookup.GetBookletLineNumberByLoginId(loginId);
            #endregion
            #region Test Steps
            Assessment assess = new Assessment(
                UseAdminPageToGoToLocation(accNumber, loginId).Driver, true);
            try
            {
                assess.AnswerNonReadingWritingItem(subject);
                Reporter.LogTestStepAsPass("Responded to accession number " + assess.GetAccessionNumber()
                    + " for Item Type " + assess.itemTypeString);
            }
            catch (Exception ex)
            {
                ePScreenshot.SaveContentScreenshot(assess);
                exceptionString = ex.ToString();
                throw new Exception(exceptionString);
            }
            //    if (assess.itemTypeString == "Comp" ||
            //        assess.itemTypeString == "CompR")
            //    {
            //        assess.PageWait(0);
            //    }
            //}
            #endregion
        }

This next test does the same thing, but this engineer wrote it using TDD. Do you understand the purpose of this test?

public void RespondToAllItems(string itemNumber, string dataSubject)
{
    var assessmentTestData = new StudentAssessmentTestData(itemNumber, dataSubject);
    var studentAssessmentPage = UseAdminPageToGoToLocation(assessmentTestData.ItemNumber, assessmentTestData.LoginId);
    bool gotRaiseHandError = studentAssessmentPage.AssessmentItem.AnswerNonReadingWritingItem(assessmentTestData.Subject);
    Assert.That(gotRaiseHandError, Is.False);
}
public class StudentAssessmentTestData
{
    public string ItemNumber { get; set; }
    public string Subject { get; set; }
    public string LoginId => BookletDataLookup.GetLoginIdByAccessionNumber(ItemNumber);
    public StudentAssessmentTestData(string dataItemNumber, string dataSubject)
    {
        ItemNumber = dataItemNumber;
        Subject = dataSubject;
    }
}
The difference between these two examples is drastic and should be obvious. And the disparity between the frameworks is even greater when you apply test-driven design to guide the design of your framework.

Next, I'll take you through an example of how to use TDD to guide your framework design using this sample automation scenario.

Write your automated functional test first

The key here is to use plain English to write your automated functional test. It's also important to only make the test work, so as to avoid over-engineering. It should look like this:

 [TestFixture]
    [Category("SmokeTest")]
    public class SampleTestsFeature
    {
        [Test]
        [Author("NikolayAdvolodkin")]
        [Description("Validating that sprint 1 form can be filled out")]
        [Property("TCID","1")]
        public void TCID1()
        {
            Sprint1Page sprintOnePage = new Sprint1Page(Driver);
            sprintOnePage.GoTo();
            sprintOnePage.FillOutAndSubmitForm();
            UltimateQAHomePage ultimateQAHomePage = new UltimateQAHomePage(Driver);
            Assert.That(ultimateQAHomePage.IsOpen, Is.True);
        }
    }
Now implement all of the classes, methods, and properties. Here is my implementation, from the example above:

public class Sprint1Page
    {
        private IWebDriver Driver { get; set; }
        public Sprint1Page(IWebDriver driver )
        {
            Driver = driver;
        }
        public void GoTo()
        {
            Driver.Navigate().GoToUrl("https://www.ultimateqa.com/sample-application-lifecycle-sprint-1/");
        }
        public bool IsOpen =>
            Driver.Url.Equals("https://www.ultimateqa.com/sample-application-lifecycle-sprint-1/");
        public void FillOutAndSubmitForm()
        {
            Driver.FindElement(By.ClassName("firstname")).SendKeys("testUser");
            Driver.FindElement(By.XPath("//*[@type='submit']")).Click();
        }
    }
Note that you should only implement the test to make it work. Don't do any more than that. Don't try to predict the future, code for all scenarios, or otherwise over-engineer it. Meet the requirements, make sure your test is stable, and move on.

Notice how, with a single automated test, I created two page objects and have a complete framework ready. Of course, the framework needs enhancements. But it's there, ready to execute as needed.

How to start automation

There are three steps you need to follow to get started with your automation.

Step 1. Create a definition of "done" for your functional automated check

If you want to create a good automated environment, you must have a baseline definition of "done" (DoD). Your DoD is critical to set standards for your team from the beginning as to what a finished test should look like. I have worked with test automation engineers who have given me an automated check that worked 5% of the time. They weren't jerks; that was simply their DoD. 

Some engineers believe that once they've coded a functional test, they are finished. Whether it passes or fails correctly is irrelevant to them, because they believe that their job is to crank out as many functional tests as possible. I disagree.

The job of a test automation engineer is to help deliver the highest-quality software possible in the shortest time feasible. An automation engineer does the job through test automation, as opposed to manual testing or software development. Regardless, quality is the ultimate goal.

By creating a DoD from the beginning, you will eliminate such a horrible mentality and limit the cowboy coders. Below is my DoD for every automated check your team writes. It contains many requirements, but most are best practices. Once you learn them, applying them will become second nature. Start here and then modify based on your needs. 

DoD for an automated functional test

  1. The automated functional test can be successfully executed in development, test, and production servers.
  2. The automated test has greater than 95% accuracy from its result.
  3. Your test uses the page object pattern.
  4. Your test does not contain Selenium commands (something referencing OpenQA.Selenium).
  5. Any hard-coded sleeps (Thread.Sleep) require justification from the lead.
  6. A test longer than 20 lines of code requires special justification from the lead.
  7. Leave code cleaner than you found it—clean up one extra thing every time you touch the code.
  8. No dead code, a.k.a. commented-out code.
  9. No comments. If your code cannot explain what it does, refactor the code until it can.
  10. Your page object class is not more than 200 lines long.
  11. Your page object contains behavior to interact with the application.
  12. Your page object does not contain hard-coded sleeps (Thread.Sleep).
  13. Page object classes cannot override default implicit wait timeout unless approved by the automation lead.
  14. Prefer CSS over XPath.
  15. Tests follow the hermetically sealed design pattern.
  16. All code follows the "Don’t repeat yourself" (DRY) principle.
  17. No function is more than 15 lines long.
  18. No class is more than 200 lines long.
  19. A function with three or more arguments requires approval from the automation lead.
  20. A function should never contain a Boolean as an argument.
  21. Prefer to follow DRY over single responsibility. Therefore, your method may do up to two things, if it removes duplication from the code. If it does not remove duplication, a method may only do one thing. For example, a constant step with an assertion following may be placed into a single method that performs a step, with the assertion immediately following.
  22. Finally, as exit criteria, your automated check must meet all of the criteria above.

 

How to choose what to automate

Here are a few tips to help you decide what should be automated.

  • Start your functional test automation with the easiest scenarios possible. A big benefit of automation is that it helps to save on manual testing efforts. That's why, in most instances, it is more beneficial to automate ten easy tests than one hard one. Easy tests are usually more stable and easier to maintain for automation engineers.
  • Apply iterative development to your automated functional tests. Use agile principles to develop your test automation. Rather than spending days or weeks working on a single test, spend a few hours. Release and execute your code often to demonstrate value to your employer. Releasing often also allows you to quickly fix any issues that occur in different environments.

Below is my step-by-step decision process for picking an automation candidate. If your application doesn't contain a certain test suite type, skip it and move on to the next. Also, try to set these expectations from management from the beginning. Unless management is highly technical with regard to functional test automation, stick with the plan.

Automate smoke tests

Smoke tests are meant to determine whether the system catches fire as soon as it turns on. If these tests don't pass, there is no point in testing the rest of the system. Smoke tests are easiest to automate and provide a high return on investment. Usually, a good smoke test touches not more than about three pages.

Examples of smoke tests include:

  • Does the main page of the application load?
  • Can you log in as a user into the application?
  • Can you start the main flow of the application?

Here is a code example of an automated functional smoke test using this practice site.

[Test]
public void Test4()
{
    var complicatedPage = new ComplicatedPage(Driver);
    complicatedPage.GoTo();
    Assert.IsTrue(complicatedPage.IsAt(),
        "The complicated page did not open successfully");}

These are not smoke tests:

  • Are all elements present on the Complicated Page?
  • Can you interact with every element on the Complicated Page?
  • Can you navigate from the beginning to the end of this sample application?

 

Automate regression tests

The automation of regression tests saves manual testing time for the team and prevents regressions in the application. Following the principles outlined at the beginning of this step, make sure that you start with what's easiest to automate. Again, in most cases with a regression suite, it is better to automate ten easy tests than a single difficult one.

You can measure the complexity of a regression test by the number of pages that it touches. Start with regression tests that touch only two pages, then automate regression tests that touch three pages, and so on.

Examples of regression tests:

  • Do all the buttons in the section "Section of Buttons" work on this page?
  • Can a user book a flight and a car package together using this practice site?
  • Can a user filter by five-star hotels?

Here is a code example for a simple regression test:

[Test]
public void Test4()
{
    var complicatedPage = new ComplicatedPage(Driver);
    complicatedPage.GoTo();
    Assert.IsTrue(complicatedPage.IsAt(),
        "The complicated page did not open successfully");                         complicatedPage.LeftSidebar.Search("selenium");
    Assert.That(complicatedPage.IsAt(), Is.False);
}

Here are the exit criteria for the regression suite:

  • The maximum number of automated GUI tests allowed was reached, or
  • It is easier to automate the next category of tests

 

Automate data-driven tests

Data-driven tests offer an excellent return on investment because they allow a single test to iterate through multiple permutations using different data points. This means that writing a single test method can yield 10, 100, or even 1,000 tests. Now, that's good ROI.

However, these are the least important when compared to the smoke and regression tests. In most cases, data-driven tests take a long time to run. Also, this type of testing can be performed at the unit test level more efficiently.

Good examples of data-driven tests include:

  • Open every page of the application and perform visual validation testing.
  • Iterate through all of the application's links to ensure that all work.
  • Check all of the boundary conditions for different fields.

 

Step 2. Refactor 

Refactoring is a critical step in creating good automated functional tests. You want to go through your DoD to make sure that all of your tests conform to that definition. Pay extra attention to the stability of your tests.

If this is your first test, refactor before moving on to the second test. If you have more than one test at this point, compare all of your tests to check for similarities and code duplication. Then take appropriate action to reduce duplication and code complexity.

Step 3. Repeat

Repeat the entire process for every test. Each test must meet the entire definition of done before a test engineer can move on to the next functional check. Each test must also adhere to the two rules established earlier.

Rule 3: When to stop functional test automation

If you follow the steps outlined above, you'll impress management, and you will receive constant requests for automation to be plugged into a software testing process. This is a great sign of success. Just remember that functional test automation must have an end. Otherwise, you will spend all of your days doing maintenance on a system that is too big.

Prefer stability and quality over quantity

Do not allow yourself to have more than 200 automated functional GUI tests per automation engineer. At this point, maintaining the quality of the code base becomes tough. Every test should always be monitored to follow the DoD. Furthermore, running all these tests and gathering results will take a long time.

At this point, automation engineers begin to spend all of their time maintaining the framework, while the application under test continues to evolve. The focus shifts from helping to improve the quality of the application to helping to maintain the test suite. That's not a good trade-off in terms of ROI.

Move to the API layer

Most systems have more than 200 test cases that need test automation. Therefore, you should do functional test automation at the API layer, rather than the GUI layer. Technically, API level tests should happen prior to GUI tests. But, sadly, most companies have the automation pyramid reversed. This approach will help to reverse that bad cycle of constantly adding to a giant GUI test suite. 

Move to the unit test layer

Test automation engineers do not normally perform unit tests. This fact makes it tough to penetrate into the unit test layer, which is typically controlled by developers. This sad reality is even more painful because having a great suite of unit tests is the best ROI out of all of the different automations. Unit tests are quick to write, quick to run, and the most robust. They don't need to interact with any web technologies, as GUI tests do. 

Be a champion of quality, and have conversations with your team about including more unit tests in the automation pipeline. You can even offer to write unit tests for the team to improve the quality of the application.

But at the end of the day, if your developers don't believe that unit tests have value, it will be tough to make anything happen at this layer. And unfortunately, the majority of developers still don't believe in the value of unit tests.

Living the dream

This guide can help you build robust functional test automation. If you follow the steps you will have an excellent system that functions well for years to come. Your automated checks will be easy to maintain, run, and create. That's a dream of every test automation engineer.

Share your test automation thoughts and techniques in the comments section below.

Keep learning

Read more articles about: App Dev & TestingTesting