Creating a Bunch of Test Automation Scripts Is a Waste of Money

It is extremely important to develop a strategic plan when creating test automation scripts.  With the push towards Agile and DevOps the mindset has become to automate as much testing as possible.  While that idea makes sense, it is important to determine how much test automation is needed.  Part of the problem is that those who are managing Agile or DevOps teams have a really different mindset, and they really don’t understand basic testing principles.  Instead of taking the time to understand, they force their understanding and push test engineering in the wrong direction.

Test Automation Planning

Test automation is certainly critical for Agile and DevOps.  There are things that can and should be automated but before that is done, a few questions must be asked:

  • How many times are these test automation scripts going to be executed?
  • Is the code stable or is it going to change a lot over the next several sprints?
  • Has the test been executed manually and has it passed successfully?
  • How much time will it take to maintain the test scripts?
  • Do you have the right resources to build and update the test scripts?
  • What type of tools are you going to use?
  • What type of framework are you going to build?
  • How much does the test automation software cost?

These questions will help lay the foundation of your test automation planning.  The next step will be to identify the number of test cases that need to be automated.

Testing That Should Not Be Automated

Here is a list of test cases that should not be automated:

  • Test cases that will only be executed a few times
  • Test cases that require human touch, for example review of a customer bill
  • Test cases that require a third party billing system such as a payment gateway that doesn’t have a test environment
  • Test cases were the code is unstable and is constantly changing

Test Case Quality

Test automation scripts help with providing test coverage across applications.  This allows faster deployments because it doesn’t require manual testing.  However, if you create hundreds of test cases and those never identify any defects, is it really worth maintaining and running those over and over again?  Probably not.  The test cases that are automated should be able to capture defects.  Therefore, it is important that you have some really sharp manual testers who can think through the testing scenarios and identify problem areas that need attention.  Without that system knowledge, you will not capture defects and it will not be a good use of your resources.

Return on Investment

It is really important to understand how much cost is involved in creating automated tests.  If you create thousands of them, how much is it going to take to make changes on those when the GUI or back end systems change?  If it is your department or team, as a manager or employee, you really need to be able to answer this challenging question.  At every organization, I make sure that I calculate the ROI.  There are a few ways to do this.  First, you roughly need to know what is the cost of a member of your team or the average rate per contractor or employee.  Most organizations will have that already.  Second, you will need to determine how many scripts are going to be automated and how many times are they going to be executed.  Third, you need to know how much time does it take to execute those test cases manually.  Fourth, once the test cases are automated, you need to know how much time does each one take to execute.  These items will at least give you a basic ROI.  Most ROI numbers are somewhere between 6-12 months.  Anything over a year, probably isn’t worth it.  There are other variations that are more complex, but this will at least get you started.  Knowing this information is certainly powerful, and it will help justify the expense for having automation engineers.

I cannot stress the importance of doing these basic steps.  With Agile and DevOps, everything tends to be rushed.  Perform this basic analysis first and spend a little time up front to yield large returns in the long run.

Role of an Agile QA Engineer

Agile QA EngineerAgile Software Testing has literally transformed the role of an Agile QA Engineer.  There are several key differences which software testing organizations will need to adjust in order to effectively operate with Agile.

Documentation:  It isn’t practical within Agile for the Agile QA Engineer to have lots of documentation.  Don’t get me wrong, there does need to be sufficient documentation for the QA Engineers to do their job well. If a developer tries to convince you that the Agile Manifesto doesn’t require documentation, Houston we have a problem!

  • In terms of requirements, there needs to be enough documentation that the Agile QA Engineer knows how the system is supposed to operate.  In my team, we currently use a 1 page document that will provide that information for each story.  It really does help the agile testers know what is expected so they can adequately plan.
  • For the Software Test Plan, it isn’t necessary to go through all the hoopla of creating a massive plan that nobody will read or follow.  A one page document will be sufficient.  I encourage the creation of the one page plan because it forces the agile tester to think about what testing is needed.
  • For the testing sign off a simple email will suffice.

Test Automation: One of the hottest topics naturally is test automation, especially within Agile software testing.  Test automation without a doubt is critical.  However, there needs to be a strategy that is going to work.  In previous organizations, I have built world class automation teams.  We had thousands of automated test cases within our regression suites.  Were all those tests necessary, probably not.  Was it expensive to maintain, absolutely!  I believe having a centralized automation team that can build and execute the automation tests is most effective.  The team prioritizes the work based upon the needs of the various agile teams.  I have found this to work most effectively.  I usually recommend that the agile testers execute the tests within the current sprint and once those have passed, they can hand that off to the agile team to automate and put in the regression suite.  Having an Agile QA Engineer do both the functional and automated tests within a given sprint is extremely challenging and often the functional testing gets done and the automation gets put on the back burner.

Performance Testing: Another challenge is system performance.  If automation takes a back seat, then performance testing is usually behind that.  In order to deal with that challenge, having a centralized performance team will help.  Usually 2-3 performance engineers will be sufficient.  I encourage a very close look at performance testing within the Agile context.  If you don’t need it, don’t do it, if you need it, you better do it.

UI versus Web Service:  Agile software testing requires some fundamental changes in terms of how things get tested.  More and more agile software testing is taking a closer look at web service testing.  It is a critical component that can find a lot of defects.  The need for agile teams to focus on spending more time in this area is due to the amount of applications that are using API’s to transfer information.  It is a lot quicker to run hundreds of transactions through an API versus through the UI.  Agile software testing teams who are able to make this transition are simply more effective.

Let’s face it, transforming your software testing organization is going to be very challenging.  Agile software testing is extremely challenging.  The key is to have a strong team who is flexible and is adaptable to constant change.  The teams that can do that most effectively will be successful.

Role of Agile QA Manager

QA Agile Manager

If you are an Agile QA Manager your role has probably changed a good bit from the more traditional Waterfall methodology.  It certainly requires a bit adjustment, but if you are able to make the transition, it will certainly help ensure your organization delivers high quality software.  Chances are your involvement will change and it will require a different mindset and approach.  Here are a few items that will make an Agile QA Manager successful:

  1. Documentation: Certainly this is a huge change from waterfall.  In typical waterfall the documentation is a lot heavier and that includes requirements, test plans, and design documents.  Documentation is important and contrary to what you might hear, it is still an important aspect of Agile.  I still encourage my team to create a small requirement document for each story and that really helps the QA testers.  For QA, I encourage creating a 1 page QA test plan and that helps development and product owners understand what is tested and ensures everyone is on the same page.  In terms of QA sign off, I simply require an email stating that QA has signed of on the story and release.  As an Agile QA Manager, I encourage you to keep things simple and lean especially around the documentation.
  2. Collaboration: Collaboration within Agile is extremely important and it is one of the most important elements of having a high quality product.  It is important for the QA tester to work hand in hand with the BA, Developers, and other QA resources.  Collaboration helps to get the best information and gain a solid understanding of the product in order to be able to test the application more effectively.  Where possible, I encourage the QA resources to sit with the other members of the Agile team.
  3. Automation:  It is important for automation to occur, but it needs to be done efficiently.  Maintaining automation test scripts is expensive so it is important for the Agile QA Manager to take a very close look at the amount of automation in place.  I have found that the automation is most effectively done by dedicated automation resources and I have set them up as a separate team.  Many teams have the QA engineer perform both functional and automated testing, but I have found that when push comes to shove, the automation takes a back seat and isn’t a priority.
  4. Performance:  Performance testing is important and needed.  There might be applications that need more performance testing than others, so a one size fits all model doesn’t necessarily work.  This is where the experience of the Agile QA Manager comes into play, so if you think it is needed, then certainly the effort must be there.  I have also setup a small team of performance engineers that sit outside the agile team and that has worked well for me.
  5. Level of Involvement: It is important to know what the QA engineers are doing and what is being tested however sometimes it is important to take a back seat not necessary get involved in the nuts and bolts of the daily activities.  Chances are high that you will get involved when there are escalations and issues.
  6. Champion of Quality: While it is true within Agile that Quality is the teams responsibility, you need to ensure that across the organization, you are personally involved in helping to build quality.  Things like documentation and testing are important and should be consistent across all the agile teams.  The better information your team has, the chances are higher they will find the issues before they hit production.
  7. Metrics:  As an Agile QA Manager, metrics are your best friend.  It will help you pinpoint when there are issues and help prevent production disasters.  I encourage you to place a strong emphasis on metrics so you base your decisions of quality versus gut feel.

I hope this information has been helpful and I encourage you to closely take a look at all of these items so you can become a more effective Agile QA Manager.

If you are looking for more additional information on Agile roles you can find additional information here:

Role of an Agile Development Manager

Role of an Agile QA Engineer

Role of an Agile Business Analyst

Agile Software Testing Transformation

Agile Testing TransformationAgile Software Testing has literally transformed how organizations test software.  There are several key differences which software testing organizations will need to adjust in order to effectively operate with Agile.

Documentation:  It isn’t practical within Agile to have lots of documentation.  Don’t get me wrong, there does need to be sufficient documentation for the agile software testers to do their job well. If a developer tries to convince you that the Agile Manifesto doesn’t require documentation, Houston we have a problem!

  • In terms of requirements, there needs to be enough documentation that the tester knows how the system is supposed to operate.  In my team, we currently use a 1 page document that will provide that information for each story.  It really does help the agile testers know what is expected so they can adequately plan.
  • For the Software Test Plan, it isn’t necessary to go through all the hoopla of creating a massive plan that nobody will read or follow.  A one page document will be sufficient.  I encourage the creation of the one page plan because it forces the agile tester to think about what testing is needed.
  • For the testing signoff a simple email will suffice.

Automation: One of the hottest topics naturally is test automation, especially within Agile software testing.  Test automation without a doubt is critical.  However, there needs to be a strategy that is going to work.  In previous organizations, I have built world class automation teams.  We had thousands of automated test cases within our regression suites.  Were all those tests necessary, probably not.  Was it expensive to maintain, absolutely!  I believe having a centralized automation team that can build and execute the automation tests is most effective.  The team prioritizes the work based upon the needs of the various agile teams.  I have found this to work most effectively.  I usually recommend that the agile testers execute the tests within the current sprint and once those have passed, they can hand that off to the agile team to automate and put in the regression suite.  Having a tester do both the functional and automated tests within a given sprint is extremely challenging and often the functional testing gets done and the automation gets put on the back burner.

Performance: Another challenge is system performance.  If automation takes a back seat, then performance testing is usually behind that.  In order to deal with that challenge, having a centralized performance team will help.  Usually 2-3 performance engineers will be sufficient.  I encourage a very close look at performance testing within the Agile context.  If you don’t need it, don’t do it, if you need it, you better do it.

UI versus Web ServiceAgile software testing requires some fundamental changes in terms of how things get tested.  More and more agile software testing is taking a closer look at web service testing.  It is a critical component that can find a lot of defects.  The need for agile teams to focus on spending more time in this area is due to the amount of applications that are using API’s to transfer information.  It is a lot quicker to run hundreds of transactions through an API versus through the UI.  Agile software testing teams who are able to make this transition are simply more effective.

Let’s face it, transforming your software testing organization is going to be very challenging.  Agile software testing is extremely challenging.  The key is to have a strong team who is flexible and is adaptable to constant change.  The teams that can do that most effectively will be successful.

If you are looking for more additional information on Agile roles you can find additional information here:

Role of an Agile QA Manager

Role of an Agile Development Manager

Role of an Agile QA Engineer

Role of an Agile Business Analyst

Test Automation: What Programming Language Should You Learn?

Programming LanguageThe most frequent question I receive by far from software testing professionals is what programming language should they learn.  To answer that question, I usually ask a series of questions to help them figure out what programming language would be most helpful for them.

  • What tool does your automation team use?
  • What types of applications are you testing?
  • Are you planning to use automation and performance testing tools?

This helps to set the context.  Once I have those questions answered, I can give them a better answer to the question.  Now if you are new to software testing but want to learn a language or a tool, I recommend that software testers pick a language and start learning.  The more time you spend on trying to figure out the best programming language, the less time you will spend learning it.

The challenge is that there are many software testing tools on the market and several of the big players use different programming languages.  HP QTP/UFT uses VBScript.  Selenium uses Java.  If your organization uses one of these tools it would be natural for you to learn one of those programming languages.

Personally, if I were to learn a programming language today, I would focus on learning Java.  Why?  Because many companies today are using Selenium for their test automation tool plus many companies use Java as their preferred programming language so to me, it just makes sense.  As a software testing professional, if you can master Java, then you can naturally move into a Java programming role if you choose to head in that direction.

Let’s face it, test automation is rapidly becoming a requirement for a Quality Engineering job.  You have to have some programming backgrounds to get to the next level in Quality Engineering.  Pick a programming language and start today!

Product Model Software Testing

The product model is rapidly gaining traction within the software development lifecycle.  Companies continue to look at options to improve the software development delivery process.  I am seeing more and more organizations move in this direction.  The challenge involved is determining how software testing will play a role in this process.

The product model leverages the agile methodology to help develop a product or series of products.  Typically the teams are 6-8 people and they develop solutions in short iterations (Sprints) and deploy small code changes into production on a frequent basis (2-4 weeks).  The team typically will have business analysts, development, and testing.  The numbers will vary based upon the needs within the product team.  I see the product model working well in companies that are small to mid-sized and have a relatively lean IT organization.  In addition, I believe those companies that have a few products will be able to perform the best.  If you have a lot of products and those products interface with each other, you will have significant challenges with integration and moving data from one system to another.  This will require extreme collaboration to ensure interfaces and data are provided when they are needed, otherwise the release going to into production could potentially get delayed.

The product model team is typically driven by a product manager or product owner.  That individual is responsible for providing direction and helps to remove obstacles.  It is important that the team works well together so they can increase their velocity in terms of software products.  If there is one or two team members that aren’t pulling their weight, it could affect the entire team.  Either those team members improve or they will need to be replaced with those who have the proper skills.  The team also must be self-sufficient and they need to do everything possible not to be dependent on other organizations either inside or outside the organization.  Certain pieces such as infrastructure setup might be outside the team but it requires constant collaboration to make things happen.  If there are multiple products, those teams will usually roll up to a program manager.

In terms of software testing within the product model, it will involve functional, automation, and performance.  Some software testing organizations may differ in terms of test automation and performance activities.  I have setup a shared service where the functional testers will reach out to automation and performance resources who can build and execute automation and performance tests.  I have also seen examples where the automation and performance activities are accomplished by the testers within the team.  As long as those activities get done, it should work well either way.

If you would like more information on Agile, DevOps or Software Testing, please visit my Software Testing Blog or my Software Testing YouTube Channel.