Performance Testing: Common Myths/Confusions

Performance Testing Myths

Performance Testing: Common Myths/Confusions

Performance Testing in today’s changing times is still considered a niche skill. It is because of this reason there are many associated myths around it that lingers on and gives rise to false notions and picture which ultimately affects the kind of picture presented to client. Performance testing rings is associated with such statements as “Delivery impacting testing”, “Redundant testing”, “Not sure if we have budget for such things” etc. Below are some of such myths and a possible resolution for the same. Please do post if you think otherwise or there something similar to this in your mind.

#1-“I always consider Average reading or response time while presenting it to client”. Question – “Why so”. Answer – “Not sure. That’s what my client wants”.

In most cases, I have received the answer for this question as “I will decide as per my client’s need”.  I don’t think this is quite true and if you agree to it you must always have the strong reason to support it. This poses us with 2 issues. Firstly, is this reasoning “I will decide as per my client’s need” alright? And secondly, if we don’t agree to first issue then how should we decide which reading to provide? Let us, for the sake of this discussion, concentrate on the second question.

Let’s consider we have 10 readings for transaction as:
3 18 2 4 20 7 3 12 12 10

Arranging them in ascending pattern:
2 3 3 4 7 10 12 12 18 20

Average of the above readings is 9.1 and 80 percentile is 12.

Now this tells me:

  • 80 out of 100 times this transaction will be completed in 12 secs or faster. Of course, what reading of percentile (75, 80, 90) you give depends on the criticality of the functionality you are testing but it provides you an idea of the range in which majority of your transactions will fall. This cannot be true about average because it is not a relative information of the various readings your transaction has done. Consider due to any circumstances (inherent system behaviour like SQL taking more time, extreme paging etc.) the peak transactions rise to 40 and 50 instead of 18 and 20. This change will impact average to rise from 9.1 to 14.3, while the 80 percentile reading remains at 12 secs. The question you might ask here is did we just ignored major issues like SQL taking more time, extreme paging. The answer is no. We know that there is some issue but as I said before as per the criticality the percentile reading can be changed. If it had been 90 percentile response here then it would be 40 secs and would have been concern to me/client. But if we consider 80 percentile, it says that 80 time out of 100 I won’t face any issues. If the 20 times are tolerable then you can sign off the readings or else you can look closer.
  • Also, when a second round of the same test is conducted, and is found that 80 percentile reading for this transaction is 15 secs, one can be sure that there is a 40% degradation in this transaction. This helps a lot if your system is undergoing tuning or performance improvements and knowing this info will point out to you exactly how bad or even good has the change affected your responses. For average, again this can’t be true reading because we do not have any relative information about the times that are considered for calculation of average response time.

#2-Once the bottleneck is found, the step ahead is not something which is an integral part of Performance Testing

This mostly depends upon the experience level of the person involved in the project and it should be respected however, pointing out the issue should be the first step of Performance Testing. The people involved should develop, with experience, the ability to understand the system and suggest the possible workarounds for such kind of scenarios. It might sound like speaking unpragmatically but believe me this is the kind of market demand and should be answered. At least, experience of the level where suggestions are made as to what could be a potential issue should be something which could be to catered. You can figure below example for a better insight to the approach. Of course, the example is just one possibility out of the ocean of such issues you might be facing.

Lets assume for example that there is an issue of higher response time for one of the transactions you are dealing with. What will your first step be? I personally, will like to check the DB logs. Lets again, with the fear of becoming specific, assume that the Database involved is Oracle. Ask for AWR reports. Analysis of AWR is something which requires understanding of SQLs, execution plan, parsing and bind variables. You can either go for ADDM report (which is an automated analysis of AWR report) or you can do the analysis yourself. Getting back to the question in hand, if the issue is response time, we can safely consider the top SQLs in AWR. Just looking at the top performing SQL will not solve your problem. You must consider the query and understand if that is corresponding to the transaction which is in question. What I mean here is, if your UI transaction is doing a submit transaction and the top SQL is a select query, it is not the correct data to be analysed.

Next you go for the query popping to the top of the pile by the elapsed time since the response time is the question. If the query ran once or twice during the period of execution and still the elapsed time is quite high, this could be your potential bottleneck. You can also consider hard parse to soft parse ratio which normally is high in cases where there is an SQL with higher elapsed time as discussed. If all this is what you can conclude then the suggestion you can make as a possible resolution is to look for any hard coded values in SQL queries.

#3-It is assumed that all kinds of Performance Testing (Load, Stress, Endurance, assisting in DR etc) will be executed if Performance Testing is employed for a particular project.

This is something which always annoys me when I interview people. I get all kinds of answers for this. “I will do load test first and if that is successful, I will consider doing stress and endurance test” This could be true because if your load test is not successful, you cannot go ahead with stress or endurance test. But this however, is not the reason for including other tests in your plan. As far I understand, the Non-Functional requirements should concur with test that are in scope. Figure for example something like this.

“NFR # – The system should cater for 5% of user load YOY.” This clearly translates to you as Performance Tester including stress test in your plan. How you design this test is another topic.

“NFR # – The system should be able to sustain for a period of application up time” This translates to you including endurance or long haul test in your plan.

There are even more of such myths associated with Performance Testing as it is still considered a niche skill. But I strongly believe that experience and proactiveness is of utmost importance to resolve this myths/confusion and bring about a kind of change in terms of how we deliver to client.

About the Author

Mital Majmundar is a Quality Assurance Engineer with a primary focus on Performance and Automation testing.  Please visit his LinkedIn Profile here.

Role of an Agile QA Engineer

Agile QA EngineerAgile Software Testing has literally transformed the role of an Agile QA Engineer.  There are several key differences which software testing organizations will need to adjust in order to effectively operate with Agile.

Documentation:  It isn’t practical within Agile for the Agile QA Engineer to have lots of documentation.  Don’t get me wrong, there does need to be sufficient documentation for the QA Engineers to do their job well. If a developer tries to convince you that the Agile Manifesto doesn’t require documentation, Houston we have a problem!

  • In terms of requirements, there needs to be enough documentation that the Agile QA Engineer knows how the system is supposed to operate.  In my team, we currently use a 1 page document that will provide that information for each story.  It really does help the agile testers know what is expected so they can adequately plan.
  • For the Software Test Plan, it isn’t necessary to go through all the hoopla of creating a massive plan that nobody will read or follow.  A one page document will be sufficient.  I encourage the creation of the one page plan because it forces the agile tester to think about what testing is needed.
  • For the testing sign off a simple email will suffice.

Test Automation: One of the hottest topics naturally is test automation, especially within Agile software testing.  Test automation without a doubt is critical.  However, there needs to be a strategy that is going to work.  In previous organizations, I have built world class automation teams.  We had thousands of automated test cases within our regression suites.  Were all those tests necessary, probably not.  Was it expensive to maintain, absolutely!  I believe having a centralized automation team that can build and execute the automation tests is most effective.  The team prioritizes the work based upon the needs of the various agile teams.  I have found this to work most effectively.  I usually recommend that the agile testers execute the tests within the current sprint and once those have passed, they can hand that off to the agile team to automate and put in the regression suite.  Having an Agile QA Engineer do both the functional and automated tests within a given sprint is extremely challenging and often the functional testing gets done and the automation gets put on the back burner.

Performance Testing: Another challenge is system performance.  If automation takes a back seat, then performance testing is usually behind that.  In order to deal with that challenge, having a centralized performance team will help.  Usually 2-3 performance engineers will be sufficient.  I encourage a very close look at performance testing within the Agile context.  If you don’t need it, don’t do it, if you need it, you better do it.

UI versus Web Service:  Agile software testing requires some fundamental changes in terms of how things get tested.  More and more agile software testing is taking a closer look at web service testing.  It is a critical component that can find a lot of defects.  The need for agile teams to focus on spending more time in this area is due to the amount of applications that are using API’s to transfer information.  It is a lot quicker to run hundreds of transactions through an API versus through the UI.  Agile software testing teams who are able to make this transition are simply more effective.

Let’s face it, transforming your software testing organization is going to be very challenging.  Agile software testing is extremely challenging.  The key is to have a strong team who is flexible and is adaptable to constant change.  The teams that can do that most effectively will be successful.

Role of Agile QA Manager

QA Agile Manager

If you are an Agile QA Manager your role has probably changed a good bit from the more traditional Waterfall methodology.  It certainly requires a bit adjustment, but if you are able to make the transition, it will certainly help ensure your organization delivers high quality software.  Chances are your involvement will change and it will require a different mindset and approach.  Here are a few items that will make an Agile QA Manager successful:

  1. Documentation: Certainly this is a huge change from waterfall.  In typical waterfall the documentation is a lot heavier and that includes requirements, test plans, and design documents.  Documentation is important and contrary to what you might hear, it is still an important aspect of Agile.  I still encourage my team to create a small requirement document for each story and that really helps the QA testers.  For QA, I encourage creating a 1 page QA test plan and that helps development and product owners understand what is tested and ensures everyone is on the same page.  In terms of QA sign off, I simply require an email stating that QA has signed of on the story and release.  As an Agile QA Manager, I encourage you to keep things simple and lean especially around the documentation.
  2. Collaboration: Collaboration within Agile is extremely important and it is one of the most important elements of having a high quality product.  It is important for the QA tester to work hand in hand with the BA, Developers, and other QA resources.  Collaboration helps to get the best information and gain a solid understanding of the product in order to be able to test the application more effectively.  Where possible, I encourage the QA resources to sit with the other members of the Agile team.
  3. Automation:  It is important for automation to occur, but it needs to be done efficiently.  Maintaining automation test scripts is expensive so it is important for the Agile QA Manager to take a very close look at the amount of automation in place.  I have found that the automation is most effectively done by dedicated automation resources and I have set them up as a separate team.  Many teams have the QA engineer perform both functional and automated testing, but I have found that when push comes to shove, the automation takes a back seat and isn’t a priority.
  4. Performance:  Performance testing is important and needed.  There might be applications that need more performance testing than others, so a one size fits all model doesn’t necessarily work.  This is where the experience of the Agile QA Manager comes into play, so if you think it is needed, then certainly the effort must be there.  I have also setup a small team of performance engineers that sit outside the agile team and that has worked well for me.
  5. Level of Involvement: It is important to know what the QA engineers are doing and what is being tested however sometimes it is important to take a back seat not necessary get involved in the nuts and bolts of the daily activities.  Chances are high that you will get involved when there are escalations and issues.
  6. Champion of Quality: While it is true within Agile that Quality is the teams responsibility, you need to ensure that across the organization, you are personally involved in helping to build quality.  Things like documentation and testing are important and should be consistent across all the agile teams.  The better information your team has, the chances are higher they will find the issues before they hit production.
  7. Metrics:  As an Agile QA Manager, metrics are your best friend.  It will help you pinpoint when there are issues and help prevent production disasters.  I encourage you to place a strong emphasis on metrics so you base your decisions of quality versus gut feel.

I hope this information has been helpful and I encourage you to closely take a look at all of these items so you can become a more effective Agile QA Manager.

If you are looking for more additional information on Agile roles you can find additional information here:

Role of an Agile Development Manager

Role of an Agile QA Engineer

Role of an Agile Business Analyst

Agile Software Testing Transformation

Agile Testing TransformationAgile Software Testing has literally transformed how organizations test software.  There are several key differences which software testing organizations will need to adjust in order to effectively operate with Agile.

Documentation:  It isn’t practical within Agile to have lots of documentation.  Don’t get me wrong, there does need to be sufficient documentation for the agile software testers to do their job well. If a developer tries to convince you that the Agile Manifesto doesn’t require documentation, Houston we have a problem!

  • In terms of requirements, there needs to be enough documentation that the tester knows how the system is supposed to operate.  In my team, we currently use a 1 page document that will provide that information for each story.  It really does help the agile testers know what is expected so they can adequately plan.
  • For the Software Test Plan, it isn’t necessary to go through all the hoopla of creating a massive plan that nobody will read or follow.  A one page document will be sufficient.  I encourage the creation of the one page plan because it forces the agile tester to think about what testing is needed.
  • For the testing signoff a simple email will suffice.

Automation: One of the hottest topics naturally is test automation, especially within Agile software testing.  Test automation without a doubt is critical.  However, there needs to be a strategy that is going to work.  In previous organizations, I have built world class automation teams.  We had thousands of automated test cases within our regression suites.  Were all those tests necessary, probably not.  Was it expensive to maintain, absolutely!  I believe having a centralized automation team that can build and execute the automation tests is most effective.  The team prioritizes the work based upon the needs of the various agile teams.  I have found this to work most effectively.  I usually recommend that the agile testers execute the tests within the current sprint and once those have passed, they can hand that off to the agile team to automate and put in the regression suite.  Having a tester do both the functional and automated tests within a given sprint is extremely challenging and often the functional testing gets done and the automation gets put on the back burner.

Performance: Another challenge is system performance.  If automation takes a back seat, then performance testing is usually behind that.  In order to deal with that challenge, having a centralized performance team will help.  Usually 2-3 performance engineers will be sufficient.  I encourage a very close look at performance testing within the Agile context.  If you don’t need it, don’t do it, if you need it, you better do it.

UI versus Web ServiceAgile software testing requires some fundamental changes in terms of how things get tested.  More and more agile software testing is taking a closer look at web service testing.  It is a critical component that can find a lot of defects.  The need for agile teams to focus on spending more time in this area is due to the amount of applications that are using API’s to transfer information.  It is a lot quicker to run hundreds of transactions through an API versus through the UI.  Agile software testing teams who are able to make this transition are simply more effective.

Let’s face it, transforming your software testing organization is going to be very challenging.  Agile software testing is extremely challenging.  The key is to have a strong team who is flexible and is adaptable to constant change.  The teams that can do that most effectively will be successful.

If you are looking for more additional information on Agile roles you can find additional information here:

Role of an Agile QA Manager

Role of an Agile Development Manager

Role of an Agile QA Engineer

Role of an Agile Business Analyst

Product Model Software Testing

The product model is rapidly gaining traction within the software development lifecycle.  Companies continue to look at options to improve the software development delivery process.  I am seeing more and more organizations move in this direction.  The challenge involved is determining how software testing will play a role in this process.

The product model leverages the agile methodology to help develop a product or series of products.  Typically the teams are 6-8 people and they develop solutions in short iterations (Sprints) and deploy small code changes into production on a frequent basis (2-4 weeks).  The team typically will have business analysts, development, and testing.  The numbers will vary based upon the needs within the product team.  I see the product model working well in companies that are small to mid-sized and have a relatively lean IT organization.  In addition, I believe those companies that have a few products will be able to perform the best.  If you have a lot of products and those products interface with each other, you will have significant challenges with integration and moving data from one system to another.  This will require extreme collaboration to ensure interfaces and data are provided when they are needed, otherwise the release going to into production could potentially get delayed.

The product model team is typically driven by a product manager or product owner.  That individual is responsible for providing direction and helps to remove obstacles.  It is important that the team works well together so they can increase their velocity in terms of software products.  If there is one or two team members that aren’t pulling their weight, it could affect the entire team.  Either those team members improve or they will need to be replaced with those who have the proper skills.  The team also must be self-sufficient and they need to do everything possible not to be dependent on other organizations either inside or outside the organization.  Certain pieces such as infrastructure setup might be outside the team but it requires constant collaboration to make things happen.  If there are multiple products, those teams will usually roll up to a program manager.

In terms of software testing within the product model, it will involve functional, automation, and performance.  Some software testing organizations may differ in terms of test automation and performance activities.  I have setup a shared service where the functional testers will reach out to automation and performance resources who can build and execute automation and performance tests.  I have also seen examples where the automation and performance activities are accomplished by the testers within the team.  As long as those activities get done, it should work well either way.

If you would like more information on Agile, DevOps or Software Testing, please visit my Software Testing Blog or my Software Testing YouTube Channel.

7 Software Testing Trends in 2017

software testing trends 2017Software testing has been rapidly growing over the past several years and here are some software testing trends to be aware of in 2017.

  1. More Open Source Tools-As more pressure is placed on reducing expenses, more software testing teams will look at alternative options in reducing software testing licensing.  Adoption of open source tools will continue as Agile and DevOps practices evolve.  This software testing trend will continue for the next several years.
  2. Test Automation Growth-Most software testing organizations already have some test automation practice in place, however, there will be a focused effort to increase the amount of automation that is in place.  Companies will continue to focus test automation in the areas of smoke testing and regression testing.  Where companies can, they will use automation tools such as Selenium and Appium.
  3. Performance Engineering-There will be additional attention in the area of performance engineering due to additional data demands and production level failures related to system performance.  Companies will begin to require performance testing to be completed prior to production implementation.
  4. Digital Transformation-More attention will be place on customer satisfaction and this will require a digital transformation with the customer workflow.  This will require extensive amounts of software testing to ensure that the customer has a positive experience.  This transformation will go across multiple systems and require security in terms of the customer data.  In most cases there will be a mobile component to the digital transformation.
  5. Big Data-Explosion of demand for data due to IoT devices and additional data in order to make better business decisions will require new technology to support the additional capacity.  Software testers will need to be able to learn more about Big Data and will need to verify that the data is correct before it is consumed.  Platforms such as Hadoop will need to be learned.  This will require a solid test data management practice.
  6. Cloud-Continue cloud adoption will grow as more traditional companies such as insurance and finance begin to slowly transform their business to the cloud.  This transformation will continue to evolve for the next several years.  Software testers will need to understand more about public, private and hybrid cloud solutions.  Software testing companies will also look for software testing platforms to help them accelerate software testing.
  7. Agile/DevOps-Most companies will have Agile as the default software development lifecycle.  As Agile continues to take shape, companies will begin to focus on building DevOps practices.  This will also include Continuous Development and Continuous Integration.  The DevOps approach will help to build collaboration across teams and help increase the amount of automation used in developing and deploying software.  In addition, most companies will start to integrate Security as a part of their DevOps practice.

I hope this list helps.  It will be interesting to see where software testing heads as we look at 2017 and beyond.

If you would like more information on Agile, DevOps or Software Testing, please visit my Software Testing Blog or my Software Testing YouTube Channel.