The product model is rapidly gaining traction within the software development lifecycle. Companies continue to look at options to improve the software development delivery process. I am seeing more and more organizations move in this direction. The challenge involved is determining how software testing will play a role in this process.
The product model leverages the agile methodology to help develop a product or series of products. Typically the teams are 6-8 people and they develop solutions in short iterations (Sprints) and deploy small code changes into production on a frequent basis (2-4 weeks). The team typically will have business analysts, development, and testing. The numbers will vary based upon the needs within the product team. I see the product model working well in companies that are small to mid-sized and have a relatively lean IT organization. In addition, I believe those companies that have a few products will be able to perform the best. If you have a lot of products and those products interface with each other, you will have significant challenges with integration and moving data from one system to another. This will require extreme collaboration to ensure interfaces and data are provided when they are needed, otherwise the release going to into production could potentially get delayed.
The product model team is typically driven by a product manager or product owner. That individual is responsible for providing direction and helps to remove obstacles. It is important that the team works well together so they can increase their velocity in terms of software products. If there is one or two team members that aren’t pulling their weight, it could affect the entire team. Either those team members improve or they will need to be replaced with those who have the proper skills. The team also must be self-sufficient and they need to do everything possible not to be dependent on other organizations either inside or outside the organization. Certain pieces such as infrastructure setup might be outside the team but it requires constant collaboration to make things happen. If there are multiple products, those teams will usually roll up to a program manager.
In terms of software testing within the product model, it will involve functional, automation, and performance. Some software testing organizations may differ in terms of test automation and performance activities. I have setup a shared service where the functional testers will reach out to automation and performance resources who can build and execute automation and performance tests. I have also seen examples where the automation and performance activities are accomplished by the testers within the team. As long as those activities get done, it should work well either way.
Software testing has been rapidly growing over the past several years and here are some software testing trends to be aware of in 2017.
More Open Source Tools-As more pressure is placed on reducing expenses, more software testing teams will look at alternative options in reducing software testing licensing. Adoption of open source tools will continue as Agile and DevOps practices evolve. This software testing trend will continue for the next several years.
Test Automation Growth-Most software testing organizations already have some test automation practice in place, however, there will be a focused effort to increase the amount of automation that is in place. Companies will continue to focus test automation in the areas of smoke testing and regression testing. Where companies can, they will use automation tools such as Selenium and Appium.
Performance Engineering-There will be additional attention in the area of performance engineering due to additional data demands and production level failures related to system performance. Companies will begin to require performance testing to be completed prior to production implementation.
Digital Transformation-More attention will be place on customer satisfaction and this will require a digital transformation with the customer workflow. This will require extensive amounts of software testing to ensure that the customer has a positive experience. This transformation will go across multiple systems and require security in terms of the customer data. In most cases there will be a mobile component to the digital transformation.
Big Data-Explosion of demand for data due to IoT devices and additional data in order to make better business decisions will require new technology to support the additional capacity. Software testers will need to be able to learn more about Big Data and will need to verify that the data is correct before it is consumed. Platforms such as Hadoop will need to be learned. This will require a solid test data management practice.
Cloud-Continue cloud adoption will grow as more traditional companies such as insurance and finance begin to slowly transform their business to the cloud. This transformation will continue to evolve for the next several years. Software testers will need to understand more about public, private and hybrid cloud solutions. Software testing companies will also look for software testing platforms to help them accelerate software testing.
Agile/DevOps-Most companies will have Agile as the default software development lifecycle. As Agile continues to take shape, companies will begin to focus on building DevOps practices. This will also include Continuous Development and Continuous Integration. The DevOps approach will help to build collaboration across teams and help increase the amount of automation used in developing and deploying software. In addition, most companies will start to integrate Security as a part of their DevOps practice.
I hope this list helps. It will be interesting to see where software testing heads as we look at 2017 and beyond.
In Agile the team normally consists of the BA, Developer, and Tester. It some organizations, the tester is responsible for manual, automated and performance testing activities. That is a lot of responsibility for one person but it does happen in small IT organizations. In larger organizations it gets a little more complicated. I have setup multiple testing departments and I create a shared services model for both automation and performance work. The tester works with those teams and identifies automation and performance testing efforts. I do require that if the test case is going to be automated, it needs to provide the right amount of detail so the automation team will be able to script it properly. I also like the manual tester to execute and pass the test case before it is automated. Input in terms of the stability of the code is also needed, otherwise you can burn a lot of hours having to constantly update changing automated scripts. The beauty of this model is it allows multiple Agile teams to leverage automation and performance skills. As more automation and performance testing is needed, those shared services team will grow creating a model that can scale.
The Agile teams don’t typically need dedicated resources for automation and performance testing, so hours are only used when needed. To keep in line with the Agile methodology, I have both my automation and performance teams create Sprints for the work that is needed so I can closely measure how things are going within a given timeline. Since the effort is shared across the team, they follow standard Agile principles and have their own series of meetings, demos, and retrospectives to ensure communication is flowing across the team. They also may participate in some of the meetings such as daily stand ups for the projects they are supporting.
I have adjusted this model a bit to fit the needs of each organization. It has worked well and I will continue to use it moving forward. The teams like the approach and it has helped to build stronger teams using this approach.