I see these reasons why Testing and Development are converging:
- Automated tests will only be run if they are fast, and fast tests require all of the tricks of the trade involved in unit testing.
- Automated tests will only be maintained if the entire team is invested in them, so they must be reviewed by the team and understood by the team, which requires the whole team to have development skills.
- Developers will only run and maintain automated tests written by testers when they change the features under test if the scripts use the same language, frameworks and tools as their automated unit tests.
- Automated test are more efficient than humans for doing the highly repetitive testing using combinatorics to cover huge matrixes of contingencies.
- Quality is not just 'Does the feature work?' but is also 'Does it leak memory?', 'Is it fast?', 'Is it secure?', and 'Is it scalable?'. Questions like that that require large numbers of datapoints spread out over a great deal of time (or very small units of time) are better measured by software than by humans.
- In an agile world where there are no written requirements documents (or tracking documents get lost / out-dated within 2-3 sprints), you don't measure coverage by matching requirements to test cases, you measure coverage with code coverage tools.
- The human perspective provided by QA Engineers doing exploratory or acceptance testing is important, but it does not allow for much career growth.
- In small teams where the same tester would end up manually testing the same feature over multiple releases, the benefits of human eyes would be decreased and automation would very likely be more thorough and less error prone than human testing.
I also think QA Management and Product Management are converging. For big projects, QA Teams need to not only provide test plans and report defects originating from the test cases, but also create automation suites that meet the following requirements:
- Ensure exact pre-conditions and clean post-conditions, even in the event of failure.
- Can be run by Continuous Integration, other teams, or people across the globe.
- Are re-usable / can be maintained over multiple release cycles, by other teams.
- Are tracked by the same version control process being used by development.
- Provide results that can be interpreted by contractors, new hires, and/or people that stay behind when the writer goes on vacation.
- Can be run in parallell on the same machine or over different machines.
- Do not conflict with other tests written by other teams being run at the same time or on the same equipment.
- Interface with other systems in the SDLC (bug trackers, requirements trackers).
- Can be multiplexed to provide a variety of load scenarios.
- Run within the time limits imposed by the release cycle.
These requirements may be more complicated than those for some commercial development projects, and leading a team that can deliver an automation suite like that is going to be a lot like being a PM on a software project.
I might also be able to make a case for QA Engineer and Tech Writer converging. Trying to write acceptance test plans that are detailed enough to be outsourced and keeping them up-to-date is a great deal of effort. Keeping customer-facing documentation up-to-date is also challenging. These two efforts could be combined if acceptance tests were written up as a list of workflows the software needs to support, the QA Engineer responsible for writing the tests ensures that the software is well-documented, then the contractors running the tests could be verifying that the customer is able to learn how to perform the workflows given the information provided in the user guide.
I see lot of these convergence happening with more and more teams adopting agile. Also, there is a paradigm shift in 'Who owns Quality'? It's no longer (in reality it was never), a QA job ONLY. Quality has to be seeded into the product right through inception and developers play a big role in that.
ReplyDelete