The Seven Deadly Sins of Agile Testing

by Brad on March 22, 2011

I had the honor of presenting at the 2011 SQuAD conference on The Seven Deadly Sins of Agile Testing. It was a highly interactive session where participants gathered in small groups to identify penances, or ways they can avoid the seven sins. I provided a set of cards with ideas, and participants generated their own ideas as well.  Below is a summary of the solutions the participants came up with for each of the seven sins.

Sin #7: Separation of Requirements and Tests

Sin #7 may not be quite deadly, but it definitely is a big drag on team effectiveness. When requirements are separated from tests, it’s difficult to know which one is the source of truth, and maintaining the dreading traceability matrix is time consuming. Here are some ways to cleanse your soul from this sin.

Analysts & testers are best friends!

If we want to reduce the distance between requirements and tests, we need to reduce (or eliminate) the distance between the people responsible for them. Analysts can learn how to write tests, and testers can learn how to think like analysts. Closer collaboration between analysts, testers and programmers is a core value of agile methods and will help in many ways beyond this one.

Create an executable specification

The ultimate way to eliminate the separation between requirements and tests is make an executable specification.  The tests are the spec. This provides a single source of truth. Requirements can be written in the form of tests. If you’re using a test management tool, use it also as your source for requirements. Even better, with tools like fitnesse and cucumber, these executable requirements can be automated.

Specify requirements by example

Don’t limit requirements to abstract descriptions of rules and conditions. Give a bunch of specific examples to illustrate the requirements. Tools like fitnesse make this possible. As Albert Einstein said, “Example isn’t another way to teach, it is the only way to teach”.

Practice ATDD and BDD

Acceptance test-driven development and behavior-drive development codify the notion of executable specifications.

Team commitment to TDD and CI

Test-driven development and continuous integration are practices that provide a solid foundation for creating and executable specification.

Sin #6: Testing is one “sprint” behind coding

Scrum defines the output of a Sprint as a “potentially shippable product increment”. If it’s potentially shippable, it must be tested. Period. Here are some ways to atone for this sin.

Make user stories smaller

See Richard Lawrence’s great post on patterns for splitting user stories.

Here are some more ideas to bring testing and coding into the same sprint:

  • The Definition of Done for backlog items includes testing
  • Keep QA involved throughout the project – especially at the beginning of each sprint
  • Whole-team responsibility for quality
  • Closer collaboration between programmers and testers
  • Create an executable specification (see Sin #7 above)
  • Specify requirements by example
  • Include testing effort in release planning
  • Team commitment to TDD and CI
  • Minimize the effort to setup and deploy test environments
  • Programmers deliver working code to testers early in the sprint/iteration
  • Co-located teams
  • Pair development
  • Persevere through challenges
  • As a last resort, you may consider increasing the sprint length, but this option has many disadvantages also.
  • Consider a Scrumban/kanban process that is flow-based, rather than iteration-based. Even so, strive to keep backlog items small!

Sin #5: Unbalanced testing quadrants

The agile test quadrants define different types of testing necessary on most projects: unit testing, acceptance testing, exploratory testing, and “property” testing (performance, security, etc.). Our participants identified a few ways to absolve a team of this sin.

  • Balance the testing quadrants in your Definition of Done – at the user story and release level
  • Categorize tests to analyze quadrant coverage, determine proper weighting of each
  • Include testing in release planning
  • Automate acceptance tests, allowing more time for other quadrants
  • Keep QA involved throughout the project
  • Exploratory testing included in acceptance testing with the customer
  • Outsourcing or in-sourcing specialized testing (performance, -ility) earlier in project

Sin #4: Ignoring Test Failures

Strong agile teams tend to have a lot of automated tests that run multiple times per day via continuous integration. All that effort might be for naught if you don’t do anything when one of those tests fails! Pay your penance, as suggested below.

  • “Stop the line” when tests fail
  • The Definition of Done for backlog items includes testing
  • Stakeholders participate in testing
  • Invest in robust automated tests
  • Incentives for clean check-ins. A little peer pressure or friendly competition can help establish a culture where clean check-ins and passing tests are the expectation of everyone.

Sin #3: Lack of Test-Driven Development (TDD) and Continuous Integration (CI)

TDD and CI are core practices from Extreme Programing and agile teams will quickly hit a brick wall without them. Here are some ways for a team to get on the path toward righteousness.

  • Achieve an explicit team commitment to do TDD and CI
  • Invest in legacy test automation
  • Make automated tests robust
  • Practice ATDD and BDD
  • Keep metrics on important quality indicators and monitor trends; use these metrics to demonstrate ROI on TDD and CI. Typical measures include rate of CI build failures, test coverage, manual effort level per test cycle, escaped defect counts, and customer satisfaction.
  • Prioritize test automation efforts to maximize ROI
  • Stakeholders participate in testing
  • “Stop the line” when tests fail
  • Include testing in release planning
  • INVEST in user stories; they should meet the INVEST criteria. Make them small and testable.

Sin #2: Separate QA team

Scrum teams are cross-functional, meaning they include all the people and skills necessary to deliver a working product. The whole team is responsible for the quality of the product. Clearly, QA is part of the development team, rather than a separate team. Here are some virtues to strive for.

  • Cross functional teams include QA
  • Keep QA involved at all times
  • Whole-team quality responsibility
  • Closer collaboration between programmers and testers
  • Include testing in release planning
  • Include QA activities in the product backlog
  • Co-located teams
  • Open Communication

Sin #1: Waterscrumming

The last and perhaps worst of the seven sins is being Agile in name only. High quality, end-to-end tested features are the output of every iteration/sprint. While large programs with multiple teams may very well need a single hardening sprint, multiple testing sprints is just another form of waterfall. Many of the penances listed for the other six sins also apply here.

  • Acquire deeper knowledge of the agile practices. Understand the lean and agile principles behind the process.
  • Do a pilot project. Be disciplined about all the practices and resist the temptation to modify the practices until you’ve first tried them “by the book” for a significant time.
  • Include testing in release planning. Testing is an integral part of the team and process, not an afterthought.
  • Include QA activities in the product backlog. Quadrants 3 and 4 in particular may require focused effort that is not directly related to individual features built during sprints.
  • Cross-functional teams include QA
  • Co-located teams
  • Whole-team responsibility for quality
  • The Definition of Done for backlog items includes testing
  • Close collaboration between programmers and testers throughout
  • Create an executable specification
  • Specify requirements by example
  • Pair development. In particular, pairing testers with programmers is a great way to cross-train and enable shorter cycles.
  • Prioritize test automation efforts to maximize ROI on test automation.
  • Invest in robust automated tests. Tests should be modular and robust in the face of changes in the system.
  • Better project management. Traditional PMs need to understand agile principles & practices and may need coaching to implement agile practices effectively.
  • Incentives for clean check-ins
  • Test First! Practice TDD, ATDD and/or BDD
  • Ask for forgiveness rather than permission. Can you take the initiative to apply agile practices within your sphere of control and then demonstrate the results, rather than asking for permission first?

I welcome input from my fellow sinners who have found their own paths to righteousness. Please comment with your ideas for salvation!

Richard Lawrence March 22, 2011 at 9:17 am

Thanks for the link to my story splitting post, Brad. Just wanted to let you know that there is an extra “http//” in the link, so it doesn’t actually work.

Brad March 22, 2011 at 9:30 am

Richard, I fixed the link. Thanks for finding my mistake!

Cliff Stockdill March 22, 2011 at 11:18 am

Great post Brad.
One thing to add to #6 to keep test up with the code, is to not be afraid of dropping scope/functionality from the sprint to avoid sacrificing testing. Too often when the clock starts to run out it is the testing that gets skipped or abbreviated in exchange for more functionality coding. It can be difficult to go back on scope commitment, but it must be communicated to the stakeholder that ‘coded’ functionality is not done until it is tested – so skipping the tests for more code doesn’t get him finished functionality.

Vin D'Amico March 26, 2011 at 5:33 am

You’ve made excellent points. Integrating the software engineers and the QA engineers into a single unit can be tough. Organizational boundaries and cultural issues get in the way.

There is also a pervasive attitude that QA’s job is to find bugs. If don’t find many bugs, they are not doing their job. This is so wrong. QA’s job is to evaluate risk. Finding 1000’s of bugs or finding none at all doesn’t matter. What’s important is that the software is rigorously and thoroughly tested so that everyone has a good understanding of what is being shipped.

Mark Dalgarno March 26, 2011 at 10:34 am

Lots to digest here.

A valuable post.

Adam Yuret March 26, 2011 at 10:37 am

Excellent piece. Not just critical of common mistakes in agile teams but actually provides constructive solutions. The linked article about splitting user stories is also great. I’ve forwarded these posts to my engineering team.

We’re still new to agile but it was great to see things we’re doing right and solutions to things we aren’t in your posts.

Thanks!

Zhu Wei April 14, 2011 at 2:20 am

Hi Brad, this is wonderful post!
For Sin #2, in the situation that the QA is not collocated with the dev team, we create a customized workflow in Jira (e.g. by adding a “ready for test” status for each story and visulizing it in the “Task Board” ) to integrate the QA and dev team.

Brad April 15, 2011 at 11:47 am

This article was re-published in the online magazine Tea Time with Testers: http://issuu.com/teatimewithtesters/docs/tea-time_with_testers_april_2011__year_1__issue_ii

Preston June 3, 2011 at 1:19 am

Thanks for this great post. I have not sinned yet, in this context, because I have not yet practiced ‘Agile Testing’ but I am sure I will have my share of sins when my company goes ‘Agile’ soon. Your list will then help me avoid as many sins as possible. Thanks again.

sue June 23, 2011 at 6:30 am

Dead on, great article. Thanks Brad, your ‘ 7 Deadly..’ presentation at MiHigh was one of the most attended! I volunteered at the conference and as I was the primary of the next session presenter, I was able to tell… your session was highly interactive! 🙂

Ed Felix December 10, 2012 at 2:25 pm

Great article! One of the best summaries I’ve read in a long time. I don’t know how many times I had to explain to testing teams that Waterscrumming and Sprint +1 aren’t very good Agile testing practices.

Andre October 16, 2013 at 6:43 pm

The problem I find is that if you write a test for everything or 80% of what you do, actually running this takes hours!!! So how do you know what to write for or what to run at any given time, I am more talking behaviour type tests here.

{ 1 trackback }

Previous post:

Next post: