Session-based test management - Part 4

20 November, 2017

So you’ve collaborated with your team and created a feature file full of examples, you’ve analyzed features to identify potential risks to test for, and you’ve carried out your first exploratory testing session based of one of those risks.

So you’ve collaborated with your team and created a feature file full of examples, you’ve analyzed features to identify potential risks to test for, and you’ve carried out your first exploratory testing session based of one of those risks. What happens next? Before you jump into your next exploratory testing session, you will need to complete the SBTM process with an exploratory testing session debrief.

What is a debrief?

A debrief is a discussion around a recently completed exploratory testing session between two people, the:

  • Reporter - or the person who ran the exploratory testing session

  • Reviewer - or the person who learns about what happened during the exploratory testing session

During the debrief, the reporter shares information such as what they did and didn’t test, what they learned during testing, what issues they faced and what bugs they raised. 

As the reporter shares this information the reviewer asks questions based on the reporter’s feedback, which can help you:

  • Make an informed decision - If you are a product owner or someone who is responsible for making decisions about the progress of your product, being informed of your product is vitally important. A debrief will give you confidence in the positives and negatives of the product to allow you to plan the next steps.

  • Identify additional exploratory test sessions - It may be that the reporter shares that they couldn’t test everything in one session due a wide range of influences such as time, environment or other distractions. Discussing what else is required may result in a new test session for the same charter being scheduled.

  • Identify new risks - As new information is discovered, new risks might be found which in turn will mean new exploratory test sessions will need to be carried out. You may also discover that an exploratory test session actually covered multiple risks meaning other exploratory test sessions are no longer required.

  • Discover opportunities for automation in testing - Repeated actions or activities that consume large amounts of time might be identified for automation, such as setting up an environment, creating data sets or running data-driven API checks against an HTTP endpoint.

Getting started with debriefs

Running debriefs is an activity that doesn’t come easily, it requires effort from your team to adopt debriefs and run them successfully. You and your team need to encourage an environment of collaboration and communication and ensure debriefs after each test session are run regularly. You may want to consider adding changes to your agile board or workflow.

Try to track when debriefs were and weren’t carried out and why. When you first get started with debriefs you will want to regularly take a step back and assess how adoption of debriefs is going, and metrics might help the conversation. Use retrospectives and team gatherings to ask what is and what isn’t working during your debriefs, what might be preventing you from carrying out effective debriefs, and how can you improve your debriefs to make them more valuable to the team.

Running a successful debrief

From the perspective of the reporter

A debrief is not a critique of your skills as a tester, but an exercise in sharing what you’ve learnt, using that learning to identify other testing activities and enabling the reviewer to make an informed decision (if they are in a position to do so). The debrief is an opportunity to tell the story of your exploratory testing session.

Once you have finished your session you should review your test notes and session report to ensure that you are prepared for a debriefing. We’ve provided a checklist of things you will want to ask yourself about your notes and session report before heading into a debrief session:

  • Have you compared your session to previous sessions that contain the same charter to determine differences in test coverage?

  • Have you added in timings for setup, execution and investigation?

  • Have you considered and recorded the quality and coverage of your session?

  • Are your notes in a state that would enable you to retell your testing story in the future?

  • Have you raised all the bugs you found during the session?

From the perspective of the reviewer

The goal of the reviewer is to learn about what has been discovered during the exploratory testing session, help the reporter tell their story, ensure the exploratory testing session was a success and whether any further actions are required. This means using skills such as active listening and questioning techniques to get as much value out of the debrief as possible.

Asking the right questions is key to getting the most out of a debrief, so that you’re in a position to make informed decisions. However, it takes skill and practice to ask the right questions during a debrief. We’ve compiled a checklist of questions to ask yourself to ensure you’ve got everything you need from the reporter. These are not questions you would ask the reporter but activities to check have been fulfilled:

  • Have you been debriefed on positive information that was discovered during the session?

  • Have you been debriefed on negative information that was discovered during the session?

  • Have you been debriefed on any bugs that were found?

  • Do you feel you have been given sufficient information related to the charter to make a decision on releasing?

  • Have you discussed how the current session compared to previously executed sessions related to the same charter?

  • Have you confirmed that all required data has been added to the session report?

  • Have you discussed any issues that might have affected the setup, execution and investigation timings?

  • Have you discussed any new risks that were discovered?

The checklist we’ve created is a great place to get started with debriefs but you might want to consider adding in your own checklist items. Ours were based on the structure of SBTM session reports as well as James Bach’s own checklist. Keep in mind, though, that choosing the right questions or creating a checklist can be tricky. There is an excellent book called The Checklist Manifesto by Atul Gawande that you might want to read to learn how to build a checklist that works for you and your team.

Debriefs in summary

Debriefs can be a very effective and valuable tool for a tester and their team but they take time and experience to get right. If you find them hard to run at first don’t be disheartened, just remember these summary points and you and your team will be expertly sharing experience before you know it:

  1. Encourage regular debriefs after every exploratory testing session

  2. Foster a safe culture that promotes collaboration instead of criticism

  3. As the reporter, come prepared with good testing notes

  4. As the reviewer, use active listening and questioning techniques that open up the discussion

  5. Keep reviewing the quality of your sessions and work as a team to improve them and make them your own

Written by Alan Parkinson

More articles by Alan

  • You may also like...
Install now

I’m ready to install Behave Pro

Start your free evaluation and install Behave Pro from Atlassian Marketplace.

Install nowInstall now