Recently I have run a number of workshops for my testing colleagues at Trade Me. One of the over-arching themes of these workshops was to help improve the way testers tell the story of their testing by becoming more conscious of the test techniques and heuristics they apply when they’re performing exploratory testing.
To aid this, I created and handed out some heuristic testing dice to each tester:
While partly a bit of fun, these dice collect together a series of exploratory testing heuristics to act as mental triggers, intended to:
- provide inspiration for the tester in the midst of exploration, whether because they want to switch up their strategy or because they’re having a mind-blank in their quest to uncover useful information; and
- encourage testers to be conscious of the heuristics that they use in their own testing, and to apply similar labels to them, enabling them to be used more consciously and thus more strategically and deliberately in service of their testing mission.
Since the workshop when I handed these out, I’ve had numerous testers approach me with the revelation, “I do this! I just didn’t know it had a name!”; or propose alternative heuristics that they have identified and given their own labels to – which is exactly what I had hoped would happen.
These heuristic dice are a reminder that to tell our testing story, we need to develop a language in which we can credibly construct it. To achieve that we must ourselves first be conscious of the thinking that happens when we are performing that testing. And sometimes, that might mean that we have to roll the dice.
Given this response, I decided that I’d share the nets for these dice here on my blog too. The dice as I created them are available below – but I’d also encourage you to have a think about the heuristics you find most useful in your own testing. Can you identify them, and create your own personal heuristic testing dice?
Note: I’d recommend printing these on heavy stock paper to add some more heft to the finished article. Double sided tape or some high-adhesive glue have proven effective construction materials.
As a reference, here’s a glossary of the heuristics included on the dice, since some of them are potentially pretty absract or obscure:
|Heuristic||What does that mean?|
|Focus / Defocus||This heuristic is about switching up your focus level: if you’re zoomed inlooking at some detail, zoom out and look at the big picture. If you’re thinking about mutliple variables, try focusing more tightly on one or two.
Credit: I was first exposed to this heuristic via Rapid Software Testing with Michael Bolton.
|Go galumphing!||Galumphing is a style of interaction where the tester deliberately behaves in an over-elaborate way, performing high volumes of inert actions that shouldn’t affect the outcome, but sometimes does.
Credit: this was identified by James Bach, here and here.
|Variable analysis||This heuristic encourages you to consider the different types of variables that might be at play, and how you might manipulate them. Consider especially the subtle and indirectly accessible variables.
Credit: Elisabeth Hendrickson identified different types of variables in her wonderful book Explore It!
|Push the boundaries!||Pretty obvious: identify where there might be boundaries between how the code handles different variables, and test ‘em.|
|Choose your weapon!||This heuristic encourages the tester to switch up the input device used to interact with the product. Mouse, keyboard, touch pad or screen? How about voice command or gestures?!|
|Be a rule breaker||What constraints or validations should exist for your product or feature? Can you violate them? Rules are made to be broken, after all…|
|The toddler heuristic||This heuristic invites the tester to imagine they are a user who is baby-sitting a toddler – and to continually interrupt their workflow with other tasks or distractions, and observe any ill effects.|
|Goldilocks heuristic||Try values that are too big, too small, or just right!
Credit: Elisabeth Hendrikson’s Test Heuristics Cheat Sheet.
|Lévy Flight heuristic||A Lévy Flight is a type of walk that is comprised by series’ of short movements in one region, before a longer jump to a new region, where multiple short movements occur, and so on. Try similar “movements” within your product.
Credit: James Bach introduced this idea in his keynote at CAST 2014.
|Stop! Hypothesize!||This heuristic encourages the tester to pause their exploration and formalise or make conscious the current experiment they are performing, and what they’re trying to prove or disprove. This can help ensure we’re on mission and not aimlessly wandering.|
|Half-time heuristic||At half time in the FIFA world cup final or the superbowl, electricty grids are flooded when people turn on their kettles en masse. Is there a similar scenario where your product may get a sudden rush? Can you simulate this effect?|
|Bus Stop heuristic||You wait ages for one bus, then two come at once… the same thing can happen with software. How does your system react when two users try to call the same code at the same time?|
|Fresh eyes find failure||Sometimes it pays to step away and return with fresh eyes, or even grab someone else and have them look over what you’re testing – a fresh perspective will often spot something you’re too close to notice.
Credit: this features in Lessons Learned in Software Testing by Kaner, Bach and Pettichord.
|Jackanory!||Jackanory, a famous British TV show, featured an actor reading a story; and this heuristic encourages the tester to consider how they could tell their own story about the testing they have performed, to effectively frame their testing.|
|Label-maker||This somewhat meta-heuristic encourages the tester to think about the techniques or approaches they’ve been using and to assign them their own heuristic labels.|
|Piñata heuristic||Michael Bolton introduced this as a heuristic for when to stop testing – when you’ve hit it so hard the candy comes out. No candy? Hit it harder!
Credit: Michael Bolton, ‘How Much is Enough?‘
|OFAT/MFAT||Switch between manipulating one (OFAT) or multiple (MFAT) factors at a time, to expose bugs that may be caused by different combinations of variables.|
|WWXD?||This heuristic encourages the tester to ask “what would a user [x] do?” and to interact with the software in that style, utilising personas to test from multiple perspectives.|
Note: I’ve credited the source of these heuristics where applicable. These sources are where I first encountered them, if you have an alternate source, let me know. Uncredited heuristics are either common ideas, or my own.