Coming into my fifth year of Software Testing, I began to rethink it as a discipline. The current debate is between traditional methods of testing and more modern schools of thought:
- Entrenched methods are represented by the International Software Testing Qualifications Board (ISTQB) and its many local equivalents. Certification is their path to expertise, with accompanying wares of training sessions, books, tests and standards.
- Context-driven Testing focuses more on a set of tools and skills that typify testing. Advocate offer classes at conferences, but certs and best-practice are four letter words. They state that there are no best practices, and a tester knows best how to apply the tools to explore, experiment and learn about the system under test.
Something Just Didn't Feel RightI strode onto this battleground in 2011 as a new manager and new tester. Promoted from an application integration team, I was used to working with outside developers while using and abusing buggy product. This did little to prepare me for the reality of testing: limited time and endless defects! I dove into the online community in the hopes that it would help sort the good from the bad. What I found to be the central influences were conferences, consultants and blogs.
At testing conferences, half the talks were advertisements for bug trackers, test case repositories and automation frameworks. These were great for managing a project, but they didn't support the essence of testing: finding more defects. Expensive tutorials before the conferences showed similar taint: how to use this tool, best practices for testing this specific type of thing, certification in a process and not a portable technique. The cost was the most surprising thing: Thousands of dollars for something I couldn't justify to myself, the eager tester with a training budget to burn.
Delving into webinars brought more despair. A demo of the latest automation tool invariably lead to a pitch to get me to purchase something. The topic purported to discuss principles, but I ended up in the morass of industry jargon. I learned how to write, automate, and justify my time, but I was no closer to actually finding bugs in a more efficient manner. And what's more, I was spammed by vendors for months after. I found zero techniques that were universal.
Finally, the blogs showed a glimmer of hope. Some people wrote about techniques. Others wrote about test cases and how they managed their overhead. Still others advocated that QA should be involved earlier in the development process. Nowhere did anyone extol the virtues of their test management system, bug tracker or automation too in helping them find bugs. This was a breath of fresh air, but it still felt stunted and directionless. My closest analogue, software development, spawned a generation of people applying agile to everything from families to manufacturing, but there wasn't a similarly powerful framework for testers. I started to feel that no one was excited about my new career outside of the context of getting paid for it.
This muddled world left me questioning: Shouldn't there be some "ethos of testing" to unify our efforts just like in agile software development, lean manufacturing, and so forth? Why do vendors have a place at the table at industry conferences? Why isn't anyone embarrassed that the main competitor to major test management software is a spreadsheet? Who cares about my expertise and not just my dollars?
"Answers" and AnswersFor a long time, I thought the answer was in certification. Surely, if ever there was an organization that could be a cheerleader for quality, it would be the ISTQB. However, the reality is much different. Manuals are filled with process, conferences host vendors and not practice sessions, and training classes are about extracting fees and not learning techniques. The certification exam that guarantees your resume goes to the top of the pile and organization that proctors it is a laughing stock.
The alternative came through an unlikely route: Twitter. Long a tool of celebrity publicists and companies looking to engage directly with individuals, Twitter also has a reputation as the way to communicate within a subcultures. Have an idea? Publish it in under 140 characters. Want to learn the pulse of an industry? Follow its leaders. Computer security wonks, hackers, and now testers joined my follow list, and I was soon introduced to a new debate: is certification a waste of time? I'd found my people.
The new school touted something called Context-driven Testing. Instead of best practice, effective testing was supposed to be driven by context. Instead of test cases, a set of tools were taught that could be used depending on the product (mainframes are tested differently than mobile devices). Even among superficially similar products, the most effective testers would make judgement calls based on the needs of the customer and the time available. Testing was not a set of rigid processes, but a scientific exploration of the software. The knowledge gained by testing increases the confidence of the organization in the software. In other words, testing challenges the assumptions made by the developers that the product works in the time we have available. This sounded like the meat I was looking for, but the results were amazing too.
In an experiment, I had our work study group put down the ISTQB manual with its test cases, and I instead introduced them to exploratory techniques. We first learned about the requirements and returned a bunch of questions to the developer. Then we tested without scripts and tracked our coverage on a mind map. It was the first time we had been prompted to field trial a technique in our study session. The best part was that a person with very little previous experience in testing was able to pick up the technique almost organically.
This revelation about software testing was what we were all looking for, and it was delivered through experience instead of pronouncement. James Marcus Bach, one of the proponents of Context-driven Testing compared the ISTQB and certification organizations to the medieval medicine of Galen. People wrote down what testing was and proceeded to bleed their employers without knowing why it wasn't finding bugs. Testers were outsourced instead of valued as their techniques were old or ineffective. Yet in spite of all this, the consultants and conferences kept making money printing outdated works of questionable value. Once context-driven techniques come to light, the old ways start dying. We can only hope this continues so that meaningless certs are no longer valued by testers, managers and HR alike.
Where Next?After stumbling through the world of testing for a few years, I have abandoned certification as a path to expertise. As with computer security, network administration and technical support, certifications are a poor way of communicating true expertise. This revelation places testers firmly in the camp of indispensable elements of the development organization. They are not monkeys running scripts but knowledge workers with a valuable investigative skill that challenge the product from all angles. They cannot be outsourced if you hope to be successful, and they cannot be replaced by automation.
I am beginning a new training regimen with my testing colleagues based around Context-driven techniques. We hope to learn the techniques and apply them to our current projects and continually grow our skills in this new framework.
- A Context-driven Testing manifesto of sorts
- Black Box Software Testing, Coursework in Context-driven Testing
- Rapid Software Testing, James Marcus Bach's courses on Context-driven approaches: