Over the last year I’ve had the opportunity to attend a number of extremely interesting and mind-expanding conferences focusing on emerging and somewhat disruptive technologies and companies: APIs, mobile, cloud, big data – the works. Coming from a quality background, it has stricken me how little focus these companies give to testing. They talk plenty about continuous integration, agile methodologies, user groups and continuous deployment – but testing? Nope.
Why is that?
First, let me elaborate a little on what I mean (and don’t mean) by "testing." I don’t mean unit tests. I don’t mean BDD or TDD based on some cool-named framework using a custom DSL to describe scenarios. I don’t mean recorded and automated scripts of your website or application. Much of this is being done by many of these companies – which is great, and I’m positive it increases the quality of their products.
What I do mean with “testing” is testers who try to break stuff; who do those things with your software that users weren’t intended to do; who provoke the hardware hosting your application to behave like it usually doesn’t; and who continuously challenge the design and architecture of your products by thinking outside the box and applying this in a methodological and structured way. When it comes to quality, these testers will be your greatest pain and your biggest gain. They take the quality of your products to the next level by doing all that crazy stuff your users do (and don’t do), giving your team the opportunity to fix them first.
So, back to the question: why is it that these oh-so-crucial testers and the art of testing are so absent from these companies and conferences?
I have three possible explanations in my mind:
Strike One: Developers aren’t testers
Developers – I love you - but you aren’t testers. A tester’s mentality and talent when it comes to quality is to find defects and ultimately break stuff. Developers, on the other hand, want to make sure things work. This might sound like a small difference but the implications are huge. The “developer” of a steering wheel will make sure it turns the car left or right when you turn the wheel left or right (at the required rate/degree/whatever). The tester, on the other hand, will jerk the wheel back and forth to bring it out of balance, will submit the wheel to extreme (but plausible) conditions at the north pole and in the Sahara desert, and he might even compare the wheel to the competitors and tell you why theirs is better. Developers confirm, testers overhaul. That’s just how it is. Unfortunately, though, developers are usually at center stage in a development team, and often lack the insight into both the craft of testing and the time it takes to do it right. You need both on your team for your quality to be top-notch – neither can stand in for the other.
Strike Two: Agile Testing = Automated Testing
Don’t get me wrong - agile development can be fantastic when executed correctly, and has surely improved the lives of many a developer/tester/product-owner out there. Unfortunately, though, agile teams (at least in their infancy) often put testing efforts into the hands of developers (see point 1), who often believe that you either need to be able to automate all your tests or that a BDD/TDD specification is a valid substitute for testing. Neither is correct. Using a BDD/TDD specification as a test is just another way of checking that your software performs as required/designed/specified. And, as already argued above, exploratory testing is key to finding those out-of-bounds conditions and usage-scenarios that need to be fixed before users encounter them.
Strike Three: Cheap-skating quality
OK – you’ve convinced your agile team they need to do exploratory testing during their sprints, and your developers have reluctantly agreed that they aren’t testers at heart. So what happens when you approach the management team with a request to hire an expert tester?! Hands in the air if you think they might answer something like:
- "We have a deadline – we need to release – we’ll invest in development now and testers later."
- "Don’t our developers have 90% code coverage? Do we really need testers?"
- "Our users will help us iron out those out-of-bounds issues and quirks. That will be ample feedback for future improvements."
- <any other “explanation” that is based on the reluctance to spend money on quality>
No one raised their hands? Phew, that’s a relief! Otherwise, given the already stated arguments, this is an obvious and probably the most common mistake. Your story-telling talents will be put to test. Hopefully you can convince management to make the investment.
What to learn from this mini-rant? To put it simply:
- Understand that testing, just like development, is a craft of its own
- Cherish your testers and their expertise
- Invest in quality – your users will love you for it.
Before I leave you, I have a confession to make: I’ve been guilty of all of these three. I’ve been the developer thinking that his code and unit tests will be good enough to handle out-of-bounds users and behavior. I’ve been on agile teams voicing the “holisticity” of automated tests. And I’ve released features to users with the expectation that they will help us iron out the final issues instead of hiring a skilled tester to do the same (users, testers, everyone: I’m sorry!).
Trust me, don’t make these mistakes with your products. Hire some kick-ass testers that turn your product upside down for you. Your users and customers will love and cherish you (and so will I).