One of my pet peeves in the industry is the often very lackluster ways I see testers being interviewed. So let’s talk about that a bit.
I already talked about this more than a little bit while discussing looking for broad skills in technical testers. I’ve also discussed my views on how testers, like most other professionals, are more than just their skills.
Speaking from my immediate experience, one interview I recently had for a Chicago area opportunity — ostensibly for a “test position” — I was asked about DNS and VPN and virtual memory and what the difference between an abstract class and an interface was. Now, whether or not I have picked those things up in my career, it has little to no bearing on my skills as a tester, even a so-called “technical tester.”
Do you agree? Disagree? Well, let me start with my context when I’m interviewing someone.
The key to all interviewing for me is simply this: limited time, thus limited information. So the goal has to be getting information you can use. Do the questions you ask, and the answers you are likely to get, tell you something that helps you make a hiring decision? For any question I plan to ask of someone, I ask myself the following two questions, under the (admittedly unrealistic) assumption that I may only get to ask that question:
- Am I willing to hire someone because of a good answer to this question?
- Am I willing to reject someone because of a bad answer to this question?
Unless my response to (at least) one of the two questions is “yes,” there’s no point in asking the question. A question that does not fall into these categories may tell me something about the candidate (most questions will) — but will it tell me something I can act on? My rule of thumb for determining what to ask is pretty simple:
Am I going to gain operationally useful information based on the answer to this?
The overall heuristic for me is thus to try to gain the operationally useful information with the minimum of questions. This provides a focus for how I ask my questions.
Okay, so that’s my basic context.
Now obviously I have to be flexible in applying this. While I can pretend each and every question may be the only question I get to ask, the reality is I will get to ask many more than one. But that doesn’t change my focus of treating each question as if it might be the only one I get to ask.
Speaking specifically for a tester role, I need the candidate to convince me they have “enough” of a well-rounded experience in testing and/or quality assurance such that, at the very least, they …
- … can adequately describe how to effectively test, both in writing tests and in executing them.
- … are aware of different ways that testing can be performed.
- … have been in the proverbial trenches (doing the work, feeling the pain, enjoying the discovery).
Obviously I calibrate what “enough” means with the candidate’s length of time in the industry and their length of time in the particular roles of testing. A junior-level person is clearly going to have a different experience-base to draw on than a senior-level. That means you do need some normalizing questions that anyone should be able to answer. Here are some of mine:
- What’s the value of testing?
- How do you distinguish a good test from a bad test?
- What’s the best way you’ve found to organize and structure tests?
- What do you think are some necessary skills that good testers must have?
- How do you keep up to date with your knowledge on how to test software?
- What’s your view of the proper relationship between tests and requirements?
- What are some common misconceptions you’ve encountered about testing?
I will sometimes try leading questions to see how they answer:
- Tell me why test plans can be a bad thing.
- Tell me why randomness in testing is always a bad thing.
- What would make detailed, step-by-step test cases detrimental?
- How can automated testing (vs. manual testing) derail a project?
My goal there is not to gauge whether they agree with what I stated, but rather how they respond. In fact, I like if the candidate challenges the statement itself based on their experience.
Some other points I like to keep in mind.
- If the candidate has had lots of experience working with test solutions within tools like Team Foundation Server (TFS), Microsoft Test Manager, Test Director, Quality Center or any of those types of tools, I want to make sure they can think outside of those tools so I’ll focus on that a bit.
- If they have “this-and-that” in their resume (i.e., “performed black box and white box testing”), I like to check on a few of those and ask them the distinction. In particular, if they mention something like “performed functional and regression testing”, I’ll ask them what the difference is between the two.
- If they have been in the industry at least over 1.5 years, and particularly if they have titles on their resume like “QA Analyst” and “Test Engineer”, I’ll ask them what they perceive the distinction is between Quality Assurance and Testing — and that’s assuming they perceive such a distinction at all.
- If they have been in the industry for five years or more, and if there has been a majority of focus on either development or testing, I’ll ask what trends they have observed in the industry in that time, particularly related to how testing is done or how testing is structured.
- If they have been a developer and a tester (in whatever order), I’ll ask them what differences in thinking and practice they have found they had to apply as they have transitioned between such roles.
- If they have the word “expertise”, I tend to jump on that. People tend to throw this word around quite a bit. So if someone uses the word “expert” or that they have “expertise”, I will at minimum ask them “What’s the difference between someone who has proficiency vs. expertise?”
- If the role is for a more technical-focused role (meaning, building automated test solutions), then I also make sure their technical skill can be discussed coherently and with a clear focus on having actually done the design and construction of test solutions rather than just the use of pre-existing ones. If they claim they have used a tool, then I want to make sure they have actually used that tool, ideally from setup/installation through full use.
I was asked nothing even remotely like any of this on the recent interview I mentioned. Nor was I engaged in anything even close to beyond surface level questions around testing.
The company in question, who of course shall remain unidentified, cannot, based on the feedback given by the interviewers during the phone screen, have any idea how I view testing as an activity or tests as an artifact. Beyond what I chose to share, in spite of their non-probing questions, they really have no idea of the philosophy behind my view of testing and how I translate that philosophy into action via a series of techniques, some of which I encode in test solutions that I write. I believe my resume gives plenty of “hooks” into just these very topics.
In short, based solely on the questions they asked, they don’t know if I can actually solve problems related to testing.
But, hey, at least they found out I knew what a DNS server was and that I’ve used VPNs before, right?