By Lisa Crispin
Last May, we were asked to enlarge our small team by hiring another tester and another programmer. We’ve made a few hiring decisions in the past which turned out to be mistakes, so we decided to enhance our interview process. After a lot of brainstorming and research, we came up with some ideas to try. Today we got to use our new process for the first time when we interviewed a tester candidate.
We’ve always done group interviews, where all team members are in the room, and we all ask questions, focusing on open-ended, behavioral style questions (see my article about hiring agile testers for examples). We do this for about 45 minutes, also allowing time for the candidate to ask us questions. Then, we hook up a laptop to the projector, navigate to an interesting page in our UI, and ask the candidate to do manual exploratory testing on it, explaining to us what she’s doing and why as she tests. We also asked the candidate to come up with a simple SQL query that involves joining two tables, and for simple Unix commands (“How do you navigate to a directory, how do you see what files are in the directory).
When we discussed what’s important to us in a tester candidate, some criteria came out that we couldn’t evaluate well with our existing process. We need a tester who has some programming experience (though she doesn’t have to be a programmer), who can maintain her own test environments, who knows how to work with customers in an agile setting to flesh a user story out with examples and requirements. Here are the activities we added with those goals in mind.
We had someone play the customer role, describe a user story in the “As a… I want… so that…” format, and asked the candidate to talk with the customer and get requirements for the story. The candidate today asked some great questions, covering not only functional requirements, but other considerations such as response time and load. In fact, I was rather awestruck at how quickly the candidate fired off questions and thought of many different aspects of quality. We stopped him after about 5 minutes because we could tell he knew how to ask good questions.
Then we asked the candidate what tests he thought he might automate for that story. He started with the happy path, then came up with good boundary and edge cases for regression. Again, we stopped him after 5 minutes.
The next part of the interview was the one we were most nervous about. We had prepared a straightforward automated GUI test script in the tool we use, Canoo WebTest. We walked the candidate through the actual GUI to show what we were testing, and walked through the test. We had included in the test one module and one variable, to show that the tool supports these concepts. The rest of the script was all hard coded and included a lot of duplication, which we hoped was obvious. We asked the candidate how he might refactor the test to make it easier to maintain. He immediately grasped that the tool allowed extracting duplication into modules, and had good insight into where variables would work better than hard-coded values. He even came up with an idea to drive the test with different input values. We felt comfortable that he has a good understanding of basic design patterns and principles, such as Don’t Repeat Yourself.
Finally, I paired with the candidate to deploy a test script. Our plan was to have him read the instructions on our wiki for this, and try to do it himself, but we were short of time, so I talked him through it. First we showed him our Jenkins build UI, and the terminal window for the server where we would deploy. We asked how he might get the file to deploy, and he suggested FTP, which was a good answer. We actually copy the link for the file in the browser and use wget, so we talked him through that. It was obvious that he was well-versed in Unix commands, as he easily navigated to the right directory, renamed the file as we requested, even set his editor to vi so he could edit previous commands. We had no doubt that he is capable of learning fast, and can maintain his own test environments.
At the end of the two-hour interview, our candidate commented (un-asked!) on how much he liked our interview process. He liked the group setting, and he enjoyed the various activities we had him do. It was awesome to get this unexpected feedback.
We did our own little retrospective later, and felt we should have prepared better – we wasted some interview time in setting up the laptop and projector. We also need to watch the time better, and stick more to our planned agenda in terms of how much time to spend on each part of the interview. We all rated the candidate on some various aspects, such as his technical skills, his requirements-gathering skills, whether he seems like a good fit. He got a big thumbs-up from everyone, so I hope we’ll be able to hire him!
I got long-winded on this, sorry. If you have questions on any of this, feel free to email me. I have a lot of resources that we gathered in our effort to develop a better interview process.
Appendix: Here is the job ad we ended up with, thanks to Elisabeth Hendrickson‘s help.
Established Software Company with Startup Energy Seeks Software Tester
We provide internet-based 401(k) services to small companies. We’re part of Paychex, an established, solid, stable company, but we still have the close-knit feel of a small startup.
We’re looking for a few good people to round out our team.
What’s in it for you:
Comments are off
By Lisa Crispin