Hiring Developers - Why "Smart and Gets Things Done" is not Enough

By Ed Willis
September 3, 2009 | Comments: 6

Back in 2000, Joel Spolsky published the first version of his "Guerrilla Guide to Interviewing" about hiring developers. Since then, he's published revisions to that article as well as including it in a book on hiring developers. I don't know when I first read it but it certainly stayed with me. Given how frequently people around me reference these sources - especially the guidance about the people to target ("smart and gets things done") - it seems to have resonated with many others out there also. That said, over the last few years I've managed a group that's done a fair bit of hiring and, while I love the confidence of that article, it's not enough for us.

I'd like to thank Mark Chatterley and Marc Lepage for their very helpful review of this article.

The example interview plan from the article calls for the candidate to tell the story of a recent project they worked on, write some code that solves a simple problem, and explain a solution requiring pointers or recursion. The objective in his interview process is to identify candidates who are smart, have a history of getting things done and have demonstrable programming skill.

What we look for in a candidate is really four things. The first three are knowledge, experience and native ability. The last is a genuine interest in the field - people who have this typically have side projects, new favorite technologies, a book or two on the go, etc. It's not just a job to them. Regarding native ability, it's not just raw intelligence we're after - we value other characteristics as well, examples include being likable, a good listener and conversationalist, determined, and being good in group settings (like working meetings). Those in addition to Spolsky's second required attribute - that they get things done.

Spolsky is looking for candidates to be smart and knowledgeable within the domains of programming and design (limited to indirection and recursion). Those are good starting points but there is a lot more knowledge and experience we want to draw out in the interview. When I'm talking about knowledge, I don't mean about knowledge of technologies or of specific problem domains but rather a solid understanding of engineering best practices. For developers, we tend to ask lots of questions that go to design, refactoring, lifecycles and verification and validation strategies and a lot fewer related to pure programming. So I might describe a project which, close to its inception, has about one third of its requirements well-understood and clearly needed for the release, one third being understood but of arguable necessity and one third wholly unknown at that point in the project - and then ask the candidate how they'd approach development on a project like that. Hopefully, they come back with something on lifecycles or methodologies - for sure they need to come up with iterative development of some form. At that point I can ask them how they'd plan to build a quality product in that environment. Poor candidates will focus on locking down requirements and testing. Good candidates will talk about leveraging the iterative lifecycle to define a similarly iterative approach to testing, review and validation as well as using demonstrations of the iteratively-developed product to help elicit new requirements. From there I can ask about how they'd deal with new, unanticipated requirements that overturn assumptions they'd made in design. Good candidates will talk about refactoring here and will talk about how testing frameworks (for example automated xUnit-style testing and acceptance tests) provide a safety net to allow them to change those assumptions without destabilizing the application. Poor candidates will talk about locking down requirements early in the project to avoid this situation coming up in the first place.

Really the only questions I commonly ask that go to programming per se are "compare and contrast C++ and Java - what do you love and hate about each of them?" and "what are you looking for in a code review?"

You wouldn't think you'd have to make a case for development knowledge but there are plenty of hiring processes in my experience that focus entirely on a candidate's raw horsepower and their job experience track record. A focus on knowledge broader than programming knowledge is important in practice for at least a couple of reasons.

Firstly, a person who has to reason everything out from first principles or "common sense" will take a lot longer to decide things (or will make poorer decisions) than someone who has a good background. For example, a project with pretty firm requirements up front and an equally firm release date should call to mind a lifecycle something along the lines of "Staged Delivery" (from Rapid Development, by Steve McConnell). You shouldn't have to mentally construct the project's organization a little at a time - ideally, you go right to a decent solution that's worked many times before that can be used as a good starting point for the current problem. So knowledge helps accelerate the pace of decision-making. That's really important to us.

Secondly, the candidate's knowledge has to lend itself well to their integration into the team. I once hired someone who seemed both smart enough and had a track record of getting things done. That said, he lacked knowledge of object-oriented design, design patterns, refactoring, and unit testing, each of which was something everyone else on the team knew intimately. This was, as you can imagine, a big mistake. The gap in knowledge led to team discussions that either went over his head, which frustrated him, or to everyone explaining their points at a very fine granularity, which frustrated them. So paying attention to team norms in knowledge when hiring new team members helps the team elevate the level of discussion, which is also really important to us. In the Construx Professional Ladder, expectations of knowledge acquisition form part of the advancement plans of employees. In hiring, they're free to take someone who doesn't measure up in these areas but the candidate is given a timebox within which to shore up any gaps in their knowledge - that's how important maintaining norms in staff knowledge is to them.

That last point - the candidate's need to address gaps in their domain knowledge or team norms in solution space knowledge - brings us back to that fourth thing we're looking for: a genuine interest. Someone who has it will inevitably close these gaps while someone who doesn't will struggle to do so. Identifying candidates who have this characteristic isn't actually that hard - you ask them questions that go to their self-directed post-university learning. Some examples are:

  • What book or article have you read since university that has really changed the way you look at development?
  • Have you read any books or articles that you really disagreed with? Tell me about one such case?
  • What are you reading right now?

So we're looking for readers - courses and training they've taken are relevant here also but we're definitely after readers.

One interesting thing that comes out in reading Spolsky's earlier and later versions of the "Guerrilla Guide to Interviewing" is the differing treatments of the "Impossible Question" - these are the famous estimation questions popularized (I believe) by Microsoft. Things like "how many gas stations are there in New York City." For what it's worth, I've always hated these. There are a couple of reasons for this - the first is that it's really easy for someone to just practice doing these questions and improve their performance on them. So it's not at all clear to me that you end up identifying good candidates or just well-prepared ones. Probably the bigger reason why I dislike them though is that they favor people who can think quickly on their feet and don't get flustered in difficult social situations - those are useful characteristics, to be sure, but they're not indicative of great development potential per se. In real life I don't want to force people to always think on their feet. And I want to manage group dynamics to help draw people out rather than just hoping they make a big effort to get themselves heard. The thing that's interesting about the two versions of the Spolsky article is that the earliest one expects these impossible questions to be a part of any interview while the latest just suggests them. Backing away from them is fine by me, for what it's worth.

Instead of the impossible question, I tend to favor asking candidates to explain either a good or a poor design from their past. I'm looking for both a good and reasonably concise description of the design and also a well-thought out analysis of its strengths and weaknesses. In answering these questions, the candidate still needs to think on their feet to present the information in a coherent manner but the actual content is stuff they have spent a fair amount of think time on. I think that models what they're going to need to be able to do on the job much more closely than the impossible question does. I once read an ex-Google employee's remembrance of being interviewed by Sergey Brin. At one point in the interview, Sergey told the candidate "I'm going to give you five minutes. When I come back, I want you to explain to me something complicated that I don't already know." That's similar to what I'm describing - the candidate isn't being asked to invent out of whole cloth in the middle of an interview but is rather asked to organize their thoughts and present something complex. That exercises the skills that form a big chunk of the work in development, in my opinion. Interestingly, although the most current version of the Spolsky article does not dive deeply into design with candidates, earlier versions did.

The Spolsky articles spend a lot of time on how you, as an interviewer, make up your mind about a candidate - and I absolutely love the confident prescription of this stuff. For example, "The trick is telling the difference between the superstars and the maybes, because the secret is that you don't want to hire any of the maybes. Ever." One thing that most hiring processes I've been involved with fail to do, though, is make clear in advance how the overall hiring decision will be made. Spolsky offers these two pieces of advice for how the team will arrive at its decision:

  • "If even two of the six interviewers thinks that a person is not worth hiring, don't hire them."
  • "I would probably allow any senior person to reject a candidate but would not reject someone just because one junior person didn't like them. "

The decision rule is tricky but I'm not a huge fan of either of these. I want everyone to take the interview process really seriously. I don't want to over-ride someone's opinion. If I don't want to let someone have a deciding say in the process, then I don't want to involve them in the process to begin with.

In my experience, most organizations have one of the following two decision rules in practice: manager decides (most common) or consensus (a bit less common). Regarding letting the manager unilaterally make the call, all I can say is that, as a former manager, I've been very keen on candidates after interviewing them and had my eyes opened by the other interviewers in discussion afterward - I'm not comfortable that I'll do as good a job as the team is capable of.

Given that, earlier in my career, the groups I managed used consensus decision-making - and to be honest we did OK with it, but there were moments where it really made me nervous. The problem with consensus decision-making in groups made up of people that know and respect each other is that people will tend to support the emerging consensus immediately - or what they think the emerging consensus is. No one wants to be the stick in the mud that's preventing the team from arriving at a decision. It can be almost like a Ouija board - everyone's got their fingers on the planchette and once it starts moving towards the "yes" or "no", there's no stopping it. Consensus decision-making, especially in medium or large groups, is a decision rule that allows the possibility that everyone leaves the table thinking the decision was a poor one because everyone was reluctantly supporting a position that it turned out no one actually wanted. After talking this over with the team, what we arrived at was something we felt was a much better decision rule:

Someone has to love the candidate. No one can hate them.

This empowers the interviewers in spades. At least one of them has to be willing to throw their weight behind the candidate. If no one's willing to say they love the candidate, then we keep looking. It also makes for some amusing conversations to be sure:

Interviewer: "I really like this candidate."
Me: "Sure, but do you love him?"
Interviewer: "Well, my heart does go pitter pat when I think of him ..."
Me: "That's a reason to invite him to the prom. I'm asking if you want to build a life with him."
And so on. Great fun.

And any one of the interviewers can turf the candidate unilaterally. This simultaneously demonstrates our respect for their opinion and encourages them to take the interview process very seriously.

Along the way, it tunes the process to make it more likely to say yes to what has been the absolute best source of candidates out there for us - referrals from people on our team. I think many would agree with this (although Spolsky, interestingly, does not). We've very, very rarely gone wrong going this route - if someone you respect tells you to take someone, you should do so almost all the time. That's been our experience. Going back to the decision rule, candidates like these pass the first part of it (someone has to love them) and only need to avoid the second part (no one can hate them) to get hired.

So, to summarize, there's a lot to like in Spolsky's approach but I'd drop the programming question and the pointer or recursion question. Beyond that I'd do the following:

  • Dig into the candidate's development knowledge - especially knowledge that forms a part of team norms
  • Don't use impossible questions but rather make them explain and analyze a design from their past experiences
  • Focus on finding people who have a genuine love for their field - especially readers
  • Be explicit about the decision rule for the hiring process - someone has to love them, no one can hate them is the best one I know
  • Bend over backwards to hire people recommended by people already on your team

You might also be interested in:

6 Comments

Academics matters significantly as well. Someone that has a degree or two Computer Science has gone through a rigorous evaluation process and has done significant programming. They should have a deep understanding of complex data structures, algorithm analysis, software engineering, numerical methods, compilation, operating systems, optimization, and large scale software systems. They should also have significant exposure to mathematics and physics. And they can also hopefully write a coherent English sentence. A candidate with a Computer Science degree should also be able to discuss the current research issues in the field. Membership in ACM, IEEE, etc. should also be something you look for in a Computer Scientist. There is no equivalence between this background and a 2 year MIS or IS degree, where they have maybe been exposed to a few programming languages and other technologies de jour. No offense, but there needs to be some academic depth, expecially for a senior position.

Academics, yes - I absolutely agree. I held off writing about requiring a degree per se because one of the best developers I've worked with stopped just short of his degree - so it's more the knowledge and less the degree I personally would be after. But the heart of what you're saying there I completely agree with.

Regarding your point about candidates needing effective English skills - you're absolutely right (well, for whatever language is the norm for the team). It's a pure oversight on my part that that didn't get into the article. It's certainly something we look for and likely the most common reason why resumes end up in the "don't call" pile.

Great article. A tactic I've often used in interviews, especially for senior engineers, is to provide the job requirements and position description (which they should presumably already know since they are at the interview). Then, give the interviewee a few minutes and ask them to come up with a mock interview in which I play the role of an interviewee and the person I'm interviewing becomes the interviewer. I ask them to pretend that they were hiring me and ask the questions. The speed at which they come up with questions and the type of questions they ask me tell me a great deal about their experience.

Thanks much for your comment.

That's a very novel approach to interviewing - I've certainly never heard of it before. I can see how it might expose their real opinions and working knowledge rather than just what they think you want to hear.

I am not entirely agreeable with interview questions that probe 'what kind of reading you do'? they can do all info look-ups on the internet. They probably dont have or need not have the attention span necessary to read books from start 2 finish. The ability to Google stuff and find out how other developers have handled similar or likewise problems would be more important. A search on snipplr or koders or other source code database would probably bring up code that already exists. I would think they are more comfortable with things like 'tell me something that you have done much in common in all your projects' or 'what are your most often used code snippets and have you built up a collection of them' etc.

Fair enough. I suspect we are targeting different people.

News Topics

Recommended for You

Got a Question?