Saturday, February 7, 2015

I Redesigned Our Hiring Process

Why?

I've always disliked the software hiring process, from both sides of the equation. I find that the types of things which tend to be asked, whether it be the Microsoft style brain teasers, the algorithmic brain picking, the whiteboard coding and all the rest are optimized to select people who are good at those, but the very large majority of software jobs these aren't the right things for which to select.  In the interest of full disclosure I'll admit to some bias, for a variety of reasons the typical tech interview is not something at which I excel, yet I've managed to do pretty well for myself. At least with an n of 1 something isn't right here.

The software interviews at my company are standard fare, a candidate will talk to 8-12 people spread over 4-6 one hour sessions. Most of the panels will go through the usual progression - describing their current work, answering some questions to assess their technical knowledge and some sort of whiteboard coding. The whiteboard questions are fairly typical, chances are you've seen at least one of them before. Candidates will also have taken a canned test using Codility before they're brought in, solving a handful of small problems which are scored automatically. When a candidate completes the process I don't see how anyone could really have a great feel - there's simply not enough time to cover all of those bases ... will they fit in? Are they smart? Can they get things done?

Another problem is that I know via second and third hand tales that our process has turned people off. The key part there is that people talk and in our case saying negative things - all of the tales I heard only crossed my ears by coincidence so clearly there's a bad image out there.

A few months ago my company formed a new department, merging together a few similar groups of software engineers and data scientists. We've been given a lot of leeway to run things how we'd like and apparently people have noticed that I constantly gripe about our hiring process as I was asked to come up with a new scheme. My goals were:

  • Determine if they're smart and if they can get things done. Really the only two things which matter in the long run.
  • Provide situations more similar to our actual work life instead of artificial constructs
  • The household names can optimize for near zero false positives. They get more resumes per day than we get in a year (I made that up, but probably close). We can't do that. A false positive sucks, but going understaffed for a year because we passed on 10 false negatives sucks more.
  • Understand that accomplished devs often have a prior body of work, let them show us instead of just being resume bullet points
  • Spread the screening duties around to both keep people sane but also as professional development for our own engineers
  • Remember that candidates are not supplicants, rather they are at least as (if not moreso!) important than we are. We need to be selling ourselves as much as they need to sell themselves.
  • Foster an environment that helps present our company as a place that software developers would love coming to
The latter two are more abstract and were just points of emphasis for our group. Specifically there were two issues I wanted to address. The first was that I felt we (and almost every company I've ever interviewed with/for) would put up a wall which said "Tell us why we should deign to hire you". It's no secret that there are many bad stereotypes of tech interviewers - e.g. the guy who asks the same stupid "gotcha" question but doesn't realize there's more than one correct answer or the alpha geek who asks some esoterica in an attempt to crush your soul if you don't know the answer. Instead, I wanted our approach to be warm and welcoming, being thorough and being a dick don't have to go together. Candidates should feel that they're getting an opportunity to demonstrate their talents in a fair environment instead of being under some crucible of artificial pressure.  

The other is that the company I work for is extremely prestigious in its field, but that field is not software engineering. Worse,  we're a non-profit (read: low salaries) and household names of the software industry are literally across the street. It's a tough hill to climb when you're neither a sought after destination nor in the ballpark of the salaries of companies within 100 feet of you. Upper management asks how can we become notable for software prowess, my stance is that this can only happen organically over time and one path is that when candidates come in we seem like a place where a software developer would want to be. It's hard to say exactly one could do this, but we could start by not going with the same trite hiring process every other mediocre software shop is doing. Present a positive atmosphere, set ourselves apart from the pack a bit and hopefully even if things don't work out with a candidate they'll have positive things to say about us.

Ok, great. So what are we doing now? 

What are we doing that's so different? I'll admit that nothing here is novel. Only one small piece of this was something I came up with on my own. Instead I spent a lot of time reading about how other companies do things and what folks felt about those ideas and then strung together a path that I felt would be best. The reality is that nothing is perfect - if there was, we'd all be doing it and there'd be no false positives nor negatives. There's also no solution that won't piss someone off but we can try to minimize that number.

We start with a pretty typical screening process. We asked HR to not filter incoming resumes (let us handle the buzzword bingo please!) and they are screened by two people with the bar being "could this person possibly make it through the process". This is followed by a phone screen which lasts about half an hour and is simply about a 50/50 sales job on both parts with the screener trying to glean enough tech info to answer the same question. In our previous process the candidate would now be handed an online Codility test, and here is where we start to deviate. We ask the candidate to provide coding samples: an online portfolio, a Github account, or anything else they might have. We recognize that this isn't always possible (and skews towards younger folks!) but when it is we feel it's superior to Codility. First, senior developers are often put off by such tests when they have a body of work that they can demonstrate. Second, real code is always more informative than an artificial test. If they don't have samples, we fall back to Codility where we've chosen the most program-y of the questions and make it clear that this is simply a convenient way to generate material, not a graded test. The sample is then reviewed by two people with the bar being "would it scare me if this person ever touched my code repository". All of the screening roles are rotated around our engineers - both as a load balancer but also as professional development.

If all goes well, the candidate is brought on site where we have four sessions (plus one with an HR rep):

1) A technical discussion with a group of 4-6 team members. 

This is your standard interview session except that the brain teasers, whiteboard coding and similar things are verboten. Interviewers are instructed to try to guide an organic discussion about technology to assess both breadth and depth of knowledge, using something from their recent work history as a seed. Instead of asking the 90th person in a row what a pure virtual function in C++ is perhaps you can discern that they know their stuff by simply talking to them. I contend that when this works you glean just as much (if not more) information and manage to not have the candidate sweating bullets and worried about impressing you. Also, because it's an actual conversation it's easier to get a grasp on the candidate's communication skills - they're not doing mental gymnastics trying to figure out what trick you're currently playing on them.

Why 4-6 team members? Partly to help keep a lively discussion going, more people mean that there's more opportunity to participate. Another positive is that everyone is seeing the same thing, it's easier to compare notes afterwards. An interviewer having a bad day would be known by the rest to being harsh instead of having to take their word that the candidate was awful. Lastly, we have a social environment and people need to operate mixed in with several people at once - this is a more realistic situation than being locked in a room with different sets of two people.

2) A code review

Remember those code samples they sent us? You might ask, "How do you know they actually wrote it?" Well, we don't. However, after this we can be sure that either they did or understand it well enough that they could have.

Three engineers will sit with the candidate and talk through the code samples the candidate supplied in the screening. This isn't an opportunity to pick on their choice of where to put their curly braces but rather to discuss why they made the design choices they did, why they opted to do X instead of Y, how would they improve what they've done if they had the opportunity, what trade offs they made, etc.

Why this instead of whiteboarding a cycle detection algorithm for linked lists? This review allows us to probe their technical knowledge and assess their self awareness (do they understand what they've done well and poorly?), tests their communication skills (can they explain what they've done to a fresh audience?) and susses out if they actually authored the code in the first place.

3) A friendly lunch

The candidate will have lunch with 1-3 people, largely consisting of non-software folks in our department. This isn't graded, except in extreme cases and is intended to provide the candidate an ability to recharge while getting to know more people they'll be interacting with regularly.

4) A coding session

The candidate will be told to either bring in a laptop or that they can use one we have for this purpose which is loaded with standard development tools. They'll be told that they have two hours to complete an assignment which should be treated as if they were handed this task in an actual work environment. It's up to them what that means: unit tests, documentation, actually works, whatever. Two team members will be in the room with them the whole time and they're told to treat them as they would coworkers. If the candidate wants to sit in silence they can do so, but they can also use the folks in the room to bounce ideas off of, ask questions, or discuss the most recent episode of Game of Thrones if they choose - it's all up to them. When they've finished or two hours are up two more team members will be brought in and the four engineers will proceed with a code review of the same format as the earlier session.

Our hope was to find a completely real world problem for candidates to solve although finding a good one which could be solved in two hours was elusive. Sadly the one we're going with for now is meatier than your typical whiteboard problem but still ends up being fairly artificial. If we were requiring people to have an exact tech stack we could do more but since we're fairly language agnostic - not to mention things like frameworks, ORMs, etc there are too many variables to allow a Real Application. Hopefully we can improve this over time.

Ok, you've blabbed for a long time now. How is it working out?

Uh, I'll admit that I can't say for sure. We've yet to have a candidate come in for a live interview although I've found a lot more value in the coding samples instead of the Codility tests we used to get. So far the screening has been going well, we'll get a better picture in the coming months as people come in and things work or don't.

To make matters worse the first person coming in for a live interview backed out when he heard about the coding session. I'm ok with this, we make it clear that we're trying to simulate the job they're applying for, it doesn't seem unreasonable to request they actually do that for us. We'll never know for sure but this seems like a situation where this worked out for the best. When I was researching the process I found overwhelming support for this type of coding session over whiteboarding, so I'm hoping it's bad luck that our first person found it off putting.

Perhaps this will prove to have been a giant waste of time but I have faith that it'll go a long way to meeting the goals I stated above, time will tell.

2 comments:

  1. Great initiative, thanks for sharing the ideas!

    Seems like a promising direction, would be interesting to read about the results after a while.

    The only thing which is a bit strange for me is the number of participants. In my experience 3+ interviewers put a bit too much pressure on the candidate, I preferably avoid those situations. On the other hand, the candidate might get used to it quickly, especially when they know the game rules in advance. Simulating real-life scenarios is a great idea.

    Good luck with the improved process!

    ReplyDelete
  2. Thanks for the comments!

    This whole thing was an exercise in tradeoffs as I quickly realized that no interview technique was universally loved. At that point the min/max function was to maximize our success rate (both hiring *and* not turning away good people) and minimize pissing people off in the process :)

    The large group angle came and went throughout the process with the 'nay' argument being the one that you cite, I've come across people with similar sentiments. The compromise we're trying is that at the start of the session the candidate is told what the goals & objectives are and how things will work, so hopefully that eases the tension a bit. Granted some people are *never* going to be comfortable in a group setting and it's not just the interview pressure, but the reality is that if they're that introverted they're going to struggle in our office environment.

    We'll have to see how things go - not just for that angle but the whole thing. This is really just meant to be a starting point, it'll evolve over time I'm sure. And due to some scheduling snafus (not to mention a historic level of snow over the last few weeks) we *still* haven't had a live candidate come in so there's still no data.

    ReplyDelete