Writing

Software, technology, sysadmin war stories, and more. Feed
Wednesday, February 6, 2013

Inverting the polarity of interview coding

It seems like almost everyone in this industry has a story about being asked to write some code on a whiteboard when interviewing somewhere. They probably also have a story about asking someone to do that same thing when the tables are turned and they're the one conducting an interview. This seems to be the way things are, and even though some people balk at being asked to write code, the status quo seems intact.

There is plenty of research and ordinary writing online about what to do before going in for an interview and also what to do if you are giving one. There are puzzles, trivia questions, and challenges which seem simple but really aren't. Both sides of the table are trying to maximize their return on investment, and there's probably a good bit of "gaming the system" happening too.

Still, it seems like all of the code flows one way: the candidate writes it and the interviewer reviews it right there in the room. If the interviewer takes notes, then that person's hiring committee overlords probably also review the code at some later point. That code may get a bunch of reviews, but it's still just flowing one way.

I wonder what would happen if it went the other way. The interviewer would come into the room loaded with a couple of code examples and would proceed to hand them to the candidate. The candidate would then have to review the code and give their honest assessment of it. It seems like this could open up a bunch of new routes for candidate analysis which previously have only coincidentally been used.

Just think of the possibilities! You could lead off with something simple, like a basic class which is supposed to take a bunch of values and remember which ones it's seen. Instead of using a really good implementation, it could be something embarrassing which has at least one giant stinker in it. It might fail on certain types of input, or maybe it's not thread-safe, or perhaps it does things which are a Bad Idea in your language of choice.

A candidate who spots stuff like this has probably dealt with this code before. It's likely they've written something like it, but it's also possible they merely reviewed it for someone else. Either way, they should wind up sharing some domain knowledge which will give some idea of how well they know a given topic.

Then there's the invitation to fix it. What do they do when asked to give their two cents on the code? Do they just rip it apart and leave it at that? Do they recognize what's going on and just say "you're doing it wrong" without bothering to share an alternative? Maybe they'll say a few nasty things about the author. All of this is valuable data. If they do that in an interview, I'd guess they probably do that when presented with "real" work. Do you want to work with someone like that? This is your chance to find out.

It might also be interesting to drop in a few more things to collect some data about candidates in general. This would be far more controversial, but imagine a subtle bit of metadata at the top of the code. It might say "Author: rachel@example.com" or "Author: chuck@example.com" or "Author: jbrown@example.com". After a while, there might be some interesting patterns in the data.

There are some definite possibilities here. I'm sure I'll get a bunch of feedback which tells me about some forward-thinking place which has been doing this for years but which is off my radar. It's a big industry and that's entirely possible. If you know about this and have a story to share, please write in and let me know!


February 7, 2013: This post has an update.