Aja dives into what is wrong with many current tech hiring practices and what you can do instead. You’ll learn ways to effectively assess technical ability, what makes a great interview question, how to be more inclusive, the pros and cons of giving homework assignments, and much more.
Takeaways coming soon!
Aja Hammerly: Hi, I am Aja and I work at Google and my talk today is called Hiring is Not Hazing. I'm specifically going to be talking about interviewing. And just a heads up, I've got 47 slides in 15 minutes, so this is going to be a little bit fast, but first we need to address the elephant seal in the room. I work at Google. Google has a reputation for a long and difficult interview process. I know I've failed the Google interview process several times. It wasn't actually particularly pleasant. I'm a little bit salty about it still.
And so I want to say upfront, I haven't fixed anything about hiring at Google. I think I've made the interviews with me better. I think I've positively influenced my coworkers and my peers and we're trying, a bunch of us are trying to fix the hiring process, especially for Dev at Google.
But any big company changing things is hard and it's been slow going, but let's get on with the rest of the interview and the ways that I've been able to make myself a better interviewer and make hiring a lot less like hazing. And first that we have to start with the obvious interviews suck. I've read so many blog posts, Twitter threads, Facebook grants, everything about how much interviews suck, and I'm just going to pause it right here. That tech interviews, they kind of suck more. I've thrown this out to Twitter a couple times asking what about interviews people don't like or what I could do to make interviews more pleasant or how I could make the interview process less intimidating and people just come back with anger and vitriol.
These are fictionalised, but kind of representative of what I hear. I hate whiteboard interviews. CS 2 0 1 stuff isn't relevant to my job.
I hate homework problems. I hate pairing interviews. I don't know anything about the company's domain. Whiteboard interviews are so unfair. So what if I miss a semi interviews all suck?
It feels like the interviewers just want to show off and it's super duper negative, but I took a step back and tried to figure out what there was to learn. This negativity is coming from somewhere useful. And when I looked a bit deeper, I found that there's some actual kind of global interview anti-patterns hidden here, and they aren't. Things like whiteboard interviews just suck because for a lot of people, whiteboard interviews do suck. There's something deeper and the first thing you notice is we kind ask bad questions. The CS 2 0 1 stuff isn't relevant to my job. Yeah, that's fair. Homework problems taking eight hours.
Nope, not cool. Ask expecting folks to know about the domain that your company's in when you're doing a parent interview.
Yep, that's another example of bad question. We also do a bad job of evaluating interview candidates and or they are afraid we're going to do a bad job of evaluating them. So again, worried about domain knowledge or worried about forgetting a semicolon. And finally, quite frankly, a lot of interviewers just show up with a bad attitude. They feel like they need to be there to show off or they feel like they are there to fail everyone. So to summary, tech interview suck because we ask bad questions.
We do a bad job of evaluating and a lot of us just plain old show up with bad attitudes turns out though we can totally do better and doing better is actually pretty straightforward. We can ask better questions, we can do a better job of evaluating and we can definitely show up with better attitudes. And if we fix these things, we can fix whatever interview process you have, whether it's whiteboard or behavioural or take home.
All of these things can improve any of these processes and most of these changes are things you can do today. If you have an interview later this afternoon, you don't have to go through a huge HR process. You don't have to talk to even talk to your boss in many cases. So let's just dive straight in. How can we ask better questions?
The biggest thing I can tell you is just keep it simple. So many interview questions and quite frankly, the ones that I started asking when I started interviewing are too long and wordy and complex. This is especially true of homework problems that we give to people to solve in their own time. People aren't at their best in an interview situation, so please cut them slim slack. Your actual problem statement ideally should be a couple of sentences, all but one of my questions is a single sentence.
The one that's little bit longer than that is it's a coding question. It's like five. I'll talk about that in a second because the other thing that's really important to know about your questions is they can be easy.
Folks aren't at their best during their interviews. Something that seems easy to you that you could probably solve in five minutes, it's probably going to take someone in interview 15 or more. And that longer question I talked to, it's actually one of my favourites right now, and it's actually the first homework question from the first chapter of an introduction to computing for non-computer scientist books. Literally the easiest question in an introductory book, I get plenty of signals from it. I get signals about design approaches. I get signals about their knowledge of data structures, code clarity. I get signals about their ability to explain algorithms, their ability to ask a customer potentially what the edge cases are, making sure they truly understand a customer's problem statement.
All the stuff I need to know when I'm interviewing advocates and deral engineers.
I get from this super duper simple question. Another pattern I've seen in bad questions is a lot of jargon. What do I mean by jargon? Well, jargon is kind of a problem in tech circles. It's actually a problem in STEM circles, but there's obvious stuff like slang or terms internal to your team. Google is pretty infamous for this, but there's also some interesting jargon that can go deeper than just using internal terminology. This is actually one of my favourite examples of a sneaky place jargon from Weasel in, and it's actually a place that I ran into interviewing some candidates several years ago. So this right here is adjacent structure describing my cat.
My name's Emma. She's a cat. She weighs 22 pounds, actually only 18 right now. We've been working on that.
But what do you call this data structure? That's the interesting part. Is it a dictionary? Is it a hash?
If you're in Ruby, it's hash. Is it an array? There's a couple languages that call this an array. Is it a hash map? Is it a map? The gopher like to use map. I think the Java people use map. If you are a lisser, it's an associate of array and that's actually the data type as listed in most data structures book.
But what you call this depends on the language. So if you say your input is an array or a hash, the candidate might have a different understanding of what you're asking about, which is why it's super important to give examples as a way to combat jargon. So don't just say the words, show what you mean in something as clear and concrete as you possibly can.
Another way to avoid a lot of jargon is actually to keep the context of the problem familiar. There's a big push. I know a lot of people like to ask questions about sports or games and they will say, well, it's not a big deal. It is about sports or games. I supply the rules, so it's okay if folks don't know the sport or the game.
I've also seen folks asking about a specific industry, finance, retail, et cetera. And some people really like to ask Matthew questions. And the problem with all of these is while we think of them as okay, if we supply the rules or okay, if we give an explanation by requiring that explanation, a lot of people are going to assume that they don't know something that was required. They may assume that everyone else knows the rules of say soccer and the fact that you had to explain the rules to them is going to be negative on them.
It also means that they're working with less familiarity with the context than another candidate. So as much as possible, I try to keep the context of my problems very familiar and or I try to keep my problems context free, something that I try to keep it to things that almost everyone would've experienced if I have to do a context questions should also be relevant to the job description. Don't ask folks interviewing for community manager position to revers a linked list unless your community manager position requires reimplementing CS fundamentals. If so, I want to talk.
Don't ask someone interviewing for a devex API design position to give a presentation unless giving presentations is going to be a regular part of their job. And God damnit, no brain teasers and riddles, please no tricks. We don't need to know why manhole covers around. We don't need to know how many ping pong balls fit in a 7 47. There are so many other ways to get information about how your can candidates solve problems. Ask them to solve a problem that you would solve on a regular basis or a simplified version to it and de ask them how they would answer a question that you got last time you did booth duty. I know it's been a while for a lot of us, or ask them to answer the kinds of questions that you get on Stack overflow or you get via your product Slack or something.
Also don't overs specify, and this one's actually super subtle.
This is a failure case I see in a lot of more experienced interviewers and the big thing here is to not lead the candidate to a specific answer. If you want to find out how a candidate would handle deprecating an API and migrating your user base to a new one, ask how would you handle deprecating an API and migrating the users to a new one? Don't ask, what kind of outreach campaign would you do via our mailing list if we needed to deprecate an API because you're leading that candidate to a mailing list. Only answer in that case. And it may be that the candidate has never used a mailing list for this purpose and has 9 million other awesome ideas that are better than anything that you could have come up with, but you prematurely bounded their answer and you're expecting them to have the exact same experience and the exact same understanding of the world that you do.
So you're limiting the chances for them to get it right and limiting the chances for them to get it right and not expecting people to have the same world experience you do. Leads us to the second category that I identified earlier, which is better evaluation because we're quite frankly, better questions don't matter if you aren't evaluating them well, and a lot of candidates are worried that their interviewer is going to be a harsh grader or that their interviewer is not evaluating consistently. And quite frankly, sometimes I worry that I'm not evaluating consistently.
So how I get better at being, how did I become a better evaluator? And how can we as interviewers get better at evaluating? Well primarily have a rubric. If you're unfamiliar with the term rubric, it is a rating guide that includes examples for each rating. So most big companies use rubrics at this point, but I actually written rubrics for a couple of my more complex problems so that I have a general idea of what outstanding answer, a solid answer and a weak answer looks like for say a question, how would you handle deprecation of an API use an example from earlier.
You can also see rubrics on specific job skills like communication skills versus technical skills versus team teamwork and project management skills. Those are also ways that we can use rubrics to help us make sure that we're holding a consistent bar across candidates, and I'm going to call it explicitly right now that a rubric is qualitative, not quantitative. It describes what an answer or candidate candidate's answers would look like for strong or medium or weak for a given category or a given problem.
A lot of people try to treat rubrics like points like this is basketball. So they'll say things like, well, they had three syntax errors in their whiteboard code, so I'm going to take one off for each of those, which leaves them a two out of five, which means that they are borderline for this particular skill. No, this isn't basketball, we're not doing points. It's a qualitative evaluation. If those three syntax errors were minor, maybe they were still solid.
This isn't points, please don't do points. Another thing that I've learned at Google especially is question yourself while you're writing up your feedback. I find this really helps me clarify my thinking, be more confident that I'm being consistent, and it helps counteract a huge number of cognitive biases that we all have.
Things like recency bias or anchoring bias. These are the two questions that I learned specifically at Google that I really, really like and they work really well when paired with a rubric. And the questions are, why not higher and why not lower? So say if your rubric has three buckets, exceptional, solid, and borderline, and you've picked solid, you would ask yourself, so why didn't I rate the candidate exceptional on this trait or on this question? And then you would also ask yourself, well, okay, why didn't I rate them borderline on this question?
I actually think about this as checking my work, and I will type up my answers to these questions in cases where I'm not confident and include it in my feedback so that people understand how I got to the conclusion I got to and understand that I did check my work. But there are other questions you can ask yourself. Here are some of the ones that I frequently ask myself.
When interviewing DevRel candidates, can the candidate clearly explain their technical decisions? Did the candidate consider common error in race conditions? When we were talking about a design problem or technical problem, how did the candidate handle hints or feedback? How did the handle candidate handle it? When I disagreed with them, if I disagreed with their approach, were they respectful? Were they able to understand my disagreement and help me come around to see their point of view?
These are all important things, and this is the other question I ask myself all the time. Does it actually matter if the candidate missed a semicolon on their whiteboard code before you ding them for it?
Ask yourself, does it matter? I'd argue no. In the real life, an id, a compiler interpreter would catch it. I remember one candidate and the first time I really started using this question, we were doing a coding question and they had reversed the array indices of a 2D array. They had the X where the Y goes and the Y go where the X goes. And I verified that they got it wrong and I was about to ding them for it, and I'm like, actually, no, they would've figured that out within 10 seconds. Everything else about the conversation was amazing. They were fantastic at explaining what they were doing.
They taught me some cool things about the language they were using. They understood that this was the best answer that they could give. And then we talked about why this problem couldn't be optimised. No, that one syntax error didn't matter, not worth thinking for, not worth hurting, not worth even including that in my evaluation and related to, doesn't matter. I always like to make sure that we assume some ramp up time because everyone gets some ramp up time. If the candidate makes mistake, think about whether this, the type of thing you'll be able to teach during ramp up or onboarding. If a candidate gets super nervous and muddles their presentation or maybe their slides aren't particularly pretty, mine aren't pretty ever, maybe that's a thing that doesn't matter. Maybe that's a thing that you can help them with.
You can teach them during their ramp up time.
Maybe that's a thing that your company has resources available to help them with. Maybe you guys have a slide template that they can use and everything will be gorgeous. Assume that they're going to have ramp up time and access to all the resources that you have if you hire them and take that into account in your evaluation. And also completely unrelated to the previous one, a lot of folks I know have started using gender neutral pronouns when they write up their feedback. And what I mean by gender neutral pronouns is in English them. Originally I thought this was to prevent bias. For others reading my feedback, they wouldn't know what gender the candidate was, so they wouldn't bring their biases in.
But I actually think it's so that I don't bring in some of those nasty unconscious biases that I have. And some of those bias pieces of language, like the word bossy that we use in English, sometimes negatively with one gender or the other, or positively with one gender or the other.
This is also just a fantastic way to get used to using gender neutral pronouns. I'm way more comfortable with them seven years after starting this than I was when I started doing this. If using gender neutral pronouns feels too hard, there are chrome plugins and other browser plugins that can actually identify gendered language like bossy and suggest other alternatives. I really like having those because even when I'm feeling lazy, it will remind me, Hey, you need to being lazy and you need to be serious about this. This is important. And we sometimes miss some of those little things because they're just so embedded in our language and our culture at times.
So that's better evaluation. Last category is better attitudes. So better attitudes are the most important. This is where the hazing part of this title comes in. People come to interviews with bad attitudes all the time.
I went through this, so you have to too. The candidate must know more than me about my domain. My job is to ensure that only the best get through our process.
Yeah, no, I don't know how to stress this enough. Come to the interview with a positive attitude and I don't mean friendly. Although yes, please do be friendly. I mean, come to the interview expecting it will go well. Not looking for reasons for the candidate to fail. I refer to this as being on team, the candidate. If everyone in the interview process wants the candidate to succeed, the candidate can tell and they will feel better. And I personally believe that they will probably do better.
I tell most of the candidates that I interview, I'm here to figure out what you're good at. If I touch on something that you don't know or struggle with, let me know and we'll move on.
This is what I think of as a strength focused interview. It means I don't dwell on mistakes. Even major ones, I move on. Yes, that means I have to have extra questions because sometimes some of my, we have to skip some of the ones I was hoping to ask. It's fine. I always show up with a couple extra questions in my question.
Bank, it's not a big deal. I truly, I do just move on. I accept. I don't know if the candidate says, I don't know to a critical skills question, I absolutely need them to know a particular language well. Or I need them to know deep knowledge about containers. I can always follow up with, well, how would you learn? How would you find out? Or what do you know that's similar to this?
Or How have you solved problems related to this in the past?
I can find out information to figure out if they can solve that problem and get an accurate assessment of their skill in that area without just harping on questions they dunno the answer to, because that's not pleasant for either of us. And then the last thing I talk about when strengths focus is people go, well, but what if they just don't have the skills needed for those position? I'm like, in that case, you have two options. You can end the interview, you can end the interview early. It hurts. It's kind of uncomfortable to do for both parties, but sometimes that is the best choice. But what I like to do is think about, okay, is there another position we have open that they would be successful in?
Maybe they're not doing so well for this API engineering position, but they'd be a fantastic community manager.
And that's one of the reasons I focus on strengths. And I try to figure out what they are good at because that gives me a good perspective on where they might fit in, if not in the particular opening that I have right now. Easier at a bigger company of course, but if they can fit in elsewhere, let's see if we can find 'em a place where they will be successful. So right before the end, quick lightning round. First of all, biology happens. People need to eat, drink pee, and some people need to pump. Make sure that you make time in your interview process for biology, especially true on onsite interviews, but even in the time of virtual interviews, make or break for people to go take a pee.
There are likely legal requirements about how many accommodations and what kinds of accommodations you need to supply for people with disabilities.
Accommodating people with disabilities is the ethical thing to do, but please talk to your HR or a lawyer to make sure that you are doing these correctly. If you can for folks, offer some coaching. And by coaching I don't mean teaching them the skills they need to be successful at the job. By coaching, I just mean giving them a chance to try the interview format and get some feedback on how they interviewed and what they could practise and or just get them used to the format. So much of people's nerves doesn't have to do with not knowing the material and not studying enough has to do with not knowing what to expect. So if you can give them a low stakes coaching interview where they just do the process once and get a little feedback on longer answers here, shorter answers there. It can help make candidates who would've struggled really, really successful.
Also, a fantastic place to use less experienced interviewers who need to practise interviewing. Also, be human. Offer a bit of yourself, not too much. Interview is about them, not about you, but don't be a robot. And finally, don't ask personal questions. This seems super duper obvious, but a lot of us are kind of gregarious, even if we're introverts, we're a lot of us. We're hard. We make people feel at ease.
And one of the ways we do that is we ask them about their themselves. We ask personal questions to be nice and friendly in an interview situation. This can backfire and it can backfire in subtle ways that you don't notice necessarily, especially virtually. Even seemingly innocent questions like read any good books lately can stress someone out or put someone in knock or situation if they haven't had a chance to read lately. Or maybe they're reading and they're reading a controversial book or a book that they don't know if you're going to be okay with. It's just adding a whole bunch of stress to a situation that doesn't need to be stressful. So keep friendly questions focused on the job. Why are you interested in this job?
Questions about stuff that's happened on their resume. I saw you worked at this company. What team did you work in while you were there? Did you work? You know someone there? Did you work with this person? I don't know. Try to keep the questions as focused on the job skills as possible.
Thank you. The slides for this talk are shared on my website. There's also a recording to a longer version of this talk that is specific to software engineering, not Debre, if that is more your speed. But thank you very much.