Summarized using AI

Interview Them Where They Are

Eric Weinstein • May 31, 2019 • Minneapolis, MN • Talk

In the talk titled "Interview Them Where They Are," presented by Eric Weinstein at RailsConf 2019, the speaker critically examines the technical interview process in the software engineering industry. He argues that interviews often fail to effectively assess candidates' skills and potential, leading to a poor experience for both interviewers and candidates. The core idea is to develop a framework that allows for evaluating candidates in a more inclusive and effective manner, focusing on identifying the right talent and competencies that align with team needs.

Key Points Discussed:

- Inefficiencies in Current Interview Processes:

- Many interviews consist of trivia questions that do not reflect real-world skills or knowledge.

- Interviews can create anxiety and discomfort, which may not accurately showcase a candidate's abilities.
- Anecdotes from Personal Experience:

- Weinstein shares his personal experiences during interviews, highlighting instances of unproductive questioning and missed opportunities to display competencies.
- Deficiencies in Job Descriptions (JDs):

- Common pitfalls in JDs include imposing unnecessary degree requirements and arbitrary years of experience, which can deter qualified candidates, particularly from underrepresented backgrounds.

- Framework for Effective Interviews:

- Weinstein suggests creating a flexible interview process that meets candidates where they are. This includes customizing interviews based on candidates' backgrounds, such as those from traditional CS degrees, boot camps, or self-taught developers.
- Emphasizes evaluating strengths rather than weaknesses and allowing candidates to demonstrate their skills in real-world tasks or problem-solving scenarios.
- Inclusive Interview Practices:

- Addressing biases within the interview process and crafting questions that are relevant to the candidate's specific experiences.
- Utilizing made-to-measure interviews that adapt to each candidate's strengths, encouraging a more inclusive hiring atmosphere.

Conclusions and Takeaways:

- It's crucial to clearly define what a company is looking for before interviews and develop success metrics that tie back to the job description.

- Candidates should be assessed based on their ability to demonstrate their competencies in ways that reflect real work scenarios, moving away from traditional whiteboard interviews.

- By restructuring the interview process to be more inclusive, organizations can attract and retain a broader range of talent while also improving the candidate experience.

Weinstein's final message encourages engineers and hiring managers to reflect on their interview practices and commit to continuous improvement for a better hiring process.

Interview Them Where They Are
Eric Weinstein • May 31, 2019 • Minneapolis, MN • Talk

RailsConf 2019 - Interview Them Where They Are by Eric Weinstein
_______________________________________________________________________________________________
Cloud 66 - Pain Free Rails Deployments
Cloud 66 for Rails acts like your in-house DevOps team to build, deploy and maintain your Rails applications on any cloud or server.

Get $100 Cloud 66 Free Credits with the code: RailsConf-19
($100 Cloud 66 Free Credits, for the new user only, valid till 31st December 2019)

Link to the website: https://cloud66.com/rails?utm_source=-&utm_medium=-&utm_campaign=RailsConf19
Link to sign up: https://app.cloud66.com/users/sign_in?utm_source=-&utm_medium=-&utm_campaign=RailsConf19
_______________________________________________________________________________________________
As engineers, we've spent years mastering the art of conducting technical interviews—or have we? Despite being on both sides of the table dozens of times, how often have we come away feeling that the interview didn't work as well as it could have? How many of our interviews have been just plain bad? How much time do we spend designing and improving our own interview processes, and what signals should we be looking for when it comes to making those improvements? In this talk, we'll examine the technical interview in depth, developing a framework for interviewing candidates "where they are" by focusing on answering two major questions: how can we ensure our interview process identifies the people and skillsets we need to grow our teams, and how can we interview candidates in an inclusive way that maximizes their ability to demonstrate their competencies? By the end, we'll have built out a rich new set of tools you can immediately apply to the hiring process in your own organization.

RailsConf 2019

00:00:20.689 So, I think we'll get started.
00:00:22.939 I'd like to start by telling you a story.
00:00:26.609 This story is anonymized; I've left out some names to protect companies that have interesting interview practices.
00:00:30.650 I want to let you know ahead of time that everything that happened in this story is true, but I've stitched together multiple interview loops into one for the sake of the narrative.
00:00:39.710 So, don't feel bad for me; I did not have one super awful day where everything went terribly wrong. However, these are all things that happened to me during my interviews with small companies, large companies, companies whose names you know and read about in the news, and companies that you may not.
00:00:46.350 I drove to the company, feeling very excited, and started my interview loop by talking to the coordinator who was running it. Everyone was super nice: friendly, restroom, water, coffee, all that.
00:00:51.329 I went into the interview room, and my first interviewer came in and introduced himself. He pulled out a piece of paper and started asking me things like, "What is a pipe?" Not how they work, not why they are interesting, nor what some UNIX command-line utilities are that I might think to use; literally just, "What is that character right there, and what does it do?"
00:00:57.260 So, I said, "Okay," and gave them the Wikipedia definition. They said, "Great," and then moved on to the next question.
00:01:01.530 They asked, "Does Ajax return things other than XML?" I replied, "Sometimes, I think," and talked about JSON, web standards, and then ex-missus script, which is just a misspelling on my part.
00:01:06.180 They said my answer sounded good, and we continued for about 35 minutes, with questions ranging all over the place, from front-end command-line utilities to various technical trivia.
00:01:09.770 At the end, they thanked me for my time, and that was it. That was interesting.
00:01:14.250 Then another person came in, introduced themselves, and said, "Let me ask you a problem." They asked me a question that I always felt there was a secret follow-up question to, like, "How would you validate that this data structure is correct?"
00:01:17.070 I knew there was another question underlying this one, but I decided to play along and started doing all the things you're supposed to do: writing on the whiteboard, saying I needed to ensure I understood the question correctly.
00:01:23.410 But then they interrupted me and said, "This isn't hard. Just write the code." I don’t know what that would do to you, but it completely threw me off my game.
00:01:29.290 For the next 35 minutes, I ended up mumbling, sweating, and freaking out, providing a poor answer to what was not a very hard question. While I tried to confirm if my answer seemed reasonable, they kept saying no, and after a lot of probing, I discovered I had just missed a semicolon or swapped two indices.
00:01:38.920 We didn't get to their favorite part, and it became clear they were very irritated.
00:01:40.810 But the nice thing about interviews is that they eventually end; they thanked me for my time, took a picture of the whiteboard, and left. It felt pretty bad, but it was fine because it was lunch interview time.
00:01:44.730 This is when you go to the company cafeteria, and someone tells you what it’s like to work there, the culture, and what other people think. This interviewer was very nice, a bit older, and had been at the company for some time. They told me about the culture, what they liked working on, and what they didn't.
00:01:53.580 I’m not sure if HR instructed them to say this or if it’s something they felt was important to share, but their comment was: "We heard all about the sexual bathrooms." I think they meant non-gender specific bathrooms. Nonetheless, it was well-meaning.
00:01:58.730 I finished my meal, took my non-gender specific bathroom break, and returned.
00:02:03.290 Then this next interviewer threw me for a loop. They came in, very friendly and warm, and gave me a reasonably scoped interview question. They asked, "Does that make sense?" and I replied, "Yes."
00:02:06.150 They sat down, opened their laptop, and I thought maybe they would be taking notes or that there might be some interactive component.
00:02:09.000 But instead, they proceeded to ignore me entirely as though I were not there for the next 40 minutes. I attempted to engage, saying things like, "Does this make sense? Is this reasonable? Is this the right approach?" Sometimes they’d acknowledge me, and sometimes I got nothing.
00:02:16.000 Finally, they thanked me for my time, took a picture of the whiteboard, and left. This was repeated throughout the day.
00:02:19.000 Now, I was getting to the end of the interview loop, feeling very tired, but knowing I only had one interview left. The final interviewer came in, also very nice and friendly, and asked me a question that turned out to be unanswerable unless I knew what a De Bruijn sequence is.
00:02:25.870 I did not know what that was, and it frustrated me to the core because I found the topic very interesting. I left the interview, thinking I had failed, but then they asked me to come in for more interviews.
00:02:30.000 But I said no. I am not a fan of comparing software developers to doctors or medical professionals. It's knowledge work like medicine; we are on call sometimes, but pretending that ensuring a website is operational is equivalent to life or death situations is incorrect.
00:02:35.200 I have friends who are doctors, and as someone who imagines going into a doctor interview— where they ask to check your references and present a case to demonstrate you know how to perform surgery—just highlights the absurdity of the expectations placed on developers.
00:02:44.079 Some interviewing processes feel broken, and I don't throw that term around lightly. When someone says something is broken, it invites the immediate follow-up of how it's broken. And that's precisely what I want to dive into during this talk.
00:02:48.160 This talk is titled 'Interview Them Where They Are.' Hello, RailsConf, and hello, Minneapolis!
00:02:50.880 I am delighted to be here; I've never been to Minnesota or Minneapolis before, so this has been fantastic.
00:02:54.080 My name is Eric. I am a software consultant with a company called Test Double. If you're not familiar with Test Double, we are a distributed remote consultancy that partners with client teams to not only deliver great software but ensure that the teams themselves are better as a result of collaborating with us.
00:02:59.880 If you are thinking about your current projects or environment and believe there's something we might assist with, please don’t hesitate to reach out to me. I’m here at Q Weinstein on most platforms, including Twitter and GitHub. You can also contact me at [email protected].
00:03:07.160 Finally, if you want to talk about interviewing, diversity, and inclusion, these are very important subjects to me, and I'm more than happy to chat. Feel free to come find me after the show.
00:03:12.380 A few years ago, I wrote a book to teach Ruby to 8-, 9-, and 10-year-olds, published by No Starch Press. It's called 'Ruby Wizardry.' If you're interested, let me know. If, for some reason, you'd like a copy and can't afford it, let me know, and we’ll work something out.
00:03:24.480 In my mind, the two key points of interviewing are to find the people we need to grow our teams, and that's somewhat non-controversial. However, an often overlooked factor is that we must optimize for demonstrating competencies. Too often, an interviewer, whether intentionally or not, treats the interview process like a challenge, trying to find out what the candidate doesn’t know.
00:03:31.909 This results in them saying things like, 'That person knows JavaScript' or 'They really know React,' but they don't understand ActiveRecord or databases, etc., without truly considering if those are the skills you're looking for.
00:03:45.190 I really enjoy writing tests; I write tests frequently, and typically I follow test-driven development. This issue has come up many times during my career, especially when I wrote some code, sat down, and decided that I would write the tests.
00:03:52.630 I remember getting that first test written and it was red, which is a good initial step in red-green-refactor. I became excited when I realized I was writing my tests, and then I wrote the production code, but my test remained red. I began digging, trying to figure out where the issue was, writing down my execution paths on paper and staring at my code, until finally, I looked back at the test after an unnecessarily long time and realized it was incorrect.
00:04:05.720 This highlights a principle we often overlook outside the context of writing code. If you're interviewing someone and they have years of experience, numerous open-source contributions, are intelligent, sharp, and empathetic, and they struggle with a trivia question or perform poorly on a whiteboard, it might indicate the test—or its design—is what’s incorrect.
00:04:11.560 One key aspect I'm continuously contemplating is how we construct interviews. Many believe that interviews only start when the candidate walks in, or perhaps weeks prior when they receive the calendar invite with the attached resume, but the reality is they start much earlier—months in advance.
00:04:21.440 This involves writing a job description, or JD as we call them in the profession. Having been an engineering manager, I've written several over the years, including some that were poorly crafted. The job descriptions often seem familiar, indicating we are typically looking for some kind of credentials or equivalent experience.
00:04:31.080 For example, we often want a certain amount of experience, particular technologies, and whatever bonus points equate to. I don't believe this is even a good practice to include in our job descriptions.
00:04:39.040 Going through various job descriptions, I don't think a bachelor's degree in computer science is a valuable requirement. Most of the time, you don't need it. While there are roles where a solid understanding of computer science is essential, it seems unnecessary for most jobs.
00:04:46.540 When they say equivalent experience, it's ambiguous. It’s a content-free statement without any real substance. So, my recommendation is to eliminate it. Similarly, having a requirement for one to three years of experience may seem to indicate you're searching for a certain level of seniority, but it's actually very vague.
00:04:56.840 In my experience, the differences evident in someone's professional skill set after just one to three years at different types of companies can vary tremendously. For example, someone working at a continuously evolving startup will possess drastically different skills compared to someone whose experience is at a large bank.
00:05:05.720 What I suggest is to aim for determining what the candidate can actually accomplish.
00:05:11.980 When stating the technology stack, it's reasonable to require candidates to know JavaScript and React, but instead of a hard line requirement, I propose that we teach candidates that we prefer they know JavaScript and React.
00:05:20.919 This also opens the door for more experienced candidates who may not have learned the specific stack to be considered as they are still adaptable enough to learn on the job.
00:05:30.700 Bonus points are typically given to candidates with certain skills or experience, but this is also inherently vague. Here what I suggest is adhering to what our stack really is: if we are using JavaScript, React, and Go on the backend, we should mention it. The same applies to how we organize our services and technologies.
00:05:41.109 Overall, these changes can improve job descriptions significantly and help to refine our search. It keeps the essential elements while encouraging candidates to discuss their relevant experiences in these areas.
00:05:51.060 Moreover, there’s feedback questioning how vague job requirements could lead to unqualified candidates regarding specifics, but the opposite is true: gatekeeping language such as degree requirements or years only drives certain qualified candidates to self-select out, especially underrepresented minorities.
00:06:01.370 I believe that this approach will set us up to find suitable candidates. The key takeaway from this talk is to know what you’re looking for before the interview. Additionally, knowing how you will measure success is crucial when writing the job description.
00:06:13.640 Historically, evaluating candidates has been done in whiteboard interviews, which I believe has faltered, as they can misrepresent how real-world software development operates. Generally, those interviews have a rich history of being valuable, especially in the past, but the practices should evolve with time.
00:06:24.370 Yes, they can leave an impression on how a candidate writes code, yet they often do not accurately mirror collaborative work environments, which is what we ultimately need to measure. We work as teams in coding scenarios, communicating and building software collectively, while whiteboarding reduces that to an isolated exercise.
00:06:36.990 Additionally, texts like 'Programming Interviews Exposed' indicate you won't be asked about real-world issues, yet the core of software development revolves around solving precisely those challenges.
00:06:48.540 What are practical and valuable ways where we can conduct interviews that are more inclusive while gathering a higher fidelity signal about what candidates can do?
00:06:58.320 We might start by recognizing a candidate, let's say, Ada, who is a fresh computer science graduate, possibly with a few internships, but has not written much production code. So, what would we assume her strengths might be? Most likely, her strengths align with algorithms—sorting, searching, and kindred topics which are staples of traditional computer science curricula.
00:07:07.150 At this stage, we might also consider her proficiency with Stack Overflow and the types of well-defined, tightly scoped problems that lead to correct solutions as there exists only one right answer.
00:07:18.149 Now, how might we evaluate her skills? I would recommend a pairing session where she might demonstrate her knowledge collaboratively, allowing us to gauge her insights into code quality and design skills.
00:07:29.660 We should also aim to let candidates express the abilities they've developed, ensuring that interview formats align with what the candidate is familiar with, be it more traditional or modern methods. This equitable flexibility will foster a better environment for showcasing these capabilities.
00:07:37.450 Now let’s talk about another candidate, Ben, who graduated from a boot camp three or four years ago. For a candidate like him, we might focus more on pair programming to observe his focus on code quality and collaborative abilities.
00:07:48.649 He likely has dedicated time to thinking about testing code, modularity, and refactoring. Thus, during an evaluation, I might say, "Hey Ben, let us pair on a small version of a real production task or conduct a take-home assignment where we can later collaborate on it together."
00:07:59.160 These methods have been implemented in our interview processes at Test Double, where we often provide candidates with options on how they prefer to showcase their skills.
00:08:08.250 This level of respect for candidates' time—especially considering commitments such as parenting or other obligations—can lead to a more inclusive atmosphere for demonstrating one's experience.
00:08:17.989 Next, I want to address a candidate named Charlie. Charlie has a traditional computer science degree but graduated 10 to 12 years ago, meaning they may not be current on algorithms or data structures but excel in practical application, managing fires, dealing with products, and stakeholders.
00:08:30.360 In such an interrogation, Charlie may be better suited to handling practical programming tasks requiring a greater level of scenario awareness or complex problem-solving. I would seek to present them with high-stakes, challenging tasks that provide an opportunity for them to demonstrate their abilities.
00:08:39.650 A focus here should be on understanding systems and architecture to gauge their capacity for systems-level thinking. I also recommend creating an environment where they can discuss trade-offs and decisions made in past roles when developing systems.
00:08:52.509 Lastly, I want to take a moment to address the concept of bias. People have asked me whether a more flexible interview style introduces bias because not everyone is subjected to the same process. I believe that varying methods of assessment can prevent an overall bias.
00:09:01.790 Firstly, people aren’t receiving identical interviews even today; every interviewer has different standards and varying experiences that inform their process. This has a direct impact on the result of the interview.
00:09:09.750 The contention lies in distinguishing between uniformity versus unbiased assessment. Not everyone gets treated equally, which devoids interviews of real insights about a candidate's capabilities. When we treat each candidate to the same paradigm, it does not mean they’re getting fair evaluations.
00:09:19.210 Moreover, terms like equivalent experience and requisite years can signal candidate quality on a proxy basis, similar to standardized tests like the SATs, which serves more to categorize socio-economic privilege than real professional capability.
00:09:29.750 Having a firm understanding of what you're looking for and how to evaluate candidates can facilitate a more productive hiring process. The most crucial aspect is assessing candidates in a manner that values their unique skills.
00:09:38.020 In conclusion, interviews fail the interviewee when the process does not account for individuals like them. If an interview loop is designed for those with certain privileges or specific experiences, you risk excluding acceptable candidates.
00:09:47.360 Ultimately, the goal should be to allow candidates to showcase what they do best—leading to insights that highlight their actual competencies instead of arbitrary standardized evaluations.
00:09:55.000 In essence, remember that flexibility, inclusivity, and understanding context are key drivers in constructing productive and positive candidate experiences when interviewing.
Explore all talks recorded at RailsConf 2019
+103