Seeing less can help us make fairer decisions
Years of research conclude that even if your organization doesn’t have a “blinding” policy for hiring (and other people evaluations), you should do it anyway.
We are living through a moment in which the question of bias is front and center in many business discussions. There are good reasons for the focus on this persistent problem. We know, for example, that identical resumes with names that "sound black" receive less attention than those with white-sounding names. We also know that female entrepreneurs face a harder path to getting startups funded than their male peers.
Because so much research has shed light on the potentially negative impacts of bias, an increasingly important question is what can be done to minimize the effect of bias in critical people-centric decisions. One of the more established response strategies is something known as blinding, which refers to the withholding of specific information, e.g., applicant gender or age, from a decision-maker until after a decision is made. This is a technique with a long history, and a famous case is the adoption of blind auditions by some symphony orchestras — a practice that started in the 1960s. The impact of blind auditions was such that by the 1990s 25% of symphony musicians were women — up from about 5% in the 1950s.
As with the symphony example, research has demonstrated that if job applicant demographic information is withheld from hiring managers, job applicants from underrepresented groups are more likely to get interviewed and, in some cases, to receive job offers.
Even though the literature on blinding is clear about its benefits, the practice remains the exception in business, which led Sean Fath (Cornell), Richard P. Larrick (Duke), Jack B. Soll (Duke), and Susan Zhu (Kentucky) to seek to understand why a technique with a long track record of reducing bias in decisions has not gained wider acceptance in the corporate world. Specifically, the researchers explored the factors "that might influence whether evaluators will choose on their own to use a strategy of blinding in their evaluations" and the efficacy of those factors once deployed.
As noted above, despite the documented positive impact of blinding, very few Human Resources organizations have formal blinding methods in place. The authors confirmed this observation in a survey of over 800 HR managers. Overall, almost 60% of the managers said they were familiar with blinding methods in hiring, but only 19% said their company had any blinding policy in place and only 18% had ever worked for an organization that used blinding in HR decisions. Moreover, only 20% had ever been trained on how to use blinding in decision-making, though that number was slightly higher for respondents who worked for public sector organizations.
From years of research into the issue, the authors identify several reasons that stop managers who know the value of blinding from using it in their own decisions. One such factor is simple curiosity. The authors describe a study in which they asked participants to view a video of someone completing a specific task and then to gauge the person's performance. The participants had the option to see the personal profile of the person before the observation and grading. On average, half of the participants in this group chose to view the video. A second group, however, was asked if they should see the profile (before being asked if they wanted to do so). In this second group, approximately 90% of the participants agreed that seeing the profile could bias their judgment and chose not to watch the video. Thus, note the authors, "people have the insight that it is better to avoid certain information, but they need to trigger this insight by reflecting on what they should do.”
In a similar study, participants were asked to interview someone for a job. As with the study noted above, one group was given a simple choice: they could choose to see the applicant's name and photograph before the interview. A second group was asked the "should you see them" question. A third group, however, was given a third option: "to first judge the job candidate based on credentials alone and then see that person’s name and photo (with the option to revise the initial judgment)." The participants in this third group were more inclined to make a blind judgment first and see credentials after making their initial decision. Notably, only a small number of participants — fewer than 20% — revised their initial, blind judgment. In other words, "with their curiosity satisfied, most subjects chose not to adjust their assessments based on the potentially biasing information."
The authors note that in addition to curiosity, people also choose not to self-blind because “they honestly, but incorrectly, believe biasing information to be useful or helpful." In another (unpublished) experiment, the authors provided one group of hiring managers the option to see a candidate's professional headshot and credentials while another group had the option to learn a candidate's race and gender. The authors note that "even though a person’s photograph is very likely to reveal that information, we reasoned that the explicit option to learn a job candidate’s “race and gender” would be more likely to cue reflection about potential decision bias than the option to see a candidate’s “professional headshot”." In this study, 45% of the managers chose to see the headshot while 20% chose to see the race and gender information. The authors believe that "certain information, like a candidate’s name, headshot, or college graduation year, may fail to cue a desire to self-blind because the underlying, potentially biasing content — race, gender, age, and so on — is not immediately brought to mind."
After considering the results of various experiments across multiple studies, the authors conclude that "evaluators who can overcome or delay a curiosity-driven impulse to receive potentially biasing information about a target — and who understand that having such information tends to hurt rather than help decision-making — are more likely to choose to blind their own evaluations."
Given that so few organizations have adopted specific blinding policies, the authors end their analysis by suggesting two blinding-related techniques that can help reduce bias. The first approach is to "nudge deliberative thinking." Because curiosity is often the force pushing back against adopting blinding techniques, evaluators should be encouraged to reflect on their decisions and to consider how bias may be affecting the outcome. Just asking decision-makers to consider the question "What should you see?" can be enough to encourage self-blinding. Thus, managers should be trained to question what information they are using to make decisions — and when it is used — because this self-reflection stage can often be enough to make someone adopt blinding proactively.
A second technique is to "change the order of information." Thinking about when certain information is seen can also be a catalyst for self-blinding. For example, "managers can be asked to first perform a blind evaluation (such as evaluating an anonymized resume) and then receive the information that was hidden from view (the job candidate’s name, college graduation year, hobbies, and so on), with the option to revise their initial blind evaluation." In the same unpublished study noted above, the authors found that changing information order reduced initial decision bias and that less than 20% of participants elect to change a decision once they are shown a candidate's full profile. This finding suggests that a key to reducing bias is simply minimizing access to the kind of information that brings it to bear during decisions.
The authors call this approach a "fair order" strategy, and it can be useful in many settings. For instance, venture pitches could be presented in two parts: first, a written description of the pitch (with information about the entrepreneur not included), and then a live presentation with the actual founder. As the authors note, "investors who first read and evaluate the blind version of the idea — the written description — and then see the pitch with the option to update their evaluation may be less likely to be swayed by the gender or the attractiveness of the entrepreneur in their final evaluation than those who learn about the idea from the pitch alone." While many venture funding processes look like this at first glance, the reality is that many VCs explicitly seek to be swayed by the founder profile right from the start of a pitch evaluation.
The good news is that researchers continue to make a compelling argument for the increased adoption of blinding in hiring, promotions, innovation leader selection, venture funding, etc. Adoption may be increasing slowly, but it is increasing. Until the practice gains wider institutional support, managers should consider self-blinding options, keeping in mind the possible negative effects of their own natural curiosity and the order in which candidate information is viewed. This evolution is a challenge for many managers because research has also shown that people sometimes choose to see potentially biasing information, due to the mistaken belief that such information may bias others but not them. Notwithstanding this uniquely human challenge, the authors make a solid case for at least considering how self-blinding can make our deliberations fairer and our decisions more equitable.
Sean Fath, Richard P. Larrick, Jack B. Soll, and Susan Zhu. Why Putting On Blinders Can Help Us See More Clearly. MIT Sloan Management Review. June 08, 2021. Access it here.