5 Biases organizations are guilty of when making application decisions

Many organizations have hidden biases in their application selection process. Whether that is distributing grants, evaluating resumes, or awarding scholarships. For example, white names are 50% more likely to receive a callback on their application. Biases impact the diversity of which organizations you fund, which candidates you interview, or who gets a scholarship.

We should all strive to make our decisions as equitable and bias-free as possible. Here are the top 5 recommendations to improve your decision-making process.

1. Remove Identifying Information

People have biases, whether they consciously know it or not. Information about an applicant, such as their name, age, race, gender, religion, could affect a reviewer's decision. Making this information hidden during the review process helps eliminate these biases.

Removing identifying information, more importantly, reduces affiliation and conflict of interest bias. If the applicant works for a prestigious institution, for example, it is shown that a reviewer favours them up to 15% more. When a reviewer knows an applicant or has a conflict of interest, they are 4 times more likely to give a high evaluation.

Most likely, you'll need to collect identifying information as part of the application, but you should try to remove/hide it for reviewers either manually or using software.

2. Prevent reviewers from seeing each other's evaluations (Conformity Bias)

When you have a group of decision-makers, there is often a group consensus that can suppress an individual's desire to share alternative thoughts, critiques, or other unpopular opinions. This concept is called "Groupthink" or conformity bias.

When a review process is a victim of groupthink, individuals tend to score, evaluate or comment similar to what others have already done instead of giving an authentic review.

By preventing reviewers from seeing others' evaluations, you force them to provide an authentic and well-reasoned evaluation rather than bandwagoning on the group consensus. This leads to a more equitable result and can spark essential conversations.

3. Randomize the application order

If you assign a reviewer a stack of 30 applications and ask them to evaluate them, the first application will be used as an anchor point and the last application will be subjected to an “anchoring bias”

An anchoring bias is a cognitive bias that causes us to rely too heavily on previous information, which means that we interpret new information from a reference point (an anchor) which is the first/previous applications. This affects our decision making by evaluating an application relevant to the anchor.

By randomizing applications, the advantage or disadvantage that might be received from being first or last is negated. This becomes more significant the larger the review team and number of applications.

4. Provide a Rubric

A rubric also helps in reducing anchoring bias by setting the rubric as the “anchor”.

Everyone is different and has their preferences and ways of thinking; these differences can influence the amount of weight that individual reviewers may give to the same information. Providing a rubric can help mitigate this.

Rubrics should have distinct and specific criteria along with a scale for each (good, okay, bad). This helps reviewers to assess and assign scores to each application fairly. An aggregation of the scores at the end using equal weighting will yield results with which you can be confident when making decisions.

5. Educate your reviewers

One of the best ways to prevent biases, in general, is through education. Most of the biases discussed in this article are "unconscious", and specific reviewers can hold specific biases.

Making your reviewers aware of unconscious biases through training and mentorship can help them detect and mitigate any adverse effects.

Conclusion

It’s important to be aware of these biases when making decisions, especially those with financial or life-changing implications for the people on the other end of the decisions.

Decisionhub has been working to remove bias from the applications review process for 2.5 years. Our software is designed with features baked in that target the biases mentioned. Here are a few links that you may find interesting

  • All you need to know about DecisionHub can be found here
  • DecisionHub’s features and services are explained here
  • Our website can be found here
  • If you want to book a 15–30 minute consultation meeting, book here

If you learned something new from this article, please drop us a comment and some claps below!

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store