Diversity & Inclusion Guide

Reducing Bias in Hiring Processes

A practical, evidence-based guide to reducing unconscious bias in hiring with structured interviews, diverse panels, and skills-first selection

March 12, 2024 • 7 min read

Unconscious bias creeps into recruitment at four choke-points: sourcing, screening, interviewing, and decision-making. Below is a concise, research-backed playbook you can implement end-to-end—complete with plain-English definitions, sample practices, and legal notes for UK employers.

1. Write Inclusive Job Ads

Why it matters

Seemingly neutral words can change who applies. Classic experiments show that "masculine-coded" language (e.g., driven, assertive, dominant) reduces women's sense of belonging and interest; neutralising wording widens the pool without lowering the bar.

What to do

  • Strip out gender-coded terms and replace with precise, skills-based criteria (use a checker like Gender Decoder or similar).
  • Focus on capabilities required to do the job (skills-first). LinkedIn and SHRM report skills-first practices expand and diversify talent pools.

2. Anonymise Early Screening

What "anonymised screening" is

Removing personally identifying info (name, address, school, photo) before initial review so you judge fit on skills and evidence, not proxies.

What the evidence says

Name signals can drive large callback gaps (e.g., U.S. resumes with "white-sounding" names received ~50% more callbacks vs identical resumes). Blind processes were shown to improve fairness in other markets (e.g., "blind" orchestra auditions increased women's success).

Field trials in Europe are mixed: many show higher interview rates for disadvantaged groups; some find blinding can also remove opportunities for positive action at early stages. Treat as one tool among several, not a silver bullet.

How to implement well

  • Blind CVs and work samples at first sift; reveal identities only once candidates reach structured assessments.
  • Combine with structured, skills-based testing to maintain signal while reducing noise.

3. Structured Interviews with Scoring Rubrics

Plain-English definition

A structured interview uses the same questions, in the same order, scored with a pre-defined rubric (often "BARS": behaviourally-anchored rating scales that spell out what a great / good / not-so-good answer looks like).

Why it works

Decades of meta-analyses show structured methods predict performance better and reduce subjective bias compared with unstructured chats.

How to do it (step-by-step)

  • Define success from the JD: 4–6 core competencies (e.g., stakeholder management, problem-solving).
  • Write questions (situational/behavioural) that elicit evidence for each competency. Google's re:Work has clear templates.
  • Build the rubric (BARS): for each question, pre-agree "5 = outstanding … 1 = insufficient," with concrete behavioural anchors.
  • Train interviewers and enforce independent scoring before discussion to avoid anchoring to the most senior voice.
  • Calibrate after the round (compare score distributions, discuss discrepancies).

4. Diverse Panels & Diverse Slates

Panels

Having more than one trained interviewer reduces single-rater subjectivity; many UK public bodies now expect diversity on panels.

Slates

When shortlists include only one woman or URG candidate, the odds of hiring them are near zero; ensure at least two under-represented finalists where possible to normalise choice.

Run the meeting well

Independent scoring first, then discussion; record rationales against the rubric (not impressions).

5. Skills-First Selection

What it is

Prioritising demonstrable skills, evidence and outcomes over proxies (schools, tenure, last job title).

Why it helps

Skills-first expands reachable talent (LinkedIn estimates up to 10× in some roles) and can reduce demographic skews created by degree filters. Use structured assessments to keep it fair.

Tools to support this

Skills taxonomies / frameworks (e.g., WEF) and structured question banks make apples-to-apples comparisons easier.

Quick Templates You Can Copy

Rubric snippet (Problem-solving)

5 – proactively frames the problem, weighs trade-offs, quantifies impact, pilots low-risk test; 3 – identifies options but limited data; 1 – jumps to solution with no diagnosis.

Structured question

"Tell us about a time you inherited an ambiguous project. How did you clarify scope, decide priorities, and measure impact?"

TL;DR Checklist

  • Inclusive, skills-first JD
  • Blind first sift (where feasible) + structured skills tests
  • Structured interviews with BARS
  • Diverse, trained panel; independent scoring; calibration
  • Reasonable adjustments offered and recorded
  • Decisions documented against rubric evidence
  • Data practices aligned to ICO guidance