As humans, we’re biologically programmed to like people that are most like us. It’s an unconscious shortcut our brain makes.
And it’s exactly the same in the hiring process. All too often, organisations unconsciously make hiring decisions that mean they pick the person that fits in best with their existing environment.
This is known as cultural fit. Over time, it means teams are more likely to become more homogeneous, harming representation, inclusion, and belonging.
Relying on cultural fit as a hiring benchmark means your organisation can’t reach its full potential. Here’s why.
What is culture fit?
Culture fit is a term that’s often used in the context of the recruitment process. It’s a way of evaluating the cultural impact a candidate’s personality and values will have on an organisation, as well as their commitment to it.
“Cultural fit has never been a clearly defined term,” explains Ansa Mahmood, Research Specialist at Develop Diverse. “Often when organisations use it, they often mean that they’re looking for someone who can adapt to their culture within their company. And at first glance, that seems like a normal hiring criteria — everyone wants someone who can ‘fit in’ seamlessly with their team.
“But what a lot of organisations don’t realise is that cultural fit risks unconsciously communicating that they’re looking for someone that looks, and behaves like them.”
The risks of hiring for cultural fit are well-studied. Organisations that hire for cultural fit often have lower levels of representation from marginalised or underrepresented groups. This is linked to poor innovation, job satisfaction, and ultimately, business performance.
But in the long-term, cultural fit can also harm your organisation’s inclusion and belonging.
“Because culture fit is so poorly defined, it can mean a lot of different things to different individuals,” says Ansa. “It could communicate to an ethnically marginalised candidate that they need to look like you to fit in — or it could communicate that it’s okay to look different as long as they hold the same world view or behaviours.”
Cultural fit often places a higher value on a candidate’s personality or individual characteristics, rather than their skills and potential at your organisation. And because everyone interprets it differently, it’s an unreliable way to evaluate candidates.
Because really, cultural fit is a form of bias.
“Cultural fit is affinity bias,” says Ansa. “Affinity bias is how we unconsciously make favourable choices towards people that are most like us. This can include people that look like us, those who come from the same cultural and social background, or people who have the same interests.
“In the recruiting context, we often unconsciously imagine a specific kind of person in a role — that’s affinity bias in action.”
Affinity bias can show up in a few different ways:
- Personal qualities: How someone behaves, their individual personality, what they value, and their interests.
- Individual characteristics: How someone looks or speaks.
- Education and background: Going to the same university or studying a similar course can lead to affinity bias.
- In-group cultural references: Your Friday bar, the Christmas party, and the foosball table can all communicate exclusion to different groups of people.
“Increasing your value and competitiveness as an organisation depends on hiring the best people,” says Ansa. “But you can’t hire the best people if you’re evaluating them based on a biased process. And if you aren’t creating an inclusive culture, then you can’t retain your best people either. This is why you have to be aware of how your biases interact with your recruitment processes.”
We all have biases, and ultimately, eliminating affinity bias in your recruitment process requires awareness and exposure — as well as a willingness to commit to long-term behaviour change.
In the Develop Diverse platform, we help organisations identify unconscious bias in their written communications with our inclusive communication tool. When you import text, our platform uses a blend of natural language processing and machine learning to instantly identify instances where biased language has been used, helping you make small tweaks that add up to big changes over time.
Book a free demo today to learn more.