As you may have realised from my blogs, I’m passionate about the hiring of talent being an equitable process where it’s a level playing field for all candidates. As next week is Transgender Awareness Week, I thought now would be a good time to discuss some of the biases preventing marginalised groups from getting a fair chance at employment.
I recently came across an article shared by a connection on LinkedIn titled, ‘Here’s why you didn’t get that job. Your name.’ The article references a study from the University of Toronto where researchers distributed nearly 13,000 fake resumes to over 3,000 job listings. The study found that Chinese, Indian or Pakistani-sounding candidates were 28% less likely to be offered an interview than fictional candidates with English names, even with identical qualifications.
These results chime with other studies, such as one from the French government last year which found employers were less likely to interview candidates with North African-sounding names, and a 2012 UK parliamentary study which found that women who ‘whitened’ their names had to send half as many applications to get an interview than those with ‘foreign’ names. Clearly, change is needed. But what’s the reason for these biases?
Why do biases exist?
Despite the fact there are huge incentives for companies to increase diversity, with McKinsey research showing ethnically diverse companies are 35% more likely to outperform their non-diverse counterparts, and the American Sociological Association claiming that every 1% rise in racial diversity will bring a 9% rise in sales revenue, companies are still not drawing from the 13% of the UK population hailing from BAME backgrounds. Furthermore, a shocking new report from Crossland Employment Solicitors showed that one in three UK employers admitted they are ‘less likely’ to hire a transgender person, and nearly half unsure if they would recruit a transgender at all.
There are a number of biases causing this. One of these is homophily, the tendency for people to trust those similar to themselves. This is so ingrained that it’s even seeped into our ‘sophisticated’ artificial intelligence bots. With AI algorithms being trained on older company data, and programmed by historically homogenous workforces, there’s every chance that algorithms pick up on human biases. In fact, a recent study of the facial recognition software of Microsoft, IBM and Face++ reflected this. The systems were shown 1,000 faces, and told to identify each as male or female. All three did well discerning between white faces, and men in particular. However, when it came to dark-skinned females, there were 34% more errors.
So, if biases are influencing how we hire, how can platforms like AnyGood? help? A 2018 Stanford study of Airbnb may point to the answer. For the study, researchers created two groups. Group One, with profiles of similar demographics to participants, and Group Two, with profiles of different demographics but better reputations. Participants were then asked to rate the profiles based on trust. The study resoundingly showed that participants were more trusting toward users with characteristics different to their own, but better reputations.
The evidence, therefore, shows that reputation can offset social bias. Dr.Bruno Abrahao, leader of the study, stated ‘The fundamental question we wanted to answer was whether technology can be used to influence people’s perception of trust.’ He concluded that platforms such as Airbnb can ‘engineer tools that influence how people perceive trust and make markets fairer to users from underrepresented minorities.’
How AnyGood? can help
Just as Airbnb is a community that relies on recommendations and holds individuals accountable for their reputations, so is AnyGood?
Our network consists of members known for their expertise. These members select candidates they believe are a perfect fit for roles and are rated by clients on the quality of their recommendations. Putting forward good candidates earns trust. From this, more diverse workforces can be built through members using their reputation to give candidates from underrepresented groups a better chance of a bias-free hiring process.
However, this alone is not enough. Members may have their own biases. This is why we’ve diverted our efforts into curating as a diverse network as possible from day one. This not only aids underrepresented groups into roles but also help clients tap into previously inaccessible pools of talent, and enjoy the economic benefits of doing so.
As a company that believes that reputation matters, and that there’s strength in community, we’re dedicated to building a system where hiring is based purely upon the candidate being the best fit for the job, and removing the barriers for organisations to find this talent.
Interested in becoming a member of AnyGood? Sign up today