One reason for the widespread interest in artificial intelligence (AI) is that it has the potential to reduce the degree of bias underpinning human decisions. For example, meta-analytic studies have long highlighted the pervasive nature of bias in hiring and recruitment.

Even in the rich and liberal world, there are many biases at play in the workplace (for example, sexism, racism and ageism) which account for the unmeritocratic or unfair advantage that some groups have over others, irrespective of their actual talent or potential.

One of the most prominent biases that is hardly discussed or acknowledged is the beauty bias, also known as “lookism”. The existence of a beauty premium in the labour market is well-documented. 

As an academic review summarised: “Physically attractive individuals are more likely to be interviewed for jobs and hired, they are more likely to advance rapidly in their careers through frequent promotions and they earn higher wages than unattractive individuals”. 

Common manifestations of appearance-based discrimination may include bias against obese, oddly-dressed, tattooed candidates or any people who don’t fit a society’s dominant aesthetic criteria.

Broadly speaking, the beauty bias concerns the favourable treatment that individuals receive when they are deemed more attractive, regardless of whether this happens consciously or unconsciously. Of course, few individuals, let alone employers, admit to preferring to work with others on the basis of their higher levels of attractiveness. 

Naturally, there are some exceptions. For instance, to join the Chinese Navy, “good looks” are an official requirement, apparently so they can represent the nation with the best image.

Here’s the good news: identifying this bias is surprisingly simple. This means that any employer interested in eliminating handicaps against less attractive people should be able to detect this bias. 

Now the bad news: you are unlikely to achieve this unless you replace human intuition with data. This is where AI can potentially help, if approached responsibly. So how can we tackle the attractiveness bias?

First, you can measure attractiveness, which is typically a function of consensual ratings of physical appearance. Imagine you ask 10 people to rate 100 people on physical appearance or attractiveness. Although attractiveness is not objective, which is why there are always disagreements between people rating the same person, it is also not entirely subjective. So, most people will tend to agree on whether someone is more or less attractive, for instance, by using a 1-10 point scale, not just when they belong to the same culture.

Next, you can correlate this score with a range of success indicators, from interview ratings, to job performance ratings and promotion or salary data. Given that attractiveness is rarely a formal criterion for picking one person over another, except of course, in the dating world, there are obvious reasons for evaluating whether and why people’s attractiveness scores may correlate with any objective indicator of career success. 

Here’s where AI can help: as a diagnostic tool to predict someone’s likelihood of being deemed more effective in the business based on their perceived attractiveness. A significant body of research suggests that a person’s attractiveness level is far more predictive of a range of success outcomes than one would hope if we want to live in a fair and unbiased world.

So, what does the science actually tell us?

Studies showed that less attractive individuals are more likely to get fired, although they are also less likely to be hired in the first place. For example, in an experimental study, researchers sent 11 000 CVs to various job openings, including identical CVs accompanied by candidate photographs of different levels of attractiveness. 

Attractive women and men were much more likely to get a call back for an interview than unattractive (or no-photograph) candidates were.
At times it is hard to determine whether appearance should be treated as a bias factor or job-relevant trait, especially when employees’ performance depends on the perceptions customers or clients have of them. 

As a Glassdoor report noted: “There are many industries and businesses that would suffer immeasurably if we were to legislate our beauty bias.” In support of this idea, evolutionary scientists report positive correlations between attractiveness ratings on one hand, and scores on socially desirable personality traits such as emotional stability, extraversion and ambition, on the other.

For example, physical attractiveness – just like psychological attractiveness (likeability) – contributes to better sales and fund-raising potential, so is it sensible to stop employers from hiring more attractive salespeople or fundraisers?

Perhaps, because the alternative is to discriminate against less attractive individuals, which will include people from minority groups who don’t fit the dominant “beauty norms”. But when employers simply pretend to ignore attractiveness, focusing on candidates’ past performance or interview performance and interpreting these data as objective or bias-free, there is no guarantee that less attractive candidates won’t be handicapped. It is no different from pretending to ignore race or social class while selecting on academic credentials, which are actually conflated with race and social class.

Clearly, there’s an unfair advantage to being deemed more attractive versus an unfair handicap to being deemed less attractive. Although employers can mitigate this bias by eliminating appearance data from their hiring practices by not only using AI but also focusing on science-based assessments, past performance and CV data, such measures will not be sufficient to eliminate bias. 

This is because they are also influenced by historical or past bias: if attractive people are evaluated more favourably in the past, they will show up as high performers in their CVs, etc. Still, that is no reason to avoid the issue or perpetuate the beauty bias at work.

Importantly, AI can be a powerful tool to detect and expose the degree of bias underlying human ratings of potential and performance. If programmed correctly, AI could become an objective way to measure what we don’t always see ourselves. For example, if you’re trying to lose weight, a scale can help keep you honest. If you’re trying to exercise more, a fitness tracker can help monitor your progress. Given the right inputs, AI can help us overcome our conscious and unconscious biases in hiring.

* This article first appeared in Harvard Business Review.

Pin It on Pinterest