Featured Image

This article was originally published in ERE SourceCon

I was watching a rerun the other day (OK, it was The Office!), and Dwight was quizzing Ryan on those old-school brain teasers (i.e., A man builds a house with all four sides facing south. A bear walks past the house. What color is the bear?). If you watch The Office, Ryan knew all the answers, and Dwight was of course furious.

And if you were ever a kid, you likely know the answers to most of these types of teasers, but one jarred my memory … and ticked me off a little.

A father and his son are in a car accident. The father dies instantly, and the son is taken to the nearest hospital. The doctor comes in and exclaims, “I can’t operate on this boy.”

“Why not?” the nurse asks.

“Because he’s my son,” the doctor responds. How is this possible?

How is this possible??!! Well, the doctor is in fact his mother – a woman.

Why is this even a brain teaser? Today, it’s shocking that people wouldn’t immediately guess this (I’m hoping they do, and if this one ever stumped me as a child, I’m completely embarrassed). But, the truth is, people tend to associate doctors with men, despite the fact that 35% of physicians in the U.S. are women (and that percentage is growing). That this brain teaser even exists is one clear example of how bias is ingrained deep within us, particularly when it comes to gender and diversity.

 

Can We Eliminate Unconscious Bias?

Every day, we make tons of decisions, assumptions and judgments about people without even realizing it. Based on our own unique world view, our unconscious bias fills in the blanks of what we don’t know about others. This kind of bias isn’t hateful or wrong; we’re often not even aware of it. But it is difficult to rewire.

I recently dove into this topic with Shon Burton, the co-founder & CEO of HiringSolved, a company that applies AI-based technology to automate the process of matching candidates to jobs. Burton’s company can help companies solve a lot of problems, but one of its most common use cases is to create more diverse pipelines by removing human bias from the sourcing process.

Removing bias isn’t a new concept, either.

Burton brought up the example of the Boston Philharmonic. Long story short: In 1970, the five highest-ranked orchestras in the nation consisted of only 6% female musicians. To try to overcome gender-biased hiring, a majority of symphony orchestras revised their traditional hiring practices, opening auditions to a range of candidates instead of only hiring musicians handpicked by the conductor. Some also adopted “blind” auditions, where musicians literally played behind a curtain, thus concealing their gender. After these practices were implemented, female musicians in the top five orchestras increased to 21% by 1993. “This works well because you can actually put musicians behind a curtain and judge purely on their talent,” says Burton.

I’m going to go out on a limb and say next year’s recruiting trend won’t be interviewing people from behind a curtain. But these orchestras were able to level the playing field, ensuring every musician that auditioned had an equal chance based on their quality of play.

 

Are We Introducing Bias by Removing Unconscious Bias?

The one caveat to the story above is this: To address their gender imbalance, the orchestras eliminated the unconscious bias that was limiting their exposure to women.

And that’s where the comparison deviates from how many organizations currently approach diversity hiring initiatives.

As Burton puts it: “When we say we want to hire more women for a role and we actively seek them out, that is inherently a conscious bias. We’re actually handpicking and pinpointing them for a specific job, and most people don’t seem bothered by that. But if we said we wanted to hire more white men for the same role, it would be really distasteful.”

Hold off on throwing your tomatoes.

Burton isn’t arguing that white men are discriminated against. Think about it: Are diversity hiring initiatives leveling the playing field, or are they targeting specific groups of people over others — an inherently biased thing to do? Maybe they’re doing both. But the question we need to ask as a society — and as employers — is whether that’s really the best thing for companies and the people they hire. 

“It’s not to say we shouldn’t strive for change: we should be optimizing hiring to understand and potentially remove biases — ones we’re aware of and ones we aren’t,” Burton says. “And it’s certainly well within the capability of technology to remove bias all the way through someone’s first day of hire. But the question is: Does that make sense? Is it going to help you hire the best people who work well on your current team? And is it going to strip out the value of human intuition and cognition?”

 

The Tech Factor: Man With Machine

I recently read this sentence in a Science Magazine article and I think it sums this issue up pretty well: “AI (artificial intelligence) is an extension of our existing culture.”

Burton agrees, and it’s an argument we stand by at SmashFly. That is: Technology is the product of humans. And AI is simply technology that’s been created and trained by humans, so that it can go off and analyze data in a way — and at a rate — that humans aren’t capable of. Yes, it can do so without some of the bias humans show. But, at the end of the day, that technology is still influenced by humans. If those humans start with bias, the technology will reflect it.

“It’s tricky with technology because AI learns from our choices to some degree,” Burton says. “If you have a lot of bias in your recruiting and hiring processes, and you push that data through a neural network or AI-based system, then you’re using data that’s inherently biased. As a result, you’ll simply be training AI to be biased.”

To illustrate this, Burton uses this analogy to compare our relationship with AI: “People can do a lot of good with a hammer. But they can also wreak havoc and destroy. It’s the same thing with technology, specifically in sourcing and AI. It’s up to us how we use it.” 

 

Increasing Accuracy at the Top of the Funnel

So how can we leverage AI and other technologies for good — removing bias in the earliest stages of the screening process to ensure our companies have the greatest access to the most diverse talent pools?

It starts with applying these technologies in the right ways, at the right time.

Let’s look at the top of the recruiting funnel. While AI can help throughout the hiring process — like reference checking or deducing employee turnover — it has clear value in sourcing and vetting candidates to create an equal playing field.

“The beauty of this technology is that it can pick up on social signals and other data that’s often missed or misconstrued on resumes,” Burton says. “This allows us to use machines to populate the top of the funnel with the right people. And because this process is inherently unbiased, you’re more likely to end up with a more diverse group of candidates who may not have made it through the screening process before.

Case in point: Say a talent acquisition team gets an initiative from the C-suite that they need to hire 2,000 people in ‘X’ role by ‘X’ date, with a focus on hiring women. Technologies like HiringSolved and SmashFly could work together to make intelligent guesses about which candidates might be women, stack rank them into a pipeline, and enable your team to filter those candidates based on skills, experience, and other criteria. This won’t eliminate profiles or resumes of qualified people, but it can bring more women into the top of the funnel with higher accuracy and without bias or compliance issues.

Now, it’s important to point out that this process isn’t consciously removing qualified men from the role. Instead, it’s giving more qualified women — or anyone else, for that matter — a chance to be seen when bias might have eliminated them from contention before.

As Burton so eloquently puts it: “Technology enables transparency in a way that can’t easily be undone. Everything gets laid bare.” 

 

Bigger Than Buzz: What Are You Trying to Solve?

Unconscious bias is not going to end with the emergence of AI; it will help you equalize candidates and become more intelligent about your top of the funnel sourcing methods. But we can’t expect technology to remove all of our inherent biases: they’re in the channels we use, the way we talk about our employer brand, the employee advocates we target, the way we write job descriptions, the way we think about job roles and gender roles, the way we seek to work with people who are like us. It’s a systemic change, one that technology can get us one step closer to.

This is where Burton strongly believes that talent acquisition teams need more than just AI to drive the quality and diversity of their talent pipelines. While AI can improve the diversity of candidate pools, intelligent automation (IA) is the key to nurturing that relationship over time —helping you identify further patterns, from what content and messaging they interact with, to which channels they act on, to the different career site pages they stay on the longest, to the job titles that convert more of a certain type of candidate audience. This type of behavioral data is integral in continuing to make objective, unbiased inferences about the talent you want to hire – and the talent you might not yet know you want to hire.

“We believe social and behavioral signals — the things we put on the web every day, where we are, who we know, what we like, what we don’t know, what we interact with – are, in aggregate, more predictive of an ability to do a certain job than a resume,” Burton says. “That’s the real power of AI and analytics in recruiting. When you can paint this very complete picture of someone, it’s a far better predictor of whether they’re going to be the right fit for your needs.”

Leave a Reply

X