Where Are the Bias-Busting Innovators?

Innovation is not immune to unconscious gender, ethnic, or other biases. With the dawn of AI, the need to make new products or processes inclusive is more urgent than ever
Where Are the Bias-Busting Innovators?

While pursuing her PhD at Western University, Sarah Saska came across research from the automotive industry that revealed car accidents as the leading cause of fetal death related to maternal trauma. This puzzled her and she dug deeper to find out why. She discovered that auto makers have always used male-bodied test dummies in crash testing. As a result, manufacturers did not take into account the impact seatbelt design would have on a pregnant person involved in a car crash.

As she pursued her research, Saska, now CEO and co-founder of the diversity, inclusion, and belonging consulting firm Feminuity, was struck by the hidden power of unconscious biases to shape innovation in unintended ways. Unconscious bias is a mix of attitudes, stereotypes, and cultural norms that we hold about different types of people and situations. Unconscious biases allow us to make quick, automatic decisions about our environment based on our background, cultural affiliations, and experiences.

Saska says these biases lurk in the design of many products. Studies have shown, for example, that voice recognition software has difficulty recognizing female voices. Saska says this is because R&D teams testing the software tend to be dominated by men. Google’s translator app also has a gender bias. Input a block of text into Google Translate and Google will automatically provide a translation using male pronouns.

Saska says the field of innovation tends to self-identify as gender-blind and gender-neutral. “Not cool,” she told delegates at the Social Innovation Bootcamp organized by the Centre for Social Impact of Smith School of Business. “This doesn’t work.”

Out of Sight, Out of Mind

The lack of acknowledgement of gender in innovation – an unconscious bias – means many new products and processes are often highly gendered and typically designed with men as a the default. The result are products or processes that are potentially dangerous or demeaning toward women. “In innovation, we’ve been paying more attention to the innovation processes and systems than to the human side of the actual process,” she says. “When people are not visible in the discourse, gender easily becomes invisible.”

Saska found that unconscious bias in innovation does not only relate to gender. Racial bias is also present in many innovations. Studies of facial recognition software, for example, have shown that some software is challenged to identify different skin tones, facial features, and aesthetics. Snapchat caused commotion when it created a filter that allowed users to turn their selfies into Asian caricatures, prompting accusations that the feature was an example of “yellowface.” Just a few months later, the firm released a Bob Marley filter, which turned users’ selfies into the equivalent of “digital blackface.”

“Understanding our unconscious biases is more important than ever before,” Saska says. Organizations are made up of individuals each carrying their own biases. These biases are manifested in an organization’s structures, systems, and processes. “Even the smallest expression of bias can result in the exclusion of individuals, groups of people, information, knowledge, product ideas, and can affect businesses from a 360-degree perspective.”

“In innovation, we’ve been paying more attention to the innovation processes and systems than to the human side of the actual process”

The issue of hidden biases in innovation has greater urgency these days because of the dawn of artificial intelligence. In a recent essay in The Financial Times, Kriti Sharma, vice-president of artificial intelligence at Sage, pleaded with AI technologists to “prevent the algorithms and machines we create from replicating the human mistakes and casual prejudices the world has come to accept as normal.”

Sharma argues that AI technology is fraught with biases that can have harmful consequences on people’s lives, especially as it is beginning to be applied to professions that have traditionally relied on human judgment. Algorithms, for example, are now being used in criminal justice. Palantir, a security company, provided the New Orleans Police Department with technology designed to predict criminal activity that contained racial bias.

Why should business leaders care about biases? Saska says that companies that understand their unconscious biases have been shown to solve problems more effectively, better understand the needs of a diverse customer base, and be more innovative.

As demographics change in unprecedented ways, the world is becoming more complex than ever before. As a result, Saska challenges business leaders to show greater care for different people and perspectives, and to ensure their innovations are designed with empathy for others. “We as leaders need to do better,” she says. “Good intentions are not enough in this world anymore.”

Michèle Whitcombe


Take Action

Are you interested in eradicating unconscious biases from your innovations and organization? Sarah Saska recommends these resources:

Smith School of Business

Goodes Hall, Queen's University
Kingston, Ontario
Canada K7L 3N6

Follow us on:

Queen's logo