Jacob – you have been involved in BCG’s work with UN Women on the Gender Diversity Roundtable for a couple of years. Can you firstly tell us what that is all about?
“BCG and UN Women established the “Gender Diversity Roundtable” in 2016 to put more focus on gender diversity and enhance the advancement in Denmark. We are unfortunately still not, where we want to be. There is progress, but we are falling behind, as our peers outpace us.
Since 2016 lots have happened – we have held regular meetings with the roundtable members, who are leaders of the Danish private sector, educational institutions and societal member organizations, published a number of reports on the state of gender diversity in Denmark and created a Manifesto, with “Three Recommendations for Better Gender Balance” in Denmark.”
The newest report focus on AI from a gender perspective – why is this important?
“AI is on the rise, and Denmark is uniquely positioned to leverage the opportunities that AI brings. However, we know from research that if AI is not managed correctly, it may pose a threat to diversity, reinforcing the current biases in society – and as such, hinder value creation.
On the other hand, AI may be a tool to enforce diversity, and thereby value creation. It is all a matter of implementing the right tools and strategies, and with this report we wanted to make companies and organizations aware of this, as they are starting their journeys with AI. “
What are the main challenges related to diversity in AI?
“Our report highlights four challenges, which can be summed up as follows:
- Bias in data: Bias in people is reflected in both what data we collect and how we collect it
- Lack of diversity in the AI industry: Females make up a mere 10-16% among AI-focused staff at top tech giants
- Functional bias: Algorithm design can lead to a disconnect between the issues solved for and the desired outcome
- Ethics of discrimination: AI by nature enables discrimination on the basis of sensitive variables such as gender“
Can you name an example of how AI can hinder diversity?
“Amazon experienced an issue with Al-related bias some years ago when their recruitment tool began to discriminate against women for technical jobs. It screened resumes based on those of previously successful employees, the majority of whom were male. As a consequence, the algorithm punished resumes using the word “women” and favored language more frequently used by men. This situation ultimately led to the tool being scrapped.
When even tech giants struggle with diversity within AI, there can be no question that it is a key topic that companies generally need to resolve in order to fully leverage the potential of the technology.”
What are your recommendations for companies and organizations implementing AI solutions, wanting to assure it fosters – and does not hinder – diversity?
“In the report, we highlight four steps to build AI-solutions; but the specific path forward naturally depends on company, industry, and type of AI solution.
Overall the key recommendation must be that companies and organizations continuously evaluate ethics and diversity in their AI solutions, from strategy to implementation. If this is done, we do believe that AI can be a force of increasing diversity, and thereby value creation for Danish companies and organizations.
The four steps to developing a successful AI solution can be found below:
- Develop a strategy for AI: Study the potential and effect of AI use cases to understand value pools
- Ideate and prioritize use cases: Ensure the right number of cases to not spread the effort too wide or focus on one case
- Build and deploy use cases: Ensure the right expertise to build optimal models and algorithms
- Transform the operating model: Structure an ecosystem to develop the people, skills, and processes“
Download the report:
“Gender Diversity in AI: Why companies and organizations need to consider bias and ethics in AI”