Slush

Algorithms are not impartial

The world's leading start-up convention, Slush, took place in Helsinki in late November. With the tagline "Nothing normal ever changed a thing", this year the fair sought to highlight the social impacts of technology.

I attended Slush with colleagues from Plan International and people from the SmartUp Factory innovation hubs in Ethiopia and Uganda.

The technology gender gap

The world's first machine algorithm for an early computing machine was written by a woman. Her name was Ada Lovelace, and she lived during the first half of the 1840s. At the time, she (as a woman) was a rarity in the world of science. 

And despite her and many other women's contributions to the sector, the fact remains that 200 years since Lovelace was born, a significant gender gap persists in the world of science and technology. This is a problem.

The barriers surrounding girls' and women's access to and use of digital tools and technologies are well-known. Less attention is given to the challenges facing women as creators of the same tools and technologies, and the impacts this might have.

Without girls' and women's perspectives, we risk creating tools and solutions that reproduce and perpetuate existing gender inequalities – as well as fail to address issues and challenges girls and women face.

This is not merely a hypothetical risk. Already, women looking for a job online are less likely to see targeted ads for high-paying roles than male counterparts. Why? Because the algorithm was designed that way. 

Artificial intelligence learning gender stereotypes

A 2016 study by the University of Virginia and University of Washington found that artificial intelligence (AI) systems are more likely to label people who are cooking, shopping and cleaning as women, and people who are playing sports, coaching and shooting as men.

Does AI want to keep women by the stove? In an interview for The Guardian, Joanna Bryson, a computer scientist at the University of Bath, said: "A lot of people are saying this is showing that AI is prejudiced. No. This is showing we're prejudiced and that AI is learning it." 

And while humans have the potential to counteract learned biases, the algorithms AI are based on may be unable to do so and instead continue to reinforce and even amplify existing biases, including gender-stereotypical social norms. 

In fact, they may be specifically designed to be that way: fighter robots are designed as 'male', while robots intended for the care and service industries are given 'female' characteristics.

And who makes these algorithms and AI? Predominantly white men. There are many implications of machines learning predominant biases and stereotypes, from labelling black defendants as more likely to reoffend than white defendants, to providing gender-biased translation across languages. 

The below image shows how Google Translate 'translates' the gender-neutral Finnish pronoun "hän" into either he or she depending on context.

Google Translate image.PNG

 

The marginalised and vulnerable are usually at the receiving end of prejudiced AI.  Without more women – and people from diverse backgrounds in general – as creators of technology, we risk perpetuating existing inequalities.

What does this have to do with Slush?

At the convention, Plan International launched the keyboard app Sheboard, which harnesses the power of predictive text to boost girls' self-confidence and remind us to talk to and about girls in a more empowering way. We brought young women from Finland, Uganda and Ethiopia to not only experience Slush itself, but also bring their own views and experiences to the convention. 

The group also had the opportunity to meet with established women in tech, including the impressive Rumman Chowdhury who leads Accenture's work on ethics and AI.  

Technology, algorithms and AI are, in themselves, neither good or bad. It's how and what we use them for that matters. And by working to increase the number of female tech creators, we can increase the chances of machines working with – as opposed to against – us to achieve gender equality.