Why the Internet Must Become Feminist

Photo credit: Ehsan Kabir, Plan International

Photo credit: Ehsan Kabir, Plan International

This text was originally published on Plan International’s website on Apr. 25, 2019 to mark. International Girls in ICT Day.

“You’re simply the best”, “Hero!!!”, “I’m in awe of you”, “You Are a Great Leader!”

So read some of the thousands of comments on Greta Thunberg’s Twitter feed. Yet despite the 16-year-old climate change activist galvanising over 1.6 million people to act through her school strike for climate action, you don’t have to do a lot digging online to find the backlash. There are claims of Greta spreading ”propaganda”, calls for her to return to school and stop inciting other children to strike. 

This is not surprising. While the internet and social media have been huge enablers for Greta’s message to reach millions, her activism has also made her a target for the trolls, cyberbullies, and fake accounts well-known to many activists online. 

The World Wide Web, 30 years old this year, is not a friendly place for girls and women. And the more vocal they are, the worse the abuse. Research reveals that female politicians on social media are over 3 times more likely to experience derogatory comments* related to their gender than their male counterparts. Younger women are disproportionately targeted. 

The Role of Bots

Increasingly, this violence is perpetrated not only by humans, but by bots too. Around half of all web traffic today is created by bots*. Some are eminently useful, performing tasks such as repairing links, removing vandalism and tagging articles on Wikipedia. 

However, bots, like any technology, are not neutral. They do what they were programmed to do and some actively uphold inequalities and crowd out alternative views online. 

Social bots, essentially fake accounts that imitate real humans, are creating a growing amount of content on social media. Some 15% of all active Twitter accounts are presumed bots*, but they punch above their weight; unlike humans, bots don’t need to eat or sleep – they can post content 24/7. This makes it possible for bot-created content to flood social media streams, skewing public debate and amplifying hateful rhetoric, violence, and abuse. 

For instance, both Donald Trump and Hillary Clinton received supporting messages from social bots* in the 2016 US elections. However, Trump had more bots producing positive messages about him, while half of bot-produced messages about Clinton criticised her. Bots are also contributing to Instagram’s massive harassment problem as well as spreading anti-Muslim rhetoric and fake news across social media.

We Need More Women in Tech

With the web increasingly created by bots, who creates them is an important question. As the tech sector remains dominated by men, it’s fair to assume they create most bots. This has consequences in terms of what bots are designed to do and what problems they solve - or create. 

The founder of the World Wide Web, Sir Tim Berners-Lee, is rightfully worried about the future of the web. In his annual letter this year he highlights harassment as one of the central problems affecting the internet today, contributing to making “many people feel afraid and unsure if the web is really a force for good.” He calls for us to step up “to make sure [the web] is recognised as a human right and built for the public good.”

As we step up, the “we” must include girls and women. For 3 decades, the World Wide Web has been a playground where the rules – or rather the lack of rules – have been determined by far too few. It’s been a place where hatred and violence have been allowed to thrive, where success has been defined based on number of engagements, not whether those engagements are useful, safe, or even made by a real human. 

No more. Being female online should not be a synonym for being abused. We need a web that is created by a diverse group of people, putting equality at the centre of its structures and processes. A web where girls, women, and other marginalised groups can exercise their freedom of expression without harassment. A web that allows the Gretas of the world to thrive and that amplifies the voices of those otherwise not heard. We need a feminist web. 

Change is Vital so Girls Get Equal

Concrete action is needed to make that happen. The lack of diversity in tech is keeping the internet from reaching its potential for good. We need to create opportunities in the technology sector, so girls and women can be involved in determining how the web operates, and what type of bots are allowed to operate and how. 

We also need social media platforms to improve their processes for reporting and dealing with abuse so that girls can safely create content that represents their views and needs. Facebook, for instance, currently does not differentiate abuse relating to gender, causing much of the abuse suffered by girls and women to go unidentified. Significantly, social media platforms, including Twitter and Instagram, must put user rights and safety before profits and growth. Bots masquerading as real humans must be banned. 

Meanwhile, governments must ensure legal frameworks stay up-to-date with technological developments, so perpetrators of online abuse, including bots, are stopped and held to account.

Through our youth-led, global campaign, Girls Get Equal, Plan International is making sure girls and young women have power over their lives and can shape the world around them – online and off. As we celebrate Girls in ICT Day today, we are encouraging girls all over the world to get into tech and help us make sure the web is a safe place for us all to exercise our rights – to help us make the World Wide Web feminist. 

Girls' freedom online is under attack

Photo credit: Tian Bo, Plan International

Photo credit: Tian Bo, Plan International

As part of Plan International’s efforts to mark the 16 days of activism against gender-based violence, my fantastic colleague Policy and Advocacy Advisor Leila Asrari, and I wrote the below piece, originally published on the Plan International website.

Compared to their male peers, girls online are facing more threats of sexual violence, more comments about their appearance and behaviour, and are more often told not to speak out and have an opinion. We need to reclaim the internet for girls, assert our experts, Leila Asrari and Nora Lindstrom.

Violence against girls online is a growing issue. As an increasing share of our lives are spent online, we’re also seeing harassment and abuse take new forms in the online space. Plan International’s research shows that just as in the offline world, harassment and bullying online is gendered. While many young people struggle with the pressures of social media, compared to their male peers, girls online are facing more threats of sexual violence, more comments about their appearance and behaviour, and are more often told not to speak out and have an opinion.

Violence and harassment are being used, both incidentally and strategically, to silence the voices of girls and women, and to limit their engagement in political debates online. This activity mirrors concerning behaviours towards women negotiating political spaces. In a recent global survey it was found that almost half of women in politics have faced serious abuse, including threats of murder, rape and assault. One fifth said they had been subjected to sexual violence.

In addition, in 2016, FRIDA reported that over half of 1500 young women, girl and trans-led organisations they interviewed regularly felt unsafe because of the work they do. So, we are seeing that for those girls and women who take on political leadership responsibility, or who challenge the status quo, there are significant threats – different in nature, and higher in prevalence, than those faced by men.

GIRLS’ FREEDOM, VOICE, AND AGENCY ONLINE IS UNDER ATTACK

For girls, navigating the online world brings with it these threats and more. The statistics are alarming. In Sweden, one of the most gender equal countries in the world, over half of all girls’ online report having been contacted for a sexual purpose by someone they think is an adult. In neighbouring Norway, 16-year-old girls are most at risk: 40% report unwanted sexual attention online over the past year. Only 13% of boys report the same. While global data on girls’ experiences online is scarce, experiences of women suggest the problem is global: 45% of women in Kampala and 21% of women in Nairobi have been harassed or threatened online; seven out of ten 18-24 year old women who use the internet daily have been subject to online abuse.

The threats that girls and young women face when navigating the online space are real. However, our response must not be to limit girls’ online voice, agency and freedom – protection should not mean exclusion. We must approach the question of digital safety, access and voice not simply from an individual, one-to-one perspective, but also in light of the internet being a core social structure, and a crucial platform for active citizenship and voice. If we do not support young people to exercise their voice and agency online, we risk weakening the civil society of future generations.

Already, of children interviewed across 60 countries, only 34% felt safe expressing their views in public and only 38% felt safe attending public protests and demonstrations. Responses to ensure girls’ safety and freedom online as well as their access and use of digital platforms must address the question of girls’ right to have a voice, both online and off.

Plan International’s new global campaign, Girls Get Equal, is about ensuring every girl and young woman has power over her own life and can shape the world around her. 

Girls also need to get equal online. The online space is not subject to the same scrutiny of ‘real-life’ public spaces, nor are legislative frameworks as strong. This leaves children and young people vulnerable to violence and harassment, in a world from which they should not be expected simply to disengage. Much like the response to violence against girls and women in public spaces should not be to restrict their freedom of movement, responding to gender-based violence online needs to be about making the internet a safe space – we need to reclaim the internet for girls.

EVERYONE HAS A ROLE TO PLAY

Governments need to strengthen legislation and increase cooperation to ensure perpetrators of violence online are held to account. The tech industry needs to take clear actions to ensure that social media is safe for children and young people, implementing strong reporting mechanisms and responding to reports of violence or abuse sensitively and efficiently.

Educational institutions all over need to ensure children know their rights and responsibilities online, and understand how to stay safe, and how to report violence and abuse. Children and youth also need better education on human rights and gender-based violence – for instance through citizenship education, or comprehensive sexuality education. 

SUPPORT GIRLS: SIGN THE PLEDGE

We can be positive citizens online, speaking out against violence and abuse, reporting it where we see it, and standing up for victims. We can encourage others, especially children and young people, to use the internet to explore their voice, and to speak out on issues that they care about or that affect their lives. And we can all ask more of those in positions to make online spaces safer for others. 

To start with, we can all sign on to Plan International’s pledge for girls’ freedom. This 16 days of activism, we can all do our part to stand up for the rights of all to feel safe navigating online spaces, we’ve signed – we hope you do too!





How digital principles can help tackle gender inequality

1_MyRftXYzEBueTZYL-pznxg.gif

Wrote this post for Plan International (my employer) on the occasion of Plan endorsing the Principles for Digital Development. 

The fictional kingdom of Wakanda, in the box-office hit Black Panther, is a highly technologically advanced, affluent, closed-off kingdom. To outsiders, it presents itself as poor, partly because its rulers don’t want the kingdom’s powerful technology to end up in the wrong hands. As the story unfolds, it becomes evident that the Wakandans' concerns regarding misuse of their technology are valid. 

Isolation, however, becomes increasingly untenable. At the end of the film, Black Panther – now rightfully the kingdom’s ruler – decides to open up and share their technological advances with the world. 

In ending their isolation and embarking on a new road to use their tech for good, one can only hope the Wakandans are following the Principles for Digital Development.

The power of tech for good

Designed to help development practitioners successfully integrate technology into programming, they are a set of nine best practice guidelines, written for and by international development actors. To date, they have been endorsed by over 100 UN agencies, INGOs, tech companies, and civil society groups. And today, Plan International has joined that community by formally endorsing the Principles. 

Much like the Wakandans, we at Plan International believe in the power of tech for good and want to use it to advance children’s rights and equality for girls all over the world. We know it can work: we have seen first hand how an app is preventing child marriages in Bangladesh, how a predictive text keyboard is breaking gender stereotypes, and how virtual learning environments are making education accessible to girls who would otherwise be left without. 

But we are also aware of the risks. Risks related to dealing with data on vulnerable groups, the harassment and bullying that girls face online, and the potential of technology, and particularly artificial intelligence, to further entrench gender inequality, by reproducing current and historical biases.  

The potential benefits outweigh the risks, but only when technology is used thoughtfully and responsibly. Which is where the Principles of Digital Development come in, as best practice guidelines that will help us use digital technologies increasingly effectively and responsibly so that 100 million girls can learn, lead, decide and thrive

Using the Principles to steer our work

Already, the Principles have guided our extensive work on digital birth registration, and, more recently, our ground-breaking work on OpenCRVS, a software platform for rights-based civil registration and vital statistics.

Central to the latter is the principle of openness. Recognising the need to create a global good that can be re-used and improved over time, OpenCRVS will be built on open source technology. The system will also be built on the principles of open standards and open architecture so that it can work with and complement existing systems of registration.

How we practice the Principles is further evident in our Free To Be crowdsourced city safety maps, which recently launched in 5 cities around the world following a successful pilot in Melbourne. At the heart of the initiative has been designing with the user, i.e. girls and young women.

“I was proud to be part of developing Free to Be because it’s designed by young women like me, for young women, to help make our streets safer,” said Alice Rummery, a university student who helped co-design the Sydney city safety map for Plan International Australia. “I don’t want to have to change my behaviour so that I’m not harassed. I want decision makers, authorities and men to act.”

Digital Principles with a gender lens

Our endorsement of the Principles for Digital Development is a statement of how we intend to use tech for good. But we also intend to give back to the Principles community by looking at the Principles through the lens of gender.

Given the digital gender gap, a key question for us not only how we can use tech for good, but how we can and must use tech to further gender equality and bridge the digital divide. This involves not just designing with the user, but designing with girls and women; not just understanding the ecosystem, but also its gendered dimensions; not just being data-driven, but recognising that there are significant gaps when it comes to availability of gender-disaggregated data. 

Gender inequalities in the real world are reflected in the digital. So while Plan International doesn’t have the revolutionary high tech of Wakanda with which to make the world a better place, we do have expertise and insights on patriarchal structures and how to break these, both online and off. And that’s something the kingdom of Wakanda could learn from too.

Don't let tech leave girls behind

Photo by G. Van Buggenhout for Plan International. 

Photo by G. Van Buggenhout for Plan International. 

Wrote this piece for Plan International (my employer) on the occasion of Girls in ICT Day 2018. 

If Alexa or Siri could, they’d probably be saying “Me too”. But they haven’t been programmed that way. Instead, these personal assistant bots are more likely to be evasive or even respond positively when sexually harassed. While officially genderless, both Siri and Alexa have feminine names and default female voices; it’s hard not to see their evasion as condoning the sexual harassment of women. 

Neither Siri nor Alexa of course have a mind of their own. They have been programmed to respond to prompts in one way or another. Last year, digital news outlet Quartz tested how they respond to sexual harassment: in response to “You’re a slut”, Siri said “I’d blush if I could”. Someone had programmed it that way. 

I bet that person was a man: some three-quarters of staff in tech firms are.

Girls must be encouraged to create

The digital gender divide is particularly large when it comes to girls and women as creators of technology. As AI becomes ubiquitous, this is increasingly a problem: without girls' and women's perspectives, we risk creating tools, solutions, and systems that reproduce and perpetuate existing gender inequalities – as well as fail to address the unique issues and challenges girls and women face.

This is not merely a hypothetical risk. Already, we’ve seen “comprehensive” health apps that come without period trackers because the developers didn’t see menstruation as a core bodily function worth tracking. Research has shown that AI-powered facial recognition systems are particularly poor at recognising darker skinned women’s faces. And machines currently provide gender-biased translationacross languages, assuming someone who is a nurse, for example, is always a woman.

This is a problem. Women and girls constitute half of the world’s inhabitants, and if we’re not involved in creating our common digital future, it will be created for us. 

Getting tech into the hands of girls

As a girls’ rights organisation, Plan International is working to get technology and technical skills into the hands of girls themselves. We believe it is vital to provide girls in developing countries, including those without access to formal education, with opportunities to themselves create technology and digital solutions that address their needs – which is essential. A "brogrammer" in Silicon Valley is unlikely to understand what benefits a teenage girl in Ecuador could gain from technology.

We walk the talk too. In Uganda and Ethiopia, we have set up SmartUp Factory innovation hubs, where marginalised youth – including girls – can access and try out digital tools and technologies. In an environment that is safe for and encouraging of girls, they are supported to develop their own solutions for communal problems using methodologies such as human-centred design. 

In Timor-Leste, Plan International has worked with girls and young women to develop the country’s first sexual and reproductive health app, designed to provide youth with easy access to reliable information on topics they often have no one to ask about. And in China, we have worked with teachers to influence their views on which gender is more “suitable” for careers in ICT. 

This work is important. If we don’t want women looking for a job online to be less likely to be showntargeted ads for high-paying roles than their male counterparts, and if don’t want AI to be more likely to label people who are cooking and cleaning as women, but people who are playing sports and shooting as men, we need to increase the number of women engaged in creating technology. 

Digital equality will help create an equal society

We also need to make technology our ally. Beyond creating digital tools and solutions that address the needs of girls and women, like apps to improve street safety or ones that connect mothers and mothers-to-be, we need to explore the potential of creating gender transformative technology, i.e. tech that seeks to transform unequal gender power relations and actively challenges the prevailing status quo. 

An example of this type of tech is Sheboard, a predictive text app developed by Plan International in Finland together with girls and young women. The app works just like a regular keyboard, but challenges prevailing gender stereotypes of girls being primarily pretty or beautiful, by suggesting empowering words such as “strong”, “smart”, and “clever”, following phrases like “I am” or “my daughter is”.

The future is digital, and if the majority of humankind is not involved in creating that future, we’re in trouble. Instead of allowing tech to perpetuate gender inequality, let’s harness its power for the opposite and create a gender equal society where no one loses out. 

 

 

3 steg för att förbättra informationen och kommunikationen inom småbarnspedagogiken i Helsingfors

IMG_20180425_165928.jpg

Diskussionen kring tillgång till svenskspråkig dagvård på lämpligt ställe i Helsingfors är ett återkommande ämne bland stadens barnfamiljer. Tyvärr är det även ett som ofta leder till mycket stress och oro. Bristfällig information och begränsad kommunikation bidrar till brist på förtroende för fostrans- och utbildningssektorn, även om tjänsteinnehavarna arbetar hårt för att få ett krångligt pussel att gå ihop.  

Det är klart att det behövs fler svenskspråkiga daghem i Helsingfors, för att alla barn ska få en lämplig dagvårdsplats. Detta är ett arbete som fortskrider, t.ex. med det nya daghemmet i Tölö, samt de kommande daghemmen på Drumsö och i Månsas.

Emellertid kan dock informationen till och kommunikationen med familjerna som söker dagvård förbättras. Nedan föreslår jag tre konkreta steg för att förbättra situationen:

1. Bättre processbeskrivning

En klar och tydlig beskrivning över processen att söka dagvårdsplats behövs. För stunden finns en sådan inte, även om det finns en del information utspridd över stadens hemsidor. Detta bidrar till att det florerar rykten och ibland rentav felaktig information bland barnfamiljer.

En tydlig processbeskrivning skulle beskriva alla steg från att familjen söker en dagvårdsplats till att en lämplig plats tas emot, inklusive vem som ansvarar för det olika skeden och vem familjen kan kontakta vid behov. Beskrivningen skulle även innehålla alla samtal familjen kan vänta sig från sektorn (t.ex. ett samtal efter att ansökan skickats, och ett samtal innan beslutet ges), samt hur familjen ska gå tillväga och vad processen är om en lämplig plats inte kan utses.

På lång sikt borde sektorn övergå till att all kommunikation även finns till hand digitalt, d.v.s. att kunderna kan både ansöka och ta emot dagvårdsplatser elektroniskt genom sin egen digitala ärendemapp, göra ändringar på ansökan, samt kontakta den relevanta dagvårdsföreståndaren.


2.      FAQ: Frågor och svar

Ett FAQ på stadens nätsidor, där det fanns svar på de vanligaste frågorna barnfamiljerna har angående dagvårdsansökan skulle vidare bidra till mer likvärdig tillgång till information för alla familjer, samt hjälpa förebygga onödiga rykten och felaktig information. Ett FAQ kunde även minska arbetsbördan för tjänsteinnehavarna, som då kunde hänvisa till FAQn istället för att svara på samma fråga enskilt till många föräldrar.

T.ex. de följande frågorna kunde inkluderas:

  • Inverkar tidpunkten då ansökan lämnas in på beslutet? D.v.s. har det någon skillnad för beslutet om ansökan skickas ett år eller enbart 4 månader i förväg?
  • Måste man göra fem val i ansökan? Vad gör man om enbart tre daghem är lämpligt belägna för familjen?
  • Hur ska man gå tillväga om man vill ändra på sin ansökan efter man lämnat in den? Hur inverkar detta på ansökan? (T.ex. blir man sist i kön?)
  • På basis av vilka kriterier utses dagvårdsplatser? Hur ser man till att familjerna behandlas likvärdigt?

3.      Statistisk översikt av situationen

Dagvårdsplatser ansöks om och behövs året runt, vilket bidrar till att det är ett svårt pussel att få ihop, samt att ha översikt över. Ändå borde det vara möjligt att offentligt ge en grundläggande översikt av situationen med 6 månaders mellanrum. Statistik t.ex. på hur många familjer fick dagvårdsplats på antingen första eller andra önskade plats, hur många som blev helt utan önskad plats, samt hur många första- och andraplatsansökningar det kom till diverse daghem, versus öppna platser på respektive daghem skulle ge ens lite insikt i situationen (även om det är underförstått att statistiken regelbundet ändrar). Offentlig statistik som denna skulle ge både kunderna, d.v.s. barnfamiljerna, insikt, samt en möjlighet för sektorn att visa hur väl de lyckas få  ihop pusslet.

 

Jag har ingen tvekan om att de anställda på fostrans- och utbildningssektorn gör sitt bästa för att barnen i Helsingfors ska ha tillgång till högklassig småbarnspedagogik på ett så lämpligt ställe som möjligt. Jag hoppas diskutera förslagen ovan tillsammans med de relevanta tjänsteinnehavarna så att vi tillsammans kan synliggöra deras arbete, och minska den stress och oro många barnfamiljer upplever inför dagvårdsstarten. 


 

Getting Past the Hype & Practical with New Technologies

IMG_20180309_100610.jpg

This morning I spoke at the USAID Digital Development Forum 2018 in Washington, DC. Below is my speech in full:

Hi Nora, writes one of our senior execs.

I’ll be talking about our use of digital technology at an upcoming meeting. Can you tell me how we are using blockchain in our work?

Of late, these types of queries have popped up in my inbox relatively frequently. The blockchain hype has entered the mainstream, and everyone wants to be seen using it, or at least testing out how it could be used. And why not? Last December an iced tea company changed its name to Long Blockchain, and it’s stock price skyrocketed. A lot of people don’t know what blockchain is, but they’re pretty sure they want to be in on it.

The hype is on in international development too: Perhaps if we allowed people to donate in bitcoin, we’d be raking it in. Or we could bank the unbanked. Provide transparency in land registration and agricultural supply chains. Provide identities to those without.

In an op-ed for Devex last year, blockchain was described as “the single most disruptive technology for the international development sector as we understand it today.”

My answer to the executive that wrote to me is however no. No, we are not using blockchain in our work on girls’ rights. Anywhere.

We’re not not using blockchain because I and others don't see it's potential. I do. And I’m aware of the work UN agencies are doing in spearheading its use to improve food security and reduce remittance costs.

However, there are many organisations - including mine - who are not using it because we’re still struggling to see the tangible use cases that would bring added value beyond the initial experimental hype. It’s not yet clear how heavily we should be investing in it right now: There are genuine concerns about piloting new technology on extremely vulnerable populations, in situations where we don’t have the right to fail. Also, not all new technologies fulfill their disruptive promise: how many of you still regard 3D printing as the massive big next thing in the international development?

While being interested, even excited by the dots on the Gartner hype cycle, lots of practical gains and returns on investment still come from “older” technologies.

There is a slight tendency to talk about using mobiles phones in development as something that’s already been done, like it’s DVDs when everyone’s moved to streaming. Yet many of our organisations still have so much to gain and so much to learn from using mobile phones.

Last December, Plan Finland launched Sheboard, a predictive text app that empowers girls by suggesting words such as “clever” “brave” and “powerful” after phrases such as “I am” or “my daughter is”. This was described as revolutionary.

Data collection is still not universally digital, and not everyone even has access to a mobile phone. In fact, women globally are 14% less likely to own a mobile phone than men.

We also have a lot to learn about making use of data we collect, as well as ensuring the data is stored securely and privately.   

Anyone who has studied development theory will be familiar with the phrase “hand over the stick”, coined by Robert Chambers. As we think about using technology – new or otherwise – in our work, we also need to be mindful of handing over the tech. Nowhere is this more true than when it comes to artificial intelligence, or machine learning. We all know Silicon Valley has a diversity problem, but what also needs to be recognized is the impact this has on how, and what, machines learn.

AI has several potential applications in international development – from chatbots to natural language processing.

But if machines are programmed to learn from the current status quo, then we risk them reproducing existing power relations, from gendered stereotypes through neo-colonialism to inequality.

That’s why we in international development increasingly need to work on getting knowledge  into the hands of the people we work with, and empower them to develop their own tech. The future is digital, and if the majority of mankind is not engaged in creating that digital future, we’re in real trouble.  

The three things I want you to take away from this talk are these:

  • watch blockchain, experiment with it where it makes sense, but don’t think that it’ll be sole source of digital development, added value and competitive edge in the future.

  • expand your use of mobile phones and other familiar technologies and do it ever better and more responsibly

  • create opportunities for the people you work with to access, use, and create technology: HAND OVER THE TECH.

Algorithms are not impartial

The world's leading start-up convention, Slush, took place in Helsinki in late November. With the tagline "Nothing normal ever changed a thing", this year the fair sought to highlight the social impacts of technology.

I attended Slush with colleagues from Plan International and people from the SmartUp Factory innovation hubs in Ethiopia and Uganda.

The technology gender gap

The world's first machine algorithm for an early computing machine was written by a woman. Her name was Ada Lovelace, and she lived during the first half of the 1840s. At the time, she (as a woman) was a rarity in the world of science. 

And despite her and many other women's contributions to the sector, the fact remains that 200 years since Lovelace was born, a significant gender gap persists in the world of science and technology. This is a problem.

The barriers surrounding girls' and women's access to and use of digital tools and technologies are well-known. Less attention is given to the challenges facing women as creators of the same tools and technologies, and the impacts this might have.

Without girls' and women's perspectives, we risk creating tools and solutions that reproduce and perpetuate existing gender inequalities – as well as fail to address issues and challenges girls and women face.

This is not merely a hypothetical risk. Already, women looking for a job online are less likely to see targeted ads for high-paying roles than male counterparts. Why? Because the algorithm was designed that way. 

Artificial intelligence learning gender stereotypes

A 2016 study by the University of Virginia and University of Washington found that artificial intelligence (AI) systems are more likely to label people who are cooking, shopping and cleaning as women, and people who are playing sports, coaching and shooting as men.

Does AI want to keep women by the stove? In an interview for The Guardian, Joanna Bryson, a computer scientist at the University of Bath, said: "A lot of people are saying this is showing that AI is prejudiced. No. This is showing we're prejudiced and that AI is learning it." 

And while humans have the potential to counteract learned biases, the algorithms AI are based on may be unable to do so and instead continue to reinforce and even amplify existing biases, including gender-stereotypical social norms. 

In fact, they may be specifically designed to be that way: fighter robots are designed as 'male', while robots intended for the care and service industries are given 'female' characteristics.

And who makes these algorithms and AI? Predominantly white men. There are many implications of machines learning predominant biases and stereotypes, from labelling black defendants as more likely to reoffend than white defendants, to providing gender-biased translation across languages. 

The below image shows how Google Translate 'translates' the gender-neutral Finnish pronoun "hän" into either he or she depending on context.

Google Translate image.PNG

 

The marginalised and vulnerable are usually at the receiving end of prejudiced AI.  Without more women – and people from diverse backgrounds in general – as creators of technology, we risk perpetuating existing inequalities.

What does this have to do with Slush?

At the convention, Plan International launched the keyboard app Sheboard, which harnesses the power of predictive text to boost girls' self-confidence and remind us to talk to and about girls in a more empowering way. We brought young women from Finland, Uganda and Ethiopia to not only experience Slush itself, but also bring their own views and experiences to the convention. 

The group also had the opportunity to meet with established women in tech, including the impressive Rumman Chowdhury who leads Accenture's work on ethics and AI.  

Technology, algorithms and AI are, in themselves, neither good or bad. It's how and what we use them for that matters. And by working to increase the number of female tech creators, we can increase the chances of machines working with – as opposed to against – us to achieve gender equality.