Plan International

Why the Internet Must Become Feminist

Photo credit: Ehsan Kabir, Plan International

Photo credit: Ehsan Kabir, Plan International

This text was originally published on Plan International’s website on Apr. 25, 2019 to mark. International Girls in ICT Day.

“You’re simply the best”, “Hero!!!”, “I’m in awe of you”, “You Are a Great Leader!”

So read some of the thousands of comments on Greta Thunberg’s Twitter feed. Yet despite the 16-year-old climate change activist galvanising over 1.6 million people to act through her school strike for climate action, you don’t have to do a lot digging online to find the backlash. There are claims of Greta spreading ”propaganda”, calls for her to return to school and stop inciting other children to strike. 

This is not surprising. While the internet and social media have been huge enablers for Greta’s message to reach millions, her activism has also made her a target for the trolls, cyberbullies, and fake accounts well-known to many activists online. 

The World Wide Web, 30 years old this year, is not a friendly place for girls and women. And the more vocal they are, the worse the abuse. Research reveals that female politicians on social media are over 3 times more likely to experience derogatory comments* related to their gender than their male counterparts. Younger women are disproportionately targeted. 

The Role of Bots

Increasingly, this violence is perpetrated not only by humans, but by bots too. Around half of all web traffic today is created by bots*. Some are eminently useful, performing tasks such as repairing links, removing vandalism and tagging articles on Wikipedia. 

However, bots, like any technology, are not neutral. They do what they were programmed to do and some actively uphold inequalities and crowd out alternative views online. 

Social bots, essentially fake accounts that imitate real humans, are creating a growing amount of content on social media. Some 15% of all active Twitter accounts are presumed bots*, but they punch above their weight; unlike humans, bots don’t need to eat or sleep – they can post content 24/7. This makes it possible for bot-created content to flood social media streams, skewing public debate and amplifying hateful rhetoric, violence, and abuse. 

For instance, both Donald Trump and Hillary Clinton received supporting messages from social bots* in the 2016 US elections. However, Trump had more bots producing positive messages about him, while half of bot-produced messages about Clinton criticised her. Bots are also contributing to Instagram’s massive harassment problem as well as spreading anti-Muslim rhetoric and fake news across social media.

We Need More Women in Tech

With the web increasingly created by bots, who creates them is an important question. As the tech sector remains dominated by men, it’s fair to assume they create most bots. This has consequences in terms of what bots are designed to do and what problems they solve - or create. 

The founder of the World Wide Web, Sir Tim Berners-Lee, is rightfully worried about the future of the web. In his annual letter this year he highlights harassment as one of the central problems affecting the internet today, contributing to making “many people feel afraid and unsure if the web is really a force for good.” He calls for us to step up “to make sure [the web] is recognised as a human right and built for the public good.”

As we step up, the “we” must include girls and women. For 3 decades, the World Wide Web has been a playground where the rules – or rather the lack of rules – have been determined by far too few. It’s been a place where hatred and violence have been allowed to thrive, where success has been defined based on number of engagements, not whether those engagements are useful, safe, or even made by a real human. 

No more. Being female online should not be a synonym for being abused. We need a web that is created by a diverse group of people, putting equality at the centre of its structures and processes. A web where girls, women, and other marginalised groups can exercise their freedom of expression without harassment. A web that allows the Gretas of the world to thrive and that amplifies the voices of those otherwise not heard. We need a feminist web. 

Change is Vital so Girls Get Equal

Concrete action is needed to make that happen. The lack of diversity in tech is keeping the internet from reaching its potential for good. We need to create opportunities in the technology sector, so girls and women can be involved in determining how the web operates, and what type of bots are allowed to operate and how. 

We also need social media platforms to improve their processes for reporting and dealing with abuse so that girls can safely create content that represents their views and needs. Facebook, for instance, currently does not differentiate abuse relating to gender, causing much of the abuse suffered by girls and women to go unidentified. Significantly, social media platforms, including Twitter and Instagram, must put user rights and safety before profits and growth. Bots masquerading as real humans must be banned. 

Meanwhile, governments must ensure legal frameworks stay up-to-date with technological developments, so perpetrators of online abuse, including bots, are stopped and held to account.

Through our youth-led, global campaign, Girls Get Equal, Plan International is making sure girls and young women have power over their lives and can shape the world around them – online and off. As we celebrate Girls in ICT Day today, we are encouraging girls all over the world to get into tech and help us make sure the web is a safe place for us all to exercise our rights – to help us make the World Wide Web feminist. 

How digital principles can help tackle gender inequality

1_MyRftXYzEBueTZYL-pznxg.gif

Wrote this post for Plan International (my employer) on the occasion of Plan endorsing the Principles for Digital Development. 

The fictional kingdom of Wakanda, in the box-office hit Black Panther, is a highly technologically advanced, affluent, closed-off kingdom. To outsiders, it presents itself as poor, partly because its rulers don’t want the kingdom’s powerful technology to end up in the wrong hands. As the story unfolds, it becomes evident that the Wakandans' concerns regarding misuse of their technology are valid. 

Isolation, however, becomes increasingly untenable. At the end of the film, Black Panther – now rightfully the kingdom’s ruler – decides to open up and share their technological advances with the world. 

In ending their isolation and embarking on a new road to use their tech for good, one can only hope the Wakandans are following the Principles for Digital Development.

The power of tech for good

Designed to help development practitioners successfully integrate technology into programming, they are a set of nine best practice guidelines, written for and by international development actors. To date, they have been endorsed by over 100 UN agencies, INGOs, tech companies, and civil society groups. And today, Plan International has joined that community by formally endorsing the Principles. 

Much like the Wakandans, we at Plan International believe in the power of tech for good and want to use it to advance children’s rights and equality for girls all over the world. We know it can work: we have seen first hand how an app is preventing child marriages in Bangladesh, how a predictive text keyboard is breaking gender stereotypes, and how virtual learning environments are making education accessible to girls who would otherwise be left without. 

But we are also aware of the risks. Risks related to dealing with data on vulnerable groups, the harassment and bullying that girls face online, and the potential of technology, and particularly artificial intelligence, to further entrench gender inequality, by reproducing current and historical biases.  

The potential benefits outweigh the risks, but only when technology is used thoughtfully and responsibly. Which is where the Principles of Digital Development come in, as best practice guidelines that will help us use digital technologies increasingly effectively and responsibly so that 100 million girls can learn, lead, decide and thrive

Using the Principles to steer our work

Already, the Principles have guided our extensive work on digital birth registration, and, more recently, our ground-breaking work on OpenCRVS, a software platform for rights-based civil registration and vital statistics.

Central to the latter is the principle of openness. Recognising the need to create a global good that can be re-used and improved over time, OpenCRVS will be built on open source technology. The system will also be built on the principles of open standards and open architecture so that it can work with and complement existing systems of registration.

How we practice the Principles is further evident in our Free To Be crowdsourced city safety maps, which recently launched in 5 cities around the world following a successful pilot in Melbourne. At the heart of the initiative has been designing with the user, i.e. girls and young women.

“I was proud to be part of developing Free to Be because it’s designed by young women like me, for young women, to help make our streets safer,” said Alice Rummery, a university student who helped co-design the Sydney city safety map for Plan International Australia. “I don’t want to have to change my behaviour so that I’m not harassed. I want decision makers, authorities and men to act.”

Digital Principles with a gender lens

Our endorsement of the Principles for Digital Development is a statement of how we intend to use tech for good. But we also intend to give back to the Principles community by looking at the Principles through the lens of gender.

Given the digital gender gap, a key question for us not only how we can use tech for good, but how we can and must use tech to further gender equality and bridge the digital divide. This involves not just designing with the user, but designing with girls and women; not just understanding the ecosystem, but also its gendered dimensions; not just being data-driven, but recognising that there are significant gaps when it comes to availability of gender-disaggregated data. 

Gender inequalities in the real world are reflected in the digital. So while Plan International doesn’t have the revolutionary high tech of Wakanda with which to make the world a better place, we do have expertise and insights on patriarchal structures and how to break these, both online and off. And that’s something the kingdom of Wakanda could learn from too.

Algorithms are not impartial

The world's leading start-up convention, Slush, took place in Helsinki in late November. With the tagline "Nothing normal ever changed a thing", this year the fair sought to highlight the social impacts of technology.

I attended Slush with colleagues from Plan International and people from the SmartUp Factory innovation hubs in Ethiopia and Uganda.

The technology gender gap

The world's first machine algorithm for an early computing machine was written by a woman. Her name was Ada Lovelace, and she lived during the first half of the 1840s. At the time, she (as a woman) was a rarity in the world of science. 

And despite her and many other women's contributions to the sector, the fact remains that 200 years since Lovelace was born, a significant gender gap persists in the world of science and technology. This is a problem.

The barriers surrounding girls' and women's access to and use of digital tools and technologies are well-known. Less attention is given to the challenges facing women as creators of the same tools and technologies, and the impacts this might have.

Without girls' and women's perspectives, we risk creating tools and solutions that reproduce and perpetuate existing gender inequalities – as well as fail to address issues and challenges girls and women face.

This is not merely a hypothetical risk. Already, women looking for a job online are less likely to see targeted ads for high-paying roles than male counterparts. Why? Because the algorithm was designed that way. 

Artificial intelligence learning gender stereotypes

A 2016 study by the University of Virginia and University of Washington found that artificial intelligence (AI) systems are more likely to label people who are cooking, shopping and cleaning as women, and people who are playing sports, coaching and shooting as men.

Does AI want to keep women by the stove? In an interview for The Guardian, Joanna Bryson, a computer scientist at the University of Bath, said: "A lot of people are saying this is showing that AI is prejudiced. No. This is showing we're prejudiced and that AI is learning it." 

And while humans have the potential to counteract learned biases, the algorithms AI are based on may be unable to do so and instead continue to reinforce and even amplify existing biases, including gender-stereotypical social norms. 

In fact, they may be specifically designed to be that way: fighter robots are designed as 'male', while robots intended for the care and service industries are given 'female' characteristics.

And who makes these algorithms and AI? Predominantly white men. There are many implications of machines learning predominant biases and stereotypes, from labelling black defendants as more likely to reoffend than white defendants, to providing gender-biased translation across languages. 

The below image shows how Google Translate 'translates' the gender-neutral Finnish pronoun "hän" into either he or she depending on context.

Google Translate image.PNG

 

The marginalised and vulnerable are usually at the receiving end of prejudiced AI.  Without more women – and people from diverse backgrounds in general – as creators of technology, we risk perpetuating existing inequalities.

What does this have to do with Slush?

At the convention, Plan International launched the keyboard app Sheboard, which harnesses the power of predictive text to boost girls' self-confidence and remind us to talk to and about girls in a more empowering way. We brought young women from Finland, Uganda and Ethiopia to not only experience Slush itself, but also bring their own views and experiences to the convention. 

The group also had the opportunity to meet with established women in tech, including the impressive Rumman Chowdhury who leads Accenture's work on ethics and AI.  

Technology, algorithms and AI are, in themselves, neither good or bad. It's how and what we use them for that matters. And by working to increase the number of female tech creators, we can increase the chances of machines working with – as opposed to against – us to achieve gender equality.