gender

How digital principles can help tackle gender inequality

1_MyRftXYzEBueTZYL-pznxg.gif

Wrote this post for Plan International (my employer) on the occasion of Plan endorsing the Principles for Digital Development. 

The fictional kingdom of Wakanda, in the box-office hit Black Panther, is a highly technologically advanced, affluent, closed-off kingdom. To outsiders, it presents itself as poor, partly because its rulers don’t want the kingdom’s powerful technology to end up in the wrong hands. As the story unfolds, it becomes evident that the Wakandans' concerns regarding misuse of their technology are valid. 

Isolation, however, becomes increasingly untenable. At the end of the film, Black Panther – now rightfully the kingdom’s ruler – decides to open up and share their technological advances with the world. 

In ending their isolation and embarking on a new road to use their tech for good, one can only hope the Wakandans are following the Principles for Digital Development.

The power of tech for good

Designed to help development practitioners successfully integrate technology into programming, they are a set of nine best practice guidelines, written for and by international development actors. To date, they have been endorsed by over 100 UN agencies, INGOs, tech companies, and civil society groups. And today, Plan International has joined that community by formally endorsing the Principles. 

Much like the Wakandans, we at Plan International believe in the power of tech for good and want to use it to advance children’s rights and equality for girls all over the world. We know it can work: we have seen first hand how an app is preventing child marriages in Bangladesh, how a predictive text keyboard is breaking gender stereotypes, and how virtual learning environments are making education accessible to girls who would otherwise be left without. 

But we are also aware of the risks. Risks related to dealing with data on vulnerable groups, the harassment and bullying that girls face online, and the potential of technology, and particularly artificial intelligence, to further entrench gender inequality, by reproducing current and historical biases.  

The potential benefits outweigh the risks, but only when technology is used thoughtfully and responsibly. Which is where the Principles of Digital Development come in, as best practice guidelines that will help us use digital technologies increasingly effectively and responsibly so that 100 million girls can learn, lead, decide and thrive

Using the Principles to steer our work

Already, the Principles have guided our extensive work on digital birth registration, and, more recently, our ground-breaking work on OpenCRVS, a software platform for rights-based civil registration and vital statistics.

Central to the latter is the principle of openness. Recognising the need to create a global good that can be re-used and improved over time, OpenCRVS will be built on open source technology. The system will also be built on the principles of open standards and open architecture so that it can work with and complement existing systems of registration.

How we practice the Principles is further evident in our Free To Be crowdsourced city safety maps, which recently launched in 5 cities around the world following a successful pilot in Melbourne. At the heart of the initiative has been designing with the user, i.e. girls and young women.

“I was proud to be part of developing Free to Be because it’s designed by young women like me, for young women, to help make our streets safer,” said Alice Rummery, a university student who helped co-design the Sydney city safety map for Plan International Australia. “I don’t want to have to change my behaviour so that I’m not harassed. I want decision makers, authorities and men to act.”

Digital Principles with a gender lens

Our endorsement of the Principles for Digital Development is a statement of how we intend to use tech for good. But we also intend to give back to the Principles community by looking at the Principles through the lens of gender.

Given the digital gender gap, a key question for us not only how we can use tech for good, but how we can and must use tech to further gender equality and bridge the digital divide. This involves not just designing with the user, but designing with girls and women; not just understanding the ecosystem, but also its gendered dimensions; not just being data-driven, but recognising that there are significant gaps when it comes to availability of gender-disaggregated data. 

Gender inequalities in the real world are reflected in the digital. So while Plan International doesn’t have the revolutionary high tech of Wakanda with which to make the world a better place, we do have expertise and insights on patriarchal structures and how to break these, both online and off. And that’s something the kingdom of Wakanda could learn from too.

Getting Past the Hype & Practical with New Technologies

IMG_20180309_100610.jpg

This morning I spoke at the USAID Digital Development Forum 2018 in Washington, DC. Below is my speech in full:

Hi Nora, writes one of our senior execs.

I’ll be talking about our use of digital technology at an upcoming meeting. Can you tell me how we are using blockchain in our work?

Of late, these types of queries have popped up in my inbox relatively frequently. The blockchain hype has entered the mainstream, and everyone wants to be seen using it, or at least testing out how it could be used. And why not? Last December an iced tea company changed its name to Long Blockchain, and it’s stock price skyrocketed. A lot of people don’t know what blockchain is, but they’re pretty sure they want to be in on it.

The hype is on in international development too: Perhaps if we allowed people to donate in bitcoin, we’d be raking it in. Or we could bank the unbanked. Provide transparency in land registration and agricultural supply chains. Provide identities to those without.

In an op-ed for Devex last year, blockchain was described as “the single most disruptive technology for the international development sector as we understand it today.”

My answer to the executive that wrote to me is however no. No, we are not using blockchain in our work on girls’ rights. Anywhere.

We’re not not using blockchain because I and others don't see it's potential. I do. And I’m aware of the work UN agencies are doing in spearheading its use to improve food security and reduce remittance costs.

However, there are many organisations - including mine - who are not using it because we’re still struggling to see the tangible use cases that would bring added value beyond the initial experimental hype. It’s not yet clear how heavily we should be investing in it right now: There are genuine concerns about piloting new technology on extremely vulnerable populations, in situations where we don’t have the right to fail. Also, not all new technologies fulfill their disruptive promise: how many of you still regard 3D printing as the massive big next thing in the international development?

While being interested, even excited by the dots on the Gartner hype cycle, lots of practical gains and returns on investment still come from “older” technologies.

There is a slight tendency to talk about using mobiles phones in development as something that’s already been done, like it’s DVDs when everyone’s moved to streaming. Yet many of our organisations still have so much to gain and so much to learn from using mobile phones.

Last December, Plan Finland launched Sheboard, a predictive text app that empowers girls by suggesting words such as “clever” “brave” and “powerful” after phrases such as “I am” or “my daughter is”. This was described as revolutionary.

Data collection is still not universally digital, and not everyone even has access to a mobile phone. In fact, women globally are 14% less likely to own a mobile phone than men.

We also have a lot to learn about making use of data we collect, as well as ensuring the data is stored securely and privately.   

Anyone who has studied development theory will be familiar with the phrase “hand over the stick”, coined by Robert Chambers. As we think about using technology – new or otherwise – in our work, we also need to be mindful of handing over the tech. Nowhere is this more true than when it comes to artificial intelligence, or machine learning. We all know Silicon Valley has a diversity problem, but what also needs to be recognized is the impact this has on how, and what, machines learn.

AI has several potential applications in international development – from chatbots to natural language processing.

But if machines are programmed to learn from the current status quo, then we risk them reproducing existing power relations, from gendered stereotypes through neo-colonialism to inequality.

That’s why we in international development increasingly need to work on getting knowledge  into the hands of the people we work with, and empower them to develop their own tech. The future is digital, and if the majority of mankind is not engaged in creating that digital future, we’re in real trouble.  

The three things I want you to take away from this talk are these:

  • watch blockchain, experiment with it where it makes sense, but don’t think that it’ll be sole source of digital development, added value and competitive edge in the future.

  • expand your use of mobile phones and other familiar technologies and do it ever better and more responsibly

  • create opportunities for the people you work with to access, use, and create technology: HAND OVER THE TECH.

Algorithms are not impartial

The world's leading start-up convention, Slush, took place in Helsinki in late November. With the tagline "Nothing normal ever changed a thing", this year the fair sought to highlight the social impacts of technology.

I attended Slush with colleagues from Plan International and people from the SmartUp Factory innovation hubs in Ethiopia and Uganda.

The technology gender gap

The world's first machine algorithm for an early computing machine was written by a woman. Her name was Ada Lovelace, and she lived during the first half of the 1840s. At the time, she (as a woman) was a rarity in the world of science. 

And despite her and many other women's contributions to the sector, the fact remains that 200 years since Lovelace was born, a significant gender gap persists in the world of science and technology. This is a problem.

The barriers surrounding girls' and women's access to and use of digital tools and technologies are well-known. Less attention is given to the challenges facing women as creators of the same tools and technologies, and the impacts this might have.

Without girls' and women's perspectives, we risk creating tools and solutions that reproduce and perpetuate existing gender inequalities – as well as fail to address issues and challenges girls and women face.

This is not merely a hypothetical risk. Already, women looking for a job online are less likely to see targeted ads for high-paying roles than male counterparts. Why? Because the algorithm was designed that way. 

Artificial intelligence learning gender stereotypes

A 2016 study by the University of Virginia and University of Washington found that artificial intelligence (AI) systems are more likely to label people who are cooking, shopping and cleaning as women, and people who are playing sports, coaching and shooting as men.

Does AI want to keep women by the stove? In an interview for The Guardian, Joanna Bryson, a computer scientist at the University of Bath, said: "A lot of people are saying this is showing that AI is prejudiced. No. This is showing we're prejudiced and that AI is learning it." 

And while humans have the potential to counteract learned biases, the algorithms AI are based on may be unable to do so and instead continue to reinforce and even amplify existing biases, including gender-stereotypical social norms. 

In fact, they may be specifically designed to be that way: fighter robots are designed as 'male', while robots intended for the care and service industries are given 'female' characteristics.

And who makes these algorithms and AI? Predominantly white men. There are many implications of machines learning predominant biases and stereotypes, from labelling black defendants as more likely to reoffend than white defendants, to providing gender-biased translation across languages. 

The below image shows how Google Translate 'translates' the gender-neutral Finnish pronoun "hän" into either he or she depending on context.

Google Translate image.PNG

 

The marginalised and vulnerable are usually at the receiving end of prejudiced AI.  Without more women – and people from diverse backgrounds in general – as creators of technology, we risk perpetuating existing inequalities.

What does this have to do with Slush?

At the convention, Plan International launched the keyboard app Sheboard, which harnesses the power of predictive text to boost girls' self-confidence and remind us to talk to and about girls in a more empowering way. We brought young women from Finland, Uganda and Ethiopia to not only experience Slush itself, but also bring their own views and experiences to the convention. 

The group also had the opportunity to meet with established women in tech, including the impressive Rumman Chowdhury who leads Accenture's work on ethics and AI.  

Technology, algorithms and AI are, in themselves, neither good or bad. It's how and what we use them for that matters. And by working to increase the number of female tech creators, we can increase the chances of machines working with – as opposed to against – us to achieve gender equality.

Jos nainen olisi käynyt kuussa

nora-_K1A1654-kuva_MT_preview1.jpeg

Tämä blogi julkaistiin alunperin Vihreät Naiset blogissa 11.2.2018.

Pienenä minulla oli kaksi tavoitetta elämässäni: olla ensimmäinen nainen kuussa ja voittaa kultamitali olympiakisoissa. Nuorta minua ei voi syyttää kunnianhimon puutteesta. Pikakelaus noin kolme vuosikymmentä eteenpäin, enkä ole saavuttanut kumpaakaan. Olympiakulta tuntuu nyt saavuttamattomalta, mutta ensimmäinen tavoitteeni on vielä teknisesti mahdollinen: yksikään nainen ei ole vieläkään käynyt kuussa. Itse asiassa vajaat 90 prosenttia kaikista avaruudessa käyneistä ovat olleet miehiä.

Viimeisimpien Pisa-tulosten perusteella suomalaiset tytöt pärjäävät luonnontieteissä poikia paremmin, he ovat jopa toiseksi parhaita maailmassa. Tulosten julkistuksen jälkeen Helsingin Uutiset otsikoi: “Tytöt jyräävät pojat nyt miehiselläkin alalla”. Kuitenkin samaiset Pisa-tulokset osoittavat, että suomalaiset tytöt eivät ole kiinnostuneita hakeutumaan miehisiksi mielletyille tekniikan ja luonnontieteiden aloille: vain runsas  viidennes Suomen teknologiayritysten työntekijöistä on naisia. Tämä on ongelma.

Osallistuin lokakuussa Women in Tech -tapahtumaan, jonka tavoitteena on edistää naisten osallistumista teknologiaan. Vaikka tapahtuma itsessään on tärkeä, minua jäi harmittamaaneräiden osallistujien vihjailu, jonka mukaan naiset ovat parempia kuin miehet ja naisten osuutta teknologiassa tulisi sen vuoksi lisätä. Mielestäni argumentointi ontuu: naiset eivät ole miehiä parempia sen enempää kuin miehetkään naisia parempia. Kyse ei ole ”paremmuudesta”. Kyse on siitä, että sukupuoli ei määritä osaamista ja erilaiset näkökulmat tuovat rikkautta.

”So what?” kysyi teinipoika minulta pari viikkoa sitten, kun vedin työpajaa aiheesta koulussa. Mitä väliä sillä on, ettei teknisillä aloilla ole yhtä paljon naisia kuin miehiä? Vastauksia on monia. Yksi liittyy tyttöjen – eli tulevien naisten – uramahdollisuuksiin teknillisellä alalla. Tutkimusten mukaan luutuneet sukupuolistereotypiat vaikuttavat edelleen kielteisesti tyttöjen luottamukseen omaan osaamiseensa esimerkiksi fysiikassa, kemiassa ja tietojenkäsittelyssä. Ennakkoluulot tyttöjen ja naisten osaamisesta teknillisillä aloilla vaikuttavat myös heidän urakehitykseen; esimerkiksi IT-alalla naiset ovat tyypillisesti alemmissa hallinto- ja toimistotehtävissä.

Sukupuolten epätasapaino heijastuu myös siihen, miten näemme ja koemme maailman, sekä mitä maailmasta tiedämme. Vuonna 2011 tehdyn kyselyn mukaan noin 90 prosenttia englanninkielisen Wikipedian toimittajista on miehiä. Wikipediassa on myös enemmän ja pitempiä artikkeleita miehistä ja tietokannassa olevissa naisten ja miesten elämänkerroissa on eroja: naisista kertovissa elämänkerroissa suuri osa sisällöstä liittyy kyseisen naisen perheeseen, sukupuoleen ja suhteisiin. Miehistä kertovissa elämäkerroissa myönteiset ilmaukset toistuvat useammin, kun taas kielteiset ilmaisut ovat tavallisempia naisista kertovissa elämänkerroissa. Koska Wikipedialla on yli 32 miljoonaa käyttäjää, sillä mitä siellä kerrotaan tai ei kerrota, on väliä.

Haluan uskoa, että vajaan vuoden ikäinen tyttäreni voi kasvaa juuri sellaiseksi, kuin hän itse haluaa. Tutkimukset kuitenkin osoittavat, että jo esikouluiässä hän on todennäköisesti omaksunut monta haitallista sukupuolistereotypiaa, mukaan lukien sen, että astronautit ovat miehiä. Meidän on tietoisesti purettava näitä stereotypioita, ja työskenneltävä sen eteen, että tytöt ja naiset pääsevät tasavertaisesti sekä käyttämään että kehittämään teknologiaa ja digitaalisia ratkaisuja sekä nyt, että tulevaisuudessa.