Debugging the tech industry: a talk about you and me

I was very happy to be invited by the lovely organisers of JSConf Australia 2016 to close this conference with a first version of this talk (you’ll find the video here).

More than seven months later, in June 2017, I got to go back to Melbourne to give a similar talk – this time 60 minutes long though (😱) at the Software Art Thou meetup. This gave me a chance to look at this topic again after some time had passed and after I’d had some more time to think about it. The talk you find below is the result – it’s a talk about the reason why we’re in tech. It’s also my most personal talk so far, and the sum of all I’ve talked about over the last years.


Trigger warning: mention of harassment, abuse, rape, death.


Why are you here? —

Why am I here? —

I’m here because I got here, to Melbourne, for the first time in my life almost 7 months ago, and this I was on this plane because I’d submitted a talk to a conference named JSConf Australia, and the organisers invited me to come. And I’d submitted this talk because I deeply care about the status of the tech industry. And the reason why I care so much about this – that’s something that I’ll tell you more about later. — In the end: a talk abstract that I submitted on August 1st, 2016, at about 11am CEST: that’s the reason why I’m here today.

— Sometimes, when I’m in the middle of things, I like to pause for a moment, look around me, and ask myself: why am I here? Especially when things are busy, when I’m stressed, things are intense, I have a million things to do, my mind is racing, and everything feels hazy. — Why am I here?

I’ve found this question incredibly useful. It helps me think back to what I set out to do and remember where I came from, it helps me see clearly again, focus on what really matters, – and it helps me act accordingly. Many times, where we are right now is the result of decisions we made many, many years ago. But we change, our needs change, our perspectives change. Other times, we just forget what we set out to do.
Thinking back to why we started can help us find our way out of being lost in detail; help us see the big picture again, and can help us understand what really matters.

This is a talk about the tech industry. And today, I want to ask us as an industry this very same question: why are we here?

This is why we are here. — This is an abacus, a calculating tool that first appeared about 4,300-4,700 years ago. Abaci, also called counting frames, and similar tools were used for calculations – addition, multiplication, but even more complex operations like calculating square roots.
There were many different versions of abaci, including a Mesopotamian, Egyptian, Persian version, a Chinese, Japanese, Korean, Native American version, and many more. Even though thousands of years old, today, abaci are still used in many countries today. And we’re also here because of this:

This is the Antikythera mechanism. It was recovered over 116 years ago from a shipwreck off the Greek island of Antikythera. It was used to predict astronomical positions and eclipses of heavenly bodies for calendrical and astrological purposes. It’s a complex clockwork mechanism composed of at least 30 meshing bronze gears. It could also track the four-year cycle of athletic games which was similar to the cycle of the ancient Olympic Games. And: it’s an analogue computer.

Why is any of this relevant? Why look at what are now only rusty devices from ancient history? – In the rush, in the haze of our daily work, we easily forget about this. But this is where we came from. Our information technology today has its origins thousands of years ago. –

Ever since, a few things have changed a little – the information technology industry has become one of the most industries worldwide. This is why, over the next hour, we’ll talk about some things that are applicable to software specifically, and about others that are about technology in a broader sense. —

Worldwide revenues for information technology / IT products and services are forecast to reach between 2-3.5 trillion US$ in 2017. About 2 billion personal computers are used worldwide. About 2.3 billion people use smartphones. Around 3,6 billion people or 40% of the world population have internet access now, and humans globally spend an average of almost two hours per day on social networking sites like Facebook. Software is running on everything from smartphones to computers, to washing machines, TVs, coffee makers, juicers, alarm clocks, thermostats, smoke detectors. About 20 billion devices are part of the so-called internet of things at the moment, and their number is expected to double over the next five years. And Software systems lie at the heart of modern decision making – they answer whether we get credit, a phone contract, how much our insurance will cost, and decides whether we’re potential criminals.

Technology has become an important part of our lives – even if we’re unaware of it, or don’t want it. And there’s the very answer to the question why we’re here: Even though our devices today are shinier and faster than only a few years ago, and even though they will be even shinier and faster tomorrow: Information technology has always been about information: storing, retrieving, manipulating, communicating, and using it – to make decisions, and solve problems. But at its very core, information technology has always been about people: helping people, supporting people, enabling and empowering people. Ultimately: people are the reason why we’re here today.

And our technology today can have positive impact on people and their lives. It can be a great tool to help understand and satisfy our human needs, and there are many people doing great things with it. And there are some things about it that are just outright cute – and I just had to show you this one from just a week ago:

The Portland Guinea Pig Rescue often gets a bunch of new guinea pigs in at once, so they need to generate large numbers of guinea pig names quickly. This is why they asked research scientist Janelle Shane to build a neural network – to name guinea pigs. Shane gave the neural network a list of more than 600 guinea pig names to train itself, and gave it some tweaks along the way.

And the neural network created a few names that were probably not so fitting for guinea pigs, like Moonyhen, Me, Fleshy, or Boooy (yes, with 3 o’s). But ultimately, it delivered – some really guinea-pig-like names like Popchop, Pugger P, Fuzzable, Snifket, Hanger Dan or, one of my favourites, Princess Pow.

We all know that technology can make great things possible, and can really help people – and even help name guinea pigs. But we need to be careful. – Hypes easily distract us from issues. Lack of critical distance makes it harder to see problematic aspects. And the ubiquity of technology increases its impact. This is why we need to critically examine technology. We’re often hesitant to do this, but it’s important – because while we’re pretty good at celebrating the positive sides of technology, this one is a side that we rarely think about. Over the next minutes, I’ll show you some examples for the negative impact of technology, particularly software, on people.

When Apple first introduced its Healthkit app, you could track everything there: your body measurements, fitness, nutrition, sleep, vitals – but not your period. It took Apple one year to introduce period tracking for Healthkit. As a study from last year showed, virtual assistants like Apple’s Siri, Google Now, and Microsoft Cortana redirected users to help when they said they had a heart attack. But at this point, none of these assistants were able to recognise what it meant when a user said that they’re being abused. Abuse disproportionally affects women – 1 in 3 women worldwide is abused at some point in her life.

Over the past years, social networks have been enforcing so-called “real name” policies – “real names” usually meaning “legal names”. Originally, “real name” policies were introduced to help reduce harassment and stalking – but they didn’t achieve that. Rather to the contrary: these policies actively harmed marginalised and endangered groups who most heavily rely on pseudonyms: like women, non-binary people, LGBTQ people, people affected by abuse and harassment, human rights activists, Native Americans and many more. These policies have even exposed domestic violence survivors to their abusers again.


You may have heard of Pokémon Go, a game that became super popular last year. It’s an augmented reality game that uses phone cameras and GPS to show Pokemon in real environments – and the point of the game is to walk around and catch these Pokemon.
The original locations for these Pokemon were crowdsourced. Pokemon could initially be found in areas where walking around to catch Pokemon is not necessarily appropriate: like cemeteries, graveyards, the 9/11 memorial, Holocaust memorial museum, or the Auschwitz memorial.  For many people, the game was also pretty pointless to begin with: in some areas where most residents are minorities, and in rural areas as well as low-income areas, Pokemon were hard to find. Even worse though: We live in times of police brutality and racial profiling – and multiple black players have worried that they will face racial profiling while wandering around playing the game.

Most apps and web technologies nowadays have built-in features that enable humans to interact with each other. For social networks like Twitter and Facebook, this is even their core use case. Most apps with such built-in features for human interactions have become greenhouses for harassment. 40% of adults on the internet have experienced online harassment. And this widespread harassment and abuse have made many important technologies unusable for many people: numerous people of color, queer and trans people, activists, and many more had to leave social networks and other platforms because the harassment there literally made them sick.

Technology can be a contributing factor to societal issues: Businesses like AirBnB are contributing to a growing housing crisis in cities like New York, London, and Berlin, by removing rental stock from already tight housing markets.

End of last year, around 6,000 people from more than 100 countries submitted photos of themselves for an international beauty contest – which was judged by artificial intelligence. The submissions were pretty diverse – but out of the 44 winners of this beauty contest, almost all were white, a few were Asian, and only one person had dark skin.


Face detection software in cameras and web cams can warn people when they blink in portrait photos – and it also asked Asian people if they were blinking when they actually had their eyes open. Other camera face recognition software just failed to even recognise that it was presented with a face – when it was a black person’s face. Flickr’s photo-tagging algorithm auto-tagged the photo of a Black man with the words “animal” and “ape”. Google’s photo app classified images of black people as gorillas.

When Microsoft launched its Artificial Intelligence chatbot Tay a year ago, the bot was supposed to learn human speech patterns. Within only a few hours, the bot started posting racist, sexist, homophobic, fascist tweets, and harassing other Twitter users. – And had to be shut down less than 24 hours after its launch.

Software is also used to assess the risk of recidivism in criminals. A recent investigation found that one of these tools was twice as likely to mistakenly flag black defendants as being at a higher risk of committing future crimes. It was also twice as likely to incorrectly flag white defendants as low risk. Other software is used to assess whether people who are in jail can get parole or not. The source code for many of these tools is closed source – which gives defendants and their attorneys no chance to understand why decisions were made, and no chance to challenge the results.

These are the examples I showed you over the last minutes. – They’re examples of software that enables harassment or perpetuates societal issues, as well as software that’s insufficient, or even racist, sexist, or dangerous. All these examples are about very different applications with very different use cases. And they have to do with different aspects of software and technology in a broader sense: some are related to artificial intelligence AI, others to security or policy aspects, others are more about game design and the effects of crowdsourcing, or the design of human interactions online. – But these examples are just a few of many, many more. These examples aren’t isolated incidents – they point to bigger
problems. And they have one thing in common:

while intended to be great tools for many people, they ended up only serving *some* people – like the Healthkit app that allowed people to track important vitals, except for people who get their periods. Or like the social networks that enable people to stay connected to others and stay up-to-date – as long as they do all of this under their “legal” name, and as long as they’re willing to put up with harassment. – And these are only very few examples for what our software does these days. Yes, there’s software that does great things for people, empowers and enables them. But: there are far too many, far too bad examples for the harm that we do as an industry.

That’s why these harmful examples are only a symptom: a symptom of the status of the tech industry. – The tech industry is a system producing broken output. It’s a system that’s failing people, instead of serving them. And we urgently need to do better than this. We urgently need to do right by people.

In order to understand how we can do this, I would like to dissect with you where this systems failure of the tech industry is coming from.

The tech industry as a system doesn’t exist in isolation: We live in patriarchy, a social construct that enforces gender roles and is oppressive on humans. Our societies are also capitalist systems that have led to wealth and income inequalities. And we live in racist societies that centre upon the false belief that white people are superior to people of other racial backgrounds. – Together, patriarchy, capitalism, racism, as well as sexism, ableism, classism, transphobia, and homophobia, and many more, AND their intersections make our societies oppressive systems. And these oppressive systems influence each of us, they influence our beliefs, our view of the world, – and they influence this industry
as well.

Software development is an act of representation. And another reason why we as an industry are failing people has to do with this representation, and the makeup of the tech industry. And there’s a great example from the automOtive industry to illustrate that point. –
Historically, automOtive product design and development was largely defined by men. In the 1960s, dummies for crash tests were modelled after the average man – his height, weight and stature. – So seat belts were designed to be safe for men. Not for women, even less so for pregnant women. As a study from 2011 found, women drivers are 47 percent more likely to be seriously injured in a car crash. – How does this translate to the tech industry?

Around 80% of engineering staff at major tech firms like Twitter, Google and Facebook are men, and between 60 and 95% of all their staff are white. In Europe and Australia, 70% of people working in tech are men. The numbers of people of colour, LGBTQ* people or non-binary people in tech are incredibly low.

The numbers in Open Source are even worse – 90 to 96% of all contributors are men. The numbers of women of colour, black women, queer people, non-binary folks and other groups in tech are still incredibly low across the board. The latest Open Source Survey that GitHub published a few weeks ago speaks of 1% non-binary people and 3% women in OSS.

The diversity of our societies is not reflected in the makeup of the tech industry. The tech industry is a homogeneous system. This misrepresentation is one of our most fundamental problems as an industry. It means we’re really good at building solutions – but only for people who are very much like us. And it means that, the more diverse our user base is, the more detached we become from them, and the worse the effects of this misrepresentation get. This representation gap is showing in our work, and the problems it causes are becoming more and more severe. –

Our algorithms learn by being fed data, often chosen by engineers, and our systems build a model of the world based on this data. This has strong effects: If a system is trained on photos of people who are overwhelmingly white, it will have a harder time recognising nonwhite faces. But at the same time, we still believe that our algorithms are unbiased machines. As long as we hold on to this belief, we risk reinforcing the status quo. — And Leigh Alexander described this effect as follows:

“When bias appears in data, it even seems to suggest that historically disadvantaged groups actually deserve the less favourable treatment they receive. Unless the data itself can be truly said to be ‘fair’, an algorithm can’t do much more than perpetuate an illusion of fairness in a world that still scores some people higher than others – no matter how ‘unbiased’ we believe a machine to be.” –

This way, sexism, racism, and other forms of discrimination are being built into these worlds we create, worlds which then shape how humans are categorised and advertised to. This way, through our software, histories of discrimination can live on in our digital platforms.

Right now, many threat scenarios around technology that are discussed by industry heroes and media focus on topics like the future of artificial intelligence, and machines becoming smarter than humans. – And such scenarios are worth discussing, BUT: for many people, software is ALREADY discriminating them. For these people, threats through technology are not some distant future scenario – they are already impacting their lives, and they already have very real consequences.

– And not only have we built software that systematically ignores people and their needs. We have built software that actively endangers marginalised persons. – We have normalised racism, sexism, homophobia, harassment, and many more issues. We have created new realities in which all these things are the new status quo. And that’s a serious problem.

As the relatively young industry that we are, we have accumulated a massive amount of influence within a very short time. Our work has global impact, while, at the same time, we lack answers to fundamental questions about the implications of our work. – We have become too powerful and influential, and we need to start living up to the responsibilities that come with this power. And we have to start doing right by people.
This takes me to another big reason for why this industry is broken. And there’s a story to this one.

Whenever we hear stories, not only the language processing parts in our brain are active: stories also lead to activation of any other area in our brain that we would use to actually experience the events of the story ourselves. Stories can plant ideas, thoughts, and emotions into our brains. For thousands of years, telling stories has been one of our most fundamental communication methods as humans.
— And here’s my story. It’s the story of why I am here today.

I got into the tech industry rather coincidentally. After working in finance, and while starting to work as a writer, I got my first writing job in a tech company, which soon turned into a tech marketing & account management role. Later, I started contributing to Open Source projects. I co-founded my first tech company, became a CEO, joined a different company, and this is how I got into the engineering management role I have today.

This is one version of my career in tech so far, the version that would fit well onto a résumé.

But there’s another side to this. — This other side began when I started meeting other women in tech, we started talking, and at some point, they started telling me their stories. These were stories of having lots of fun exploring technology, stories of building super cool things, stories of awesome side projects, stories of learning, and stories of finding community. But each of these women I talked to also had other stories: – the kinds of stories that often only came up a few hours into conversations. These were stories of horrible hiring processes, and stories of not finding jobs. Stories not being taken seriously by coworkers, of less qualified men being promoted over them. Stories of being paid less than equally qualified men. They were stories of microaggressions, sexism, and stories of harassment at work. Stories of constantly being asked to do unpaid diversity work for the company on the side, and stories of being pushed to do emotional labour. They were stories of online harassment, stories of being driven out of Open Source communities. And many of them were also stories of doubt, of wondering whether it was their own fault that they didn’t succeed.

When I started hearing these stories, I was just a little time into my first job in the industry. And I had no idea. — I’m a white, able-bodied woman, with what’s socially considered acceptable height and weight. I have a ton of privilege. And as privilege does, my privilege had, so far, shielded me from such experiences, *and* my privilege had allowed me not to care about issues other women experienced.

Fast forward seven years. Now I, too, have stories like these women I met back then. Some of my stories are small, others big, and they still also speak of my privilege, which shields me from many issues in this industry. – But together, all of them make for a collection of stories.

Almost every person in tech who’s a member of an underrepresented group in tech has such stories – every woman of colour, every non-binary person, every black woman, every queer person, everyone else who’s not like the majority of people in tech. All these stories are stories of having to proof yourself, your worth, your abilities, over and over and over again. They’re stories of being in an industry that keeps on telling people like us that we don’t belong here.

Many people have risked their careers to tell their stories – because they wanted to make things better for other folks in this industry. And because they wanted to make this industry better.

And for every story that we hear, there are 100 stories that aren’t told: because people can’t afford to risk their careers for telling their stories, or because telling the story could make things even worse. – Or because the people who were involved are too powerful, are industry heroes, and could harm them. Right now, people in tech have to carefully weigh the ramifications of speaking up. And then, for every story that’s out there, and for every story that wasn’t told, there are another 100 stories that we’ll never hear either, because the people who could tell them have left the industry.

The quit rate of women in the tech industry is 41% – that’s more than twice as high as the number of men who leave this industry.
For many members of underrepresented groups, the question “so, what will you do after tech?” has become a sort of sad running joke – we laugh about it, because it’s funny, but also because it’s too true, and if we weren’t laughing about it, we’d probably have to cry.

In spring 2015, tableflip.club was released, and this manifesto is still online. It gave words to what many women and other members of underrepresented groups in tech have experienced for decades, and I’d like to quote a few excerpts from it:

“For years, we thought it was us. That we were failures. We thought that if we just did twice as well as the pasty hoodie-wearers around us we’d move up through the ranks too. Instead you got twice as much work out of us than you did out of our male peers, and tossed us a few scraps of ‘women’s networks’ and ‘Lean In Circles’ instead of promotions and raises. … We’ve watched mediocre men whiz by us on a glass escalator, including in the part of tech companies which include a disproportionate number of women – roles that get dismissed as ‘pink collar’ such as marketing, HR, and QA. We’ve had our work torn down in code reviews and performance reviews, while our male peers back-pat each others’ bad work onwards to the next production incident. … We’re following in the footsteps of brave women who’ve flipped tables out of our way, clearing the path we’re now walking down. … 2015 is the year of the tableflip.”

And this takes me to the third part to this story, which has become my story: it’s the story of how I became a feminist. I wasn’t born as a baby feminist who then turned into a full-grown feminist one day. There was a point in my life when I decided I wanted to become one, and that I wanted to work on being an ally to others who experience different kinds of oppression. I made life experiences, I heard stories, and I learned: from the women, the black women, the women of colour, the non-binary people, the queer folks, I learned from those who are still here today, and from those who were here long before me, and who made things easier for us who are here now. I learned from many people in this very industry, some of which have already been around for 20, 30 years. And I will always keep on learning from all of them.

All of this: this is the story of my time in the tech industry. And this is the reason why I’m here today.

Underrepresented people in tech have been telling their stories for many, many years. And all these stories are incredibly important, and they’re powerful. , and we should be grateful for them. And here’s the thing: each of these stories by itself is one piece. But the difficulties each of us faces in tech aren’t just the problems of one of us as individuals, or one team, or one company. These problems are systemic. We get punished by this system. And the sum of all our stories makes for a picture of the tech industry that’s much less shiny than our polished hardware and the pretty interfaces of our apps. — It’s a picture of the tech industry that shows its systemic issues. It’s the picture of an industry that tells us to “lean in”, and pushes back when we try. It’s an industry that desperately wants to get more of us into its “pipeline” – a pipeline that leads into a toxic environment.

Even worse: For a long time, we have not only had these stories – we’ve also had the data. We’ve had all the numbers showing us the low, in some countries even declining numbers of women graduating in computer science over the last ___decades___. We’ve had the low numbers of women working in tech companies. We’ve had the numbers from Open Source projects telling us barely no one except men is contributing to these projects. But things have only gotten worse over time.

And right now, all these stories are not more than, well, stories, for the majority of people in the tech industry. Right now, underrepresented groups in the tech industry are still, — underrepresented. Right now, white, able-bodied, cis, heterosexual men still have a near monopoly on the power and money that keeps the tech industry up and running.

I have given talks like this one about the state of the tech industry for a couple of years now. After every one of these talks, some people would approach me, and I got to have many great conversations with folks. But there were also other reactions – particularly (quote):
1) “I don’t think that what you’re saying is true. I just don’t think things are like this.”, and:
2) “I’ve been in the tech industry for 10 years and I have never seen anything like this happen.”

I understand these reactions. First of all, because I can empathise with them – for many of the people who approached me with such comments, that was the first time they heard about bad things related to this industry. Many of them were genuinely surprised. And I understand their reactions because seven or eight years ago, I might have reacted similarly. Thing is: Almost everyone who was so surprised about hearing the bad side of the tech industry was a cis white man.

Many of us members of this industry have huge privilege. This privilege shields us: our privilege means there will always be experiences we’ll just never make. And our privilege means we have the luxury not to listen, and the luxury not to care about experiences that people make who are different from us – no matter if these people are in the tech industry and members of our communities, or if they’re users of our technology. With our privilege, we’re a living example of the classic “but it works on my machine” or “it works in my life” – our privilege allows many of us to never hear about, let alone experience, the bad sides and the harmful impact of this industry. – Do you remember the examples from earlier, the examples for the harm that technology does to its users?

All these examples disproportionally affect groups who are also underrepresented in the tech industry. The majority of people in tech are not affected by the majority of problems in the industry. And the majority of people in tech are not affected by the problems that this industry is causing for our users. — We don’t know because we don’t have to know. Because we don’t experience these issues ourselves, and we never may. So we don’t care, because we don’t have to care.

From its beginnings, information technology has always been there to serve people, support people, help people. Now, I would like for us, as an industry, to make this the core of our work again. I would like for us to center our work on people. Because, in the end, people matter more than anything else.

We need to care about people to do right by them. I want for us, as an industry, to care. And I want for us as an industry to change. For this change to be possible, every one of us needs to choose to care. Everyone of us has impact on our companies, communities, on this industry. No matter who you are in tech, or what you’re working on: you can make change happen. And I want to ask you to help debug this industry. This is why debugging the tech industry is about you and me: debugging this industry comes down to you and me. And debugging the tech industry is about people – like you, like me, but also about people who are very different from you and me.

So how do we change this industry? And how do make people the center of our work?

Two main things: We need to change this industry and its inner works – and the way we treat people in tech. And we need to change the way we treat our users. These two can’t be separated. — We can only build meaningful technology if we care about the inner works of this industry and about the users of our technology. And there are many things we can do to make people the center of our work.

Any meaningful change needs to start with understanding ourselves, and our position in the world. We need to examine ourselves and our privileges. We need to understand what it is that we don’t know, and don’t see.

We need to stop assuming that everyone has what we have, and that everyone can go through life as relatively carefree as many of us can. – The default person in the world is not a cis white heterosexual able-bodied man in his 20s who received education in his life and doesn’t have financial worries or physical or mental health issues, and who doesn’t have to be concerned about his safety online. This is a person whose identity and existence have never been questioned, a person who’s never been attacked for who he is or for who he wants to be with.— This is not the default person in the world. But we still build much of our software as if that were the case.

And our privilege is not only about identity and our realities and life experiences, it’s also about access: Not everyone has the money, electricity, high-speed internet access, expensive devices, the same education, knowledge of technology, and so many more things that many of us are used to. We need to understand that what’s safe and easy for us is likely not safe and easy for others.

This is why we need to work on understanding others better, and the best way we can do that is by shutting up and listening. — And: We need to listen to members of underrepresented groups in tech. “Listening” means that we need to actively search for the voices of those that we didn’t hear before. And it’s our responsibility to educate ourselves on topics that have shaped our past and that will shape our future, and the future of technology – like Social injustice, inequality and systemic oppression. Educating ourselves is our responsibility, it’s not the responsibility of marginalised people to educate us.

We need to end the homogeneity of this industry and work on becoming more diverse. — The main idea of diversity work is to give access to resources, power and representation to groups that don’t have them. Diversity efforts are initiatives to correct systemic inequalities. Diversity is a political issue. And our diversity work needs to be intersectional: it needs aware that different people face different forms of oppression and discrimination on more than one level, and how different layers of oppression are intertwined and accumulate each other, and we need to act accordingly. Our diversity must mean we include a broad range of people: like Women of Color, non-binary people, black women, lesbian, bisexual and gay people, trans people, older people, disabled people, people with physical or mental illnesses, people who do care work, and many more. THIS is the diversity that we need.

But – right now, this industry’s culture is still toxic, particularly for members of underrepresented groups. This is why, in order to change this industry in meaningful, long-lasting ways, we need to fix our systemic issues as an industry – and work hard on making our organisations, companies, communities more inclusive. This is a topic that’s very near and dear to me. There are many more things to be said about inclusion than I can’t say today. So here are a few pointers to what you can do in your company or community. —

Inclusion is a multifaceted topic, and these 12 points I mentioned are not more than a few pointers. Inclusion is *never* “done”, or “ready”. People change, our realities change, our lives and circumstances change, and as communities and companies, we need to constantly adapt to these changes. Inclusion is work, and it’s very, very important work. Inclusion is what makes all the difference. It will help us make this industry a better place in the long term – for the people in this industry, and for the users of our software. – But we also need to start changing the way we treat our users right now. –

Our users are not single data points or test scenarios – they’re humans, and as such, they’re carrying their entire life experiences in when they’re using what we built – and we do just the same. The environments we create in our software will affect the way our users feel and the interactions we’ll have with them. So we need to view our users’ interactions with of our software as a complete personal experience. This is why empathy is our responsibility, and it’s a skill we need to practice every day. Empathy helps us feel and understand the emotions, circumstances, thoughts and needs of humans around us and our interactions with them. Learning empathy and practicing it in our work will fundamentally change our interactions with our users, and other people in this industry.

We need to build consent into our technology. Consent is an “active process of willingly and freely choosing” an activity consented to. It requires that the person giving consent is informed, so they can make meaningful choices. We need to enable our users to enthusiastically consent. This means we need to stop with the pre-checked boxes that trick users into subscribing to our newsletters; it means stopping deceptive linking practices, and stopping the software rollouts that opt users into new features automatically with only complicated ways for them to opt out. It means we need to stop making it difficult for people to protect their personal information, and it means we need to stop disregarding privacy concerns – and it means we need to stop with other ways we currently use to trick users.

Consent is an equality issue. We need to emphasise consent in what we build, and we need to build tools which restore our users’ ability to set and enforce their own boundaries.

And we need to make ethics part of our approach to our work on technology. Ethics are moral principles that govern our actions. Considering ethical aspects in our work means thinking past current trends and anticipating every future utilisation of what we build. Ethics in practice are a very complex topic, and that makes it even more important for us to make room for thinking about them. — As a starting point, I’d like to ask a bunch of questions over the next minutes. I don’t and can’t have the answers for these questions – I want to encourage you to think about them, and to discuss them with your coworkers and fellow community members – and I want to encourage you to find answers together with them.

All of these are important questions, and they’re only starting points to spark conversation. So I want to encourage you to ask them – ask yourself, ask your coworkers, and think through what your work does to people, and what risks you may be exposing people to.

Working on software in the year 2017 is more than making some designs or typing code into an editor. We are enablers – enablers for empowerment of people, and enablers for harassment and abuse. We’re not neutral, our code is not neutral, our technology is not neutral. Our work touches on the fundamentals of people’s identities and their lives. Our work is highly political. Each of us, with every button we design, with every line of code, with every word, with every action: each of us takes a stand.

But, most importantly, this is a talk about people – people like you and me, and about people who are not like you and me. This is why I want to encourage you to challenge and change the way you approach your work. I want to encourage you to critically examine this industry and your position in it.

And this is why I want to encourage you to be someone who cares. Care about your actions & their impact.– And: care about people, because people matter above all else. Care about people who are like you, and about people who are different from you. Care about the people inside, and the people outside of this industry. Because People are the reason why we’re here.

In an essay about the impact of activism and resistance, Rebecca Solnit wrote the following about hope:

“Optimism assumes that all will go well without our effort; pessimism assumes it’s all irredeemable; both let us stay home and do nothing. Hope for me has meant a sense that the future is unpredictable, and that we don’t actually know what will happen, but know we may be able write it ourselves. Hope is a belief that what we do might matter, an understanding that the future is not yet written. It’s informed, astute open-mindedness about what can happen and what role we may play in it.”

Earlier in this talk, I talked about the stories many of us have collected from our time in this industry. If you are such a person with a story: I wish you specifically that you can have this kind of hope, – this hope that knows we may be able to write ourselves what will happen – this hope for change. And I want to encourage all of you to work on making this change happen with us. And whenever you go back to your work in tech, may it be tomorrow or next week – I want to encourage you to keep on asking yourself: why am I here?

Thank you for having me, and good night.