Safety, Justice, Compassion: Shifting the Tech Paradigm

Hi, I am so excited to be here at Web Directions Code.

The talk I'm about to do is safety, justice, compassion contributing to the ethical tech paradigm shift.

My name is Eva PenzyMoog.

I'm the founder of the Inclusive Safety Project.

I'm also a Principal Designer at 8th Light, which is a software consultancy and my pronouns are she and her.

So before getting into how we can contribute to the ethical tech paradigm shift.

I want everyone to just think for a second about what do we even mean when we say the term "ethical tech", what does that actually mean for us as technologists in a practical way?

What does it even mean to be ethical?

Because there are so many different things that are going on, right now.

So to understand how to contribute meaningfully to the ethical tech paradigm shift.

I want to first talk a little bit about paradigm shifts actually happen using a real life example.

And then I'll talk a little bit more about what it means to be ethical.

To talk about a real life example, I'm going to take us back to 1956 back to a time when cars looked a lot like this, they looked super cool.

There's one big thing missing that all cars have now, which is seat belts.

We all know that seat belts are really important, but the actual statistic is that they reduce serious injuries and car crashes by half.

So a really big deal.

But back when they were introduced in the 1950s, no one really wanted them.

And this was despite the fact that automobile injuries and deaths were at an all-time high.

And this was a really big problem.

Part of the reason that people didn't want them was because of the cost.

They were an add on feature that you had to pay extra for, and in today's equivalent, they were hundreds of extra dollars.

Also fun fact, this is what a child's seat looks like in the 1950s, or you had the option to, sort of ,tie your child down in the seat while in a standing position, or you could let them right up front with you on a little booster seat again, without a seatbelt.

There were basically no rules.

Auto safety has come a long way since 1956, when only 2% of customers were purchasing seatbelts for their cars.

And we don't actually know how many of those 2% were even actually using them regularly.

Let's skip ahead to 1965, along comes a young man by the name of Ralph Nader.

And some people in the audience might think this name is familiar, especially if you're in the U S and yes, this is the same Ralph Nader who has run for president of the United States multiple times through the Green Party.

I didn't know this, but he's actually a really amazing activist, and he basically set the groundwork in the United States for modern consumer protection.

He wrote this expose called Unsafe At Any Speed.

And it was all about how car manufacturers were prioritizing profits over the safety of their users.

This might sound familiar because tech is doing the exact same thing, right now.

The book covered, not just the absurdity of not including seatbelts in cars as a standard, but also things like how tire pressure was actually calibrated to prioritize comfort over safety.

He talked about how some cars had tires that could not even properly bear the weight of a fully loaded vehicle.

He talked about dashboards that had bright chrome displays that would reflect the sunlight directly into the driver's side.

There was also a lack of a standard shifting gear pattern.

So this meant that if you borrowed someone's car, or maybe you let someone else drive yours, and it was from a different manufacturer, you would shift into what you think is Reverse, but it had a different shifting pattern.

So you might actually be in Drive and you would shoot forward, instead of back.

Ralph Nader also wrote about the fact that car designers ignored existing crash science, which did exist back then, but they didn't really have to do anything with it.

He also wrote about the negative impact that cars were already starting to have on the environment.

If there's anyone here from Los Angeles, that's the city he focuses on because smog from pollution from cars was already a big problem in the 1960s.

Nadir also criticized automotive company leaders for refusing to prioritize safety over the fear of making cars more expensive and alienating users.

That was the term they used.

He also pointed out that the industry was running marketing campaigns to shift responsibility of safety from themselves as the automobile makers and onto the drivers, as well as the designers of roads.

Insisting that they bore no responsibility for the enormous amount of totally preventable car related harm and death happening, they were basically saying, "Hey, this is not our problem.

This is actually just a problem of user education and users need to understand things better and people just need to be better drivers.

And then this won't happen, but it's not our fault." Lastly, he had this call to action, which was at the federal government regulate the automotive industry and basically force them to do a better job at preventing all of this harm.

So this book became a bestseller in the United States.

And what followed was one of the most successful public health campaigns of all time in this country.

Nadar's Investigation contributed to the growing public outrage on this issue, and it led to Congress creating the National Highway Traffic Safety Administration, which is a part of government that we still have today.

That's responsible for things like reducing automobile, death, and injury; promoting the use of seatbelts, child seats and airbags; helping states reduce drunk driving; setting, and enforcing safety standards; investigating safety defects, in Automobiles.

So let's move on to 1968.

It is three years since Ralph Nader's expose came out and it's 12 years since seat belts first became available.

In this year, the National Highway Traffic Safety Administration created a law that all vehicles, except for buses needed to be fitted with seatbelts as a standard feature and not something that came at an extra cost.

This was a really huge victory.

And you might think that this is where the story ends, but it isn't.

People were still just against the idea of seatbelts because of the supposed inconvenience.

They thought that they would trap you in your car if you drove it into a lake and that they might do internal damage to your organs, when they prevented you from flying out of the car during a crash.

Some people even thought that it was safer to be launched at a high speed away from the vehicle during a crash than to have the seatbelt, keep you in your car.

This was despite the fact that scientists had done research, basically disproving all of these worries.

But even back then, a lot of people were choosing not to follow the science and medicine.

And it's important to note that car companies were required to have seat belts in the cars as a standard, but there was no law saying that people had to actually use them.

In fact, a lot of people responded to seat belts back then, the way that people are responding to masks today, at least some people, who claim that they're this incredibly awful thing, who's mild inconveniences far outweigh their life-saving benefits.

By 1983, not much had changed.

Just under 15% of Americans reported consistently buckling up.

New York state decided that they wanted to do something about this and in 1984, they passed the country's first law that required that everyone in the car wear seatbelts.

Auto fatalities decreased drastically in New York and the rest of the states decided that they were going to pass their own laws and did so throughout the 1980s and 1990s.

Almost every state that is.

There is still one state in the U S that does not require adults to wear seatbelts only children and that state is New Hampshire.

And it's really sad because a lot of people choose not to wear seat belts and they have a higher percentage of people there who died during traffic accidents.

But even with New Hampshire aside, every state drastically increased the percentage of people who were wearing seatbelts after the seatbelt laws passed.

With the result that by 2019 in the U S the national use rate for seatbelts was just over 90%.

This is at the point where I think we can say that the paradigm shift is complete.

From car manufacturers in 1956, charging extra for seatbelts and being allowed to self-regulate when it came to safety and only 2% of people using seatbelts; to now, when seat belts are a standard feature, the government forces the auto industry to comply with all sorts of safety standards, and the majority of people wear seatbelts without even thinking about it.

And just to recap, seatbelts were introduced in 1956, activism around them started in 1965, 1997 is when the last state passed the last seatbelt law and marks the end of the legal paradigm shift.

Although the cultural paradigm shift, I think continued for much longer, well kids who grew up in the eighties and nineties with seatbelts as the norm became today's young adults who wear seatbelt without even thinking about it.

All told though, that is three decades of activists, academics, regular people, and politicians working together to make this paradigm shift happen.

So what does this have to do with us and the tech industry?

I'm sure that you've already drawn a lot of conclusions and made a lot of connections between the auto industry before 1968 and the tech industry today.

But I want to address some specific parallels in our quest to understand more about the ethical tech paradigm shift so that we can then understand how to plug into it.

In both instances, we have massively powerful industries choosing to prioritize profits over the safety of their users.

With the automotive industry, eventually public outrage and activism led the government to create new regulations and force them to comply.

With tech we have company leaders prioritizing profits over the safety of their users also.

And there is a growing public outrage in a healthy amount of activism, but the government has yet to pass meaningful laws.

Second in both cases, there is an enormous amount of totally preventable harm going on.

And just as back then, as with now, company leaders are choosing not to take action.

Third, like I mentioned earlier, the auto industry put a lot of work into marketing campaigns to shift the responsibility of safety onto the user.

And I can't tell you how often this exact thing comes up in my own work, which is focused on designing for safety and against interpersonal harm, especially domestic violence.

People will say, well, it's the user's responsibility to understand how the product works and how someone might use it against them.

If you're going to use an app or a piece of tech, you should know what's going on.

If you don't want to use it, you shouldn't as if that's always a realistic option.

And if someone wants to use it against you, it's your responsibility to understand how that might happen and how to regain control.

It is a lot to put on people, especially those who may not have a high level of tech literacy.

This is a screenshot of a list of ways that people surviving domestic violence can stay safe online.

And it includes things like how to delete your browser history.

And checklists like this are all over the place.

There's a lot out there to help domestic violence survivors stay safe when it comes to their tech.

And I'm not saying that this is a bad thing, resources like this are absolutely essential for people in domestic violence contexts.

Especially when they're making a plan to leave and need to understand how their abusers might be digitally surveilling them.

And after they leave, when it's paramount that their abuser isn't able to track them down.

But my question is, why is this the only solution that we have?

Why do we put this responsibility on the victims to do this work?

Shouldn't we be tackling the problem at the source instead?

And that's what designing for safety is all about fixing the digital side of these problems at the digital source.

But going back to this idea of safety being the user's responsibility, I want to point out that this is a very specific tactic that is being used.

This is something that company leaders within tech are doing to intentionally shift blame and responsibility away from themselves, and onto the end user.

This is a tactic that powerful industries have always used in an attempt to not take responsibility for the harm that they're causing and to avoid regulation that will hurt their bottom line.

For both the pre 1968 auto industry, and today's tech industry have little to no government oversight and are failing miserably at self-regulation.

Tech basically gets to regulate itself now in most ways, and we all know how that is going.

Lastly, there's organized activism and growing discontent from the general public.

Public opinion is quickly shifting away from Big Tech.

People are becoming very negative towards it, which is a good thing.

And activists are working really hard to hold company leaders accountable and to educate other people about the awful things that are going on.

This combination led politicians to eventually take notice of the auto industry in 1960 and to regulate it.

And there are encouraging signs that the government is beginning to take action on certain aspects of the tech industry today, both in the U S and in other countries.

It's because of all this that I say that tech is in a pre seatbelt phase, the seatbelts are there and a lot of us as individuals and as teams are working really, really hard to make sure that our products are ethical.

Which I think means safe just and compassionate, and which I'm going to talk more about in a minute, but the reality is that a lot of products out there are still incredibly harmful.

We don't yet have laws mandating that we make the seatbelts of tech, a regular part of the process and the product, and those who lead the industry's companies are continuing to choose to prioritize profits over the tech equivalent of seatbelts that will keep the user safe.

So with all this in mind, let's take one more, look at the timeline of the paradigm shift of seatbelts.

This time with the additional framing of what's going on in the general public, which is what you see at the very bottom.

I'm going to give you a second to just look at this slide before I talk about it.

Okay, so I've broken this into three sections.

The first one on the left is when there is a lot of harm, there's some activism, but the greater public is overall pretty indifferent.

Second, the phase in the middle when activism increases and public outrage happens and public opinion begins to shift, which causes politicians to take notice and to draft new laws.

And then the final stage on the right is when the combination of activism, laws, and cultures surrounding the topic coalesce into actual behavior change.

The key insight from this, is that paradigm shifts are totally possible and they do happen, but they require a sustained effort over a long period of time.

The other insight is that it's very important to focus on specific goals.

So Ralph Nader was not the only person who was working hard as an activist around auto safety back in the 1960s, there were a lot of different people doing important work on this, but his book is a really useful example of the very specific demands that activists had for auto safety.

It wasn't just saying cars need to be safer, but there were a whole bunch of specific issues as well as specific solutions.

So going back to this question of what do we mean when we say "ethical tech" is the next thing that I want to talk about now that we understand paradigm shifts a little bit, let's spend some time understanding what we're even talking about when we say "ethical tech".

Usually we mean something to do with tech products, but it can also have to do with the tech industry itself, which is why I think that there are these sort of two main areas of focus.

Within each of these areas of focus, there are three different issues, safety, justice, and compassion.

I think that usually when we talk about "ethical tech", we're talking about one of these three things.

And because like I said, it's very important that we are specific with our goals.

Instead of just saying that, you know, tech needs to be more ethical.

I want to spend a few minutes breaking these down and talking about what actually constitutes each of these six areas.

I'm going to go through every single ethical tech issue that I've been able to identify.

And if you have ones that I don't mention, definitely contact me to let me know, because I'm sure that there's things that I've not covered in these lists.

So the first is the safety of tech products.

This includes things like using tech for stalking, technology facilitated domestic violence, which is where I spend my focus on.

Image abuse, which is sometimes called revenge porn, although that's not quite the best term because revenge isn't always part of it, and porn implies some level of consent.

Invasive surveillance, which usually involves surveilling domestic partners, children, elders, and workers.

Cyber-bullying through text and social media and other digital platforms, as well as anonymous harassment.

Like just what we see on Twitter and other social media, as well as threatening and doxing people's identities and personal information.

Next is justice.

Issues of justice within tech products include text that harms the planet, like Bitcoin mining to name one, data harms that include things like what sort of data gets collected and about who.

So for example, in Chicago, where I live, there's this really harmful thing called a Gang Database and there's zero transparency about it.

It includes a lot of people who are actually in no way affiliated with gangs, but they have no recourse to get themselves off of this list.

Then there are racist, sexist, and otherwise oppressive algorithms such as predictive policing, that weights a black man is more likely to commit a crime than a white man with the same history.

Exploitative design practices, such as bringing end-users in to create solutions to problems, and then packaging up those solutions to sell them back to them without that group or community receiving any power or any funds that came from the project, they're simply exploited for their knowledge.

And this is especially harmful when it's some type of vulnerable community.

There are "social good" projects that do more harm than good, which we see a lot in terms of designers or different groups with good intentions going into a poor community or a poor country, and attempting to understand a problem and design a solution for it.

And then leaving before the project outcomes are fully understood.

And a lot of times this isn't actually helpful and can be harmful.

And then finally there's harmful disruption, such as Airbnb, which has contributed to many cities becoming unlivably expensive for the people who actually live there.

And then lastly, in this area, there's compassion and tech products.

So the first is cruelty in advertising and promotion, such as showing a woman who has just had a miscarriage ads for diapers.

Hurtful copy, an example of this is Twitter's original messaging, when a user was over the character limit, it said something like try to think of something more clever and make it shorter.

And this might be funny if you were in the right mood, but if not, it would just be hurtful.

Failing to design for stress cases.

So for example, Eric Meyer and Sarah Watcher-Boettcher's book Designed for Real Life, talks about this and defines a stress case.

A good example that they give is Home Depot.

They might be thinking about users who are there because they're excited to be doing a home renovation, but actually some users are going through things like my refrigerator is broken, I'm about to lose a ton of food and I need to be able to find an affordable refridgerator ASAP.

Then there's retraumatizing users by doing things like showing unwanted content, such as related articles at the bottom of an article, you're reading, showing something really graphic, something that might, re-trigger a past trauma.

Next is disallowing control over what is seen.

So an example of this also comes from Design for Real Life, where Eric Meyer talks about how he tragically lost his young daughter.

And then Facebook continued to show reminders that it was her birthday long after she had died, and there was no way that he was able to stop it.

And then lastly, they're secretly experimenting with users' emotions, which is another issue that comes from Facebook.

Some people might remember in 2014, they came under fire for this because they did experiments where they, without letting the user know that this was happening, they would take all of the happy things out of that person's feed and then study what they posted to see if it got sadder, which it definitely did.

So then we get into the Tech Industry.

The first issue here is tech safety.

This includes things like workplace harassment, assault and abuse.

Next is HR teams who protect the company over the people, especially victims of the things in that first issue.

And then there's unsafe working conditions such as what we see at Amazon, in their warehouses where there's been regular documentation of all sorts of really horrible safety issues from things like overheated spaces that don't have air conditioning, to the story of the woman who was denied lighter duties while pregnant, and still had to carry heavy things and had a really tragic late term miscarriage.

Then there are issues of justice in the tech industry.

These include issues with inequitable hiring retention and promotion in which certain powerful groups get an easier time being hired, retained, and promoted.

Toxic and exploitative work cultures.

Unjust worker compensation, which again, we can look at Amazon for an example of where we see that some employees get a much higher share in the enormous amount of profits than others, poor or no healthcare failing employees with accessibility needs, which we're really seeing right now as a lot of companies, at least in the U S, are forcing their employees back to work, even though COVID is still raging, after they've shown for awhile now that they totally can accommodate people's need to work form home.

And then there's education that is reproducing existing oppressions in the industry.

So for example, things like science departments in universities, where male professors are allowed to continually harass female students without repercussion, which is pushing women out of the STEM industries before they can even enter them.

And then there's the issue of compassion within the tech industry.

There are probably more things that could be on this list, but the issues that I've identified include a lack of agency and worker burnout, which we're seeing in really huge numbers right now.

And also just simply failing to see human beings before seeing employees.

So here's a list of all of those topics that I just went through.

I'm betting that a lot of people at the conference are already very interested in one of these, are learning a lot about them or are possibly even already contributing to meaningful work.

So just take a few seconds to think about the topic that you're interested in or that you're already working on and think about where does it fall into the sort of general timeline of a paradigm shift?

Is it still in the research phase where we're just learning about it and identifying that it's a real problem.

Is it more in the education phase where we're trying to get the word out and let people know that this is happening or are there laws starting to be passed, but now we need more laws that are going to clarify it, or our behavior is starting to actually change.

So I think at this point it might help to look at a specific example from that list of problems and see where it's at within the paradigm of shifting tech to be ethical.

So let's look at the issue of racist algorithm.

So this traces its roots back to 1986, when a doctor at a medical school called St.

George's, which is in the UK, creating an algorithm to help with admissions.

And his goal was actually to make the admissions process fair and to weed out human bias.

So he wrote this algorithm.

A few years later, a bunch of staff members were very concerned about how little diversity there was in successful applicants, and there was an inquiry into the algorithm.

They found all sorts of issues.

The process of giving and taking away points to a candidate, weighed things like the applicant's name.

And if they had a non-European name, they were docked 15 points.

The algorithm also designated either Caucasian or non-Caucasian based on the applicant's name and their place of birth.

So the school was found guilty of discrimination, but they didn't really face any actual consequences.

And it wasn't until 2016, a whole 30 years later, that activism around the issue had a sort of break through moment that starts to reach the general public, which was ProPublica's piece demonstrating that criminal prediction algorithms were racist and that they had a bias against black people.

There had been articles and warnings about this for decades, as well as some meaningful studies in the early 2000s.

But it's around this time that I think the average sort of person who doesn't work in tech started to get exposed to the problem of bias algorithms.

It was two years later that Joy Buloamwini and Timnit Gebru published their paper, demonstrating that Amazon's facial recognition has a far higher error rate for dark skinned women than for light-skinned women.

And this is where we're at now in 2021.

In the US something called The Algorithmic Justice and Online Platform Transparency Act has been introduced into Congress.

It hasn't yet passed, it's in a committee, but it is exciting because this is a pretty legit law and activists are overall, really happy with it.

So next up in the timeline of what has to come next, or hopefully will come next is the first laws being passed, additional laws being passed that help sort of clarify those laws and bring about better change.

And then behavior is actually changing.

But once again, I want people to take note that this is about 30 years since the issue was first identified to the present day, when there's some real momentum going on.

And I also want to point out that at this point where there's some momentum starting to really happen, this is when opposition starts to really ratchet up.

So first off Timnit Gebru was actually fired from Google, as many people probably already know, for raising awareness about this issue in a way that Google didn't like.

There's also a lot of pushback from Big Tech companies.

So we've seen there is an there's an industry group, right now that includes companies Amazon, Facebook, Google, and Twitter, that is trying to make sure that this law doesn't pass.

They're saying that making their algorithms more transparent will provide a roadmap to hackers, Russian trolls and conspiracy theorists.

They say that actually what we should be doing to combat these problems that the algorithms are perpetuating is to expand our civil rights and discrimination laws in housing, employment, and credit.

Yes, we should absolutely do all those things, but this is a pretty transparent attempt to deflect attention away from the ways that their own companies are contributing to these problems.

And I think that we can definitely expect more pushback from company leaders who don't want to change their very lucrative way of working even a little bit, even if it means that their products will cause less harm.

A lot of people want to start their own thing, and I'm not saying that you shouldn't, but it's important not to center yourself as you work to help others.

You shouldn't start your own thing, just to start your own thing.

And if there's already a solid organization to get behind, you should join them.

Also always follow the lead of the people who are actually being impacted by the problem.

So if you want to work on the issue of bias algorithms, follow the lead of the black women who are already doing that work, especially because they're the ones who are the most impacted and they're the ones whose voices need to be centered.

I talk about this in my book, but there is a limit to empathy.

It's very important that we empathize, but it can't be a stand in for lived experiences, always follow the lead of the people who have the lived experience of being impacted by the problem that you're trying to help solve.

And then think about where your issue falls on the timeline of the paradigm shift and use that to help inform the actions that you should take.

Maybe your issue is still in the education phase, where the public needs to be made aware of it.

And there just needs to be more content out there.

Or maybe it's even earlier than that and there need to be more studies that are actually proving that this is a problem and exploring its actual impact.

It's important to remember that laws are never ahead of the curve.

They're usually the cap on many years of activism.

So it's important to recognize that like, yes, ideally laws would just kind of come in and take care of this problem at the beginning.

And we should be agitating for lawmakers to be more proactive, but it's important to know that until public sentiment begins to really turn against a certain issue, the likelihood of politicians acting on it is very low.

And then I have a few pieces of advice.

The first is to be hopeful, but be a realistic.

We know that change is possible, but we also know that it can take years and sometimes decades.

I don't think that a lot of these things are going to take 30 years like it did with seatbelts because we're able to get information out a lot quicker now and educate people more quickly, but it is very real that it's going to take years and that change doesn't happen overnight.

So with that in mind, find your team, find an organization or a group to work on the issue with, or find others at your workplace who want to enact some type of change at your company.

Find a slack group, whatever it is, just find your people.

Nothing great happens alone.

And without a team for solidarity and support, when the going gets tough, it's going to be much harder to stay focused and continue on the work.

And then lastly, set a sustainable pace.

Take breaks.

This work will take years and we need to all be in it for the long haul.

That means be being realistic about how much you can do.

Maybe you can rally your team at work to implement some big changes at your company.

Maybe you can volunteer a few hours a week with an organization, or maybe you truly don't have time for any of these things, but you can donate $10 a month to an organization of your choice that is doing important work.

And remember the Mark Zuckerberg and the Jeff Bezos's of the world.

They are counting on us to not do this.

They're counting on us to get overwhelmed and to give up.

They're counting on us to not have the staying power that they're highly paid attorneys and lobbyists do.

So don't let them win.

And then the last thing is to resist empty rhetoric around "ethical tech".

So if people are using this term be specific and push them to define like what they're actually talking about, support initiatives that are taking specific action on issues and have specific goals and push your company to do the same.

The issue was seeing "ethical tech" in this very general way is that there's no accountability because there's no clearly defined problem and no clearly defined goals.

So help people around you understand this, help your leadership, understand this, and if you see companies or people that are kind of doing this in a way that is blatantly just some empty marketing tactic, call them out.

So I want to close with this quote from Arthur Ashe, who was a groundbreaking tennis star and social activist.

He said, start where you are, use what you have, do what you can.

Maybe your thing right now is just changing one aspect about how your team works.

Like I said, maybe you can organize with your coworkers.

Maybe you can plug into a group.

Maybe you can donate some money, but whatever it is that you can do, it is going to help.

Remember that it's a tactic of people opposed to meaningful change to make us feel overwhelmed.

And like the problem is just too big for us to solve, but history has shown us that that is absolutely not true.

We totally can change tech for the better and contribute towards the paradigm shift to making it more ethical.

Thank you so much for listening to my talk.

Here is my contact information.

If you want to get in touch, [email protected] I'm on Twitter @epenzemoog and you can learn more about my work at inclusivesafety.com.

And of course, my book is Designed for Safety and it focuses specifically on issues of interpersonal harm and technology facilitated domestic violence.

Thank you so much.

Safety, Justice, Compassion:
Contributing to the
Ethical Tech Paradigm Shift

Web Directions Code ‘21

Eva PenzeyMoog

Author of Design for Safety

The Inclusive Safety Project

inclusivesafety.com

Principal Designer @ 8th Light

[email protected]

Pronouns: she/her

Photo of Eva

What do we mean when we say “ethical tech”?

Screenshot of a New York Times article titled: "Thermostats, Locks and Lights: Digital Tech of Domestic Abuse"

Screenshot of the NYT article from the prior slide with an additional screenshot overlaid of an article from The Guardian titled: "Rise of the racist robots - how AI is learning all our worst impulses

A third article screenshot overlaid from the Pew Research Center titled: "Young women often face sexual harassment online- including on dating sites and apps"

A fourth article screenshot overlaid from The Washington Post titled: "Racial bias in a medical algorithm favors white patients over sicker black patients"

A fifth article screenshot overlaid from a tech blog post titled: "The futures of many prison inmates depend on racially biased algorithms"

A sixth article screenshot overlaid from a CBS News article titled: "Microsoft shuts down AI chatbot after it turned into a Nazi

A seventh article screenshot overlaid from a Wired magazine article titled: "More Facebook Privacy Woes: Gay Users Outed to Advertisers"

An eighth article screenshot overlaid from a BBC News article titled: "Covid misinformation on Facebook is killing people - Biden"

1956

Image of a vintage convertible car interior with no seatbelts fitted

Seat belts reduce serious injuries in car crashes by half

Source: www.cdc.gov/Motorvehiclesafety/seatbelts/facts.html

Image of a vintage 1950s print advertisement for the Ford v8 Customline automobile

Image of woman in a vintage 1950s print advertisement modelling automobile seat belts

Image of a vintage 1950s child's car seat adapter fitted to the interior of a vehicle. The 'seat' consists of a plastic and metal sling fitted to a passenger seat with a toy steering wheel attached and appears designed to boost the child so they can see out the window, but does not have a lot in the way of safety restraints

Image of a vintage 1950s Sears Catalogue print advertisement for an "Auto Strap for front-seat tots" featuring a toddler standing tied to a car seat via a strap attached to the child's clothing at one end and tied to the seat on the other

Image sketch of a vintage automobile interior with a mother in fine clothing in the driver's seat and a young child beside her propped on a booster seat. Neither occupant has a seatbelt and the car is not fitted with them

Black and white photograph of a man in a 1950s car driver's seat with a donkey behind him in the back seat

In 1956, only 2% of customers were purchasing seat belts for their cars.

Source: www.cdc.gov/Motorvehiclesafety/seatbelts/facts.html

1965

Black and white photograph of a young Ralph Nader standing on an freeway overpass which is busy with traffic below

Colour photograph of a middle aged Ralph Nader in front of a "Vote Green" campaign sign during one of his many Presidential campaign bids

Cover image from Nader's influential book "Unsafe at Any Speed - The designed-in Dangers of the American Automobile"

Specific issues:

  • The safety feature of seat belts as a costly add-on rather than a standard

Inset image of the Unsafe at Any Speed book cover

Specific issues:

  • The safety feature of seat belts as a costly add-on rather than a standard
  • Tire pressure calibrated for comfort rather than safety

Inset image of the Unsafe at Any Speed book cover

Specific issues:

  • The safety feature of seat belts as a costly add-on rather than a standard
  • Tire pressure calibrated for comfort rather than safety
  • Bright, chrome dashboards reflected the sun into the driver’s eyes

Inset image of the Unsafe at Any Speed book cover

Specific issues:

  • The safety feature of seat belts as a costly add-on rather than a standard
  • Tire pressure calibrated for comfort rather than safety
  • Bright, chrome dashboards reflected the sun into the driver’s eyes
  • Lack of a standard shifting gear pattern

Inset image of the Unsafe at Any Speed book cover

Specific issues:

  • Car designers ignoring existing crash science

Inset image of the Unsafe at Any Speed book cover

Specific issues:

  • Car designers ignoring existing crash science
  • The negative impact cars were already having on the environment

Inset image of the Unsafe at Any Speed book cover

Nader’s criticisms

  • Automotive company leaders refusing to prioritize safety over fears of more expensive cars and “alienating users”

Inset image of the Unsafe at Any Speed book cover

Nader’s criticisms

  • Automotive company leaders refusing to prioritize safety over fears of more expensive cars and “alienating users”
  • The negative impact cars were already having on the environment
  • Marketing campaigns shifting the responsibility of safety onto the users

Inset image of the Unsafe at Any Speed book cover

Nader’s call to action: for the federal government to regulate the automotive industry to force them to prevent harm

Inset image of the Unsafe at Any Speed book cover

What followed was one of the most successful public health campaigns of all time.

Image of the National Highway Traffic Safety Administration (NHTSA) logo which features a square grid containing icons of a steering wheel, a pedestrian, a star, and a road

The NHTSA is responsible for:

The NHTSA is responsible for:

  • reducing automobile death and injury

The NHTSA is responsible for:

  • reducing automobile death and injury
  • promoting the use of seat belts, child seats, and air bags

The NHTSA is responsible for:

  • reducing automobile death and injury
  • promoting the use of seat belts, child seats, and air bags
  • helping states reduce drunk driving

The NHTSA is responsible for:

  • reducing automobile death and injury
  • promoting the use of seat belts, child seats, and air bags
  • helping states reduce drunk driving
  • setting and enforcing safety standards

The NHTSA is responsible for:

  • reducing automobile death and injury
  • promoting the use of seat belts, child seats, and air bags
  • helping states reduce drunk driving
  • setting and enforcing safety standards
  • investigating safety defects in automobiles

1968

Image of 1968 safety signage indicating that Seat Belts are Required which was implemented after NHTSA passed legislation mandating that all vehicles except buses were required to be fitted with seatbelts

Black and white photograph of a man wearing a "Nader Was Wrong!" tee-shirt in protest of the seatbelt laws

Contemporary photograph of sign with a facemask drawn on it above text reading: "The New Symbol of Tyranny" from a recent USA anti-mask rally in response to government mandated mask mandated designed to curb the spread of Covid-19

1983

In 1983, less than 15% of Americans reported consistently buckling up.

Vintage postcard sending greetings from New York City

Image of 1980s safety signage promoting seatbelt use with text reading: "Seat Belts Save Lives: Buckle Up Every Time" alongside a graphic of a buckled seatbelt

Image of the grimacing face emoji

Image of a graphic promoting the US State of New Hampshire's motto: "Live free or die" with an image of a grove of trees. New Hampshire is the only US state who do not mandate adult seatbelt usage.

The national use rate for seat belts was 90.7% in 2019.

Source: https://www.nhtsa.gov/risky-driving/seat-belts

Paradigm shift: complete

The seat belt paradigm shift:

  • 1956: seat belts introduced
  • 1965: Ralph Nader’s expose; public outrage
  • 1966: NHTSA created
  • 1968: Law required all cars to come standard with seat belts
  • 1984: New York passes the first law requiring people to wear seat belts
  • 1984 - 1997: all states pass a seat belt law

Repeat image of the list from the prior slide with a black outline box enclosing the bullets from 1965 to 1984 delineating the cultural shift (as opposed to just the legal shift) in the way seat belts were viewed

It took over 3 decades of activists, academics, regular people, and politicians working towards a paradigm shift of seat belts being the norm.

What does this have to do with us?

Profits over user safety

An enormous amount of preventable harm

“Safety is the user’s responsibility.”

Screenshot of an online guide called: "Using the Internet More Safely" from domesticshelters.org (an organization who offer resources, guides, and information for people suffering domestic violence and abuse). The guide offers a list of ways in which users can safeguard their online activity regarding being digitally surveilled, specifically within domestic abuse contexts

“Safety is the user’s responsibility.”

Little to no government oversight & failing miserably at self-regulation

organized activism & growing discontent from the general public

Tech is a pre-seat belt phase

Tiimeline delineating the three phases of the paradigm shift around how seatbelts safety protocols were adopted and enacted. The first phase from 1940 to 1965 is classified as: Some activism; the general public is indifferent.
This phase is marked by the following events:

  • 1940s-1950s: Research supports that using seat belts saves lives, but opposition remained fierce
  • 1956: 2% seat belt use
  • 1965: Ralph Nader releases Unsafe At Any Speed, exposing how auto companies putting profits before safety. This propels forward existing activism on the issue.

Phase two from 1968 to 1984 is classified as: Public opinion shifts to outrage and activism increases, leading to new lawsThis phase is marked by the following events:

  • 1968: Law passed required all cars to be outfitted with seatbelts as a standard instead of at an additional cost.
  • 1983: 14% seat belt use
  • 1984: New York passes law required seat belts be used; all states except New Hampshire follow

Phase three is classified as: Behaviors change
This phase is marked by the following events:

  • 1985: 21% seat belt use
  • 1990: 49% seat belt use
  • 2000: 71% seat belt use
  • 2010: 85% seat belt use
  • 2020: 90% seat belt use

Insight:
Paradigm shifts are possible, but they require sustained effort over a long period of time.

Insight:
A focus on specific goals is important.

What do we mean when we say “ethical tech”?

Two areas of focus:

  • Tech products
  • Tech industry

Two areas of focus:

  • Tech products

    • Safety
    • Justice
    • Compassion
  • Tech industry

    • Safety
    • Justice
    • Compassion

Two areas of focus:

  • Tech products

    • Safety
    • Justice
    • Compassion
  • Tech industry

    • Safety
    • Justice
    • Compassion
  • Stalking
  • Tech-facilitated domestic violence
  • Image abuse (“revenge porn”)
  • Invasive surveillance (of domestic partners, children, elders, workers)
  • Cyber-bullying
  • Anonymous harassment, threatening, doxxing

Two areas of focus:

  • Tech products

    • Safety
    • Justice
    • Compassion
  • Tech industry

    • Safety
    • Justice
    • Compassion
  • Tech that harms the planet
  • Data harms
  • Racist, sexist (etc) algorithms
  • Exploitative design practices
  • “social good” projects that do harm
  • Harmful “disruption” with a cost (airbnb driving up rental costs)

Two areas of focus:

  • Tech products

    • Safety
    • Justice
    • Compassion
  • Tech industry

    • Safety
    • Justice
    • Compassion
  • Cruelty in advertising/promotion
  • Hurtful copy
  • Failing to design for stress cases
  • Re-traumatizing users
  • Disallowing control over what is seen
  • Secretly experimenting with users’ emotions

Two areas of focus:

  • Tech products

    • Safety
    • Justice
    • Compassion
  • Tech industry

    • Safety
    • Justice
    • Compassion
  • Workplace harassment, assault, and abuse
  • HR teams who protect the company over the people
  • Unsafe working conditions

Two areas of focus:

  • Tech products

    • Safety
    • Justice
    • Compassion
  • Tech industry

    • Safety
    • Justice
    • Compassion
  • Inequitable hiring, retention, and promotion
  • Toxic and exploitative cultures
  • Unjust worker compensation
  • Poor or no healthcare
  • Failing employees with accessibility needs
  • Education that reproduces existing oppression in the industry

Two areas of focus:

  • Tech products

    • Safety
    • Justice
    • Compassion
  • Tech industry

    • Safety
    • Justice
    • Compassion
  • Lack of agency, burnout
  • Failing to see humans before employees
  • Stalking
  • Tech-facilitated domestic violence
  • Image abuse (“revenge porn”)
  • Invasive surveillance (domestic partners, children, elders, workers)
  • Cyber-bullying
  • Anonymous harassment, threatening, doxxing
  • Inequitable hiring, retention, and promotion
  • Toxic and exploitative cultures
  • Unjust worker compensation
  • Poor or no healthcare
  • Failing employees with accessibility needs
  • Education that reproduces existing oppression in the industry
  • Cruelty in advertising/promotion
  • Hurtful copy
  • Failing to design for stress cases
  • Re-traumatizing users
  • Disallowing control over what is seen
  • Secretly experimenting with users’ emotions
  • Workplace harassment, assault, and abuse
  • HR teams who protect the company over the people
  • Unsafe working conditions
  • Inequitable hiring, retention, and promotion
  • Toxic and exploitative cultures
  • Unjust worker compensation
  • Poor or no healthcare
  • Failing employees with accessibility needs
  • Education that reproduces existing oppression in the industry
  • Lack of agency, burnout
  • Failing to see humans before employees

Timeline based on the 'seatbelt paradigm' with the first phase marked by Research and Education and correlating with some activism; but the general public is indifferent. The second phase is marked by first laws and additional laws being passed and correlates with public opinion shifting to outrage and activism increases, leading to new laws. The final phase is marked by Real change beginning to happen and the paradigm shift nearing completion and correlates with Behavioural change.

Specific ethical issue: Racist algorithms

Biased algorithms

Timeline tracing the history of biased algorithms through the three phases of the paradigm shift citing representative examples of bias and the attendant shifts in legal and moral consequences over a period of decades

1986: Bias is identified in the St. George’s admissions algorithm, leading to an inquiry. The school is found guilty of discrimination.

Some activism, the general public is indifferent

Biased algorithms

1986: Bias is identified in the St. George’s admissions algorithm, leading to an inquiry. The school is found guilty of discrimination.

2016: ProPublica demonstrates that criminal prediction algorithms are biased against Black people

Some activism, the general public is indifferent

Biased algorithms

1986: Bias is identified in the St. George’s admissions algorithm, leading to an inquiry. The school is found guilty of discrimination.

2016: ProPublica demonstrates that criminal prediction algorithms are biased against Black people

2018: Joy Buolamwini and Timnit Gebru publish paper demonstrating Amazon’s facial recognition has a far higher error rate for dark-skinned women

Biased algorithms

1986: Bias is identified in the St. George’s admissions algorithm, leading to an inquiry. The school is found guilty of discrimination.

2016: ProPublica demonstrates that criminal prediction algorithms are biased against Black people

2018: Joy Buolamwini and Timnit Gebru publish paper demonstrating Amazon’s facial recognition has a far higher error rate for dark-skinned women

We are here:
2021: The Algorithmic Justice and Online Platform Transparency Act introduced in Congress

The timeline tracing the history of biased algorithms is extrapolated out to the future in anticipation of laws being passed, correlating with the current ongoing phase marked by public opinion shifting to outrage and activism increasing, and through to the final phase of behavioural change. The timeline notes that we are currently only partway through phase two

Biased algorithms

Some activism; the general public is indifferent.
  • 1986: Bias is identified in the St. George’s admissions algorithm, leading to an inquiry. The school is found guilty of discrimination.
  • 2016: ProPublica demonstrates that criminal prediction algorithms are biased against Black people
  • 2018: Joy Buolamwini and Timnit Gebru publish paper demonstrating Amazon’s facial recognition has a far higher error rate for dark-skinned women

We are here:

  • 2021: The Algorithmic Justice and Online Platform Transparency Act introduced in Congress
  • First laws passed
  • Additional laws passed

Behaviors change

  • Real change begins to happen
  • The paradigm shift is nearly complete

Takeaways

  • Choose one of the many issues to focus on and lend support to the others.
  • Plug into existing activism
    • there’s almost always something already happening; don’t center yourself by starting a new thing

Takeaways

  • Choose one of the many issues to focus on and lend support to the others.
  • Plug into existing activism
    • there’s almost always something already happening; don’t center yourself by starting a new thing
    • Remember to follow the voices of the people being impacted by the problem

Takeaways

  • Choose one of the many issues to focus on and lend support to the others.
  • Plug into existing activism
    • there’s almost always something already happening; don’t center yourself by starting a new thing
    • Remember to follow the voices of the people being impacted by the problem
  • Think about where the issue is in the timeline of paradigm shifts to inform your next move

Takeaways

  • Stay hopeful, but be realistic.

Takeaways

  • Stay hopeful, but be realistic.
  • We know that change is possible, but that it can take decades for meaningful change to happen.

Takeaways

  • Stay hopeful, but be realistic.
  • We know that change is possible, but that it can take decades for meaningful change to happen.
  • Find your team

Takeaways

  • Stay hopeful, but be realistic.
  • We know that change is possible, but that it can take decades for meaningful change to happen.
  • Find your team
  • Set a sustainable pace and take breaks; changing tech for the better is a marathon, not a sprint

Resist empty rhetoric around “ethical tech”

Start where you are.
Use what you have.
Do what you can.
-Arthur Ashe

Thank you

Thank you

Eva PenzeyMoog

Book: Design for Safety

Learn about designing for safety services at inclusivesafety.com

Contact me:
[email protected]
twitter: @epenzeymoog

Inset photo of Eva's book