Inducing Ethics: Why the Internet Needs to be Regulated and How We Can Do It

(audience applauds) - Thank you, thank you, thank you, thank you. Yeah take, go ahead.

First, I just wanna start off, do a little classification, how many of you tried the brownies? Yeah, you're my people, okay.

That's classification.

Yeah, so thanks for the intro.

That was a great intro.

For what he didn't cover, my name is Joe Toscano.

I write for Envision, Adweek, Smashing.

I have also put this book together.

Formerly a consultant for Google and now I have been on the road for 16 months giving talks like these, helping people understand why I believe we do need some regulation in the industry. Most of it comes down to, yes I believe ethics are great, and I think we're headed on the right path, but there's also financial reasons that do not nescessarily hit in with the ethics at this point, and I think that's why we discuss ethics and regulation as well.

So that's what we're gonna kinda cover today. I also started my own non-profit foundation, called designgood.tech.

The purpose of this, it's three parts.

It's one to help the general public understand what's going on, because as we all know, we are people in the AI space, and it can be tough, so for the general public talking about these issues, demanding the right things from your politicians, demanding more from these companies is difficult, when you don't even know what to ask, so helping them understand.

Helping technologists have the language, the research, the knowhow to implement better tactics and to sell that to their bosses and move it up faster.

And then lastly, kinda what I just said, how we can help policy makers understand this. Because I know for sure the United States there's a lotta confusion, if you saw any of the Zuckerberg on trial or any of that. It's kinda like over their heads, right? So I think we need to help them understand, otherwise we could end up in a position where we're regulated into really bad points. And yeah, that's my book, Automating Humanity. What I tried to do is take really, really tough topic and illustrate it in a way, that maybe hasn't been thought of before, bring it to life in a visual manner and take all these invisible, intangible concepts, and put them into something we can grasp.

So first we're gonna start off with like what the fuck is happening? We, I think, we all, we're here for a reason. We all kind of have an idea of what is happening, but I'm just gonna skim some of the more negative aspects firsts, and then we'll talk about what the good parts are, because we need to talk about that too.

So up front, we have digital addictions.

This is a problem.

We can't run away from it.

It's the truth of the current moment in time. But it's getting addressed.

But to understand we have to see and realise that it is scientifically been proven to be changing our postures.

This can have impacts on our kids and futures generations.

It also has immediate impacts on us and our emotional states.

Seeing that screens are pushing us towards less sleep because of the blue light emitting and reducing our melatonin, making it so that our circadian rhythms are off. And we're see people are losing purpose and increased risk of depression at a large scale. Just so you know, I guess I didn't state it, but the grey line here is when the iPhone came out, 2007, when we had mobile computing era, something attached to our hips 24/7 that was pinging us and keeping us going. (coughs) Next is job loss.

This is just something that's gonna happen. It's undeniable at this point.

Automation is an incredible tool, it's an incredible economic tool too.

It's going to be a matter of economics that we move forward and jobs transform, not just loss but transform. There will still be jobs, we'll figure it out, but we'll also have to deal with job loss in between. Why is that? We'll here's an example of, how many have you have seen the warehouse robots at Amazon? Yeah, a good amount of you.

The big key here is that, robots can work more than us, is one right, but two they think differently.

So like, for example, the warehouse workers that we replace with robots here, a human thinks categorically.

So we might think, oh the peanut butter is over in the food section, and the brooms is over in the home section. But if Amazon sees with their algorithm that oh when people buy peanut butter, they also buy brooms, you can just put it somewhere and the robots think logistically.

So it saves them hundreds of hours over the year of walking across a warehouse.

It's just simple little things that add up big pennies over time.

Then lastly, social engineering, surveillance, privacy concerns, all of that.

How many of you've seen the social credit score in China? We've kinda seen headlines already.

We all have our own opinions of that.

But consider that even if you're not under that system we rate each other on apps like Uber, Lyft, Yelp, Google, everywhere.

We do it already actively and that's our freedom that we are determining. But what are the benefits? Because for all the fear mongering that is out there, there's also a huge amount of benefits to what we're creating and there's a huge opportunity to make even more benefits long term.

So for one, amplified intelligence.

For example, how many of you remember maps before Google? (laughs) Right, ouch, right that's terrible. But how incredible is it that in 2001, who remembers 2001 Google Earth? It crashed your computer, did it, maybe? Now I have a Pixel 2, it's charging, because you know I didn't right, but I have a real time rendering of Earth as a wallpaper on my phone and it doesn't effect the performance at all. That's insane how far we've come.

There's no better time in history to get around from point A to point B.

Safer than ever, more direct.

Now can I find the scenic route? Maybe not yet, there's no button for that, but maybe in the feature.

Also consider how we make things.

You used to have to carve from stone or from clay or paint on canvas.

Not that we don't and not that it's not good too, I am a designer by trade.

I love arts.

But consider some of these like Bishop from Autodesk which is capable of taking in parameters and helping us design things that we never would have thought of before. What artificial intelligence is gonna do is it's going to amplify our minds, the same way steam amplified our physical abilities. So steam allowed us to go farther, to lift heavier weights, to do things at a faster rate.

Artificial intelligence will do this for our brains. This is an example of an algorithm that's given parameters and told to engineer a drone chassis.

We couldn't think of all of these maybe in months, and it can do it in a matter of seconds and that's pretty incredible.

Next point here, inclusive and accessible technologies. So we've had conversation design come up throughout the day. How many of you have ever heard of Soul Machines? Just not too far from here, actually over in New Zealand is where the base to, they're now global at this point. I think they just recently partnered with IBM. It's really, really interesting technology. What they've done is, they render 3D avatars that they use a mapping of human face, and it looks, acts, reacts, just like a human being. This is used with New Zealand Insurance Schema, to assist people with their problems, with their services, versus calling up someone and having to do dial tone or stay on the phone, hold it up here.

They've created a system, a chat bot, that as you can see up here, the camera faces down and watches them to see maybe they fell asleep in the middle of it and it pauses the interaction.

It waits for them to wake up, versus a phone call that would hang up on them. Maybe they're feeling agitated, because they don't understand the system.

It will course correct, and help them feel better about the experience. Also, if they can't hear, it's printed on the screen.

There's a lot of things that we can do with technology to make it more inclusive and accessible to many, many people.

Also, think of self-driving cars.

How many of you, and this is gonna skewed data, but how many of you would get in a self-driving car? Probably a good chunk, right? Okay we're all in tech.

I give talks around the globe, I'm talking to non-technical audiences and probably not surprising, but there are many people who are scared outta their minds of getting in a self-driving car.

But something I show these people is, this right here, this dot, which thank God we have a big screen, 'cause normally you can't see it, this dot right here represents your entire life time of driving experience, 60 plus on the years on the, that's about 800,000 miles on average.

You can trust me because my parents own a body shop and I grew up around cars.

But this right here, represents Tesla's machine crowdsourced knowledge in a matter of 15 years and their driving experience.

So what I try to tell people, it's just a matter of like public safety that we're gonna have this happen.

Less people are gonna die, cars are gonna flow through the traffic, or cars are gonna flow through the street more efficiently. And they're safer, right? Watch, this is an example for you right here. So I guess I don't have the audio for this, which is fine.

But basically if you listen to it, this is a Tesla that you're riding in.

And I'm gonna replay it one more time.

But what happens is, as you see the cars brake, just ahead of this red car, the Tesla recognises an accident coming before the other car can even see anything. These cars have sensors that allow them to see things we can't possibly see. Once you have GPS connected to things and all the cars are connected to the internet, you'll gonna be able to do this space on traffic patterns and things that are around you, that you have no opportunities to actually see as a human that has two eyeballs looking forward. Also, it's a generational thing.

Kids don't drive as much anymore.

They grow up with ride-sharing apps.

There's no need to get a licence.

And what we're going to see over time, is that smart cars, automated cars, they will phase out automatic transmissions just the same way that automatic transmissions phased out manual transmissions.

It's just a natural progression.

And lastly ambient technologies is another huge opportunity here.

This right here is the Bullet Centre in Seattle. If you haven't heard of this, what this is is a fully sustainable building, net zero energy. There's solar panels built on top that allows it to supply energy to the building. The blinds and everything inside adjust based on the sunlight that's coming in in order to heat and cool the building properly. It captures rainwater.

It allows them to have water that flows throughout the building that is not from any other process.

Next is Big Belly Trash Cans.

How many of you have ever heard of Big Belly Trash Cans? It's like the plumbing job of the future, right like building a trash can.

But this is brilliant.

This team built a trash can that has a solar panel on top, wireless internet connection and a trash compactor inside of it.

And what this does, is it maintains the level of trash that's in the can and it alerts the waste management services to let them know when one is full and when it is not.

What this does is it allows them to change their traffic patterns, so they don't have to go pick up every single trash can on the route.

They can then change their route to just the ones that are full.

You save CO2 emissions, you save work and time with the business, you save a lot of different things, and it's a trash can.

And lastly with ambient technologies like conversion and eye on I mean, a lot of these things are gonna allow us to get away from screens, to break the addiction that we have installed at a global scale for profit, and that's awesome. So there's a lot of opportunity here.

But what I tried to cover in the book and what I kinda opened up with is we do need to talk about regulation, because regulation's coming.

It's not something we can hide from.

People are scared of it at this point and you're either gonna get regulation that comes from people who are terrified and don't know what they're talkin' about, or we're gonna kinda come out as a tech community and say yeah, we understand.

And these are the things that we need to start talking about and bring to life.

So what I'm gonna go through is a high level overview of the book.

If you have questions, please raise your hand. I would much rather this be a conversation than me speaking at you the whole time.

As enjoyable as I hope it is, I'd rather have you talking to me.

So, (coughs) let's do it.

First step, recognise data monopolies.

This is a really elusive thing to many politicians and probably many people in the world really. But the first step in recognising this is to recognise that these are the largest empires in the history of the world, many of them, right? And now I have a lotta people who criticise me when they see the slide, 'cause they're like, Joe you forgot the grey for Android and for Facebook.

No, no, no, no, just look closer.

That's what 20 years to two billion looks like compared to 2,000 years to one billion.

That's the disparity.

What we have are some of the largest organisations in the history of the world, that have popped up faster than anything we've ever seen before.

This is a maturing market, that simply just needs to be tamed down.

I don't believe that anything is done intentionally out of evil pursuit, but we have just moved faster than ever before, and it's just, we need to fix some things.

So how many of you know antitrust law? I don't assume any of you do, that's fine.

You shouldn't and that's okay.

I read up a lot about it in order to make this, but there's three main pillars you need to know about. One is keeping prices low.

One is allowing free trade.

And third is not destroying competition.

So first step, first step is lowering prices and this is regulators are really struggling right now, because in traditional antitrust law, they will allow monopolies to continue to dominate and thrash competition if they're lowering price, because price is considered best for the consumer. Now what we have in this industry is a lot of products that are free.

So how do you lower price any more? It must be good for the people.

But as we've seen and as we know, it is damaging us, psychologically, so we're paying a new price in this attention economy. Attention has become the new currency and how is that impacting us? Well we know and science has proven it.

The other thing we need to think about is that the way this market place works, you get more attention, which then turns into more data, which then turns into money.

This concept is fairly similar to money laundering in the way it's working right now.

Now that's not to point a crime at these companies, it is to say that we have an illegitimate currency in attention.

It's not bought and sold on the marketplace for 2.75 a gallon like oil, gasoline.

It's not like a $20 bill that I can hand you and pass around that's worth $20 to everyone else. The companies who have set the marketplace values are defining it on their backend and it's a black box to us. That's a problem.

The last time that this happened was the early 2000s and the financial market, where things, again, got over complexified, so the public wouldn't look into it, were manipulated by the companies that owned the data and then we know what happened.

So we need to talk about this.

And then when it comes to price gouging, you have to consider that when they create addictive systems that are meant to drive attentional cost, AKA engagement, that is price gouging in the attention marketplace.

So if you think about the fact that they went from a fiscal currency to zero and they now have this attentional cost that's skyrocketing through the roof, in this new attention marketplace, they are price gouging. And that's something to consider when you're building products too.

Engagement is the new cost and how are you building that into your system. Next, elimination of free trade.

So eliminating free trade's a little bit easier to see and we all definitely see it, but you just might not think of it this way. Elimination of free trade is with new data. So whenever we create a seamless digital ecosystem, something that's maintaining data within our ecosystem, we are eliminating free trade.

Now it is, if you have any background in experience design, it is a better experience when you have that. It's seamless, things connect and they sync and it moves fluidly, but there's doubt that that's what it is.

Like you can't share data from Facebook to Twitter very easily, because Facebook wants that data. Same with Google telling Amazon earlier this year, hey, you can't run YouTube on your Fire TV devices. Because why? Well consumers, you should just buy Google devices. Or Amazon firing back and say, we'll we're not gonna sell Nest thermostats on amazon.com then.

This seems petty to consumers, but it's what it is, we're eliminating free trade of data.

And then lastly is destruction of competition. So this one is pretty blatantly obvious and regulators can't get around this.

But in terms of data monopoly, the example I bring is one that was run last year as a study by 360i where they tested Google Home versus Amazon Alexa. 3,000 plus questions to see how they performed. And what they found is that Google Home dominated 'em by six times.

And now you might be thinking in your mind, well yeah, duh, it's Google verse Amazon.

But Amazon owned the marketplace for two years longer and they had 70% saturation.

So they dominated in the smart speaker category, but they got dominated by a company who had just created one a year or so ago.

Now they question they asked people is if that's happening with the giants, how is any small business supposed to keep up? That's where we're lacking innovation.

In the United States since 1996 when we had the internet start to blossom and take off, we've seen the number of IPOs, publicly traded corporations has dropped almost 50%. Because now a days, you can literally copy and paste code. You can build the business, but to compete is a whole different thing.

So we see in Silicon Valley is that 53% of these companies believe it is their goal to get acquired now a days, and only 16% think they stand a chance at an IPO. So that's the point I bring up.

We are at the point where we have data monopolies. These companies aren't the only ones, but we need to start recognising that if we wanna spur innovation, continue to do good and create products that are going to make a better future. But the problem here that we're running into with regulators is they wanna break this up in a traditional sense.

But could you imagine, for example, if all of Google's ecosystem was broken up into individual products.

I think it would be a nightmare.

I think most of us would probably agree with that, right? There's a convenience that has been created that we need to discuss on how to maybe, figure out how to keep around, while allowing innovation to still occur.

And that's what we'll take next steps for.

So how do we make news trustworthy.

Again, this is very high level, but one example I bring up is, who saw Zuckerberg's Congress hearings earlier this year? Yeah, and one thing that he stated was that they're gonna hire more people to help with flagged posts, and he said they're gonna hire 20,000 more people. How many of you think that's satisfactory, 20,000 people? Or how many of you have the context to make that determination? Yeah, that's the bigger issue, right? So what I bring up it as an a example is these people are like the emergency workers of the world if you can think of it like that right.

In New York City, there are 15,000 emergency respondents to respond to nine million people.

Now what Zuckerberg says, we're gonna hire 20,000, to respond to 2.1 billion people.

And so their inboxes look like this and there's just now way that they can keep up. I think one of the biggest things we need moving forward is to talk to these companies and figure out some regulation that states hey, it's awesome what you've done, you've changed the world and you deserve to make billions of dollars, but you need more human in the system. You need more people to help out.

The algorithms are great at being a big fishnet, and then we need humans to do the details.

Even if you look to DARPA, DARPA is the United States Military that pushes innovation and technology, they are the people who spurred the internet, they recently reported in April, they came out with a programme called CHESS, which is Computers and Humans Exploring Software Security. And basically they said, hey look, our algorithms just aren't ready to do this. So I believe there needs to be regulation to push these companies into this, because there's no fiscal reason to hire more people. And if we wait for consumer demand, it could be very troubling.

Because look at some of the things that are happening around the globe right now with elections.

Some of the most developed nations in the world are kinda imploding on each other and what do they have in common? Not the same language, not the same currency, culture, just about anything, except for the internet and the way that it distributes information and shapes our reality.

So I think that's one little step to could do big things for us.

It would also help us bridge the gap by hiring more people and creating jobs.

Next is, demand transparency.

So as a consultant for Google I saw about 20 projects on average a year between me and managing juniors, a lot of their big money projects.

And there are lots of things that I can't talk about, but I scoped the industry for a lot of different things and what did find is something we can talk about publicly, which is here.

This right here is the feedback rating system that Facebook gives you when you hang up a call on Messenger.

Do any of you see the flaws here? Can you raise a hand? Yeah.

(audience member talking quietly) There you go.

Like very people say that.

Obviously I'm in a room with data scientists. A lot of people are like, oh all the stars look the same. No, no.

That's absolutely right. (laughs) It goes poor, fair, very good, or poor, fair, good, very good, and excellent. So nice job right? For example, if Zuckerberg had to go to Congress and they said, well do people like Facebook Messenger, he could legal say, well 80% or more believe it's fair or better. And that's right, statistically.

Now the thing I bring up is if this is in a publicly facing survey in a product, we need to begin to ask ourselves what questions were asked internally that allowed this to get launched publicly? That is why I believe we need auditors for these systems, but for algorithms and for manipulative design patterns. It's part of what I'm hoping to do with Design Good Tech, is to start to lay the foundations for those and for more regulations.

But I do think it's important that we start to ask more questions of these companies because, there are things happening.

Next, define privacy.

So privacy is a very fluid topic and most people don't know how to address it, because it's the internet and it's invisible and like how do we talk about it? But I thought of this for a long time.

This took me like a lotta months, just like kinda simmering on a thought, and I went back to really my roots.

I'm from Nebraska in the United States originally, which is, if you don't know, right in the middle of the country, buncha hillbillies. And my parents were blue collar workers who worked at the post office.

You can think of privacy this way.

In the United States and in many nations across the world, it is illegal to open someone's mail.

In the United States it's a federal offence that can land you up to five years in prison, for opening someone's mail.

You can send a video through the mail.

You can send a photo through the mail.

You can send text, you can send whatever you send through the internet through the mail.

Now what's happening on the internet is that we have companies that are collecting data, buying and selling it, and telling us if we want access to our data that it's their property.

It's their property.

Could you imagine if the post office held up your mail at ransom and said we're gonna sell this to whoever we want? We're gonna make copies, we're gonna do what ever we want.

Oh you want it? It's our data, sorry.

That'd be crazy, right? So I think this is an issue we need to address. And I think something that we could really talk about in terms of moving the needle forward, giving companies financial incentive as opposed to cracking down the fines all the time, is what if we gave them tax breaks for improving their privacy and security standards? Say you wanna allocated 20,000 hours of engineering to improving your engineering practises.

That's better for society.

Why not give them a tax break? Say you wanna bring workshops to make it a priority in your company culture, or you wanna hire white hat hackers to proactively hack at this stuff and make sure that it's safe.

Why not get tax breaks? Because if we continue on a retroactive plan, where we're punishing companies after they do something bad, we're not really getting anywhere.

For example, Google had a $2.7 billion fine in the EU in 2017.

I got ahold of the family who started that lawsuit. They gave me a 64 page documentation of every single thing that happened in that court case and it lasted over 10 years.

That means Google made almost a trillion dollars, and said oh that was a $2.7 dollar or a $2.7 billion speed bump.

That's why I think we need to give, or at least consider, some financial incentives to help move the industry forward faster.

I think it's a matter of publicly safety.

Next, update consumer protection standards. Right now we don't have a lot, but for example, this right here is a wire frame of my Facebook Timeline. Now, how many of you can spot the sponsored posts like that? And how many of you notice the timestamps on this? Like when we're in Facebook, we don't really notice these things, but when it's blown up like this, it kinda induces a little anxiety.

I believe some of these patterns, we need to have protections against.

Consumers have hit a point where it's a new literacy, it's a new world.

We understand this stuff, because we're in the space, but there are millions of people around the world who have know idea what exactly is being done to them. And it's not much different then what happened in 1450s when the Gutenberg press came out and people could then begin to learn how to read. It's new tech, it's new literacy.

And I think we need to protect consumers from the manipulative and coercive things that companies may do.

Another example I bring up is, this is a visual evolution of Google ads over time. Can you tell me where they begin and where they end? And it should be a little difficult, because we are at a point where it is difficult to tell. The only thing that's separating a result from an ad, is a little tiny box that is transparent and has a line around it, says ad.

At that point, it's paid influence of our minds. I think that's, again, things that we need to protect against, simple consumer protection standards.

Doesn't stop a business from operating, but makes it so that they have to have some kind of ethical standards.

And those who make the transition willfully, there's no problems with it.

But it protects them from businesses who decide we're not going to, and we're gonna continue manipulating people because it's better for our business. So how do we incentive? Create regulation around dark patterns.

Force companies to fund research that is against their best interest.

It's easy for them to fund research.

That doesn't mean it's going to the right hands. We need to force them to fund research that's digging into the problems with big tech and the damage that's being done.

Force them to supply PSAs to communities the same way big tobacco had to supply PSAs to the communities.

You know what's happened since that really started happening? In the United States tobacco usage has dropped from the mid-40 percentages to the low teens in a little over a decade.

These are small things that could greatly change the industry and the world that we're living in, without doing catastrophic damage to our economy or having to break up the conveniences that we have created with the internet.

Next, fight for representation.

So this is a story from the states, so I'm assuming most of you probably haven't heard this one, but in the United States we have Yellowstone Park. Yellowstone Park is very famous National Forest. And little story that gets passed around the states is that in the late 1800s there was a group of scientists, very well meaning people, went in and studied the ecosystem.

They found what kind of animals were in there, what kind of plant lifes were in there, trying to figure out, what's the best way to keep this long-term? How do we sustain this? And their best estimation was that they should get rid of the grey wolves. They thought that they were doing damage, they were killing off animals.

And it would probably be better long-term to get rid of as many as possible.

And what happened is that by the early 1900s between them and the Natives around the area, they killed of all the grey wolves, complete extinction. Now what happened, despite what they had predicted in their minds, was the by killing off the grey wolves it allowed different animals to move into different areas of the park they never had access to before. Deer and elk were grazing different plants, different areas that cut out different plant life. The beavers were moving around to different places and changing dams, which completely altered the flow of rivers. What they did is they ended up almost destroying the park. And then in the early 1990s, scientists said, what happens if we just like re-introduce some grey wolves there and let's see. And what happened over the past 20 years, 30 years, is that they've seen the park regenerate.

Streams have changed, animals have herded into different areas, the park has regrown. Now what this is, is an example of what happens when you take out what we call a foundational element of an ecosystem.

The wolf played a huge part in this ecosystem that we never could have predicted before, because it's a massive ecosystem.

And I bring this up, because what we're doing in tech is we're automating humanity, right the cover of my book. We are automating every part of life that we can. And while it's very interesting and it's exciting, we also have to understand that we're trying to replicate in logic, is the greatest, most complex puzzle of history, which is life itself.

Despite our best intentions, we have to realise that we may end up accidentally doing damage we didn't mean to do. And that's why we need to fight for representation in not only the workforce, but in the datasets that are training these systems and the models that are replicating different areas of society.

Because if we don't, we may inadvertently start to eliminate some cultures and different walks of life just because we accidentally did something we thought was actually gonna be really good for people. So in the United States we fought the American Revolution for no taxation without representation and I think the modern fight in here, in this modern day and age, should be implementation without representation and we should push hard for that.

Next is reintroduce stability.

We were talking about this in the back earlier but, these companies have created brilliant business models. It is, from a business perspective, it is absolutely brilliant.

Almost the perfect efficiency of capitalism. And they've created it in a way that doesn't require physical proximity.

They just need to have a WiFi connection.

It can be almost anywhere in the world.

And so you have companies that are moving to, like for example, where the best tax breaks are. On in the United States you have like Amazon who's having bidding wars over where their next headquarters should be, getting huge tax breaks, corporate incentives, all these things and then playing their workers. And what they've found is that Amazon actually has a net negative income on local economies. On top of that you have the gig economy, which is turning everyone into a contractor. And what this compiles down into is that at a point in time where there are a lotta jobs, but they're not very stable. And how do we bring that back, give some stability to the population? One of the biggest things I talk about, although there's again, many things I talk about in the book just not here, is taxation of automation.

And I know, we're all in this room, like no, let's not talk about that.

But I think we need to a little bit, right? Bill Gates has brought up how we could tax robot workers the same way we tax human workers.

What Bill Gates didn't bring up, probably because he's building the robot workers, is that even if you tax robot workers at the same rate, they can work more than four times the amount of work per year, so you're gonna lose a quarter of the income tax revenue, at least, or three-quarters, sorry.

Then you have to consider, what about software solutions? These are infinitely scalable, they work 24/7 wherever in the world.

They don't have a physical entity which you can tax. So something I bring up is the taxation of data collection processing.

And I just heard a (sighs) in the crowd.

I get it.

Now I don't think this should go for everyone. I do address this point.

I think we need a sandbox area, where can companies can play around with data and not get taxed.

But once you reach a certain amount of data collection processing, or you're impacting a certain amount of lives, contrary to the GDPR which has its derogations based on your number of employees which is totally irrelevant today's day and age, make in on your data collection processing levels. What this does is, is it incentives anti-monopoly behaviour. It says hey monopolies, you wanna go and acquire all the companies in the world, like go ahead, but you're gonna pay us a lot of money in taxes.

It financially incentives ethical levels of data collection. It says collect all you want, hold it as long as you want, just pay us some taxes, put it back into the economy. It also forces companies to figure out a legitimate revenue model instead of existing purely on investor capital and going into the negative perpetually.

And the last step here is that also creates a safety net that can help us create new jobs, fund education, help retrain workers who are going to lose jobs. So you force the companies who are at the top of this to fund, at least part of, the mess and the cleanup of the mess they're making. The way I propose this, is it could be amount that's relative to the whole data processing of the ecosystem and you could operate it very similar to the trajectory of businesses which is on an exponential scale as opposed to linear. Creating brackets, similar to how we have different tax brackets, but it could run more like the way you pay your water metre.

And after like I said, you hit a certain point of mass collection, that's when it kicks in. So we allow innovation, we allow a sandbox area for startups and new companies, but we have those ones who are really impacting the world and sucking a lot of opportunity out of economies to help fund and help people bridge this gap that's coming. So last step that I talk here, and we're getting very close to the end is, I challenge everyone at my talks.

We're all in this space, you're all smart people. You're here, you signed up, you've sat through the whole day.

Let's create a good future, because we are actively creating it everyday, whether we think about it or not.

So me for example, I started designgood.tech. This is a place where consumers can report some of the things they notice.

If they see manipulative design patterns, if they see bad patterns on the internet that they don't like, they can come report this, which then I can help use in like my articles to help inform people of what people don't like. I can set up educational uses to that research and help us move forward and supply you with the research to supply to your boss and sell it easier to make those changes.

I also have resources for non-technical, technical, and policy makers.

These are everything from like tools and apps on how to protect your data and how to keep safe on the internet, to just recent news and keeping up to date with things. Now this is a big step and we can't all do this, we don't all want to do this, and that's totally fine.

But day to day, you're building a product that you can help make changes in and little simple things, like even just talking to your neighbour, if you know someone who's confused about this stuff. If you know someone who has questions, you have the technical knowledge, why not help explain it to them? That's the end of my presentation.

This is my book.

If you are supportive of what I'm doing I would really love your support in helping fund this mission.

And that's it, thank you and keep in touch. (audience applauds).

(upbeat music)