- We're gonna close the conference now with someone who's become something of our philosopher in residence. So he came last year.
Maciej Cieglowski's presentation on The Website Obesity Crisis has become a phenomenon.
Our video alone has been watched hundreds of thousands of times.
Hundreds of thousands of times people watched the video from that last year, and rightly so.
He's spoken around the world, but the very first time was here last year. Literally, I think, days, if not hours, after the conference last year, I basically said, "Maciej, can we get you back next year?" "How 'bout you be our kind of philosopher in residence?" And so I'm looking forward to something absolutely brand new.
If you don't know Maciej's style, well, it's a mixture of cynicism, optimism, humour, and all dressed up with incredible intelligence and insightful observations.
This is the third time he's keynoted the conference, and I hope it is the third of many, many.
I'm not gonna say anything more about him.
I just want to listen to what he's got to say. So, would you please welcome Maciej Cieglowski? (audience applauds) - Thank you.
Thank you so much for a terrifying introduction. I think if there have been thousands of views, 95 or 98,000 of those will have been my mom, who is, she likes to goose the stats for me. So, I was thrilled when John invited me back last year. And I told him how excited I was to give my talk on Accountability in Automated Systems.
And then John explained to me that in order for the logistics to work and for me to actually be able to fly over to Australia and stay in a hotel, there needed to be an audience in attendance. So it was preferable to have the same talk dressed up nicely as "Who Will Command the Robot Armies?" Well, we can't talk about robot armies without unfortunately starting with robot armies. We have them.
We're building them.
And by we, I mean the United States, but strangely Australia tags right along.
I've never really understood this, but there are plans now to get these unmanned flying vehicles into the Australian Air Force.
There's excited Australian pilots who are already test-flying them in Afghanistan or wherever. So this is kind of a grim part of our world. The photo shows kind of the chocolate and the peanut butter of modern unmanned warfare.
Someone figured out that instead of just looking at people through these drones, you could put missiles on them and fire them away. So we've had several generations of these kind of robot warriors now.
They're getting smaller.
There's these ones that soldiers can carry with them into the field and launch them like a paper aeroplane. And they perform these kamikaze missions steered by people. So whether your target's a group of enemy soldiers, or an orphanage, or a car, whatever it is, it hits it and blows up and destroys it.
And then there's like land versions of these too, these little tanks with a gun on top.
They're like the cheap toys you used to get as a kid with a wire that can only go a certain distance. But it protects you.
It can also be used to drag wounded people out of the field.
And the Russians like to leapfrog us, so they've developed this little mean tank that roves around the base and finds intruders and hopefully speaks to them in a dark Russian accent before firing its many weapons.
(audience laughter) And then there's these terrifying kind of bits of nightmare fuel that are not meant to kill anybody. They're just there to kind of carry your stuff. So you'll be seeing these at airports soon. But right now they're in the military.
They're kind of these headless mule-horse-type things. And, God, I'm sorry, it's mesmerising.
It's just so weird.
And if you have a chance, look at the YouTube video of these things, because they move in a very creepy, uncanny-valley sort of way.
So the question on everybody's mind is, with these weapons, is there gonna come a point when there's not gonna be a person in the loop, and they're gonna be authorised to actually kill people? And I spent so much time working out a subtle argument before the election to convince you that this would happen. And now I have like 10 minutes of new material I had to put in, because that all is easy to get rid of. So here's a picture of (audience laughter) the space shuttle, the American space shuttle. And this is a fascinating spacecraft for so many reasons. But one of my favourites is that it has a button in it which extends the landing gear.
And the landing gear extend literally seconds before it hits the runway. And the astronauts made it so that NASA did not automate the landing gear. It could not launch the space shuttle and have it land without astronauts inside, solely because there was no one to press that button. And that was an intentional design element. And of course the Russians, when they built their suspiciously identical space shuttle, had none of it.
They made it fully automatic, and in fact, it flew without a crew and landed safely and everything. So, what I think is going to happen is with weapons in the battlefield, we're going to see the same thing.
There's going to be a point of control that is kind of pro forma, and it's there more for moral and legal reasons. And then at some point eventually and probably very soon, given what's happened, someone's just gonna decide to remove that point of control and be done with it. This is a notice from Barack Obama.
I think it was published in August.
And it basically says that the present state of emergency, which was in its 15th year, is now going to be extended for its 16th year because of the real and immediate threats to the United States homeland and so forth and so on. So we live in this kind of situation of permanent emergency. So this part, this is the darkest part of the talk, and then it will kind of come back up.
So, instead of showing you pictures of war, these are pictures of Yemeni kids.
I visited Yemen three years ago before their war started and met some really lovely young people there. So I'm gonna show them instead of terrible things, but I'm gonna talk about terrible things.
There's a dynamic that's kind of very harmful, where these technologies that protect Western soldiers also make it much more difficult for people where the wars are being fought to successfully attack our armies.
So instead, they decide to attack civilian targets, and it kind of creates this terrible dynamic where wars become more brutal.
Every once in a while, they succeed in attacking a civilian target in the West. And that raises our level of terror, perpetuates the cycle. We develop new technologies to protect our soldiers further. And it goes on and on.
And unfortunately, even though there's nobody, I think, that actually thinks this consciously, there is a lot of benefit to armies and navies to actually be in the field all the time.
They get to try tactics.
They get to try their personnel.
Institutionally, it's good to have permanent warfare, however horrible it is on a human level.
So, we have this motor.
And unfortunately it seems to be accelerating. And I just want to highlight how weird it is that it even exists.
Like, imagine if Indonesia was flying unmanned aircraft over Queensland, listening to hear if anybody was saying bad things about Muslims.
So, first off, I think like half of northern Queensland would be in flames, and then you all would be in ships heading to Jakarta to conquer.
And the fact that we have this global policeman role that both America and Australia play together is odd. But where it really touches our lives is that these things kind of, oops.
So I talked about one group of robot masters being the army generals.
I'm gonna talk about the police.
(audience laughter) These technologies come back.
They come back in weird ways to our civilian world. So the most obvious one is that all this military stuff gets donated to police departments.
And so you see random tanks and things where you least expect them.
This is a photo from the Boston bombings, where some jackass blew up an improvised bomb. And then the police appeared in like, they stuffed themselves into body armour.
They had tanks and everything.
We all wondered where it came from.
One of my favourite facts about this Boston response was that the police requested that the doughnut shops remain open so they could serve as a point of logistics while they shut down the rest of the city.
And you don't just see the equipment.
We also see the tactics.
So a lot of police are ex-military, and their attitude towards their own fellow citizens changes based on their experience abroad.
So that's very troubling.
I know John said I was funny.
It's gonna get funny, but it's gonna take me a little while to get there. There's other technologies too that come back that are less visible, but maybe more important. So we have these gigapixel cameras now, for example, that can hover over a city.
And you know how in TV shows you have an enhance button that can zoom in on anywhere? That actually exists.
There's something that kind of hovers around and has a super camera, can track multiple people at a time. Police departments love it, because people like to play with toys.
You know there's techies in police departments as well. These unmanned flying vehicles are being used by the Border Patrol.
You guys are also using them, because you have vast amounts of territory to police. You also have, like, automated ships, which I can't find a picture of, but it seems kind of fascinating to me.
And then there's devices like the StingRay, which was also developed for military use.
It's a fake cell phone tower, so if you want to eavesdrop on people using cell phones, you set this up, and it all kind of routes through there. And police departments are using this to great effect. And they're being very secretive about it, because, again, it's an extremely powerful tool. So a lot of people criticise the NSA and the CIA for being jackbooted thugs and kind of this terrifying force, which I, to an extent, agree with.
But I also feel that national intelligence agencies that we're afraid of, they have legal norms and things within them. They're accustomed to following rules, even if those rules are secret, even if we don't know them. And there's sort of a system there.
What I'm afraid of is like, you know, Sheriff Roscoe P. Buford of East Dillville, Arizona having access to all these technologies and all of these very powerful tools and using them with impunity and kind of thoughtlessly. And this is happening more and more across our country and unfortunately in yours. In San Diego, for example, there's a fad among police for photographing everybody they come into contact with, so they can put 'em into a facial recognition database. And they're also swabbing teenagers for DNA so they can have their DNA on file for no reason except, because, you know, it might come in useful.
So there's fewer and fewer checks on this sort of behaviour. And, yeah, I get into these reveries where I remember the election.
And I'm trying to keep it together, so all right. Oops.
So #NotAllRobots, okay.
Not all robots are trying to kill us, eavesdrop us. Not all robots are part of the surveillance state. There's a whole set of these things that are just trying to help.
So, for example, one of my favorite's the JuiceBro. It is a $700 juicer.
Seven-dollar packets of pre-chopped goodness that make one glass of juice that has a QR code on the back, so if it's not fresh, it has to be Internet-connected of course, if it's not fresh, it will not juice.
Get it at your local store today.
This is the Flatev.
It's the Currie Cup of tortilla makers.
So you get a dollar cup of dough, and you get a little tortilla in the shelf down there. The Vessel is one of my favourite Kickstarters. This is still phantom-ware unfortunately.
It is a container that tells you what's inside it. If you've ever dreamed of knowing what you're drinking, the Vessel is coming up soon.
Here it's telling you you're having a beer. But unfortunately, they haven't gotten to this point yet. They're still at the point where they can only detect water. But they can measure how much of it you drink. So the Vessel is on its way.
Kuvee.
Kuvee has the $200 smart wine bottle.
It has a little touchscreen so you can order more of it. Remember, if you forget to charge your smart wine bottle, unfortunately you're not able to pour it then. It has to be turned on in order to pour.
(audience laughter) The Wilson smart football.
It can count how many times it's rotating.
It gives you stats on how good you are and performance over time.
This is one of the best ones.
What is this thing called? The Molecule.
The man- or woman-portable air purifier.
So you could just see this.
You're barefoot, walking from room to room with your 40-kilogram air purifier, making sure that not a single molecule of unpurified air hits your delicate nostrils.
(audience laughter) And then the WiFi-enabled kettle.
I don't know if any of you followed this wonderful saga on Twitter a few weeks ago.
There's a data scientist in the UK who tried to set this up with his crazy home network of other stuff, and he tweeted the experience.
It took him 11 hours.
So a lot of the time was consumed trying to find the IP address of this thing. So, initially he says "three hours later and still no tea. "Mandatory recalibration caused WiFi base station reset. "Now port-scanning network to find where kettle is now." (audience laughter) And then there was a weird postmodern thing where the amount of attention he was getting on Twitter was actually interfering with his attempts to set up the kettle.
(audience laughter) So, "Now the Hadoop cluster in the garage," like we all have, "is going nuts due to retweets from @internetofshit "saturating the network "and blocking MQTT integration with Amazon Echo." And then finally, the epiphany arrived.
"Well the kettle is back online "and responding to voice control, "but now we're eating dinner in the dark "while lights download a firmware update." (audience laughter) These machines are trying, but it's not easy. Here is Peggy, the Internet-enabled clothes pin, which has a moisture detector to tell if your clothes are dry or not.
I don't know how many of these you're supposed to buy, or if you just use the same one over and over and take a week to dry your clothes.
(audience laughter) I forget this beautiful device, but it's a smart mirror and a smart scale.
So, what better way to start your day than just jump on the scale, look in the mirror? It will tell you all your skin flaws, how much weight you've put on overnight, and log it permanently forever.
(audience laughter) This is a little device that dispenses dental floss. My favourite thing about it is that it flashes to remind you that you haven't flossed.
They solved the problem of multiple users by having a switch on the back.
So when you're done with it, simply unmount it from the mirror, flip the switch to the other position, put it back, and back away slowly so it doesn't detect you and think it's the other person.
(audience laughter) I forget what this thing is called, but it clips on your belt and reminds you to breathe, if you're one of the people who-- (audience laughter) It will send you a notification.
That would be the last thing the paramedics find on your body, is like a flashing red notification. You forgot to breathe.
If you're sick, like me, of waiting more than 10 minutes for cookies, you can buy the custom WiFi-enabled Internet smart cookie maker that takes 10 minutes instead of 12 to bake you four cookies.
(audience laughter) And then this is the smart tampon.
The tampon itself isn't, but it's connected to this device that clips to your belt and then it will notify you when it's time to change it.
Nothing feels safer than having something clipped directly to the outside of your clothes that goes inside you.
(audience laughter) And then Huggies Tweet Pee, which is exactly what you're most terrified that it is. It is this little sensor that goes on the diaper, and it will tweet at you when it detects moisture. They tried to make one that detects when the diaper is full of shit.
Unfortunately, it turned out to be impossible to distinguish from regular Twitter content.
(audience laughter) Kisha is the umbrella that tells you when it's raining. (audience laughter) It's good to know, because you don't know.
You're under the umbrella.
And there's not a window.
The other thing that's nice is it's very needy. So if you leave it in the restaurant, it will start to text you.
Like, "Call me, hey." (audience laughter) I don't know how many more of these I can take. (audience laughter) Yeah, so this is the world of like these helpful robots. They're all out there.
But they all want to talk to you through the phone. And they don't really coordinate with each other well. And they're like these little needy chicks that are just constantly sometimes, even literally, tweeting and wanting your attention.
So, the question becomes, who is going to manage all of this? And who's going to get it under control and kind of into a coherent system? And the answer of course is hackers.
Hackers are going to log into all this stuff and make it work together, except they're gonna make it do terrible things. So your smart tampon is going to post racist, anti-Semitic slurs on Twitter (audience laughter) while your juice maker mines etherium or whatever else it's doing.
(audience laughter) One thing I really love about any image of hackers is that I've spent many years now in programming environments.
I have yet to have like a green binary number projection onto my body or even on to a flat surface on a monitor. It's just not a useful display most of the time. And you notice the ergonomics of this hacker are terrible. (audience laughter) The hoodie's at the wrong angle.
He's looking at his hands.
A big faux pas among the hacking set.
This guy's even worse, 'cause he doesn't even have a standing desk. He's just kind of standing in this projection of ones and zeros and trying to log into something. This map could mean so many things right now. (audience laughter) But two weeks ago, it meant just that there was a denial-of-service attack against DNS by these, I think it ended up being Internet-enabled web cams or something. Something with hard-baked passwords that's impossible to replace.
And it kind of gives you an idea of how well-designed the Internet of things is.
There's another paper out just quite recently. This is a still from a video.
It turns out there's a smart light.
I think it might be the Samsung one, which surprisingly doesn't just catch fire. It lights things up.
But it speaks on a WiFi protocol, so if you fly a drone near a building within, I think, 50 or 80 metres you can actually hack all of these.
You can cause them to permanently brick themselves. You can cause them to permanently turn on a signal that will jam WiFi across the building.
In the video demo, they have them flash S-O-S over and over again. So this is the kind of care and attention that we've put into making the Internet of things. So the solution to this stuff is kind of obvious and kind of scary at the same time.
You need, like, a butler, some responsible individual robot that will monitor and shepherd all these devices, protect them from the big bad Internet.
And who better to do that than Google? They've come up with this thing called Google Home. It looks like George Orwell designed an air freshener. (audience laughter) It kind of sits there looming.
It has speakers in the base.
It responds to voice commands.
And it's a router and a speaker, and it talks to all your things.
Part of an interesting ecosystem, right? Like, Google now at this point, they just started making a phone.
So they control your hardware.
They run the search engine that you use.
They probably run the DNS that you use.
They have your browser.
They have all of your web-surfing data, because they run analytics across the Internet. Probably stuff that I'm forgetting.
So they're really running a big portion of the infrastructure of the Internet. And the same day that Google Home was released, there was a news item where Yahoo, everybody's favourite clown, it was revealed that Yahoo had been monitoring email for the U.S. government.
The U.S. government had requested that they check all email going through Yahoo, so all 50 or 80 messages a day (audience laughter) and look for specific keywords and then give them that information.
And the scariest thing from my perspective, because I know that the Yahoo security team is actually one of the best in the world, and certainly at other companies they have the best people in the world, is that they didn't know about it.
They thought they'd been hacked.
Weeks or months later, they found out that there was this secret programme running. So, when I see Google like trying to sell this device now, I don't think it's gonna be a very popular Christmas gift. We've kind of been spoiled.
And I've been a real fierce critic of surveillance and all that goes around it, by the government, by private companies.
But I have to say that Barack Obama was fairly circumspect in what he did.
And unfortunately that's kind of a tragedy, because he kept accruing greater and greater power, partly because we gave it to him, partly because the technology improved so much on his watch. And he didn't use it very much.
He used it sparingly.
And now we're in a position where this immensely powerful tool is being handed over to a piece of human garbage, a horrible, horrible individual that I can't believe we picked.
So we kind of heard this abrupt screeching sound as the entire Internet of things and this whole idea of AI voice-enabled assistants stopped and realised what it was getting us into.
Except for Amazon, which of course is going to continue on. This is Amazon, what is it? Is it Echo? Yeah, Amazon Echo.
Amazon is kind of like more trustworthy than Google a little bit, 'cause they just aggressively try to sell you things over and over again.
And so they have this Amazon Echo, which kind of blinks in a friendly way.
And then it has these little hockey pucks that you're supposed to distribute around your house to work with it.
And Amazon is another good candidate to command the robot armies, 'cause they have a lot of robots.
They have a tonne of servers.
Sorry, I'm gonna have a little drink.
This is unfortunately just water.
So they run the Cloud.
They run the Cloud, which is full of beautiful blinking lights. And they're also into, like, actual robot stuff. But you kind of run into the same problem, oh sorry.
And then they have this weird and kind of charming vision of, oh fuck, no pun intended, (audience laughter) of what the world is supposed to be like.
So you have these buttons that are wired by WiFi to actually order stuff.
And I'm trying to think, when were you ever gonna be in the situation where this is helpful? (audience laughter) Like, if you have a week off, I don't know, you're very, very, ill.
You don't mind waiting.
You press the button and then what happens? I don't really understand.
But unfortunately, just today we saw that Amazon is having the same problem as Google. Jeff Bezos tweeted, didn't even wait for the new year to start, "Congratulations to @realdonaldtrump.
"I for one give him my most open mind "and wish him great success in his service to the country." So the guy who runs the Washington Post and the entire Cloud that everything runs on, and Amazon, and wants to run this other system of invasive home surveillance is also being super friendly right now to Trump. It's grim.
But I want to talk about other Amazon robots that interest me a lot.
Has anybody here ever read the play R.U.R.
that the word robot comes from? Yeah, a couple.
So I just read this as part of research for the talk, and I was surprised.
I knew that the word robot had come from this 1920s Czech play.
I didn't realise that in the play the robots are not mechanical.
The robots are people, or they're made of people parts. But the parts are kind of made in a factory and assembled. So they're I guess what we'd call androids. They're completely biological, but they don't feel pain. And they can work very hard.
They're not afraid of death.
So, in other words, they're kind of the ideal Amazon workforce. (audience laughter) Amazon has lots and lots of these people, as do many companies that perform labour that has not yet been automated.
My favourite thing about Amazon and their workforce is that they're run through something called integrity. In an American business context, whenever an entity is called something like Integrity, you know they're probably the most evil thing in the corporate world.
So, Amazon Integrity hires and fires seasonal workers who try very hard to get a more permanent job at Amazon. And they're part of this, it's kind of a less popularised facet of the gig economy, people working irregular schedules for jobs that give them no job security and no promise of a future job, doing hard work that we don't really get to see as the consumers.
So a lot of startups are actually these like, basically just secretly repackaging hard labour by low-paid workers.
Blue Apron is a good example.
There was just a scandal in their Richmond plant in San Francisco, because people were fighting, and the police were getting regularly called. People working very long hours.
But the whole idea is that if you have a short attention span but like to cook, Blue Apron will deliver this box of kind of pre-prepped recipe food that you can make into a meal.
But what you're really buying is somebody's labour at low wages, subsidised by venture capital. So this happens over and over again.
And I find it really interesting, because we as consumers in our social class are really into artisanal stuff.
We want to hear the story of how the bread was baked, but we don't really want to hear the story of how the shipping centre works.
We don't want the little biographical card of who exactly packed our Amazon box.
And I find this disturbing.
It's weird enough when you know that all your stuff is manufactured overseas by people who are working at low wages.
But when it's in your own country, and it's kind of hidden from you, and we pretend that it's not there, it's an odd feeling.
And then kind of the ultimate expression of this is Amazon's Mechanical Turk.
The Mechanical Turk is this thing in the photo, which was kind of a scam, where it was supposed to be a machine that played chess. And then you see there's a little seat in there and someone squeezed in and did all the, there was a system of mirrors so they could see and so forth.
But it's an amazing thing to name your service. Like, first you've got the ethnic card covered right away and the colonialist angle.
And then the whole idea that a human being is squeezing themselves into a machine shape so they can act like a machine.
Mechanical Turk is a service that lets you farm out labour to people and pay them for piecework and get it back. And it, it pays very, very little.
One interesting thing about it is that's doing a lot of the social sciences' research. One social scientist who's used Mechanical Turk for surveys actually went over to the other side and worked as a Mechanical Turk for a few days and discovered that the surveys that he'd been sending out were all worthless, 'cause people were doing 80 or 90 of them a day. Some of them involved priming tasks where you're supposed to read something and then it puts you in mind so that you make different decisions.
But he found the same priming task was repeated 15 times in a row, so he was way over-primed.
And the quality of, so there's a strange feedback effect where we package this labour, and then it infects our research.
A lot of the research is about low-income people and low-income labour.
Kind of a bizarre world that we've built by pretending that people are robots.
I keep skipping two slides.
I have to discipline my clicking here.
My favourite commentary on this all is by Simone Rebaudengo, an Italian designer, kind of philosopher of the Internet of things. This is called Ethical Turks.
It's an ethical fan.
And it actually exploits labour for the reason that it's human, to do things that only human beings can do. So, whenever the ethical fan has a dilemma as to who it should cool, it submits it to Mechanical Turk and then it gets an answer. It has a little camera in it.
So say that two people are sitting in front of it. It will look at both of them, snap the photo, upload it, and then it has these dials on the side.
One of them is the religion dial, so you can choose whether an atheist or a Hindu or a Christian or a Muslim is answering your question. And then there's an education dial as well. So you can pick the education level of the person. And they will look at the photo and make the ethical choice and tell the fan who to cool.
And it will display on the screen the rationale. And the person gets paid a couple of pennies for this task. But I really like the idea of taking this exploitation of people and really pushing it to the edge. Finally, I want to talk about the robot within. I know that you are more engaged in this talk than you've ever been in your life.
And I feel alive as well.
We're in the moment.
We're fully human.
We're both on this train of thought speeding away to an unknown station.
But most of the time we kind of go through the motions and do things automatically.
We have a daily routine, and in that time we're kind of susceptible to this brand-new form of advertising that we haven't experienced before as people, but is native to the Internet, which is one that knows everything about our behaviour and has tracked us for a long time and is looking for opportunities to just give us a little nudge.
So here's a blog by Cathy Carleton, who is a marketing executive.
And she noticed at one point, she flies US Airways all the time.
Usually, sometimes you get early boarding, sometimes you get late boarding.
But she was consistently being asked to board last, to the point where she couldn't carry her normal bag with her.
And finally after months of this, she realised that they were pushing her to get the credit card.
If you got the credit card associated with the airline, you automatically boarded in one of the early zones. So some algorithm somewhere had figured out that this was the nudge that she needed.
And it just kind of made her life annoying for months until she broke down and got the thing.
I find it fascinating that these long con sort of advertising plays are even possible now in our world.
This is the Robot Devil.
I got into a weird Twitter thing, like I often do, a month or two ago, and just started talking about Computer Hell, trying to imagine.
And people were riffing back and forth with me what Computer Hell would look like.
It would have the Apple hockey puck mouse, that sort of thing.
And at one point I said, we have a very hated cable company in the United States. So I tweeted, "Computer Hell is proudly served by Comcast." And the Comcast chat bot woke up and replied to me and said "Good afternoon.
"I'd be happy to look into "any connection problems you're having." (audience laughter) Confirming that Comcast is the Internet provider to hell, or at least Computer Hell.
Chat bots are wonderful because just the complete insincerity.
There's the first person, which of course doesn't refer to anything.
And then there's the doubt.
You never know whether this is a really low-paid human or if it's a script or some combination of the two working together. My other one, unfortunately the response got deleted, but I said at one point it's "Sobering to think that the ad-funded company "that runs your phone, DNS, browser, search engine, "and email might not cherish your privacy," and then immediately afterwards said, "Google Home looks pretty great though!" And Google Home saw this and thanked me vociferously for my support until much later some human being went through and had to weed out the obvious irony.
Or maybe, I'm not giving them credit, maybe they have an irony detector built into their scripts, and they found this.
This is my ex-cat.
I miss her dearly.
But at one point I lived with a really smart fellow. And he called me over one time and said, "Look, I've trained your catch to fetch.
"She's doing it consistently." So he showed me.
The cat would come and she'd bring her little toy in her mouth. And he would pick it up and throw it, and the cat would bring it back and bring it back. And then get bored and fall asleep.
And we put our minds together, and with our triple-digit IQ actually realised that he hadn't trained the cat at all.
The cat had trained him to be a cat toy.
She would do this twice or three times a day. And even if he was in the middle of a programming task, he would kind of stop and throw the thing.
He'd been completely automated.
So I've been thinking about this situation a lot whenever I interact with Facebook, 'cause Facebook really knows how to get into our inner robot.
Like, we think that, and we're told by Facebook that we train it. We react to stuff, we engage, and it kind of improves the algorithm so it only shows things we want to see.
But what's really happening is that Facebook is training us to click, to share, to hover, whatever it is they're measuring, to do it more and more. They change things up until they maximise the amount of activity they can get out of us. This is a still image of, I think it's fighting in Syria. And this is live video, so it streams by and people can react in this really bizarre way with Facebook emojis about how they feel about what they're seeing. And the thing that is heartbreaking right now, as an American, about this kind of training is that Facebook has been a great source of disinformation. They show you really what you want to see.
And there was a group of people in Macedonia that ended up making a lot of fake Hillary Clinton stories. And they would get amplified by the algorithm and spread all around.
And they would make kind of ad revenue from it. So there's, I thought it was nuts, that there was like a whole nest of hackers in Macedonia figuring out what American people would think would be a plausible conspiracy theory.
And Facebook kind of insists that it is not a media company, it is a tech company.
So it's training us to click, and then it's kind of having these very, very widespread effects on our culture. Finally, I want to talk about Chad and Brad. I don't know these people.
I apologise if either of them is you.
This is my mental shorthand for programmers who are in a hurry and just can't get it, can't be bothered to get it right.
Chad and Brad.
So, Chad and Brad designed Pokemon Go.
They made it so that it had every permission in the world. It could access your, you signed in with your Gmail account.
It could read your entire Gmail account and do God knows what.
And if you want to make a conspiracy theory that somebody wanted a photo of the inside of everybody's house and full access to their email account, Pokemon Go would be an excellent example of it. But it turned out to be just Chad and Brad. They screwed up and apologised.
Here's another one.
This is Facebook Ads, where they allow you to target people based on their so-called ethnic affinities: African-American, Asian-American.
I can't even explain how illegal this is in the United States with housing or employment. And my hair is standing on end thinking about it. And the only way I can think that Facebook actually got this into production is that every Facebook lawyer who saw it had a heart attack and dropped dead of shock. (audience laughter) And since there were no black marks on it, they just put it out there.
So again, Chad and Brad just ploughing through 80 years of civil rights because they couldn't be bothered.
Here's a great one from Uber.
Uber has these flat fare zones, where you climb in and you pay the same fare to the white part of town basically.
So it's a demographic map of Los Angeles on the right-hand side, where you see people of many colours living in many places but only the rich ZIP codes are represented, which is of course where the white people live. Even worse in Chicago.
I've shown you the shape of Chicago on the right-hand side.
And then you see the free fare zone, which is just half the city.
So this like line, which is the 55 Expressway, where black people start to live south of it, that's where the free ride ends for Uber.
Maybe it's malicious.
Maybe it's racism.
It's probably just Chad and Brad.
Yesterday, Facebook said, you know, there's so many elections in San Francisco at the local and state level.
So they gave me a button said, "Hey, find out who got elected." And then they asked me for my address.
The day after Trump got elected, the company that knows everything about me, who my friends are, what I do and my interests are, the only thing they don't know about me is where I physically live.
That's what they want to know from me so they can tell me, "Oh yeah, that guy won." "Now he can subpoena us to find out exactly where you are." Nice.
Thanks Chad and Brad.
So let me recap.
Who are the suspects here? Who controls the robot armies? We have the military.
We have the police.
Evil hackers.
Google.
Facebook.
Amazon.
Poor, tired programmer who just wants to go home and get the task done.
And then brands in their incarnations as chat bots and things that interact with us.
I have to find out what my grand point is.
Oh no, it's a sad one, yeah.
(audience laughter) Damn.
I wanted to come here and like finish on this upbeat tone, saying, you know, we, we get to control the robot armies. And not we as programmers and designers, but as a society, we can do it.
But it's not true.
You know, it's not true.
The answer is whoever wants it the most.
Whoever wants it the most gets to run the show. And we've been terribly remiss about that.
We've had eight fat years in the tech industry, where we can build anything we want and see what happens. And now we've kind of run into the wall on it. I'm gutted.
I really don't know what to do.
The standard response is kind of denial of responsibility.
The whole phenomenon here is, we want control without accountability for it. We want control without having, you know, we want employees without having to call them employees.
We want to show you exactly what you'll see to click the most ads without saying that we're doing anything editorial. We want to run things but not be called to task for how they change the world around us.
And I think that's got to stop.
It's got to end.
The standard response of my industry is, well, we'll just make more technology.
We're gonna make a next level of technology. We're gonna create self-driving cars.
We're gonna have better AI that understands you, that can talk to you by voice.
And that's gonna solve all the problems that we've created with this set of technologies. But of course there's never been a technology that doesn't bring additional problems of its own in. And so there's a delusional mentality that we're just gonna tech our way out of this problem until we build a super intelligence and ask it nicely to fix all the robots for us. And it can't work.
This is Futurama rendered by somebody with an enormous amount of time, and I'm super impressed by what they did.
So, when I flew here I kind of unintentionally made these two stops, because I'm a wuss about jet lag. I didn't want to be jet lagged.
I was in Zagreb.
I had a talk.
And I decided to stop in Dubai and Singapore on my way to Sidney.
I didn't think about the implications of it, but when I was in Dubai I saw around me this brand-new city that had been built from nothing in the desert in the last 20 years or so and that was kind of built on the ultimate expression of the idea of the gig economy.
Basically on labourers who have no legal status, who have no path to legal status, whose relationship with their employer is completely defined by a contract that they can't even actually enforce in a court, because no court will rule for them, and who kind of form an underclass.
There's 80% expats in Dubai.
There's a layer of Westerners who are kind of well off and happy.
And then there's just the great dark matter, literally speaking, of people from the Indian subcontinent, the Philippines, who do the work.
And this is kind of one future that we have, where we kind of hide the labour but we benefit from it. And then I stopped in Singapore, a very, very different kind of society, kind of the world's best-tasting police state is how I describe it.
There's an interesting thing.
When you get into Singapore, they give you this card that says, "Warning.
"Death for drug traffickers under Singapore law." But the user experience is odd, 'cause they give that to you after you've already passed the border.
You don't know how hard it is to keep a kilo of heroin hidden away for three days while you're enjoying delicious Singapore snacks. But Singaporeans are happy.
It kind of messed with my mind, 'cause I'm tempted to always think that surveillance is terrible.
But they've made a trade-off that they're gonna have a surveillance state. They're gonna have a nanny state that keeps an eye on everything, charts the direction for the country, and, in return, they have prosperity and harmony of a sort and a very narrowed form of political life. So I came over here scratching my head.
And I think Dubai is horrible.
I think the modern slavery system is terrible. Singapore I don't know what to think about. But as Walter Sobchak says, "At least it's an ethos." At least they thought about what they were building and kind of aimed in that direction and went there. And we haven't done that at all.
(audience laughter) We just build the technology and hope that it fits together, hope that it makes us money, hope that it doesn't have too many terrible side effects. And we can't.
The party is over.
We can't do it anymore.
We built a panopticon, the most advanced surveillance society in the world. It's kind of lying there dormant, but it's about to be used by the American government. And because most of the Internet is in the United States, and it's been centralised, it's gonna affect you just as much as it does me. And we built it for no good reason other than it seemed like a good idea, and it got us some easy revenue.
So, we have to think of the robots.
We're responsible for these robots.
We've built all these systems.
They're depending on us.
They're not that bad.
They just need a little structure and guidance. And I think it's really honest to provide it, to think about what kind of a world we want and how we can use the tools that we have and that we built to make it that kind of a world.
And most importantly right now, how to protect the people who have come to rely on these technologies from the effects that we all anticipated but just didn't think would happen so soon. We have lot of work to do.
And we have a lot of work to do keeping each other's spirits up, which brings me to my stirring conclusion, which is, thank you.
(audience laughter) It's been such a horrible week.
It has been an awful week.
And I'm so grateful that I was here in Australia surrounded by very kind people in a lovely place to receive bad news.
Nobody likes bad news, but it's best to get it, I found out, at this conference.
So, crossing my fingers for next year.
And that reminds me.
Next year, please come to my talk, which will be titled, Who Will Command the Robot Navies? (audience laughter) And if there's a little bit of time left, and if there are any questions, I'd be happy to talk to human people or robots in the audience.
Thank you.
(audience applauds)