Securing JavaScript

(electronic music) (audience clapping) - Hello everybody, thank you for inviting me to talk. And happy Pride Month to those who observe it. (audience member whoo'ing) (sporadic clapping) My people.

I hope you've all had a great first day.

This conference is extraordinarily well organised and well curated and I think the organisers deserve a round of applause for that.

(audience clapping) This is my first visit to Australia, and my first visit to Melbourne.

I'm enjoying it a lot even though it's raining. The coffee is truly excellent.

Everybody said that it would be.

Every time I said I was going to Melbourne, I just got a list of ad replies, which were just list of coffee shop names within your context.

And I have gone to all of them and I'm extremely wired now.

But before we get started, it's very important that I take a speaker selfie because if it wasn't on Instagram, it didn't even happen, and there's lots of you, so two of them.

Excellent, great.

And now we can just like wing it for the rest of it. Hi everybody.

I'm Laurie, I helped to co-found npm, Inc.

I'm not the guy who invested npm, I'm the guy who like did all the other shit. But what I really am is a web developer.

That is how I identify.

Making the web bigger and better and getting more people involved in making websites is the thing that really drives me.

And people like yourselves, web developers, you're my people.

You're the people that I care about.

And today I'm here to talk about securing JavaScript, which is a ridiculously huge topic to attempt even in this ridiculously long time slot.

So let's break it up.

First, I'm going to give you some broader context about JavaScript and where we sit in the world today. And then I'm going to talk about threat models. Which is, I'm going to talk about how to decide where to spend your time, how to decide which of the many many things that are broken, is the one that you should fix. And then I'm gonna talk about some major security incidents at npm and what we learned from them.

And then at the end, I'm gonna tie all that stuff together to talk about what npm is doing to make JavaScript more secure and what you can do to help that effort.

So, first a couple of quick disclaimers.

First, this talk is kind of a downer.

There's really no way to talk about security that's gonna leave you feeling like jazzed and uplifted at the end of it.

No, it's gonna be kind of sad.

So sorry in advance.

And the second disclaimer, which I've already violated is that I use profanity a lot.

Profanity is particularly necessary when talking about security problems.

But let's be honest.

Even if I was talking about puppies and rainbows, I would curse like a sailor and I have no intention of changing that behaviour. So, let's get started.

If you didn't know it already, JavaScript is the most popular programming language that has ever existed in the history of time. Which is a weird thing to say but it's true. Here are some stats to bake into your brain. If you were for a code leader yesterday, you've heard some of these stats already, as well as a lot of tedious explanation of how I arrived at these numbers.

Which I'm not going to do again.

So the rest of you are just gonna have to trust me that I didn't just make these numbers up.

There are 11 million JavaScript developers in the world today.

We work in every industry and we work in every country in the world and 99% of the JavaScript developers in the world use npm. That is new.

In 2014 when npm, Inc. was founded, only about 50% of JavaScript developers used npm. But more and more developers have picked up npm and 2019 is the year that we hit 99% adoption. npm is also the largest collection of Open Source software that there has ever been.

Much bigger than any other package depository and much bigger than any of the app get type stuff. We hit one million packages in the registry a couple of weeks ago.

And so npm is bigger than all of the other registries combined.

And it's by however you measure, by number of modules, by lines of code, by number of users, whatever measure you pick. And it's not just that we have the most modules. On GitHub, JavaScript is the biggest language by number of repositories, number of issues, lines of code, and it has been for seven years now. In Stack Overflow, they did a survey of 80,000 developers, and JavaScript was the most popular language with 68% of developers saying that they use it. And the result of all of this JavaScripts and all of these modules, is that npm serves nearly 12 billion downloads every seven days.

And that number has grown by 26,000% since we started npm, Inc in 2014.

And not only are you writing more apps in JavaScript, you are writing bigger apps in JavaScript than you used to. The average modern web application, so a React App, has 2,000 packages in it.

Which means that 97% of the code in your application is downloaded from npm.

And you write only the last 3%.

It's the important 3%, it's the 3% that makes your app different from all of the other apps.

But it's only 3%.

And across a big company, it is not uncommon for a company to be using 25,000 unique packages out of the one million that are available.

And all of this code we use is great.

Using Open Source JavaScript is helping everybody write better apps, it's helping us write apps faster, it's finding more bugs.

It's unprecedented and it is part of what has propelled JavaScript to its position as the most popular language ever, but it is not free.

Every day you ship code into production written by people that you've never met, and that has resulted in a strange change at npm, which is that we started out thinking of ourselves as the package manager, and we have become a security company.

By default, we have become a security company, because there was just so much insecure shit going on that we had to stop it somehow.

So in 2018, we acquired our security auditing company called Lift Security, and they became part of npm. And we've been pouring resources and time into security since then.

npm runs an annual survey of our users, the first one got 16,000 responses, the last one got 33,000 responses.

And in last year's survey, 82% of people said that they are concerned about the security of the Open Source modules that they use. That number is up from 77% the year before that. 82% is good but the number should really be 100%. There should be nobody who is not concerned about the security of Open Source because it's worth being concerned about.

So what are we doing with all of this concern? Well 76% of us do code reviews, which is a good start. 46% of us perform automated scans, which is good. 23% of us get 3rd parties to audit our code which is an excellent practise.

And 23% of us do nothing.

(audience chuckling) Not none of the above, not other.

Nothing.

I'm not gonna ask the 23% of you in this room who are doing nothing about security to raise your hands. But I am going to ask you to look into your souls. (audience laughing) But let's back up.

Because I dived straight into the stats about what we do and what we don't do to secure JavaScript but I didn't really define what I was talking about. What is a security failure? Is it something that steals data? Is it something that takes your site down? Not everybody uses the same definition.

So the one that I'm using today is that a security breach is any malicious act that ends up hurting your users, and by extension, anything that allows that to happen or could have allowed that to happen, is a security failure. So specifically, it's a security failure even if nothing bad happened, just as something bad could have happened.

Lots of companies say that it wasn't a security breach unless actually it was exploited, and I do not subscribe to that view.

So here's some bad news.

I cannot teach you how to write secure JavaScript. I mean I could try but it would take forever and who wants that kind of liability? It's much easier to make it your fault for writing insecure code.

And in any case, there's very little about security that is specific to JavaScript code.

Writing secure code is the same no matter what programming language you're using. You have to have defence in depth, you have to validate the hell out of your input sources, and if you use 3rd party code, you need to know where it comes from or whether you can trust it, and that's true of every language.

But the third part's where npm comes in.

It's especially important in JavaScript to validate your 3rd party code because so much of your code is 3rd party code in JavaScript.

And basically 100% of your 3rd party code in JavaScript is coming from npm.

And there are a lot of things that can go wrong with that. So to summarise that, JavaScript, it's big, it's important.

But let's talk about the threat models, which is basically a list of all of the ways that using a 3rd party code can go wrong and how seriously you should take each one. I'm going to cover six types of threat.

I'm going to cover Angry bears, I'm going to cover Denial of Service, Malicious packages, Accidentally vulnerable packages, Social engineering and Compliance failures. So our first threat model is an angry bear, literally an angry bear.

Like an actual animal that bursts into your office and starts eating people.

A bear that bursts into your office and eats people is going to be a huge security problem.

That is absolutely going to fuck up your day. You definitely do not want to have to deal with an angry bear breach.

But here's the thing, you probably don't have a recovery problem for angry bears. And the reason that you don't have a recovery plan for angry bears is because the chances of getting killed by an angry bear are vanishingly slim, especially in the software industry. But my point here is that threat models are not just about severity threat models are about likelihood.

Which brings us to our next threat, which is Denial of Service.

It is obviously a big threat if the npm registry is down. Because you won't be able to get any of your code, you won't be able to do any deploys.

It will completely fuck up your day.

People are very worried about the uptime and reliability of the npm registry, and we are constantly getting polite suggestions about how we might improve that.

And that is strange because the npm registry has uptime like you wouldn't believe.

The npm registry has 99.98% uptime.

So in the last three years, we've had two hours of downtime.

And our last outage was nine months ago.

But left-pad, you say.

(audience laughing) We're going to talk about left-pad later.

But my point here is that in general the registry being down is really quite rare. But the amount of effort and time and thought that people put into what happens if the registry's down, is out of proportion to the actual threat that that poses. The next most obvious threat is a malicious package. This can either be a package that was malicious to start with or it can be a package that was benign to begin with and then got hijacked and turned malicious. At npm, we detect and intercept malicious packages. As they get published to the registry, teams of robots gets spun off and they look at what's going on in each package. And as you can see in blue, the absolute amount of malware that is published to the registry is going up every single year.

And in red, the percentage of all publishes that are malware is also going up.

But it's also worth noting that these numbers are very small.

They're not even breaking triple digits.

And we have millions and millions of packages. So malicious packages are a real threat, and we're going to talk more about them and people really do try to sneak bad code into the registry and they use all sorts of weird ways to do it. But less than 0.1% of publishes to the registry are malicious, and because we destroy malicious packages before they get out to the world, almost none of them are actually available. I'm not going to say none, because if I ever said there were no malicious packages on the registry, I absolutely know that somebody would go, but I published one.

(audience laughing) So I would never make that claim.

So it is worth thinking about malicious code and how to mitigate that threat.

But what's worth far more of your time and far more of your effort is accidental vulnerabilities. These are packages maintained by good actors who were just trying to write code and have accidentally created a security vulnerability that has gone out to the registry.

Every year we discover 100s of these, much much more than anything else.

And sometimes they're discovered immediately, sometimes they lie around for years before people discover them.

But once they're known, that is when the real trouble begins.

Because you folks never update your depths. I'm going to talk later about npm audit but npm has stats on how often people instal packages that have known security vulnerabilities and that is 33% of the time.

A third of the time you are downloading packages that we know, that we're telling you, are insecure, and you're installing them and publishing them and putting them into production anyway.

So vulnerable packages are less bad than malicious packages but they're also orders of magnitude.

Multiple orders of magnitude more common.

So this is where we should be spending a lot more of our time, but we're not.

Because the situation is really really bad. How bad exactly? Think of an industry that you think probably takes security really seriously, like say, the finance industry.

We did an analysis of eight of the world's eight largest banks, partly because, you know, it's important that they be secure and also partly because it's very easy to analyse them. Because banks are very old fashion.

They tend to run a lot of their own hardware, and they tend to register their IPs to their company name. So we can just look at the IPs that are hitting the registry and do a reverse look up to what company owns them and then yell at the banks.

They also use more JavaScript than any other industry, according to our survey.

And the results of the combined analysis of eight of the world's largest banks is, this is what they did in a single month in 2019. They downloaded 22,500 individual unique packages and they downloaded them over 23 million times. You would think banks, they have a lot of money, they should probably have a caching server somewhere, why would they need to download any individual package more than once? We also have that question.

(audience laughing) But 22 million downloads it is, from the eight largest banks.

3% of those packages had known vulnerabilities, 55% of those vulnerabilities were critical vulnerabilities, and the banks absolutely downloaded thousands of copies of critical vulnerabilities and put them into production. These mother fuckers are banks.

(audience laughing) They have money.

They have real money.

Not like fake Bitcoin money, actual money.

My money.

And they should really be taking it a lot more seriously. The next threat to consider is social engineering. Which as we're going to see in a bit, it's very closely related to malicious packages. Social engineering is not a threat that's unique to JavaScript but it is a much bigger threat in JavaScript because JavaScript is uniquely social.

Every single day, like I said, you ship code into production from thousands of strangers.

And it is amazing that that works at all.

It's a testament to humanity and our ability to cooperate, it's genuinely heartwarming.

But as we're going to see that is a system that is currently straining at the seams.

And it's worth putting real money and real time into fixing it.

And the final threat model to think about is Compliance failures.

Because while denial of service can make you unavailable, and getting your data stolen can hurt your users, and getting hacked can hurt your users, getting sued can make you permanently unavailable. 'Cause you just go out of business.

And there are a bunch of ways to get sued.

And we have some survey data that's relevant here. In our last survey, we asked people if a licence on a package affects whether or not they use an Open Source package and 58% of people said that it did.

Which was kind of surprising, I didn't know that we were taking it that seriously. And within that group, 55% of people said that their company specifically prohibits them from using certain software licences, which is more seriously than I thought companies were taking it.

It means overall 29% of JavaScript developers cannot use software packages with certain licences. Which licences? Well the GPL and the AGPL are predictably unpopular because of the restrictions that they place on commercial software.

But much bigger than that was unrecognised licences or unlicensed stuff. Basically anybody who cares about software licences has hired a lawyer to tell them which software licences are okay.

So if you use a licence that they've never heard of, they have to go back to the lawyer and hire the lawyer again, and nobody wants to spend more time interacting with the lawyer than they absolutely have to. So put some licence on your code, for the love of God, but stick to one of the big popular licences. If Facebook cannot get away with making up its own version of the MIT licence, then you absolutely can't. So apparently companies really care about licences. Do you wanna guess what happened when we looked at the licences that's used by the big eight banks in our study? Without exception, the banks are using licences that they are not supposed to.

They're using licences that they explicitly said they are not using, in like published documents. A lot of them are using unlicensed packages which depending which country you're in is just illegal on face.

If there's no licence on the software, you're not supposed to be able to use it in various parts of the world and they're using it.

So we know, like just off the face of it, you've broken the law.

Which is a great way to start a sales conversation. So a question worth asking is if banks have unlimited money and they're fucking up security and compliance, what hope do we have? How are they getting it so wrong? And the problem here is that JavaScript snuck up on these companies.

26,000% growth in five years is very hard to get your head around even when it's happening to you. And the banks, it's not their primary concern. Banks aren't used to reacting to a massive change in how they're supposed to do things that happens in less than 5 years.

Because just like five years ago, the people who wrote JavaScripts in your company, they might have just been a handful of people and who are probably still just using jQuery. But now there are companies who employ thousands of people who do nothing but write JavaScripts all day. And they use thousands of packages to do it. Suddenly these companies are major users of JavaScript, but they don't have any of the same rules or processes or guidelines or training around how they do that, that they do around their other languages.

People are just sort of winging it and hoping that nobody notices quite how much JavaScript they're using. And the other reason is that the way security works in JavaScript is necessarily different because of the scale in which JavaScript works. The root of that difference is the number of packages. A JavaScript app that uses as many as a 100 individual Open Source module is still fairly rare, but a JavaScript application that uses 2,000 modules is the median.

And lots of them are using way more than that. So when your company is using 25,000 individual packages, you can't possibly inspect all of the code yourself. You're never going to get around to it, you can't possibly read every single licence yourself and have a conversation with your lawyer about it. But there's still many companies where doing that is still technically officially the policy of what they're doing. There's some companies where, if you want to use a new Open Source library, technically what you're supposed to do is fill in a form and submit it by email and hope somebody gets back to you about whether that particular library is okay. And they have not done that for all 25,000 of the libraries that they're using.

There's no possible way that they could.

So when building npm enterprise, one of our most common feature requests is blacklists and whitelists, which is exactly that kind of feature.

People wanna be able to specifically include and exclude certain packages from their stack but that's not gonna work either.

You use 25,000 packages.

You're not even gonna get done listing them before the list is out of date.

And even if you did, your security team, and obviously you all have dedicated security teams, right, right? I see a lot of nodding in the room.

They would have to spend their entire lives reading security bulletins to keep on top of the problem. So you have to throw out the processes used for other languages and adapt to how JavaScript works because, like I said at the beginning, it's big and it's important and it's a huge part of your stack.

So npm really does have a security team and they really do spend their whole lives reading security bulletins to keep on top of the problem. And then we put those security notices into a feed and we give it to every single user of npm. For enterprise customers, we do even more, and we also include reporting for compliance stuff. So the threat model is different in JavaScript. Bears, not a big problem.

Uptime, not really a big problem.

Malicious code, a small but very scary problem. Accidental vulnerabilities, a massive problem. Social engineering, a huge problem.

And Compliance failures, an epidemic.

And all of this is happening at a scale where only automation will work.

Like I said, this talk kind of a downer.

So now let's look at four major security incidents that affected the registry, and what kind of failures they were, and what we learned from those failures.

And let's start with the oldest one, which is left-pad.

Hands up if you've heard of left-pad.

Hands up if it directly affected you.

Right.

This comes as a surprise to lots of people because people who were around at the time of left-pad, think of it as this seismic event and we just kept hearing about for years afterwards. But JavaScript has grown so much since left-pad, that the number of people who were actually around when it happened, is less than 20% of all JavaScript users. Most people haven't even heard of it, and most people don't remember what it was about. So left-pad was a module that was 10 lines long, and what it did was it aligned the text to the right. And the way that it aligned text to the right was it added spaces to the left.

Very useful for command-line apps, and those command-line apps were in Tooling, and the Tooling was in other packages, and eventually, every goddamn package on the registry in 2016 were somehow depending on left-pad, which did almost nothing.

So what went wrong? What went wrong was a totally different package called Kik. You may have heard of Kik because they are a messaging app that then did a huge ICO and then turned into a massive legal tyre fire because it turned out that their ICO was illegal. But at the time they were just a messaging app and not yet a huge legal problem.

And they wanted to publish a JavaScript package to npm, something to do with their messaging app and it was going to be called Kik.

But it turned out that somebody else had already published an npm package with the name Kik and his name was Azer.

Kik emailed Azer asking if they could have the package name and this usually works.

This is how 99% of package name disputes on the registry are resolved, somebody just emails somebody else, and says, hey you don't seem to be using that package for anything important, could I have it? And the other person goes, cool no problem. Open Source! That did not happen this time.

Azer told Kik to fuck off, in exactly those words. And Kik still wanted the name.

So they got very unwisely escalated with some vague legal threats about trademark and copyright, which absolutely did not apply.

And Azer got even more mad and told them to fuck off again. So then they emailed npm support team and said, hey can we have this package name? We have like 30 million users, and this other package, it has literally zero users. Which is true, nobody was using it but Azer. And at the time we thought that was a pretty reasonable argument.

Like if you're looking for a package called Kik on the registry, you're probably looking for something to do with this nesting app, and not, you know, Azer's pet project that no one but Azer is using. And that was absolutely the wrong call.

That was a dumb lack of policy that we had, because we were young and foolish and it was 2016. It was a gigantic mistake.

I will regret it forever, but we did it.

And we took the package name away from Azer and we gave it to Kik.

Azer was so pissed off about this, Azer, not a super big fan of capitalism to begin with, but he was even more pissed off with us now, he vowed that he would never use the registry again. And Azer had published hundreds of packages to the registry and he unpublished them all.

He took them all away.

Which at the time, was still a thing that you could do. So left-pad just randomly disappeared at like 2:00 p.m. on a Tuesday and everything on npm broke simultaneously. So we put it back.

And there was a certain amount of controversy about putting the package back up.

So I should clear up that left-pad was released under the WTFPL licence.

The WTFPL licence, that's it on the screen. There is no more text to the licence.

That's the full thing.

Lawyers hate the WTFPL because it turns out "what the fuck you want to" does not have a super clear legal definition. (audience laughing) And what happened was a new maintainer showed up in the middle of, you know, like 2:15 on this Tuesday and said, hey I would like to publish a new package. It's called left-pad, it's exactly like the old left-pad. And it will incidentally fix this huge problem that you're having right now.

And we were like, cool.

Thank you, random citizen.

And so we let him do that and he did.

So a bunch of people accused us of stealing Azer's code. So first, we didn't do it.

That other guy did it.

(audience laughing) And second, he didn't steal shit.

The WTFPL allows you to do whatever the fuck you want. I don't know how much clearer you could be. So the WTFPL allowed us to publish a new version of left-pad, and we did.

Incidentally in 2017, a function called padStart got added to the JavaScript language as a direct result of this incident.

It does exactly what left-pad does but like a million times faster.

And there's now absolutely no reason that you should be using left-pad for anything but it still gets three million downloads per week, despite being like the byword for a package you don't need for anything.

It has three maintainers, one of whom is a relatively famous Australian. It's MIT licenced now, and my favourite part is it has typescript definitions in it. (audience laughing) They updated it to add typescripts definitions but they did not update it to just turn it into the word, padStart, which is all it needs to be.

And Open Source is often kind of silly but I find that whole situation delightful. What we learned here, other than do not give package names away random to large corporations, is that reliability is the same as security. People expect npm packages to be there and if they're not there, that is our fault. If they're not there that is a security breach, because our users are being hurt.

They're not there if people can delete them whenever they want. So we changed the rules.

Now if you publish a package, you have 24 hours to change your mind about that, and after 24 hours, it is there forever.

You cannot take the package down, you cannot unpublish a package except in very extreme circumstances.

And usually that's great.

It has made the registry a lot more reliable, and it only inconveniences a very small number of people. A bank took three years to notice that they'd accidentally published them at their IP to the registry, and they got really pissed off. I'm like, why are banks so bad at this? (audience laughing) So left-pad was an uptime problem.

The servers didn't fail, our policies failed. But the result was the same.

The package wasn't there, and people experienced downtime. So the second case is ESLint, which happened in July of 2018.

ESLint is much simpler to explain.

ESLint is a linter and checks your code for errors and style.

It's Open Source, and like many prominent Open Source packages, it's maintained by a team of strangers who've never met each other, who collaborate over the internet.

Every single one of them had two-factor auth enabled, apart from one.

And unfortunately, that same one guy had used his username and password on another website and then reused the same username and password on npm. The other website got hacked, their credentials all got stolen, a hacker used those credentials to reuse them on npm, and suddenly had access to published ESLint. To all of its millions and millions of users. So the attacker impersonated that member of the team and published a new version of ESLint, and this new version of ESLint had code in it that attempted to steal the npm credentials of whoever was installing it.

That is a credentials harvester, and it could have been a huge problem.

'Cause it could have meant that they could then publish a package as any of the other people who use ESLint, which is almost everybody.

ESLint has like 80% market share.

It's an absurdly popular package.

But the attack failed.

The attack failed because the attacking code had a bug, and it was not a particularly settled bug.

Like it was not like you know some weird edge case bug. It broke nearly all of the time.

And so our model of this attacker is that they were a very experienced hacker, because they did all sorts of shit like, you know, hide their IP address and stuff and they came in through a torrent, and it was all very sophisticated, but their JavaScript was awful.

(audience laughing) So I have to assume that it was somebody who had never written JavaScript before.

So lots of people noticed this weird broken ass JavaScript that was suddenly in the ESLint, and they all reported it simultaneously, and so in something 20 minutes after the new version of ESLint was published, we had already taken it down. At the time of the ESLint, we'd already introduced two-factor auth, like I said. Everybody except that one guy had enabled two-factor auth. So we learned two things.

First, supply chain attacks are real.

Until this point, they'd just been a theoretical vulnerability. People will attempt to attack a company by attacking a package that they know that they use, or a package that is depended on by a package, depended on by a package that they know that they use. And the second is that two-factor auth isn't enough, if you can't enforce it.

So just one weak link in that team was enough to break the chain of security.

So now we have two-factor auth enforcement. Which means you can say that for any individual package, it cannot be published unless that user has two-factor auth enabled.

That would have helped the ESLint team and so most people who are using two-factor auth to publish major packages use it now.

So ESLint was a design failure on npm's part. We built security systems like two-factor auth but they weren't robust enough.

They weren't convenient enough that people were using them and they didn't provide enough guidelines that you can enforce them.

So our third case is event-stream.

Event-stream was another supply chain attack and it was executed in November of 2018 and it was considerably more sophisticated. Instead of trying to do something at instal time, it tried to do something at run time.

And it was incredibly sneaky about it.

It would only do something at run time if it was running inside of the particular application that it was trying to attack.

At no other time would it execute.

Everybody else who got it, who downloaded it, didn't get attacked at all.

Only this one specific application, a crypto currency wallet.

And if it was running inside of that crypto currency wallet, it would attempt to steal your money.

"Money".

(audience laughing) "Your".

(audience laughing) So the attacker had succeeded by appearing helpful. What they did was they wrote an entirely new module called flatmap-stream that added a new feature to event-stream, and they submitted a PR to event-stream saying, here, take this new feature.

And the maintainer, who was a prolific Open Source author who maintains hundreds and hundreds of useful packages gratefully accepted their help.

It was a very useful new feature.

And then, like a couple of weeks later, the attacker offered to just take over maintainership of all of event-stream.

And the author gave them control of event-stream and that was when they injected the malicious code. So once again, the package was detected before anybody's money got stolen.

As with ESLint, the community was the first to notice that something was weird with this package. Just because there's so many npm users and they're all developers, that anything that is strange at all about a package immediately hits millions of eyeballs and these eyeballs all go, this is weird.

And then they hit the report vulnerability button. So lesson one of event-stream was that our community is an important part of our threat detection process. And lesson two was also about community but this time it was about a failure of our community. Social engineering is often the most effective kind of hacking.

And if you looked at the attacker's profile, it looked relatively legitimate.

They'd done a bunch of useful Open Source, they'd done it all in the last three weeks. That's a really subtle thing to notice.

That if they'd done all of this stuff in the last three weeks, that they were new. And even then like what if they just happen to be new? Not everybody who's new is automatically evil. And the original author was just too tired to do deeper diligence on this person and looking at their last three weeks of activity. Because why look a gift horse in the mouth? Everybody loved the packages that this author wrote and, like I said, he maintains literally hundreds of packages to this day, but not enough people had stepped up to help him out maintaining all of these Open Source.

And as a result, he was completely exhausted. Being completely exhausted didn't create a security problem. Being completely exhausted is the security problem. The fact that Open Source all depended on this one dude not getting tired is this flaw in our system. So the final case I want to cover happened just two weeks ago.

Have you noticed that these are getting closer together? And they also always seem to involve crypto currency somehow.

It's almost as if turning math into money was not the best idea.

This was another supply chain attack on a different crypto currency.

This one's called Agama.

And this time npm's security team detected the problem. Again they detected it before it did any real damage, this time using internal tooling that I was mentioning, that runs a bunch of robots against every single new package published to see if anything is suspicious.

Once again the attacker used social engineering. Once again, the attacker created a genuinely useful new package, this time called electron-native-notify.

And they submitted a PR to EasyDEX-GUI, which was being used by the Agama wallet.

And at the time that it was added, even if somebody had inspected every single line of code of electron-native-notify, it would have been completely benign, there was nothing wrong with it when it got added to the package.

As soon as it was part of Agama wallet though, they published a security patch which added the malicious code.

Which is technically true.

Security patches usually make things more secure by making it significantly less secure.

I suppose it's technically also a security patch. The lesson here is a hard one which is that Open Source contributors can no longer be assumed to have benevolent intentions.

That is an attack on the very foundations of how Open Source is supposed to work.

And it's a huge problem so I'm gonna discuss it more later but one thing I will say is that this is the attacker's actual profile picture on GitHub. (audience laughing) So if you're gonna be a black hat hacker, maybe be a little less obviously a black hat hacker, I feel. So now we've covered the things that we think that you should be worried about. And we've covered some real world situations about how those things go wrong.

You'll notice I didn't talk about any accidental vulnerabilities and that is because even though they are overwhelmingly the most common problem, they are also the most solved problem.

So let's talk about how npm solves that problem and also what we're doing about the other problems. Some of these problems are policy changes, some of them are code.

But before I do that, I need to split the room up into two teams. And if you were here yesterday, you can't give away how this works.

Everybody on this side of the room is Team A, everybody on this side of the room is Team B. Let me hear it from Team A.

(Team A whooping) Team B.

(Team B whooping) Team A again.

(Team A whooping) Team B again.

(Team B whooping) I'm not using this for anything, it's just a wake-you-up, after 30 minutes of talk. (audience laughing) So I'd already mentioned the unpublish policy. Which was our response to left-- (chuckles) People never think that I'm not gonna use it. They're always like surely he will? No. (audience laughing) Our response to left-pad was to change the unpublish policy. People rely on the registry to be up, they rely on packages to remain where they are. And we do our best to make that happen.

Mostly you can't delete packages.

Sometimes we still have to delete packages, we delete packages if they turn out to be malicious and sometimes we have to delete packages as a result of DMCA takedowns and other kinds of, like, copyright nastiness. But we employ a lawyer who's full-time job is to tell other lawyers to fuck off and read documentations about how software works. Because most of the legal threats that we receive, and we receive legal threats every single day, are completely bogus.

It's one of those sort of like lesser known things that npm provides to the world.

It's like pro bono legal services for all of Open Source. What we don't do is we don't delete a package if it has a known vulnerability.

If it's just an accidental vulnerability, we do not delete the package.

And the reason we don't do that is because people get really mad when we do.

Because people are like, it's my responsibility to decide if that vulnerability is bad, it's my responsibility to decide when I address that threat. Don't break my builds.

And that's a reasonable position and so we only mark them as insecure.

Which brings us to the next feature which is npm audit. This was introduced in npm-6, and it is the single biggest improvement we have ever made to the security of JavaScript. Every time an npm instal runs, we do a quick audit that tells you whether or not you have installed any insecure packages.

In the last 30 days, we have run 335 million quick audits. And like I said, about a third of the time they do. But a third of the time is actually better than it used to be.

Used to be half of the time, so people are slowly learning that npm audit exists and that it can help them.

If you run the npm audit command you get a detailed list of vulnerabilities including instructions on how to fix the vulnerability. And usually how you fix the vulnerability is you just upgrade to a newer and more securer version of that package.

Which means that you just run npm again and if npm is giving you this command which just tells you to run npm, why doesn't npm just run npm? That seems like a reasonable request.

So that's what we do, npm audit fix follows its own advice and it runs npm for it.

It upgrades all of your insecure packages to secure packages.

I wish we could say that when we invented npm audit we thought of this feature from the beginning, but actually we released npm audit and for the first week people were like, why does it not just run npm? Why does it keep telling me to run npm? And we were like, you're right it should.

And so we added audit fix.

Who here knows what I mean by SemVer? Okay, so for those who didn't raise their hands, very very quickly, semantic versioning or SemVer is core npm.

It is a contract between developers about the magnitude of a change to a package. So the first number is for breaking changes. If that number changes, then you must expect that your code will be broken. If the second number has changed, that means that a new feature has been added. And if the third number changes, it's supposed to be just a security patch.

If the second or third number changes, it's supposed to be safe to just bring in the new version of the package.

But obviously you should be running your test every single time.

So audit fix by default follows SemVer, which is to say we'll only bring in safe patches. But if you run it with dash-dash-force, you can also bring in unsafe patches, and then just, you know, run your test again. Speaking of tests, a really great way to prevent insecure code getting into your package is to automatically fail your tests and film audit to tech that there's a problem with your security.

This is relatively straightforward to do because npm audit exits with a nonzero code if it detects any vulnerabilities.

So you can add audit to your test command and your tests will fail if it detects any vulnerabilities. Now you might not want to fail every single time it detects even a minor vulnerability, so you can configure it with the audit level config option to, you know, fail only if it's a critical vulnerability. If only if the high vulnerability, something like that. We recommend that everybody does this.

There's no reason not to.

I mentioned already that we introduced two-factor auth. Two-factor auth is important for anybody who has an npm account but especially if you publish popular packages.

Only 7% of npm offers at the moment have two-factor auth enabled but that number has grown by 300% in the last six months. And also because the 7% who have it enabled tend to be the most prolific of software authors. 50% of all downloads from npm are covered by that 7% of authors.

Another way to get your credentials stolen is to accidentally leak them.

People publish them in packages all the time, even though we tell them not to.

And they also public them to GitHub posts all of the time, usually by accident.

As of a month ago, we have a partnership with GitHub that automatically detects whenever you have published your npm credentials to GitHub.

And automatically revokes those credentials. It has been saving the asses of 100s of people for the last couple of months and we've only just publicly started talking about it yesterday.

So you're welcome.

We also get a lot of requests for package signing, author package signing.

But it's not clear what threat this addresses. I've been talking about all of the threat models and the various ways that this goes wrong.

Package signing proves that the official author published a package.

Accidental vulnerabilities are always published by the official author of the package.

Package signing will just be like, yep that's fine. ESLint would have appeared to have been published by the official author.

So would event-stream.

So would have electron-native-notify.

The registry itself uses HTTPS, so it's kind of hard to malice in the middle of the npm registry.

But the npm registry does sign packages with our key. So you can always verify that the package that you've downloaded is the package that was supposed to be on the registry.

And we might still introduce author signing like there's some use cases that it helps, but the case for it is not super strong.

I already mentioned that our biggest defence in security is our 11 million users.

And the report vulnerability button is how they do that. It lives on every single package page and anybody can hit the button, and whenever you do, an actual human reads your report and decides whether or not the vulnerability that you think you found is a real problem.

Most of them are read by this guy called Andre, he's delightful.

Because npm has a growing team of actual expert security humans whose whole job is to keep npm secure and by extension keep JavaScript secure.

They are fantastic folks who work incredibly hard. If you ever bump into any of them, it is worth congratulating them because they have a thankless job that is incredibly hard. In addition to the manual review, as I mentioned, we have a growing system of automated review. We have a bunch of robots living in containers that spin up every time a new package is published, that run all sorts of weird automated tests to find out if the package is doing anything strange. Because automation is easy to gain once you know what the automation does, we tend not to talk about exactly what it is that we are doing with those robots. So all I just say is, they're robots, they do stuff. And then in the case of Agama, it worked.

If you have a big team, another form of security issue that can happen, which is that people will leave your team and you forget to revoke their credentials. A common solution to that is the single sign on or SSO. SSO is not part of the public registry but it is part of the npm enterprise, which is one of npm's paid products.

I am not going to do a big paid pitch for npm's paid products, because I don't really need to.

We have built a product that serves the needs of big companies using JavaScript.

I feel like this whole talk is an excellent demonstration of why big companies need to be taking JavaScript security more seriously.

It solves a bunch of problems around security, authentication, reporting, compliance, all of the stuff that I've been talking for. And I've already mentioned how bad big companies are at this stuff.

So buying it is kind of a no brainer, and if you're not a big company who have a product called orgs that does many of the same things and is much cheaper. But obviously we are not yet done.

The number of vulnerable packages should be zero. Now we're on the registry, it should be zero. So we continue to grow our security team, we continue to improve our automatic analysis. Everybody likes to throw around machine learning. It's a buzzword, it's super fun.

You know, there've been some talks about fun stuff that you can do with machine learning.

Usually machine learning is kind of bullshit. But when you have 10 million versions of packages, and you have a whole bunch of known packages that are insecure, you can actually use machine learning to do useful things. So we are going to be doing those things.

One thing we will not be doing much of is static analysis. I do not know why there are so many security companies who like loudly tout their ability to do static analysis as a way of finding out if reading code, to find out if it's good or bad.

Because that is provably impossible.

Alan Turing himself, in 1936, proved that it is the halting problem.

You cannot read code and figure out whether it is going to do something good or bad. So I do not know why nearly 100 years later, we still have companies pretending like that is a thing that you can do.

Static analysis can help find code quality mistakes and therefore catch mistakes that lead to vulnerabilities. But most bugs are much much more subtle than anything that static analysis could find. This slide is here specifically to say that I've thought about the blockchain and that I have nothing to say about it except that it is not a way to help with the security of anything.

Indeed it appears to be quite the opposite. One thing we have to do a lot more work on is addressing social engineering.

The npm community and the Open Source community in general have worked until now by everybody knowing each other. That has two problems.

First, it excludes people who are not part of the physical, social network of people who already know each other.

And that decreases the diversity of the people involved, and decreasing the diversity, studies have shown, reduces the quality of the people involved. So the network effect that keeps Open Source going is decreasing the quality of the code that Open Source could produce.

And second, Open Source is so big now that even just knowing each other is not scaling anymore. And that is what the event-stream attack showed. So to fight social engineering, we have to improve social signals delivered by npm. That is going to involve reputation scores and that is a minefield because as soon as you give humans a score, you've turned it into a game.

And humans love winning games, which tends to destroy the value of that signal. There are a bunch of shitty projects with a tonne of GitHub stars written by people who are much much better at getting GitHub stars than they are at writing code. So we have to think very carefully about how we implement those social signals. But even trickier than that problem is the maintainer burnout problem that I already talked about.

This is more than just a security problem.

This is a fundamental problem with the nature of Open Source in 2019.

We, as a community, have to figure this out. We have to figure out how to make sure that Open Source maintainers receive rewards appropriate to the scale of the value that they provide to the world.

This is a problem you should be spending time on, this is a problem that you should be spending money on. I don't have a solution right now but we absolutely have to solve it or this whole edifice is going to collapse. Because security is hard.

At npm, our position in the ecosystem gives us a lot of leverage to solve certain kinds of problems and we are going to continue to do those things and improve them all the time.

But ultimately it is a big scary internet out there, and it gets bigger and scarier all of the time. So your paranoia is justified.

Ultimately the security of JavaScript is up to you. But it's also up to you, as a member of the JavaScript community, you can help by stepping up.

You can report a vulnerability, you can ease the burden of an Open Source maintainer. We have a whole lot of work to do about Open Source security.

But we can fix this stuff together.

So be safe out there, and thank you all so much for your time and attention. (audience clapping) (electronic music)