Web Assembly at the Edge

In short order, Web Assembly has gone from an experimental project, to a fully fledged, high performance compilation target and runtime in today’s browsers, allowing many of the most widely used programminglnguages to natively target the Web platform.

In this presentation Aaron Turner brings us up to speed with what’s now, and next, for Web Assembly.

Web Assembly at the Edge

Aaron Turner, Software Engineer Fastly

Aaron has made or is involved in lots of cool projects around WASM (WebAssembly) and WASI (Web Assembly System Interface). He’ll be talking about WASM, WASI and the Edge.

So what is WASM? It’s a universal, low-level bytecode for the web. It’s great for computationally-heavy stuff that doesn’t fare well in JavaScript. It runs in major browsers, nodejs and in standalone runtimes.

There are lots of languages outputting to WASM, but the most mature are:

  • AS – AssemblyScript (if you can read Typescript you’ll be able to read AssemblyScript)
  • Rust has good tooling for WASM
  • emscripten – a very mature toolchain for C that now compiles to WASM

WASM uses linear memory – it’s like one big array you can share between WASM and JS. This makes it easy to partition – great for security. It also relies on a capability-based security system, which provides some further control.

WASI is a system interface for WebAssembly, a standardised set of system calls for interacting with system resources like file systems, randomness and time. There are proposals for more.

You can use WASI through standalone runtimes like Wasmtime and Lucet. There are tight requirements around the permission you give WASM modules to do things like modify files on your system, you have to be very specific.

This gives performant modules with really powerful capabilities.

The Edge represents putting your servers closer to your users – eg. with Fastly’s edge cloud platform. The idea is to serve from locations that are optimised to deliver content to users, wherever they are. CDNs are the best known face of this.

You can also use this model for compute – commonly called “serverless”, where it’s the compute that closer to the users, not just storage and transfer.

To make this useful Fastly considered what users consider important for compute: language portability, security, runtime performance, memory usage, fast start/minimal cold starts. A cold start is where a user’s request has to wait while code is transferred to their nearest node, parsed and served.

So what are the options? We could ask people containerise their code. It would give good portability and security, but cold starts would be really slow and resource consumption high.

You could also put JavaScript at the edge – V8 can run WASM but requires JS so there are multiple execution layers. Cold starts still wouldn’t be great and as an interpreted language execution could suffer.

But WASM is great for the edge:

  • portability is high – almost any language that compiles to WASM could be supported with a slim SDK
  • security is great – sandboxed, continuous heap, capability based security
  • runtime performance is generally good (there are always caveats with performance)
  • memory usage is reasonable
  • cold start times are much quicker with WASM than with containers or JS

So those are great technical reasons, but let’s look at this another way. Imagine an example where you have a large group of users with cheap, low-powered devices; and reasonable but not great internet speeds: for example lower-income earners in a big city.

WASM is kind to profiles like this – it takes less CPU and latency is reduced by edge hosting; so those devices can still use powerful features.

WASM is really new with lots of work still to be done. Things that are second nature are still missing, simply because they’re not done yet. But it has an amazing community driving it, people who are not just smart but see the good it can achieve. WASI is also growing with a lot of cool innovation – eg. ideas coming forward for machine learning and cryptography interfaces

Standalone runtimes:

  • Wasmtime is a popular choice for a general standalone runtime for WebAssembly. It’s designed to be light and fast.
  • Lucet is built by Fastly and it’s what they’re using for their edge solution. It notable for having an ahead-of-time compiler rather than runtime. Lucet is fast for instantiation – as little as 50 microseconds (not milli, micro!), which is very good news for cold starts.

Languages that compile to WebAssembly:

  • Rust has been a strong contender in the WASM space for a long time. Good WASI support; enthusiastic community.
  • AssemblyScript is a very young language, but has some big backers; and has a good JS/WASM story; there is a lot of opportunity to get involved if you are interested.
  • Fastly have been experimenting with Go for WASM

Even if a language doesn’t compile to WASM… maybe its runtime will…!

Communities:

There are lots of really exciting projects too:

  • Wasm3 runtime (small, good for IoT)
  • Wasm itself is also getting better
  • Lots of new projects around games – game engines, physics engines, etc

The tech is cool, but the reason Aaron stays so interested is the community is great to be part of. If you (and your company!) are interested you should get involved.

Together we can build an awesome WebAssembly for the browser, edge and beyond!

@torch2424