Shipping features without deploying them

Hey everyone.

If you haven't heard of Fastly before we provide an edge cloud platform that is used by websites across the world to accelerate their delivery and more recently move their compute workloads closer to their users.

Traditionally content delivery networks have been built upon a proprietary core product, which is then supported by equally proprietary add-ons such as image optimization and content filtering.

Fastly has always done a bit better than this.

From the beginning, building our network on top of the varnish cache gave our customers the ability to fully program how their requests were served at an edge.

This changed the game for response time and cache hit ratio.

However, the constraints of VCL, the domain specific language used to configure the varnish cache mean that customers can't always do what they want.

And they're limited to the features that we choose to offer.

The demands of modern Web applications call for much more.

With the advent of Edge Computing, it's now possible to choose the solution that works best for you, or even build your own.

Developers can build and share libraries to perform specialized tasks in the way that suits them best.

And when a solution doesn't quite cut it, they can look at the code and modify it to suit their needs.

This is a benefit of what Andrew described in his earlier talk as 'functions as a service', we've already started taking advantage of this at Fastly to offer our customers more options in the way that they serve content.

I've been working on a Rust crate that allows you to process edge side includes with more configurability than we or our competitors have ever been able to offer.

Edge side includes is a tried and tested technique that has been used to serve some of the most complex Web applications for decades.

It works by including special HTML tags in your source that tell the edge server to fetch another document and inline its contents into the original document.

This allows you to consistently compose complex Web applications, powered by services, built by multiple different teams.

Fastly has supported a subset of ESI in configurations for a long time, but it lacked the flexibility that developers need to build performant and maintainable applications.

For example, if you only wanted to pass XML tags with a given name space, you wouldn't be able to process your documents within Fastly.

If you were faced with these limitations, it would've been a huge investment to build and deploy your own templating solution across the globe.

Not only due to the engineering hours required, but because with millions of requests going through your service every day, a small performance regression could actually scale out to thousands of wasted computer hours per month.

Back in 2014, when we first wrote about edge side includes on our blog, you had to contact our support team to get the feature activated on your service.

We've come a long way since then, thanks to the ability to ship features using package managers that developers are already familiar with.

This example of how we would process ESI nowadays shows how our Rust crate can be used to stitch together a page with content from multiple different backends.

Because the library gives us full control over the outgoing requests, we can introduce advanced routing and content manipulation, which takes place as the document is streamed to the user.

To accomplish this in VCL would require advanced logic and state management.

But as you can see here, it's just a few lines of Rust that anyone with some development experience could understand.

Of course, with a language like Rust or JavaScript at your disposal, you could implement ESI yourself.

But the high level features that CDNs brought to the market were built for a reason.

Lots of people have the same problem that they need to solve.

When we at Fastly write implementations of these features, we are giving you ready to go solutions that you could drop into your program quickly, but we're not locking you in.

You could use someone else's ESI library, if you'd like, or even replace it with your own.

Our customers are already using this to solve real world problems.

One retailer is constructing product pages with over 100 individually cached fragment.

This is only possible thanks to the performance of Rust at the edge.

With the help of our local testing server, this allows their teams to ship independently and rapidly while still maintaining one cohesive Web front end.

If you'd like to try it out for yourself, scan the QR code on the screen or enter the URL shown, and it will take you to a page where you can try this out without the need to create a Fastly account.

When you're ready to publish a full service, you can sign up for our free trial and get started with the ESI crate straight away.

Thank you for listening and enjoy the rest of the conference.

Shipping features without deploying them

© 2022 Fastly

Kailan Blanks

Fastly Point of Presence

Extra Features

  • Video packaging
  • TLS management
  • Image optimization
  • Real-time logging

VCL on Varnish

  • Request-based routing
  • Edge Side Includes
  • Response caching
  • Load balancing

VCL is used to program the Varnish cache.

Screenshot of example VCL code

WebAssembly on Compute@Edge

  • Rust
  • JavaScript
  • Go

Fastly Point of Presence

<html> <head> <title>My Ecommerce Site</title> </head> <body> <div id="content"> <div id="header"><img src="/images/logo.jpg"></div> <div id="items"> <span id="item 1">...</span> <span id="item_2">...</span> ... <span id="item_n">...</span> </div> </div> </body> </html>
sub vcl_fetch {
	# Flag the request as requiring ESI processing if it is a webpage
	if (beresp. http. Content-Type "^text/html") {
		est;
	}
}
fn handle_request(req: Request) -> Result<(), Error> {
	// Fetch response from backend.
	let mut beresp = req.clone_without_body().send("origin_0")?;
	
	// Remove unwanted headers from response.
	filter_ headers( &mut beresp);
	
	// If the response is HTML, we can parse it for ESI tags.
	if beresp
		.get_content_type()
		.map( | c | c.subtype() == mime:: HTML)
		.unwrap_or(false)
	{
		let processor = esi::Processor::new(
			esi:: Configuration::default ().with_recursion()
	);
	
		processor.execute_esi(req, beresp, &|req| {
			match req.get_path() {
				// Override requests for the content fragment
				"/_fragments/content.html" => Ok(Response::from_body("

Content injected at the edge.

")), // Send all other fragment requests to the main origin // with a cache TTL of two minutes (120s) _ => Ok(req.with_ttl(120).send("origin_0")?) } })?; Ok(()) } else { // Otherwise, we can just return the response.

https://fiddle.fastlydemo.net/fiddle/98097b0a