Improving time to first byte: Q&A with Dana Lawson of Netlify
We recently published an article exploring what sort of infrastructure is needed to run edge functions. For that article, we talked to several industry experts in wide ranging conversations. While we weren’t able to use all of the conversation in the article, we wanted to share the full interview so you could get all the interesting bits that we couldn’t include in the article. Below is Ryan’s conversation with Dana Lawson, Senior Vice President Of Engineering at Netlify.
This conversation has been edited for clarity and content.
Ryan Donovan: I’ve heard of edge computing talked about as serverless functions at the edge. Are there ways you can get around it and make it less serverless? Or do you have to connect to some other database somewhere?
Dana Lawson: I mean, there’s so many cool ways to do it now. Deno’s just one way, you know? You can go faster. You can partner with a known entity that already has a runtime and you’re already utilizing that. You can just pop it in there. You can just write lambdas. You can do it old school, like five years ago—I guess that’s old school in tech. You could spin up a whole stack in your favorite cloud provider.
What is this edge thing? Flat earth? No…but it is kind of.
I believe AWS and Google both have lambdas, I’m more familiar with AWS, but you can utilize it that way to get around it. You can host it yourself.
My question is: why would you wanna do that? It’s a lot of work. We’re gonna have to have a database, right? You’re using key value pairs just to make sure that the data is the data. There’s still a lift to do it yourself, but you can do anything yourself if you really want to.
RD: I think it’s really cool that you all have edge functions as part of basically the code that people can write. How do you write a function and then have it distributed worldwide?
DL: Well, it’s complicated! I guess if it was easier, everybody would be building their own edge network.
But with tech, everything’s getting easier. The barrier to entry for writing a function has really gone down. A lot of web developers that have been front end focused, especially in the space that I am—composable architecture—where you can have a very lightweight way of manifesting websites: statically-generated websites.
But composable architecture is so much more. We hear the rise of edge computing and edge function. People are like, what are you freaking talking about? What is this edge thing? Is it like, you know, flat earth? Is it the edge of the internet? No, but it is kind of.
Content delivery network systems have been out for a moment. We see big players like Akamai, Cloudflare, and Equinox that have been around for a while. There’s these other traditional network companies that help connect the world’s data transmission. But there’s this new abstraction where the ability to execute code is right there where you’re delivering up that content.
Technologists have found a way to have content on the network layer, which is utilizing an origin server and then setting a cache, a really small piece of data out there. You can call up and get a really fast experience.
Now, what edge functions essentially do is, in addition to having that static content cached, you can call this little bitty program of bites. Where before you may have had a nested page, now you’re calling this executable code that’s sitting on that content delivery network system.
Once again, if we’re talking about web developers and empowering them to make even more dynamic websites, we’re meeting them in their language. Deno, which has a series of nodes around the world, are executing that code to say back to the origin, okay, deliver this.
The easiest way to think of this is maybe you’re going to netlify.com in Germany. Instead of the web developer having to have a clunky page in their URI, you can call a function and execute a different language.
And it’s gonna have it happen when it’s detected on a browser. Because what it really does is that first time to byte. With a lot of functions now, if you’re very sophisticated, you can create an entire visual experience that’s been separated and globally distributed.
I give it to node.JS. Node made this possible.
RD: I mean, Deno’s the same creator, isn’t it?
DL: Yes, it’s the same creator. So you get these smartypants that are like serverless will rule. Here we are ten years later, and I feel like it’s happening. It’s finally happening.
You asked that initial question: how do you even write edge functions?
You can write a function to say, oh, A/B test this. And once again, instead of managing a bunch of pages, a bunch of files, a bunch of code, and a git repository, you’re really limiting your landscape and having the heavy lifting done by these serverless function that are kinda acting like those old school content delivery network systems, but for different types of content.
RD: So the edge network itself, did y’all build out a bunch of servers around the world? Are you piggybacking on someone else? It seems like that’s the hardest part. How do you get close to the user with your actual hardware?
DL: The world is still very vast and big. Some of these networks are very bespoke, especially when you are talking about particular locations that don’t have modernized infrastructure. What we do is we understand that, and so depending upon where you are, we leverage major networks and we build an abstraction off it. We have our servers that unify and create our edge network and runtime across multiple cloud providers and different systems. That’s why we have the supply chain so that we can go and really manage the whole experience of where the bytes flow and limit that latency.
But it just depends. With the global traffic, because we’ve load-balanced it and we have this global presence, the experience is typically pretty good, almost anywhere in the world. You get to those unique locations like Siberia and it’s a little bit slower, but never say never. We’ll be talking about edge computing on different planets soon I’m sure. With all the satellite technology coming.
RD: Just set up a couple boxes on the moon.
DL: We’re too soon, Ryan. I need to save that for my 2030 idea.
RD: When you’re setting up this sort of network of edge locations across providers, what’s the trade off calculus you make when determining where to build?
DL: What it comes down to is time to the first byte. You want that experience. You’re putting yourself out there. With some of these applications, they’re completely being manifested on the served assets and origins, and whole websites are being created right there on the edge.
A lot of software developers don’t spend a lot of time in layer three. When you start talking about that to a front-end developer, they’re like, what is this? Is that like the third layer in my nachos? No!
That would be pretty slow. You would have to manage and load balance that traffic. You know, patchy traffic servers have been around for a darn minute now. You would have to write all those iRule engines. You’ll have to do all that proxy.
A lot of software developers don’t spend a lot of time in layer three. When you start talking about that to a front-end developer, they’re like, what is this? Is that like the third layer in my nachos? No! Go ask a network engineer, they’re gonna be like, that’s where all the problems last.
It’s difficult; a lot of that code is still C++. I think you can do it, but in today’s world, there’s so many cool open-source tools and packages, and the availability to do this through an abstraction with a third party or service, it doesn’t make sense to me personally, unless you’re really a full layer nerd that loves ingress to egress and wants to do it all themselves. Those people have racks in their garage. They’re gonna do it.
I think that you have to factor in so much when it comes to latency and having to manage traffic rules. A lot of it, you can set it in there and it just does it in this day and age, but this type of information, it’s just a ton. And then that replication, right?
Here’s the thing that you have to consider. We happen to be somewhat of a cloud provider. We’re always thinking about scalability and the footprint and climate change and how we serve the world and be good citizens of the world. I think that if you’re trying to do it yourself, you’re gonna miss out on some of those important details. You’re not gonna think about all the nuances and you’re gonna spend a lot of time, energy, and effort on stuff that’s already been done.
I’d rather spend time on innovation, like our edge functions on the moon. That’s a cooler problem than trying to figure out how to set up a point of presence in a specific location and finding a rack in a provider and a telco that’s connected and has the right governmental entities to get connected.
It’s a challenge. It’s a slog. You know, that’s why you do kind of piggyback on these behemoths that have already done that hard work. Choice is everything. Just don’t do it yourself. Somebody’s done it better than you. As an engineering leader, I’m like, come on, somebody’s smarter than you.
RD: That’s somebody else’s bread and butter problem.
DL: It is, and it’s gonna become a commodity. I think edge functions and edge runtime are something that modern website developers just expect to be there now. You’re not gonna manage language files.You’re not gonna wanna sit there and set up all this infrastructure and write environmental variables just to have prod and staging environments. No, you’re gonna have an edge function that kicks it over to a subdomain that maybe shows another picture of what you’ve done, and it’s all right there in that same code base.
You get to go a little bit faster and use some of those repeatable principles from what design systems have taught us too.
RD: Alright, no DIY. So there’s a lot of proxying going on here. How do you make it so the proxy server contact doesn’t add latency.
DL: That is what you watch all day. Are we adding a layer that’s gonna make it slower? Because it defeats the purpose, right? But that’s what you have to think about: where you deploy, where you scale, where you replicate. And then you’re measuring—we’re constantly measuring and watching that, and having smart rules behind it.
The world slows down sometimes, you know. Dude, I’ve seen it all. There could be some huge campaign, I don’t know, maybe Beyonce’s launching a summer tour. That could slow the whole internet down in some countries.
What you have to do is consistently say, where am I skilled? How’s my caching? Handling caching, I think, is the biggest part of it. Because with runtime in these edge functions, you are caching that function so that it can be served up again when it gets called. That’s why it’s so cool. You’re not gonna call it every time. Why would you do that?
It’s just constant. Like, how can I scale? As more nodes are spun up, you’re gonna spin up more nodes. I’m always watching where they’re gonna land and expand and saying, oh, let’s go put something there. Or some of these other places that you can go have a a point of presence either virtually or stacked.
The world’s network’s pretty good, except for that one line across the Atlantic.
RD: That’s, that’s the one the narwhals keep attacking.
DL: That’s one limitation. You think you solved it all and then a narwhal comes.
RD: So is there like a network of proxies or is it like one server that sets everything off? You still have to go to one server to get there, right?
Dana: We still have to go to an origin, origin still exists. But we’ve been smart about that. Once again, you think about that ability to replicate, if you’re using major cloud providers, you have availability zones, and you’re replicating that backend where that data’s served and those functions. And Deno’s doing it too, right? So we’re partnering with them where they’re doing the same kind of thing: replicating, filling, and caching.
But that’s on that first load, right? If you’re getting a message response to that browser and the browser’s like, ain’t never seen this, but then that’s like where you watch the optimize because a lot of the times that’s once and you never see it again.
And so really, how many origin servers you wanna place globally, that’s gonna influence it? Or where Deno goes. I’m not gonna speak to where they’ve deployed, but where they’re gonna go and deploy their functions and their runtimes.
So you still do have to go back to some backend systems somewhere and call that data, but it’s so fast and it’s just amazing how quickly we’re transmitting. It’s almost like a no-op in this day and age, unless something is really big. Then you’re gonna do a reload and refresh that cache.
Because at least for us, and I’m gonna speak through the Netlify lens, most composable websites are small. When you’re calling functions, you’re not loading a whole bunch of log data. You’re not holding a bunch of assets. So like, it’s super, super duper fast, right? You’re gonna make this change. It’s pretty quick.
RD: That sort of gets at my next question. Is there a particular use case that edge functions are really suited for?
Dana: I would say it’s best suited for A/B testing where in the past, you’ve had to really go the old school way of like full stack. Here’s this one and we’re gonna go route the traffic and see which one hits. You can do that all through a function now.
You can just say, I want 15% of my people to see this, and I’m gonna A/B test that and it’s gonna bring you the stats out in real time. It’s a game changer.
The more common one I see is geolocation. Manifest this page in this language. I remember back in the day, we would have to translate all those pages. Now you can just call a function. That’s amazing. Just blows my mind.
I see a lot of use in edge functions for e-commerce. Like, there’s a bank holiday in Ireland, stores are open, we’re gonna go send this campaign.
RD: Do you have a theoretical use case that you haven’t seen yet?
DL: I’m gonna be a nerd and everybody’s freaking talking about it, and I don’t even wanna talk about it. We’re gonna see more of AI building these websites and generating them and calling functions. And doing it itself. It’s not if, it’s when. Maybe you’ve identified some functions that say that on this, run.
It’d be really cool to see it on traffic patterns too, for it just to be smart. Where you’re coming in and you’re saying, okay, we wanna make sure this campaign hits this amount of audience. It hits a threshold, hits a metric, maybe it cascades it. It’s calling a function to keep it going for two weeks, because I used to be a human that sat there and hit that button.
You know, you’re still doing really advanced deploys. I think there’s gonna be more of that. And consumers are about to get more consumed because it’s gonna happen by our activities. It’s just gonna happen with these functions that are gonna just be called upon. So it’ll be interesting, I think for me, because I’ve been building tools for tools.
Before we get to that step, we need to really kind of figure out the use cases of functions that can be developed that are automatic. It’s like automatic detection. Do I think AI can write the functions outta the box? That’s an advanced case, but I think if you have these certain conditions, I think that’s the future. And just personalized experiences are going to not be as much driven by humans doing research and analytics, but by analytics calling code and doing it.