The evolution of the Jamstack is becoming clear and serverless isn’t just for APIs anymore. Scaling the Jamstack to large websites with frequent changes relies on serverless functions. Developers using Incremental Static Regeneration (ISR), Distributed Persistent Rendering (DPR), or something in between need easier ways to observe and debug their serverless code, especially in production.
Layer0 already offers a streaming log of the console output from your serverless code running in our cloud. Unfortunately, the output from an application’s console.log() statements is rarely informative since it requires developers to anticipate which API calls might be problematic before an issue ever arises so they can write extensive logging code around it. That rarely happens.
So earlier this month we released Deep Request Inspection (DRI). Request inspection is like the Chrome Devtools Network inspector but for your serverless code. You can view the method, headers, and body of every incoming request to your serverless code as well as the final response that’s generated. Even better, you also can view this same detailed network information for every upstream API request your serverless code makes. And we’ve even added a helpful “Copy as curl” feature that makes it super easy to go from inspecting an API call to reproducing it locally. It’s a dream! Watch in the tutorial below.
Request inspection is now in general availability for all plans on Layer0 (even our free tier) and compatible with any full stack framework that supports serverless, including Next.js, Nuxt.js, Angular, and SvelteKit/Sapper.
Working with our beta customers, they’ve already found request inspection useful for multiple scenarios:
- Resolving production issues: Resolve issues that happen “only in production” and can’t be reproduced locally
- Visualizing asynchronous code: Understand the exact API flow of complex code with lots of asynchronous routines
- Isolating slow APIs: Examine precisely when API calls are made and how long they take
Deep request inspection is one of the ways we’re helping developers get the most out of serverless. We already give developers the ability to measure the cache hit rate of their serverless code and purge the cache by route or individual pages regardless of whether they’re generated by Incremental Static Generation or our other caching methods. In fact, one of our ecommerce customers has hooked up Layer0 to their order management system and is dynamically purging over 20,000 individual products per hour as their inventory changes. This enables them to have an instant loading ecommerce site despite a large catalog that frequently changes.
Looking ahead, we look forward to building on request inspection and give you even more observability of your serverless code, including memory usage, concurrency, and API performance. We’ll even help you isolate the upstream APIs that are causing the performance problems.
We're excited about how deep request inspection and these upcoming features will make you more productive with serverless on the Layer0 platform. If you're not already a Layer0 user, sign up for a free account, give it a spin, and let us know what you think in the forums.