Simplifying internal AWS Lambda APIs

Wakeem's World
9 min readJan 22, 2023

--

Video version of this article for those allergic to reading 🤓

Lambda APIs appear to be a really awesome feature of AWS lambda that go quietly under the radar. If I am being honest, it is probably because the execution is really confusing combined with the fact that the APIs are not be marketed that hard. 💀

So I am here to hopefully break down how Lambda APIs work so they appear easier than they are and to provide some simple real world examples of how they could be useful to you today.

📋 The Breakdown

Cartoon figures of the Lambda API, Extensions API, Telemetry API, and the Runtime API.

AWS Lambda exposes 3 APIs that run inside your lambda functions along with your code

  • Extensions API
  • Telemetry API
  • Runtime API

This article is going to focus mostly on the Extensions and Telemetry API but I will briefly talk about the Runtime API.

Extensions API

Cartoon showing how the Extensions API and Lambda API work together.

You can think of the extensions API as the extension registry. You need to communicate with this API to let the lambda runtime know you intend to run some kind of extension code and when that extension code has finished running.

You heard that right, when your extension is finished Running. This is because extension code can still be running even after your runtime function code has completed its execution. 😳 You are still bound to whatever lambda timeout you have configured on your function but say your lambda function is configured with a timeout of 10 seconds and your lambda runtime code only executes for 100ms, this means your extension can run for about another 9 seconds after your lambda function has sent its response before things will timeout. This opens up some interesting post processing possibilities that I will go over in the real world examples section.

Telemetry API

Cartoon showing how the Extensions API, Telemetry API, and Lambda API all work together.

This is by far the most confusing of all of the APIs, and that is because it is less of an API and more of a webhook. Calling this API a webhook is probably also a bit confusing, at least it was to me, but let me try and explain.

This essentially means that you can set up a locally running web server inside your lambda function with a single POST endpoint. The lambda runtime can then send events to this POST endpoint so your extension code can receive information about the runtime of your app or even from your own code.

The lambda runtime will be made aware of your telemetry webhook endpoint when you register the endpoint via the Telemetry API. IMO, they should just say that this registration is part of the Extension API that way they could be free to rename this “API” the “Telemetry Hook” or something more appropriate.

This telemetry API will send you some limited telemetry information about how long initialization, runtime, and shut down took. It can also send you all of the logs that lambda receives on standard out. This can be useful if you say, want to avoid Cloudwatch costs and forward your logs to another platform like S3 or something more external. Of course sending your logs to some external source comes with its own set of costs but it is still an interesting option.

Runtime API

Cartoon of how the Runtime API and the Lambda API work together.

I will admit I have spent the least amount of time with this API. But the gist is that this API is meant to be used when you want to build a custom runtime for lambda.

I was somewhat confused by the word runtime the first time I read about this API and basically runtime just refers to the programming language that your lambda function is executing.

For example, if you wanted to build a lambda runtime that ran Chicken you could in theory do this by publishing to the the next, response, and or invocation error events to the internal Runtime API endpoints.

This is probably my lack of experience with the Runtime API but TBH I am not sure why you would use the Runtime API over building a custom Docker image that runs on lambda. I am assuming maybe the Runtime API is faster or something. Maybe I will do a comparison in my next article so I can speak to this better. 😛

Given I don’t know much about this one beyond what the docs say I won’t spend more time discussing this one.

🛠️ How Extension and Telemetry APIs work

It is good to understand the details of how the extension and telemetry API work independently from each other so once you start using them together it is easier to understand the role each API plays during the lambda life cycle.

Before going into how each API works here is a quick refresher on the life cycle of a Lambda function.

Lambda Function born
Lambda Function lives
Lambda Function dies

Lambda function born represents initialization on first request. This is commonly referred to as a cold start.

Lambda function lives is when a lambda that has already been initialized is used again for an unspecified amount of time. This is commonly referred to as warm start.

Lambda function dies, this is triggered when the lambda function has been idle for a random amount of time.

Now that we have those three phases of the lambda life cycle fresh in our mind we can go over the extension and telemetry API.

Extensions API

Remember, this is how we register our Extension code with Lambda. So we will need to do one extra step during initialization.

🐣 Lambda function lives (Initialization/Cold Start)

The interaction between Lambda and the Extensions API during initialization/cold start

Your extension will need to do the following 👇

Sample extension code

This whole script will be run when our lambda first boots up. You will notice there are 4 main steps here.

  1. Our extension is registered
  2. Our extension makes a call to next
  3. Our extension does some work
  4. Finally we go back to the top of the while loop and make a .next call and our code immediately freezes

The code immediately freezes once the code goes back to the top of the while loop for a second time and waits for our next execution.

Before we keep going it is probably also worth exploring the code behind the extensionAPIService 👇

Extension Service API

One of the important things I want to point out from this file is the header Lambda-Extension-Name in the request function. This header value must be the same as your extensions entry point filename.

🐥 Lambda Function Lives (Warm Start)

The interaction between Lambda and the Extensions API during a Warm start

This is where your extension will pick up where it last left off. The extension will continue using the same extension ID so the steps here are identical to what happened during initialization just without calling the register endpoint.

Focused on the while loop from our extension code

This while loop will repeat until your lambda container is finally instructed to shut down by AWS.

🍗 Lambda Function Dies (Shutdown)

The interaction between Lambda and the Extensions API during the Shutdown phase

Shut down is slightly different but still very similar to the warm start execution.

Focused on the while loop from our extension code

Since we don’t have anything to clean up our code will pick up exactly where it left off during the warm start, run through our code, and then shutdown once we make our call to next for a second time.

Telemetry API

As I mentioned before the Telemetry API is really more of an internal webhook than a traditional API. This “API” also relies on using the extension ID that is issues via the extension API so you can not us the Telemetry API without first registering an extension.

🐣 Lambda function lives (Initialization/Cold Start)

Since your lambda function is frozen you won’t have to restart your local webserver, just like you don’t have to reregister your extension. Your Lambda function and extension will resume execution on the next invocation by responding to the last .next call.

It is probably also important to include the telemetry service code here so you can see the details of how the telemetryAPIService works.

🐥 Lambda Function Lives (Warm Start)

This is where both your extension and your telemetry web server will pick up right where they left off.

This will repeat until your lambda container finally is instructed to shut down by AWS.

🍗 Lambda Function Dies (Shutdown)

Shut down is again slightly different but still very similar to the warm start.

This is the end of the line for the lambda container and for your extension code including your telemetry webserver. You will notice on line 28 we are shutting down our telemetry server once we receive the shutdown event from the extension API.

😍 Real World Example Time

Okay finally time for real world examples.

🤓 Simple templates to get started

If you want to build a lambda extension yourself it can be helpful to start off with some examples.

AWS provides a GitHub repo with a lot of example code in a variety of languages. The only drawback is that it is a little difficult to read and figure out what is going on in some of the examples. Hopefully this article can help you as you look through their code.

You can also check out this repo I have created that has a single extension and telemetry API running as a lambda layer. You will recognize the code from the example images above. 😎

🙊 Fetching Secrets From SSM/Secrets Manager/etc

It is pretty common you would need secret values to access database connections, external APIs, private keys, etc. So typically people would use services like SSM, for example, to store and fetch any secrets they need during runtime.

When using SSM with lambda you really only had two options before.

  1. Get the secrets when you app is initialized
  2. Store the secrets as an environment variable

These two solutions have an issue though, your secrets are never updated after they are set. This is because it is inefficient to fetch the secrets during lambda runtime because you would be holding your lambda response until your secrets were updated.

But now with the extension API we are able to run code after our lambda runtime has finished running so we could easily build an extension that keeps our secrets up to date on our running lambda functions.

To make things even easier AWS has provided a public lambda layer that leverages the extension API to manage secrets from SSM. No need to write our own lambda extension. 😉

🔄 Fire and Forget Lambda API

This last example is something I am using in production currently that is think is a very interesting use case I haven’t seen many people mention.

You could create a lambda function that responds immediately to an API request and then continue processing the request after the client receives the response

Since extensions can run code after your lambda function has sent a response this means you could create a lambda function that responds immediately to an API request and then continue processing the request after the client receives the response.

This opens up more possibilities than just fire and forget APIs. Here are a list of use cases I can think of for this extension.

  • Dynamo stream like processing for other databases. You could collect the actions that you performed on your database and then send those actions to the async handler.
  • Send events to SNS, SQS, Event bridge, etc. after your have sent a response to your customer.

🌯 Wrap up

If you haven’t already you should try out the lambda APIs. They are all pretty great albeit a little confusing.

--

--

Wakeem's World

Attempting to weekly blog about whatever I am thinking about. Skateboarding, tech, film, life, etc.