tech-lessons.in
Photo by Gezer Amorim on Pexels
April 19, 2020

AWS Lambda - A Virtual Podcast

Posted on April 19, 2020  •  12 minutes  • 2384 words
Table of contents

AWS Lambda is a serverless compute service and after having worked with it for some time, I felt it is a good time for me to share my learning and experiences. I have been thinking of writing an article in a “Virtual Podcast format” and felt this could be the one.

Welcome all to this article named AWS Lambda - A Virtual Podcast and let me introduce our guests Mr. Hernandez and Ms. Jessica who would walk us through their experiences of using AWS Lambda.

Welcome, Hernandez and Jessica, and thank you for participating in this Virtual Podcast. Let’s get started.

What is AWS Lambda

Me> My first question to you Jessica is “What is AWS Lambda?”

Jessica> AWS Lambda is a serverless compute service which allows you to execute a function in response to various events without provisioning or managing servers. What this means is your function will execute ONLY when there is a request for it.

Me> So what I gather is, a function is an entry point which gets invoked by AWS Lambda service. Is that right?

Jessica> That’s nearly right. When you create a lambda function, you need to specify a handler which is nothing but the filename.exported function name that acts as an entry point for your application.

Let’s say, you have a file named “handler.js” and it exports a function named “processOrders”, your handler becomes handler.processOrders which will be invoked by AWS Lambda in response to events.

Me> Thank you Jessica.

AWS Lambda is a serverless compute service which allows you to execute a function in response to various events without provisioning or managing servers. When you create a lambda function, you need to specify a handler which acts as an entry point for your lambda function.

. . .

How does a lambda function execute?

Me> Jessica, you mentioned that a lambda function runs in response to an event, but where does it run?

Jessica> When you create a lambda function, you need to specify a runtime say, node12.x, python3.7 or anything else. When there is a request for your lambda function, AWS will provision a container with the selected runtime and then run your function.

Me> So it is actually a container within which a lambda function is run. Does that also mean your lambda function gets some storage on file system?

Jessica> Yes, your lambda function gets around 500MB of storage in /tmp directory but that is ephemeral. It goes away as the container goes away.

AWS will provision a container to run your function when there is a request for your lambda function. This container will be discarded after some inactive time.

. . .

What is AWS Lambda Cold Start

Me> Hernandez, since a lambda function is not always running, does it increase the response time of a request?

Hernandez> Like Jessica mentioned, a lambda function will run inside a container which will stay active till the time your function is running. This container will be discarded by AWS after some inactive time thus making your function inactive and this is called as cold state.

Whenever there is a request for a cold function, AWS needs to provision a container for running your function and this is called as Cold Start. So, to answer your question, yes, cold start can add to the response time of a request.

Me> Is there a way to avoid cold start?

Hernandez> Yes. AWS has now introduced Provisioned Concurrency which is designed to keep your functions initialized and ready to respond in double-digit milliseconds at the scale you need. Provisioned concurrency adds pricing dimension though.

You can turn it ON/OFF from AWS console or CloudFormation template.

If you are using serverless framework you should check out this blog for keeping your functions warm.

Me> Thank you Hernandez.

AWS needs to provision a container for running your cold function and this is called as Cold Start. You should check Provisioned Concurrency (or even Serverless plugin WarmUP) for keeping your functions initialized.

. . .

AWS Lambda Configuration

Me> Jessica, what are the different configuration options one can specify while creating a lambda function?

Jessica> You can specify a lot of options including -

Me> Wow, these are too many. Jessica you mentioned memory, but no mention of CPU?

Jessica> Yes, you can not control the amount of CPU that gets allocated to your lambda function, it is actually proportional to the amount of memory allocated.

Me> I see. Jessica, what do you mean by Concurrency of a lambda function?

Jessica> I like the example given in Managing AWS Lambda Function Concurrency . Imagine each slice of a pizza is an execution unit of a lambda function and the entire pizza represents the shared concurrency pool for all lambda functions in an AWS account.

Let’s say, we set concurrency limit of 100 for a lambda function, all we are saying is the lambda function will have a total of 100 pizza slices which means you can have 100 concurrent executions of lambda function. Concurrency limit set for a lambda function is reduced from concurrency pool, which is 1000 for all lambda functions per AWS account - the entire pizza.

Me> Jessica, I also see an option of Unreserved Concurrency in lambda configuration. What is that?

Jessica> AWS also reserves 100 units of concurrency for all functions that don’t have a specified concurrency limit set. This helps to make sure that future functions have capacity to be consumed.

Me> Thank you Jessica. I am starting to wonder what happens when a lambda function’s concurrency limit is reached and there are more requests?

Jessica> Lambda function gets throttled.

Me> Does that mean a client of your lambda function say API Gateway will get an error?

Jessica> It actually depends on the type of request. If it a synchronous request, it will end with a timeout error.

Whereas in case of asynchronous request, say from SQS, AWS Lambda will retry your lambda function before sending the request event to a Dead Letter Queue, assuming one is configured.

Various configuration options can be specified while creating a lambda function including IAM role, memory, timeout, VPC concurrency etc.

. . .

AWS Lambda Debugging

Me> Hernandez, what AWS services can help us with debugging an issue with a lambda function?

Hernandez> AWS Lambda function logs are sent to CloudWatch and lambda function needs an IAM role in order to that. Other than CloudWatch, you can also use AWS X-Ray for tracing and debugging performance issues.

Me> Nice. How to set up AWS X-Ray with lambda function? Do you need to set up an X-Ray agent or something like that?

Hernandez> No, with lambda function, you need to do a very few things in order to set up tracing -

Rest everything is taken care by AWS Lambda.

Me> Ok. Once this is done, AWS will be able to build a service map signifying which services were invoked by lambda function and indicate the problems, if any. Is that right?

Hernandez> Yes, that is right.

AWS Lambda function logs are sent to CloudWatch and lambda function needs an IAM role in order to that. Other than CloudWatch, you can also use AWS X-Ray for tracing and debugging performance issues.

. . .

Restrictions with AWS Lambda

Me> Jessica, any restrictions around AWS Lambda that we should be aware of?

Jessica> I think there are a few restrictions -

With that said, I feel you might not hit all these limitations. To elaborate, if your unzipped code size is going beyond 250MB, I think it is good to understand why is a lambda function getting too huge. Have we packed too many dependencies or have we mixed too many responsibilities in a lambda function or is it something else.

Me> Jessica, what is lambda layer?

Jessica> A layer is a ZIP archive that contains libraries, a custom runtime, or other dependencies needed by your application. With layers, you can use libraries in your function without needing to include them in your deployment package. Layers let you keep your deployment package small.

Me> Ok, then 5 lambda layers in an application looks like a sensible default.

Jessica> True. I think these constraints are very sensible and if we are hitting some of them, it is worth looking back and seeing if there is a problem somewhere else.

AWS Lambda has some restrictions and our panel feels these are sensible restrictions. It is good to know them.

. . .

Unit and Integration Testing with AWS Lambda

Me> Coming to my favorite topic. How has your experience been with testing of AWS Lambda function?

Jessica> Well, unit testing is not difficult. If you are coding your lambda function in typescript, you can very well use sinon to mock all the dependencies and just validate that a single unit is working fine.

Hernandez> True. I think challenge comes when you want to assert that the integration of your lambda function with external systems say DynamoDB or S3 works properly. In order to test this, we have used LocalStack in our project.

Me> LocalStack? Do you want to talk a bit about this?

Hernandez> Sure. LocalStack provides an easy-to-use test/mocking framework for developing Cloud applications. At this stage, their focus is primarily on supporting the AWS cloud stack.

LocalStack spins up various Cloud APIs on local machine including S3, lambda, DynamoDB and API Gateway. All you need to do is, spin up LocalStack docker container, deploy your infra say Dynamo table or lambda function within LocalStack and connect to these services running on local machine from within your code.

Me> Interesting. Does LocalStack support all AWS services?

Hernandez> No, it supports quite a few but definitely not all.

I am sure Unit testing with AWS Lambda function code is understood by all of us but what is good to know is LocalStack can be used for integration testing.

. . .

Packaging and deploying an AWS Lambda application

Me> Jessica, you talked about unzipped code. Does that mean you have to create a zip file and upload it somewhere?

Jessica> Well, you have package your lambda function along with its dependencies as an archive, upload it either on AWS Lambda console or in an S3 bucket which will be referenced from your CloudFormation template.

Me> How do you folks package your application? It appears to me as if we need to create a “fat jar” kind of a thing.

Hernandez> We use typescript for coding our lambda application and webpack for packaging it. It does not create a zip file, just an out directory containing the transpiled code (js) and a handler.js file with all the required code from different node_modules plus its source map.

Me> How do you deploy your code then because you just seemed to create an output directory with a few javascript files.

Hernandez> We use CDK for deploying our code which allows you to code your infra.

Me> Wow, the list of tools doesn’t seem to come to an end.

Hernandez> It’s simple. Just look at it this way, we have just created a directory which is ready to be deployed and moment you say cdk bootstrap, it will copy the contents of this out directory into another directory which will be archived and uploaded to an S3 bucket.

And when you say cdk deploy, you will see all the required AWS components getting deployed. Simple.

Me>Simple? You said contents of this out directory will be copied into another directory. Does that mean CDK already knows about the out directory?

Hernandez> That’s true. When you code your infra, you will specify where is your compiled (or transpiled) or ready to be shipped code located and that’s how CDK knows about this directory.

Me> Great, now I able to connect dots. Build your code -> get a shippable directory -> archive it -> upload it to an S3 bucket -> deploy it and CDK is one way to get all these steps done. Is that right?

Hernandez> Absolutely.

In order to deploy yours lambda function, it needs to be packaged along with its dependencies as an archive. You could use webpack if you are using typescript as a programming language. You can use CDK, CloudFormation or SAM for packaging and deploying your lambda function.

. . .

Applications built using AWS Lambda

Me> Jessica, Hernandez, what are the different types of applications that you folks have built using AWS Lambda?

Jessica> We have actually built serverless microservices using AWS Lambda and we also process web clicks on our application which is a stream of events flowing from user interface to AWS Pinpoint to AWS Kinesis to AWS Lambda.

Hernandez> We use AWS Lambda for scaling down images that are uploaded to our S3 buckets and for processing DynamoDB streams which is a stream of changes in DynamoDB table.

Me> Thanks Jessica and Hernandez.

Our panel highlighted different types of applications they have built using AWS Lambda including microservices, event processing (images on S3 buckets) and stream processing (web clicks and handling changes in DynamoDB).

. . .

With this we come to an end of our “Virtual Podcast” and a big Thank you to Jessica and Hernandez for being a part of this. This was wonderful, and hope our readers (yes, it is still virtual) find it the same way. Thank you again.

References