In this tutorial, we'll learn how to deploy a FastAPI app to AWS Lambda using a Docker container. We'll start by creating a simple FastAPI app and then containerize it using Docker. Next, we'll deploy our container to AWS Lambda and test it out There are many benefits to deploying FastAPI to AWS Lambda using a Docker container. First, it allows you to package your application and all its dependencies in a single container, which makes deployment and scaling much simpler. Second, using a container also ensures that your application will always be deployed in a consistent environment, which can help reduce issues with compatibility and configuration. Finally, deploying FastAPI to AWS Lambda using a container can also help improve performance
Assuming you have a FastAPI application called app.py, the following steps will show you how to deploy it to AWS Lambda using a Docker container.
Before starting, install docker and AWS cli
Install AWS CLI from here
https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html
In AWS console, go to IAM role and create a user which has access to ECR
On next select attach existing policies and click on AmazonEC2ContainerRegistryFullAccess
It will provide you an access and secret access key. Save it.
In aws console, go to Amazon ECR and create repository. After creating, click on it and you can see view push command. CLick on it. You can see following
Now go to you system terminal and write aws configure. Enter you keys, and region, you can leave output blank or put ‘json’ there.
This is very important, other wise you will get following error
Error: Cannot perform an interactive login from a non TTY device
First i am going to use a simple fastapi that just take a string input and return that
This is our repo structure
app.py
from fastapi import FastAPI
from mangum import Mangum
from fastapi.responses import JSONResponse
import uvicorn
app = FastAPI()
handler = Mangum(app)
@app.get("/{text}")
def read_item(text: str):
return JSONResponse({"result": text})
if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=9000)
requirements.txt
numpy
fastapi
mangum
Uvicorn
Dockerfile
FROM public.ecr.aws/lambda/python:3.8
# Copy function code
COPY app.py ${LAMBDA_TASK_ROOT}
# Install the function's dependencies using file requirements.txt
# from your project folder.
COPY requirements.txt .
RUN pip3 install -r requirements.txt --target "${LAMBDA_TASK_ROOT}" -U --no-cache-dir
# Set the CMD to your handler (could also be done as a parameter override outside of the Dockerfile)
CMD [ "app.handler" ]
Now run command shows in view push command and upload it to ECR
Till this point, it will give a payload error if you curl this. Ignore it and upload it to ECR.
Load the container from ECR to lambda and create a new function.
Create aws lambda function, click container image, select your image.
Then click on test. Choose template as apigateway-aws-proxy
Event JSON a payload ic created, now you need to modify few of its line
At line 4 replace path": "/path/to/resource" to path": "/"
At line replace "httpMethod": "POST" with "httpMethod": "GET”
At line 16 replace "proxy": "/path/to/resource" to "proxy": "/"
At line 117 replace path": "/path/to/resource" to path": "/"
At line 117 replace httpMethod": "POST" with "httpMethod": "GET”
Now click save and test you will get a success message
Now click on configuration tab, then function url and click create.
Change the settings as shown in following figure, leave rest as default
Your api is created. Enjoy
Resources