AWS Tutorials: Build a Python App with Lambda, DynamoDB, API Gateway, and SAM

Tutorial with a complete working cloud-native and serverless To-Do List application.

Pedro Cabido
AWS in Plain English

--

API Gateway, Lambda and DynamoDB logos

These AWS Tutorials articles will be used as my personal learning notes but at the same time as a way to distribute information to everyone interested in learning AWS.

In my last posts we’ve learned a bit of theory and practice of Lambda (basics, advanced 1 and advanced 2), API Gateway (basics, REST API and HTTP API) and also SAM (basics, advanced).

Today we’ll build a small To-Do list Python application that will cover a lot of the concepts that we’ve seen before but with a hands-on approach. Also we will cover an introduction into DynamoDB.

I’ll try to implement and give all the needed context for you to be able to build your own application from the ground up.

Without further ado, let’s review what we will build.

To-Do List Application

We’ll build a CRUD application exposed over an HTTP API with the information persisted on a NoSQL database.

To-Do lists goal is to keep track of all your pending actions and so our main resource will be exposed on the /actions API endpoint with a few HTTP verbs available:

  • GET /actions — list all the actions
  • GET /actions/{id} — get one specific action
  • POST /actions — create a new action
  • PUT /actions/{id}/{created_dt} — edit/update an existing action
  • DELETE /actions/{id}/{created_dt} — delete an existing action

This version of the app will not cover different profiles/users. Maybe we’ll work on that after we deep dive on other services like Amazon Cognito.

Development Environment

Let’s start by preparing our development local environment which will be important to code, test and iterate quickly and without AWS charges.

The first thing we need to do is to install Docker. This is the container environment where we will run our Lambda functions but also an instance of DynamoDB. After the installation we’ll create a virtual network where the containers will be able to communicate:

> docker network create lambda-local

Then we need to spin-up the DynamoDB instance with the following command:

> docker run \
-p 8000:8000 \
-d \
--rm \
--network lambda-local \
--name dynamodb \
-v {your-user-folder}/.docker/dynamodb:/data/ \
amazon/dynamodb-local \
-jar DynamoDBLocal.jar -sharedDb -dbPath /data

Let’s drill down this docker command just to understand why are we doing it:

  • docker run — initiates the container (docs here)
  • -p 8000:8000 — port 8000 on the container is exposed to port 8000 on the host
  • -d and —-rm — run the container detached and remove it after stops
  • —-network lambda-local — the network created before where we’ll attach the Lambda local environment
  • —-name dynamodb — naming the container and the hostname
  • -v {your-user-folder}/.docker/dynamodb:/data/ — create a volume, this allows your DynamoDB data inside the container (folder /data) to be stored in the host too ({your-user-folder}/.docker/dynamodb)
  • amazon/dynamodb-local — the container image
  • -jar DynamoDBLocal.jar -sharedDb -dbPath /data — the command to run when the container starts

Our local environment is now ready. The rest will be managed by AWS SAM.

Bootstrap the Lambda application

We’ll work with Python3 and so, as a best practice, let’s begin by creating and activating a virtual environment:

> python3 -m venv venv
> source venv/bin/activate

Now with the Python environment ready, we’ll initiate the SAM application that will create the root folder of our project and all the files inside:

> sam init -r python3.9 -d pip -n todo_list_api

This command will present us a prompt:

Which template source would you like to use?
1 - AWS Quick Start Templates
2 - Custom Template Location
Choice: 1
Choose an AWS Quick Start application template
1 - Hello World Example
2 - Infrastructure event management
3 - Multi-step workflow
4 - Lambda EFS example
Template: 1

We’ll go with option 1 in each case. This will clone the skeleton of the project and create a ready to use “Hello World” application for us.

> cd todo_list_api
> ls -l

We can see an empty __init__.py file (Python boilerplate), a template.yaml file with SAM description, an events folder (containing a JSON file with a mock event), a tests folder and a hello_world folder (with our serverless app).

We can start by testing the Hello World application to make sure our environment is ready to work:

> sam local start-apiMounting HelloWorldFunction at http://127.0.0.1:3000/hello [GET]
You can now browse to the above endpoints to invoke your functions. You do not need to restart/reload SAM CLI while working on your functions, changes will be reflected instantly/automatically. You only need to restart SAM CLI if you update your AWS SAM template
* Running on http://127.0.0.1:3000/ (Press CTRL+C to quit)

We can use any HTTP client (including our browsers) to make this simple request. I’ll use Postman which will be useful for the /actions API.

Postman GET to /hello endpoint
Postman making a GET request to /hello endpoint

As we can see, we receive a 200 OK response from the Lambda running on the container managed by SAM.

DynamoDB basic details

We now need to prepare our local instance of DynamoDB to be able to receive the data from our To-Do List application.

With DynamoDB running on a Docker container, we can now use the NoSQL Workbench to interact with it. After the installation:

  • Go to Operation Builder
  • Press + Add connection
  • Select the DynamoDB local tab, write a name and press Connect
NoSQL Workbench with Local DynamoDB
  • Go to Data Modeler
  • Press + to add a new model
  • Choose a name and fill the optional fields if you want and press Create
NoSQL Workbench creating a new data model
  • Then on Tables press + to add a new Table
  • Fill all the mandatory fields and press Add Table definition
NoSQL Workbench creating a new table
  • We select id as Partition key and created_dt as Sort key
  • Finally go to Visualizer and press Commit to Amazon DynamoDB, select your Saved connections and press Commit
NoSQL Workbench committing table to local DynamoDB

Nice job. We are good to go to start working on our amazing app.

To-Do List API

As already said in the beginning, we will create a CRUD API and so we will have one Lambda function for each endpoint of the API. This will allow us to have a separation of responsibilities where each Lambda function will only have one specific, idempotent job.

Project Organisation

In the place of the hello_world folder that we have on the root of the project, we’ll replace it with an /src folder which will contain each of the Lambda functions. Also, under the /tests folder we’ll remove the integration tests (we’ll not cover those now) and we’ll add a new folder /test_events which will contain the mock test events that we’ll use for the unit tests.

Our folder structure will be something like this:

-- todo_list_api/
| -- src/
| -- create_action/
| -- delete_action/
| -- get_action/
| -- list_actions/
| -- update_action/
| -- tests
| -- test_events/
| -- unit/
| -- __init__.py
| -- .gitignore
| -- README.md
| -- template.yaml
-- venv

SAM Template

Let’s review some of the most important characteristics of the SAM Template for this app.

AWS::Serverless::Function

Resources:
CreateActionsFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: src/create_action/
Handler: app.lambda_handler
Runtime: python3.9
Architectures:
- x86_64
Environment:
Variables:
TABLE: !Ref TABLE
REGION: !Ref REGION
AWSENV: !Ref AWSENV
Events:
CreateAction:
Type: HttpApi
Properties:
Path: /actions
Method: post
Policies:
- DynamoDBCrudPolicy:
TableName: !Ref ActionsTable

The above code demonstrates how we describe our Serverless function. As we’ve already mentioned on previous posts about SAM (here and here), this Type is a super set of AWS Cloudformation types. More details below:

  • CodeUri: src/create_action/ — where we declare where is the code
  • Handler: app.lambda_handler — the main function inside our code that will process the event
  • Environment:— where we declare environment variables to be used on our function
  • Events: — declaration of the API endpoint
  • Type: HttpApi — we’ll use HTTP API instead of REST API
  • Path: /actions — the path that will be used to invoke this function
  • Method: post — the method allowed
  • Policies: — we’ll declare here a policy to be able to read and write to a specific DynamoDB table

The other functions/endpoints follow the same logic, just replacing the action.

AWS::DynamoDB::Table

ActionsTable:
Type: AWS::DynamoDB::Table
Properties:
AttributeDefinitions:
-
AttributeName: "id"
AttributeType: "S"
-
AttributeName: "created_dt"
AttributeType: "S"
KeySchema:
-
AttributeName: "id"
KeyType: "HASH"
-
AttributeName: "created_dt"
KeyType: "RANGE"
ProvisionedThroughput:
ReadCapacityUnits: 1
WriteCapacityUnits: 1
TableName: "Actions"

Here we describe how the DynamoDB table is declared. This will be used by AWS Cloudformation to create the Table for us when we deploy the entire application to the AWS Cloud.

Lambda Functions

We’ll cover the code for one of the Lambda functions, the create_action. The remaining functions use the same principles but with a slight difference on the business logic.

import datetime
import json
import os
import uuid
import boto3def lambda_handler(event, context): [1]
"""Body expected:
{
"summary": "Action Name",
"description": "Action Description",
"priority": "High"
}
"""
print(event)
if not event["body"] or event["body"] == "": [2]
return {"statusCode": 400, "headers": {}, "body": "Bad request"}
action: dict[str, str] = json.loads(event["body"]) [3] params = { [4]
"id": str(uuid.uuid4()),
"created_dt": str(datetime.datetime.now()),
"summary": action["summary"],
"description": action["description"],
"priority": action["priority"],
}
try:
db_response = dynamo_table().put_item(Item=params) [5]
print(db_response)
return {"statusCode": 201, "headers": {}, "body":json.dumps(params)}
except Exception as e:
print(e)
return {"statusCode": 500, "headers": {}, "body": "Internal Server Error"}
def dynamo_table(): [5a]
table_name = os.environ.get("TABLE", "Actions")
region = os.environ.get("REGION", "eu-west-1")
aws_environment = os.environ.get("AWSENV", "AWS")
if aws_environment == "AWS_SAM_LOCAL":
actions_table = boto3.resource( [5b]
"dynamodb", endpoint_url="http://dynamodb:8000"
)
else:
actions_table = boto3.resource( [5c]
"dynamodb", region_name=region
)
return actions_table.Table(table_name)

We begin the function by the imports, the major relevance is to boto3 which is the AWS official SDK to work with their services (we’ll use it to handle DynamoDB requests). Next we’ll follow the important bits that I’ve highlighted in the code:

  • [1] — the main function that receive the event from the API Gateway as first parameter and the context of the Lambda function as second.
  • [2] — since this is the function that will create a new action based on the body of the POST request, we first check if the body of the event exists (no further validations are done for the sake of simplicity). Early return with 400 response code in the case the request miss the body.
  • [3] — extract the action details from the JSON body.
  • [4] — defining the action parameters that we will use to insert in DynamoDB.
  • [5] — using boto3 put_item function to handle the request to the database.
  • [5a] — we’ve defined a specific function to manage boto3 boilerplate in order to be able to communicate with DynamoDB.
  • [5b] — if we are setting the local environment using env variables we connect to the DynamoDB table on the dynamodb:8000 host.
  • [5c] — if we are configured with Cloud, we just connect as usual.

The call to DynamoDB is inside a try…catch statement where we try to put the information, but if some exception is thrown, we can handle it.

If successfully we just reply back with a 201 Created status code and the details of the created action on the body. If we have an exception, we just reply with a 500 Internal Server Error (for the sake of simplicity for this tutorial).

How to test in the local environment?

Run the following command:

> sam local start-api \
--docker-network lambda-local \
--parameter-overrides AWSENV=AWS_SAM_LOCAL

Then we can use Postman to make the request to the local environment:

Using Postman to call the local environment

As expected, we receive a response with a 201 Created status code.

Unit Tests

Again, we’ll just cover one of the cases of the unit tests, the test_create_action. We use the unittest official library from Python. To mock the DynamoDB interaction we use the moto library (more details here). To invoke the function we use the respective test event on the test_events folder.

import json
from unittest import main, TestCase, mock
import os
import boto3
from moto import mock_dynamodb
@mock.patch.dict( [1]
os.environ, {"TABLE": "Mock_Actions", "REGION": "eu-west-1", "AWSENV": "MOCK"}
)
@mock_dynamodb [2]
class TestCreateAction(TestCase):
def setUp(self): [3]
self.dynamodb = boto3.client(
"dynamodb", region_name="eu-west-1"
)
self.dynamodb.create_table(
TableName="Mock_Actions",
KeySchema=[
{"AttributeName": "id", "KeyType": "HASH"},
{"AttributeName": "created_dt", "KeyType": "RANGE"},
],
AttributeDefinitions=[
{"AttributeName": "id", "AttributeType": "S"},
{"AttributeName": "created_dt", "AttributeType": "S"},
],
ProvisionedThroughput={"ReadCapacityUnits": 1, "WriteCapacityUnits": 1},
)
def tearDown(self): [4]
self.dynamodb.delete_table(TableName="Mock_Actions")
def test_create_action_201(self): [5]
from src.create_action import app
event_data = "tests/test_events/create_action.json"
with open(event_data, "r") as f:
event = json.load(f) [5a]
response = app.lambda_handler(event, "") [5b]
body = json.loads(response["body"]) [5c]
self.assertEqual(response["statusCode"], 201) [5d]
self.assertEqual(body["summary"], "Mock Action") [5e]
self.assertEqual(body["description"], "Mock Description")
self.assertEqual(body["priority"], "Mock Priority")
def test_create_action_400(self): [6]
from src.create_action import app
event_data = "tests/test_events/create_action.json"
with open(event_data, "r") as f:
event = json.load(f)
event["body"] = "" [6a]
response = app.lambda_handler(event, "") self.assertEqual(response["statusCode"], 400) [6b]
self.assertEqual(response["body"], "Bad request")
if __name__ == "__main__":
main()

Again, after importing the necessary libs, let’s see the most important points:

  • [1] — mocking the environment variables
  • [2] — using moto to mock DynamoDB
  • [3] — set up function where we initialise the DynamoDB environment. This function will run before each test.
  • [4] — tear down function where we delete the DynamoDB environment. This function will run after each test to make sure we have a clean environment.
  • [5] — testing the success scenario
  • [5a] — load the JSON event from the file
  • [5b] — calling the function with the dummy event
  • [5c] — get the body of the response from the function
  • [5d] — assert that the status code is 201 as expected
  • [5e] — three assertions to validate the body
  • [6] — testing the 400 scenario
  • [6a] — deleting the body from the event file to simulate the failure scenario
  • [6b] — asserting the status code and the response body

How to run the unit tests?

On the root of the project run the following command:

> python3 -m unittest discover tests/unit/ -bv

Deploy to AWS Cloud

When everything is finally done, we can build the project and deploy it to the AWS Cloud. For that we’ll have the help of SAM and Cloudformation to push, configure and deploy our application.

> sam build
> sam deploy --guided

Follow the instructions on the CLI, go check your Cloudformation working and creating all the necessary resources. In the end, we can check API Gateway, DynamoDB and Lambda consoles to make sure everything looks according to the expectation.

Test in the Cloud

After the Cloudformation finishes its processes, we can go to the API Gateway console and note the Invoke URL. We’ll use it to call our API which should now be publicly available on the internet.

Success! We have a full working CRUD application exposed over an HTTP API created as Cloud native on a Serverless environment.

How fancy is that? 🙂

To complete this tutorial I’ve read a lot of documentation pages from AWS, a lot of googling around and a special thanks to this inspiration project.

Since I’m still learning, I will very happy to get feedback on the code or anything related with this small project. Constructive feedback is always encouraged.

Thanks for reading. Hope this helps you as much as ithelped me. ❤️
If you want to follow my AWS Learning journey, follow me in Medium and Twitter.

More content at PlainEnglish.io. Sign up for our free weekly newsletter. Follow us on Twitter, LinkedIn, YouTube, and Discord. Interested in Growth Hacking? Check out Circuit.

--

--

Platform Engineer @ Cazoo | Eager to learn everything about coding/internet/startups/leadership/product