Run AWS DynamoDB Locally

DynamoDB is a fully managed NoSQL database service offered by Amazon. This tutorial will explain how to set up DynamoDB locally on your PC in 2 different ways.

Ivan Polovyi
AWS in Plain English

--

I assume that you have a Docker and Docker-compose installed on your PC. The complete source code used in this tutorial can be found on the GitHub repository here.

The are numerous ways to spin up a container from a Docker image. This tutorial will explain how to run a container with DynamoDB using Docker Compose. Right after the DynamoDB instance is ready AWS CLI commands will be executed against it. The commands will create a table and populate it with data. Personally, I prefer this way of creating a local DynamoDB instance because it creates everything needed for local development with one single command. To run a container using Docker Compose we have to create a docker-compose.yaml file and then in the directory where this file is located execute the command:

$docker-compose up

1. Using DynamoDb Docker image

The first method is to use the DynamoDB official Docker image. The docker-compose.yaml below has an instruction to create one container with DynamoDB instance. Alongside this container, one more container will be created from amazon/aws-cli. The aws-cli container has a volume mapped to the directory on PC that contains bash scripts. The command section of the docker-compose.yaml file has a command to loop over the docker container volume and execute all bash files against an instance of DynamoDB in the first container. After execution is done the aws-cli stops and only the DynamoDB container continues to run. The environment section has 2 variables: one for the default profile, which will be created by running the first bash script file and the second is to simplify commands in the second bash script file. I will explain this part later.

The docker-compose.yaml for DynamoDB Docker image

In this example, I’m using 2 bash files, it easily can be done using one bash script, but personally, I prefer to keep it separately when creating different objects and just wanted to show you that it is possible. To guarantee the order of execution, the names of the files have to start with numbers e.g. 01, 02…

The script below creates a profile with the name that was set as an environment variable in the docker-compose.yaml

The bash script to create a profile

The second script creates a table and populates it with one item and then lists this item.

Here you can see why I've set two environment variables, AWS_DEFAULT_PROFILE sets default profile for all AWS CLI commands, so I don't have to specify it with every command like this

--profile=dynamo-db-local

And a second, AWS_ENDPOINT can be replaced with

--endpoint-url=http://dynamo-db-local:8000

This helps make all commands shorter, and if for some reason I have to change, for example, a port mapping I won’t need to change it in every command in every script, I just need to change it in a docker-compose.yaml.

The bash script to create a table put an item to the table and scan the table

The command example to execute against DynamoDB.

$aws  dynamodb scan --table-name customer-loc --endpoint-url=http://localhost:8000

You have to have an AWS CLI installed on your PC.

2. Using the Localstack

The second option is a little bit simpler. The docker-compose.yaml file has an instruction to spin a container with the Localstack that has an instance of DynamoDB. This container has a volume mapped to a directory in a PC with bash scripts. These scripts will be executed by a Localstack as soon as DynamoDB is ready.

The docker-compose.yaml for Localstack image

Here I have 2 bash scripts. It can be done with only one script, but as I explained before I prefer to separate ones.

The script below creates a profile and then sets it as a default. I’m setting it here as default because then I do not have to specify it in every command.

The bash script to create a profile

As in the first example, the second script creates a table and populates it with one item, and then lists this item. Here as you can see, I’m using endpoint-url in every command.

The bash script to create a table put an item to the table and scan the table

The command example to execute against DynamoDB inside the Localstack.

  1. From within a container. An AWS CLI doesn't need to be installed on your PC.
$docker container exec -it localstack sh
$aws dynamodb scan --table-name customer-loc --endpoint-url=http://localhost:4566 --profile=localstack

2. Set up a profile for AWS CLI of your PC using commands from the first script and then execute

$aws  dynamodb scan --table-name customer-loc --endpoint-url=http://localhost:4566 --profile=localstack

Summary:

This tutorial explained how to create a local DynamoDB instance in 2 different ways. You may be wondering what way is better. In my opinion, there are cases for both. If only DynamoDB is needed then the better way is first because it has a much faster startup time. But if you use another AWS service (e.g. S3, SQS, etc.) together with DynamoDB or you don't have an AWS CLI installed on your PC then you may choose the second one. A second way provides a wide variety of AWS services and embedded an AWS CLI. It is up to you to decide which one is better for your case.

Thank you for reading! If you have any questions or suggestions, please feel free to write me on my LinkedIn account.

More content at plainenglish.io

--

--

I am a Java Developer | OCA Java EE 8 | Spring Professional | AWS CDA | CKA | DCA | Oracle DB CA. I started programming at age 34 and still learning.