Docker for Developers Part 2

The Saga continues

This is a continuation of my previous post which was an introduction to docker geared for developers. In the previous post, we got somewhat familiar with the docker ecosystem and built a very simple hello world application. In this part we are going to get dynamodb running locally, run a one off script to preload data, build a sample api service that reads from dynamo, and finally a sample website that reads from that api service. As with the first part, you can retrieve a copy of the completed code base off github or directly download the assets for part 2 here.

Running DynamoDB

Before we can really start building our API server, we need a place to store data. As we learned in the previous part, we can use docker-compose files to orchestrate the various services that make up our application. This could be a postgres or mysql instance, however in this case the application is going to leverage DynamoDB for storage. When developing the application it doesn’t make sense to create an actual dynamo table on AWS as you would incur some costs for something that is only going to be used for development. There are some caveats and limitations to this however which you can read about here. Since we would never want a local dynamodb container running in any environment other than development, we want to go ahead and edit docker-compose.override.yml with the following:

Save that file, and run docker-compose up -d and you should see output that the dynamodb container is running alongside our movie-api instance

Now that DynamoDB is running we need to be able to link our API container to it so that they can communicate. Our Compose file needs one small change..

Rerun docker-compose up -d which will recreate the movie-api-1 container.

Loading data into DynamoDB

Now that we have dynamodb running, we can show how to run a one off command to invoke a script that will seed data into dynamo. First, we can create the directory structure for storing our sample data and a place for our seed scripts. Within the movie-api directory create two new directors. One called scripts and the other data. We can use an amazon published sample data set for building this application. You can retrieve it here and extract this to the data directory.

The application directory structure should now look like this:

Now within the scripts directory lets write a simple script for seeding this data to our local dynamodb instance. Within the movie-api/scripts directory create a file called seed_movie_data.py and open it for editing.

 

This script is pretty straightforward, this script will create the table within dynamodb if it doesnt exist, and then seed it with a little over 4,000 movies from the json file in our data directory.

Before we can run our script, we need to set a few environment variables on our movie-api instance. Go ahead and open up docker-compose.override.yml and adjust it to reflect the following:

The AWS credentials do not need to be anything real, in fact, leave them as above (foo, and bar). They just need to be set to prevent boto from barfing when connecting to the local dynamodb instance. In a production setting, we would leverage IAM roles on the instance to connect which is far more secure than setting credentials via an environment variable.

Once you have created the script, and also set the environment variables we can run the following to recreate the container with the new environment variables and then run the script.

As you can see, we ran the script that we created within the container. We do not need to have our python environment setup locally. This is great, because the containers environment is isolated from every other application we may be developing (and even other services within this application). This provides a high degree of certainty that if we deploy this image, that it will function as we expect. Furthermore, we also know that if another developer pulls this code down and runs the project it will work.

Building our API

Now that we have the data in our datastore  we can build a simple lightweight API to expose this information to a client. To keep things simple, we are going to create a simple endpoint which will return all the movies that were released in a specified year. Lets go ahead and open up movie-api/demo/services/api.py in our IDE and add a bit of code to make this happen.

Save this and then we can try querying our service with a simple curl command:

Building our Website

Now that we have an API that returns the data we need, we can create a simple web interface with a form that will present this data in a nice fashion. There are a few ways to make this happen, normally i’d recommend using something like angular however to further demonstrate container linking and such I will use a seperate flask app.

Within the movie-web directory we need to create our app skeleton. To simplify things, copy the Dockerfile, app.py and requirements.txt files from movie-api to movie-web. Besides the copying of those 3 files, go ahead and create the following directory structure and empty files so that the output of tree matches the below.

In requirements.txt remove the boto reference and add in requests.

Open up app.py and edit the third line to reflect the below

Go ahead and create demo/services/site.py and open it up within your IDE.

Now to get things running we just need to edit our docker-compose.yml and docker-compose.override.yml files.

docker-compose.yml:

docker-compose.override.yml:

With those changes saved. We can run docker-compose up -d which should launch our container. We can verify connectivity with curl.

Now lets create a couple templates, and a build out the endpoints on our webapp so that we can perform a simple search.

Open up demo/services/templates/index.html:

Open up demo/services/templates/results.html:

And finally edit demo/services/site.py:

If you visit your web browser at http://192.168.99.100:8081/ which should show the following:

movie-db-index

Enter in 1993 and you should see the following results:

movie-db-results

At this point that completes our application and this tutorial on docker for developers. To Recap, we installed and setup the docker toolbox on your machine. We then demonstrated how to use docker-machine, docker, and docker-compose to build a sample application that uses dynamodb, an api service, and finally a web application to view a sample dataset. You should be familiar now with creating a Dockerfile to build an image, and using compose to orchestrate and run your application. You have defined environment variables, container links, ports, and even leveraged the ability to map a volume coupled with flasks ability to reload on file changes to rapidly speed up development.

One thing you may have noticed is that we spent most of the time dealing with application code, and not a whole lot of time spent working with docker itself. Thats kind of the point. One of the greatest strengths of docker is that it simply gets out of your way. It mays it incredibly easy to rapidly iterate and start working on your application code.

While this wraps up my 2 part series on docker for developers, ill be writing additional posts centered around docker for QA and Operations Engineers. This will focus on testing, CI/CD, production deployments, service discovery, and more.

Leave a Reply

Your email address will not be published. Required fields are marked *