Docker Compose: Simplifying Multi-Container Deployments

Docker Compose: Simplifying Multi-Container Deployments

12 mins read234 Views Comment
Updated on Sep 28, 2023 17:41 IST

Docker Compose is a tool that simplifies the management of multi-container applications. Read on to learn about the key benefits of using Docker Compose.


During the past many years, Docker has gained more and more popularity. One explanation is that portable containers are easily made and can be deployed quickly. According to the Docker website, a container bundles your code with any additional resources to consistently deploy across many platforms.

You may run these containers natively on your Linux, Windows, and Mac devices. And big cloud platforms like AWS or Azure already support them. Also, Docker may be configured and utilized in any hosting environment. We will go further into some of the more complex aspects of this topic, such as how to run multiple containers.

Must Explore: Docker Online Courses & Certifications

Table of Contents

What is Docker Compose?

It is a tool used to create and execute multi-container Docker applications. It enables programmers to quickly create, administer, and deploy complicated applications of several processes, containers, and connections. Developers may specify each application component with Docker Compose in a single or “compose file,” used in the software’s architecture. This file contains all the settings and prerequisites required to operate the program.

Docker Compose may use a single YAML file to set up and run numerous containers. This is incredibly beneficial if you are working on a technological stack that includes several different technologies.

Say, for illustration’s sake, that the project you are working on uses .NET, NodeJS for real-time processing, a MySQL database, and Python for providing APIs.

Advantages of using Docker Compose for multi-container deployments

For developers and DevOps teams, using Docker Compose has several advantages, including

  1. Easy management

Docker Compose makes it simple to design and manage many processes, containers, and protocols in a single file, simplifying the management of multi-container applications. This makes it simpler to comprehend the links between an application’s numerous elements and makes the deployment and administration of large apps less complicated.

  1. Increased cooperation

Collaborating on complicated applications is simpler when team members can view and edit Docker Compose files. This makes it simpler to incorporate changes from several team members into a unified interface and enables developers to work on various components of an application concurrently.

  1. Easier deployment

Whether it’s a remote testing framework, a measurement, or a development process, Docker Compose makes it simple to deploy complicated apps. As a result, launching apps takes less time and effort, and teams find it simpler to expand their applications as necessary.

  1. Replicable build

Docker Compose makes it simple for developers to replicate the same environment at multiple creation, staging, and production stages. This guarantees that issues are found and repaired as soon as possible, making it simpler to maintain consistency throughout all development process phases.

Introduction to Docker Volumes
Introduction to Docker Volumes
Many of us have used docker at some point of time. But, what if I ask you what are docker volumes? Many of us won’t be able to answer more
Kubernetes vs Docker: Understanding the Difference
Kubernetes vs Docker: Understanding the Difference
While Docker is used for packaging the containerized applications on single node, Kubernetes runs them across a cluster. Docker can work with container schedules. Kubernetes can also work with more

Basic Terminology

YAML files

In new web applications, services and distributions are frequently described and configured using YAML (Yet Another Markup Language), a human-readable data serialization language. A systematic and comprehensible framework for defining a system’s parts and configurations is provided by YAML files.


Programs and the resources they require may operate in isolation thanks to containers, a type of compact virtualization. They give applications a stable and predictable framework, making installing and maintaining software simpler. Images, pre-built copies of system parts and settings are frequently used to generate containers. When you use containers, migrating and deploying your apps from development to production is simpler because they let you bundle your programs and resources into a single, self-contained unit.


Even if a container is removed or regenerated, data created by the container can still be stored on volumes. They offer a means of storing data beyond the container’s storage, enabling it to survive the deletion of the container. Volumes can be divided across several containers and deployed to a server or a container. It is easier to handle and preserve data over time when you use volumes to segregate the data created by your instances from the containers themselves.


Networks give containers in a containerized program a mechanism to interact with one another and the external world. They offer a mechanism to restrict access to and from containers and a means to separate containers from one another. You may specify the connections and lines of communication between the various parts of your containerized program using networks, which makes it simpler to maintain and expand your program over time.


A service in a containerized application is a standalone piece of software that completes a specified job. Services are often executed in containers and controlled by technology like Docker Compose or Kubernetes. They enable the creation of scalable and adaptable applications since they can be individually installed, adjusted, and updated.

Must Read: Top Docker Interview Questions and Answers

Creating a Docker Compose Environment

You must have completed the following requirements before we can begin using Docker Compose to build multi-container apps:

  • Activate Docker Engine
  • Follow the particular Mac, Windows, or Linux steps when downloading Docker Desktop.
  • If you don’t know how to proceed, choose the WSL 2 back-end for the Windows installation rather than the Hyper-V back-end.
  • Docker Compose installation
  • With Docker Desktop, Docker Compose ought to already be deployed.
  • Use the command docker-compose —version in Docker Desktop to check.
  • Make sure Docker Desktop has WSL integration turned on:

Although creating and installing a Docker image can appear straightforward, managing several containers in a service project can soon become tiresome when you must repeatedly type terminal instructions.

Example: Multi-container Flask application using Redis and Docker Compose

We will use a basic Python web application created using the Flask library and utilize Redis as a repository to show how to deploy numerous containers using Docker Compose.

The Flask application will add some sample data to the Redis database on startup. This list of items will then be retrieved from Redis to complete a product catalogue.

After finishing, you will have successfully used Docker Compose to deploy a multi-container Flask application utilizing Redis as the database. This instructional task is not intended to provide operational icons or the services required to create a functional product catalogue.

1. Set a task directory

$ mkdir docker-flask-redis
$ cd docker-flask-redis
Copy code

2. Install the Flask app.

In your directory structure, make a file named and put the following into it:

# Create a simple Flask app here -->
from flask import Flask, render_template
import redis
import json
app = Flask(__name__)
Set up a Redis client to host our data →
redis_client = redis.Redis(host='redis', port=6379)
# json.dumps converts our test data from a python dictionary of products into a JSON to store in our redis database →
redis_client.set('product', json.dumps([
{'id': 5, 'name': 'Drilling Machine', 'barcode': '406780655784', 'price': 800},
{'id': 7, 'name': 'Press Machine', 'barcode': '522687161043', 'price': 600},
{'id': 6, 'name': 'Desktop', 'barcode': '757543429691', 'price': 350} ]))
# Retrieve product json from redis to be used in our application →
items = json. loads(redis_client.get('product'))
Set up routing as normal... → @app.route('/')
Copy code

In your project directory, make a new file named prerequesites.txt and write the following text inside:

<!DOCTYPE html>
<html lang="en">
<H1>Product Catalog</H1>
<link href="" rel="stylesheet" integrity="sha384-EVSTQN3/azprG1Anm3QDgpJLIm9Nao0Yz1ztcQTwFspd3yD65VohhpuuCOmLASjC" crossorigin="anonymous"></head>
<table class="table table-hover">
<th scope="col">ID</th>
<th scope="col">Name</th>
<th scope="col">Scan</th>
<th scope="col">Cost</th>
<th scope="col">Preferences</th>
<td>{{ }}</td>
<td>{{ }}</td>
<td>{{ item.scan }}</td>
<td>${{ item.cost }}</td>
<td> <button class="btn btn-outline btn-info">More Info</button> <button class="btn btn-outline btn-success">Buy this Product</button> </td> 
Copy code

Although it is outside the purview of this tutorial, I have added comments all through the code to illustrate how to set up a Python Flask App and link it to Redis.

3. Make a Dockerfile.

Using your terminal, you would have to manually define a root directory, provide your configuration settings, and enter other information required to deploy your app if you were to execute a Flask application actively.

All those actions may be described in a single document using the Dockerfile’s handy shorthand. Docker will use this data to create your web app image.

To launch your application, create a Dockerfile:

FROM python:3.7-alpine
COPY prerequesites.txt prerequesites.txt
RUN pip install -r prerequesites.txt
COPY . .
CMD ["flask", "run"]
Copy code

4. Services definition in a Compose file

Make a docker-compose.yml file as follows:

version: "3.9"
    build: .
      - "8000:5000"
      - .:/code
      FLASK_ENV: development
    image: "redis:alpine"
Copy code

The compose file explains to Docker how our two services will interact. A unique container image is used for each service.

Our docker-compose.yml file’s two services are as follows:

  • Web service that creates a Docker image by reading the Dockerfile from the current project directory.
  • On port 5000, the Flask web server is active.

We are prepared to launch our multi-container application after we have described how our two containers will work together.

5. Create and utilize the app using Compose

Your Python program will be deployed if you type docker-compose up in the console while Docker Desktop is active in the background.

If the terminal completes without quitting, it was successful. The appearance will be similar to this:

Use 'docker scan' to run Snyc tests against images to find vulnerabilities and learn how to fix them
docker-flask-redis-web-1 | • Debugger PIN: 141-969-678
docker-flask-redis-web-1 | * Detected change in '/code/', reloading
docker-flask-redis-web-1 | • Restarting with stat
docker-flask-redis-web-1 | * Debugger is active!
docker-flask-redis-web-1 j * Debugger PIN: 141-969-678
docker-flask-redis-web-1 | - - [13/Mar/2023 20:14:44] "GET / HTTP/1.1" 200 - docker-flask-redis-redis-1 | 1:M 13 Mar 2023 21:09:23.032 * 1 changes in 3600 seconds. Saving...
docker-flask-redis-redis-1 j 1:M 13 Mar 2023 21:09:23.033 * Background saving started by pid 15
docker-flask-redis-redis-1 j 15:C 13 Mar 2023 21:09:23.116 * DB saved on disk
docker-flask-redis-redis-1 | 1S:C 13 Mar 2023 21:09:23.116 * Fork CoW for RDB: current 0 PB, peak 0 MB, average 0 MB
docker-flask-redis-redis-1 1:M 13 Mar 2023 21:09:23.135 * Background saving terminated with success 0
Copy code

To view a list of all the containers currently using the Docker engine, start a new terminal and execute docker-compose ps. You ought to notice two services—one for the Redis image and one for the newly developed Flask web application:

Lastly, to view the containers currently operating on your system, you can launch the Docker Desktop GUI:


6. Check out your app in your browser by clicking there.

It’s time to check if our multi-container application works as intended, the point you’ve all been preparing for.

Using your browser, enter http://localhost:8000/ to check whether the app is running.

Based on the log provided, a multi-container application consisting of Flask and Redis was successfully built and deployed. The file seems to have undergone recent changes that triggered a Flask web application restart. The Redis container also performed a background-saving operation without any issues.

The Flask web application is likely the main component of this multi-container setup. It’s a Python-based web framework that enables the rapid development of web applications. Flask provides various features like URL routing, template rendering, and database integration, making it an ideal choice for building web applications.

On the other hand, Redis is a high-performance in-memory data structure store that serves as the backend for this Flask application. It is a popular choice among developers due to its ability to cache frequently requested data, allowing faster access and reduced response times. Redis provides various data structures such as strings, hashes, lists, and sets, making it an ideal choice for storing and retrieving complex data.

As shown in the screenshot below, the Redis container is up and running without any issues:

![Redis Container Screenshot](redis_container_screenshot.png)
Copy code


To view all currently running containers, open a new terminal and execute 

docker-compose ps
Copy code

It’s great to see that the Flask and Redis multi-container application is up and running smoothly. The recent changes made to the file triggered a restart of the application, highlighting the flexibility and responsiveness of this setup. 

The Redis backend plays a crucial role in caching frequently requested data, enhancing the performance and response time of the application. As seen in the provided screenshot, the Redis container is operating without any hiccups. 

To check all containers currently running, use docker-compose ps in your terminal. This combination of Flask and Redis ensures a smooth user experience for your web application by enabling efficient data retrieval and processing. The Flask framework allows for the easy development of web applications with its various features, while Redis adds an extra edge by providing efficient data caching. This combination is a powerful tool for developers to build robust, high-performance web applications. 

With the ability to handle complex data structures and facilitate faster access, Redis has become a popular choice among developers worldwide. By leveraging its strengths in a multi-container setup with Flask, developers can build scalable and responsive web applications that cater to the demands of modern-day users.

Using Flask App For Creating Web Applications

The Flask app is a popular web development framework for creating web applications in Python. It is lightweight and easy to use, making it a favorite among developers. With Docker Compose, deploying a Flask app becomes even simpler, as you can easily define the different services required for the app, such as the web server and database, in the docker-compose.yml file. This allows for easy containerization and deployment of the entire application stack. 

Whether developing a small personal project or a large enterprise application, Docker Compose with Flask can greatly simplify your development process. Redis is an in-memory data structure store used as a database, cache, and message broker. Its simplicity and efficiency make it a popular choice for backend development.

Multi-container systems can be challenging to deploy and manage, but Docker Compose simplifies this process significantly. This tool makes creating, installing, and operating applications with multiple containers easy. 

For instance, if you are developing a Flask app that relies on a Redis backend, Docker Compose can be incredibly useful. Without it, you would have to manually install and configure Flask and Redis on your machine or server- a tedious task that takes up much of your time.

Now let me guide you through the demo of how multi-container is useful in this scenario. 

The Flask and Redis multi-container application allows efficient data retrieval and processing, resulting in a smooth user experience. Let’s follow these steps to see it in action:

1. Open up your terminal and navigate to the project directory that contains the docker-compose.yml file.

2. Execute the following command to start the multi-container setup:

docker-compose up
Copy code

3. Once all the containers are running smoothly, open your web browser and visit http://localhost:800 0/ to access the web application.

4. You can now perform various actions on the website, such as adding new users or retrieving existing ones. As you do so, you will notice how Redis efficiently caches data, resulting in faster retrieval and processing times.

5. The multi-container setup makes this seamless experience possible, allowing Flask and Redis to work together seamlessly and efficiently.

Here’s a screenshot of http://localhost:8000/ after successfully following the above steps:


Using Docker Compose, you have successfully deployed your first multi-container application. If you have experience with Flask web programming, you’re in a terrific position to create your own microservices application. 

The next step is to rework and expand your Python application to incorporate the data models and distinct user and product services required to make the newly generated product catalogue interactive and useful.

Different Types of Docker Containers
Different Types of Docker Containers
In Docker, a container is a running instance of a Docker image. When you start a container, you are creating a new runtime environment that runs the application or more
Docker Explained: What You Need to Know?
Docker Explained: What You Need to Know?
If you want to get started with Docker or improve your current docker implementation, read on for tips and advice on everything from installing Docker to managing and monitoring more

Key Takeaways

This post covered Docker Compose’s fundamentals, including its ideas and lingo, setup and installation procedures, and numerous commands and methods. We discussed the advantages of utilizing Docker Compose, such as its support for networking, volumes, scalability, and load balancing, as well as its potential to simplify the deployment and maintenance of multi-container applications.

Use your knowledge of Docker web development. It won’t take long for you to appreciate Docker Compose’s potential for optimizing your developer process once you get the hold of streamlining container deployment.

If you frequently find yourself entering the same instructions into a console, look over the Docker manual to see if there are any procedures you can automate using a docker-compose.yml file.

Docker Compose is growing and evolving with frequent additions of new features and improvements. Future improvements to Docker Compose’s speed, security features, and better outcomes for multi-node deployments are just a few.

I urge you to use Docker Compose for your own projects since it is a powerful resource for installing and administering multi-container systems. Whether you are a novice or a seasoned user, Docker Compose will make it simple to create, install, and operate your applications effectively and efficiently. Why not test it and see if it can help your projects?

Contributed by – Furkan Khan

Download this article as PDF to read offline

Download as PDF
About the Author

This is a collection of insightful articles from domain experts in the fields of Cloud Computing, DevOps, AWS, Data Science, Machine Learning, AI, and Natural Language Processing. The range of topics caters to upski... Read Full Bio


We use cookies to improve your experience. By continuing to browse the site, you agree to our Privacy Policy and Cookie Policy.