The Software Design School (SDS) Toolbox is a collection of guides and resources to help you get started with the various tools and technologies used in software engineering.
This guide aims to enable you to use Docker as an integral part of the development process for a Node.js web application. The focus is on providing the skills necessary to compile, run, and manage code with Docker containers, highlighting Docker’s role as a versatile development toolbox that enhances workflow consistency and efficiency.
Docker is an open platform for developing, shipping, and running applications.
Docker allows you to separate your applications from your infrastructure so you can deliver software quickly. With Docker, you can manage your infrastructure in the same ways you manage your applications.
By taking advantage of Docker’s methodologies for shipping, testing, and deploying code quickly, you can significantly reduce the delay between writing code and running it in production.
Reference Docker Docs
Node.js is a powerful, open-source JavaScript runtime environment that enables developers to execute JavaScript code server-side. Renowned for its efficiency and scalability, Node.js operates on the V8 engine, allowing for high-speed execution of JavaScript outside the web browser.
It adopts an event-driven, non-blocking I/O model, making it particularly well-suited for building scalable network applications like web servers, real-time communication systems, and API services.
Node.js comes with npm, a vast package manager, enriching its ecosystem with a wide array of libraries and tools. This cross-platform environment is favored for its ability to handle concurrent requests efficiently, making it a popular choice for modern web development, especially in applications requiring real-time capabilities.
With a robust community, Node.js has become a staple in the technology stacks of many companies and developers worldwide.
Reference Node.js Website
The objective of this tutorial is to empower you with the ability to integrate Docker effectively into the development process of a Node.js web application. Here’s an overview of why Docker and Node.js are pivotal in this hands-on session:
Enhancing Development and Workflow Consistency: Docker’s role as a containerization platform is critical in establishing a consistent, efficient workflow. By focusing on Docker, you learn to compile, run, and manage Node.js code with containers, ensuring that the development environment is replicable and consistent across any platform.
Practical Application in Course Projects: While not mandatory, the skills acquired in this session can greatly benefit you in the course-related projects and assignments, especially if they choose to utilize Node.js and Docker.
Leveraging Seamless Integration with Development Tools: Docker’s compatibility with a range of development tools, like Nodemon for Node.js, exemplifies its role in streamlining the development process. These tools automate and simplify tasks, enhancing the overall efficiency of developing, testing, and debugging Node.js applications.
Benefiting from Vast Community Support: Both Docker and Node.js are supported by robust online communities. This vast network offers an abundance of resources, guidance, and shared knowledge, which you can leverage for troubleshooting, learning best practices, and keeping up-to-date with the latest advancements in web development.
Generated with the help of ChatGPT
Ensure that you have Docker set up and running in your system.
Here are some common terminologies used in Docker that you should familiarise yourselves with:
| Term | Description |
|---|---|
| Docker Daemon | Listens to Docker API requests. Manages Docker objects - images, containers, networks and volumes. It can also communicate with other daemons. |
| Docker Image | Read-only templates used to create Docker containers. You can create your own image or use pre-existing ones. |
| Docker Container | A runnable instance of an image. You can create, start, stop, move, or delete a container using the Docker API or CLI. |
| Docker Registry | A repository for Docker images. Docker Hub is the default registry. Using docker pull or docker run commands uses the required images from the configured registry. |
| Docker Client | The primary way users interact with Docker. It sends commands to the Docker Daemon. It can communicate with >1 daemon. |
| Docker Desktop | A GUI tool that includes the Docker Daemon, Client, Docker Compose, Content Trust, Kubernetes, etc. |
| Docker Objects | Images, containers, networks, volumes, plugins, etc. |
Referenced from SDS SE Toolbox - Containerization
| Instruction | Usage |
|---|---|
| FROM | Specifies the base image to start building your image. For example, FROM ubuntu:18.04 starts with the Ubuntu 18.04 image. |
| RUN | Executes a command and commits the results. Used for installing software packages, for example. |
| COPY and ADD | Both are used to copy files from the host filesystem to the container. COPY is straightforward, while ADD has some extra features like remote URL support and tar extraction. |
| CMD | Provides a command and its default arguments that will be executed when the container starts. Only the last CMD instruction is effective. |
| ENTRYPOINT | Similar to CMD, but is meant to define the container’s main executable and its arguments are appended to the entrypoint. |
| ENV | Sets environment variables. |
| EXPOSE | Indicates which ports the container listens on. |
| WORKDIR | Sets the working directory for any RUN, CMD, ENTRYPOINT, COPY, and ADD instructions. |
A part of this table was generated with the help of ChatGPT
FROM, RUN, COPY, ENTRYPOINT, etc.Dockerfile# Use an official Node.js runtime as a parent image
FROM node:24-alpine
# Set the working directory in the container
WORKDIR /app
# Copy package.json and package-lock.json
COPY package*.json ./
# Install any dependencies
RUN npm install
# Bundle app source inside the Docker image
COPY . .
# Make port 3000 available to the world outside this container
EXPOSE 3000
# Define the command to run your app
CMD ["node", "app.js"]
FROM statement..dockerignore to avoid copying unnecessary files into your Docker image.docker run
-d: Run container in detached mode (in the background).--name [name]: Assign a name to the container.-p [host port]:[container port]: Map a host port to a container port.-v [host directory]:[container directory]: Mount a volume.docker run -d --name my-app -p 3000:3000 my-node-appmy-node-app image in detached mode, names it my-app, and maps port 3000 on the host to port 3000 on the container.docker ps
-a: Show all containers (default shows just running).docker ps -adocker images
-a: Show all images (default hides intermediate images).--format ": ": Custom format for listing.docker images --format ": "docker pull
docker pull nodedocker build
-t [name]:[tag]: Name and optionally a tag in the ‘name:tag’ format.--file [Dockerfile path]: Specify the location of the Dockerfile.docker build -t my-node-app:1.0 .docker exec
-it: Interactive terminal.--user [username]: Specify the username to use inside the container.docker exec -it mynginx /bin/bashdocker stop
docker stop my-appdocker rm
-f: Force the removal of a running container.docker rm my-appdocker rmi
-f: Force removal of the image.docker rmi my-node-app:1.0docker logs
--follow: Follow log output.--tail [number]: Number of lines to show from the end of the logs.docker logs --tail 100 my-appdocker network create
--driver: Specify network driver (e.g., bridge, overlay).docker network create --driver bridge my_bridge_networkdocker volume create
docker volume create my_volumeDockerHandsOnTutorial/demo-app).The Dockerfile is currently empty. The aim of this hands-on is to teach you how to write a Dockerfile.
As we are interested to build a Node.js Application using React, we need to have a runtime environment that has our desired version of Node.js installed. Thus, we would use the official Node image as a parent image to acheieve our objective.
Add the following line to the Dockerfile:
FROM node:24-alpine
Next, we specify the working directory in the container.
Add the following line to the Dockerfile:
WORKDIR /app
In the case of Node.js applications, we need to copy the package.json and package-lock.json (or yarn.lock) files as these include the relevant dependencies of our app. Node.js relies on these files to lookup and install the dependencies.
Add the following line to the Dockerfile:
COPY package*.json ./
After the files have been copied, the relevant dependencies have to be installed, and on using the command npm install, the required node modules (dependencies) are installed.
Add the following line to the Dockerfile:
RUN npm install
After all dependencies have been installed, we copy the entire source code directory into the working directory of the container and this includes all of the relevant code.
Add the following line to the Dockerfile:
COPY . .
Next, we want to be able to interact with our app and as React based apps by default start on port 3000, we expose that port. This allows the outside world to interact with the created container and our app using port 3000.
Add the following line to the Dockerfile:
EXPOSE 3000
Now that all is set up, we want to start our app. For starting React based apps, the command used is npm start, and hence that is what we add as a command next.
Add the following line to the Dockerfile:
CMD ["npm", "start"]
Finally, the created Dockerfile should look like this:
FROM node:24-alpine
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD ["npm", "start"]
Now that the Dockerfile is setup, we have a skeleton for the image of our app.
Using the docker build command, we will now create an image of our app.
Open a command line/terminal window and navigate to the downloaded/cloned repo, and then into the demo-app directory where the Dockerfile is located.
Run the following command:
docker build -t docker-demo-app .
-t flag tags the create image with the name docker-demo-app.. refers to look for the Dockerfile in the Current Working Directory (CWD)With the Image ready, we can run our React app using Docker.
In the same command line/terminal window, run the following command:
docker run --name my-app -p 3000:3000 -d docker-demo-app
-p allows to bind our systems port 3000 to the port 3000 of container.-d allows to run the Docker Container in detached mode.Open your browser of choice and go to http://localhost:3000/ to interact with the React App.
The running container can be stopped and removed using the command:
docker stop my-app
docker rm my-app
Binding the Current Working Directory with the Docker Container allows you to edit your code on your local system and see the changes in the app in real-time that is running on the Docker Container.
Note: Ensure you have the folder node_modules with all the necessary dependencies in your local system. If not, run npm install locally.
In the same command line/terminal window, run the following command:
macOS users:
docker run --name my-app -p 3000:3000 -v "$(pwd):/app" -d docker-demo-app
Windows Command Line users:
docker run --name my-app -p 3000:3000 -v "%cd%:/app" -d docker-demo-app
Windows Powershell users:
docker run --name my-app -p 3000:3000 -v ${PWD}:/app -d docker-demo-app
-v allows to mount the Current Working Directory as a volume in the Docker Container.Go to src/App.js and add the following code inside the function App():
function toggleAnimation() {
var logo = document.querySelector(".App-logo-clockwise");
var isLogoRotatingClockwise = logo !== null;
if (isLogoRotatingClockwise) {
logo.classList.remove("App-logo-clockwise");
logo.classList.add("App-logo-anti-clockwise");
} else {
logo = document.querySelector(".App-logo-anti-clockwise");
logo.classList.remove("App-logo-anti-clockwise");
logo.classList.add("App-logo-clockwise");
}
}
In the same file, add the following code inside the <header> tag, right after the <p> tag
<button className="spin-btn" onClick={() => toggleAnimation()}>
Toggle Spin Direction
</button>
Press CTRL/CMD + S
You should now be able to see a button that says “Toggle Spin Direction”, which on clicking will change the spin direction of the React logo.

The running container can be stopped and removed using the command:
docker stop my-app
docker rm my-app
:warning: Note: If you do not bind volumes, such changes can not be seen in real-time, and the containers have to be stopped and run again, making the process tedious. Please refer to the section on Binding the Current Working Directory
Create a Dockerfile for demo-service based on the following inputs:
node:24-alpine/app3001npm startBuild and Run the container.
Go to http://localhost:3001/ and you should see the following output:

Once completed, stop and remove the container.
Docker Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services, which simplifies the process of managing and deploying multi-container applications.
Generated with the help of ChatGPT
Generated with the help of ChatGPT
docker-compose.ymlservices:
web: # the name of the first service
image: nginx:latest # uses the latest Nginx image
ports:
- "80:80" # maps port 80 of the container to port 80 on the host
volumes:
- ./html:/usr/share/nginx/html # mounts the 'html' directory from the host to the container
networks:
- webnet # links this service to the network named 'webnet'
db: # the name of the second service
image: postgres:latest # uses the latest PostgreSQL image
environment:
POSTGRES_DB: mydatabase # sets the PostgreSQL database name
POSTGRES_USER: user # sets the PostgreSQL user
POSTGRES_PASSWORD: password # sets the PostgreSQL password
volumes:
- db-data:/var/lib/postgresql/data # mounts the 'db-data' volume to the container
networks:
- webnet # links this service to the network named 'webnet'
volumes:
db-data: # declares a volume named 'db-data' for persistent data storage
networks:
webnet: # declares a user-defined network named 'webnet'
docker-compose up
-d: Detached mode: Run containers in the background.--build: Build images before starting containers.--no-deps: Don’t start linked services.--force-recreate: Recreate containers even if their configuration and image haven’t changed.docker-compose up -ddocker-compose down
--volumes or -v: Remove named volumes declared in the volumes section of the Compose file and anonymous volumes attached to containers.--rmi type: Remove images, type may be all or local.docker-compose down --volumesdocker-compose build
--no-cache: Do not use cache when building the image.docker-compose build --no-cachedocker-compose logs
--follow or -f: Follow log output.--tail [number]: Number of lines to show from the end of the logs for each container.docker-compose logs -fdocker-compose ps
--services: Display services.--all or -a: Show all stopped containers.docker-compose psdocker-compose restart
-t or --timeout: Specify a shutdown timeout in seconds (default is 10).docker-compose restartdocker-compose stop
-t or --timeout: Specify a shutdown timeout in seconds.docker-compose stopdocker-compose start
docker-compose startdocker-compose exec
-d: Detached mode: Run command in the background.--user [USER]: Run the command as this user.docker-compose exec -d myservice /bin/bashdocker-compose rm
-f or --force: Don’t ask to confirm removal.-v: Remove any anonymous volumes attached to containers.docker-compose rm -fOpen a command line/terminal window and navigate to the root directory of the DockerHandsOnTutorial repository that you’ve downloaded or cloned. This directory contains the docker-compose.yml file.
Execute the command below to build all the services, networks, and volumes defined in your docker-compose.yml file. This process is efficient as it doesn’t necessitate running individual build commands for each service.
docker-compose build --no-cache
--no-cache forces docker to rebuild all images from scratch, ignoring any cached layers from previous builds.By executing the following command, all the services defined in the docker-compose.yml file are ran.
docker-compose up -d
-d runs all containers in detached mode.Open your browser of choice and go to http://localhost:3000/ to interact with the React App.
The running container can be stopped using the command:
docker-compose down
Edit the docker-compose.yml file and add demo-service as a service in the file.
Build and Run the containers using docker-compose commands.
Go to http://localhost:3000/jokes and you should see the following output:

Feel free to add and edit code to see changes happen in real-time.
As we conclude this tutorial, it’s clear that integrating Docker into the Node.js development process brings substantial benefits, streamlining and refining how we build, run, and manage applications. Key highlights include:
Resolving “Runs on My Machine” Dilemma: Docker stands out in its ability to eliminate the all-too-common “runs on my machine” issue. By containerizing the Node.js environment, Docker ensures consistent behavior across different development and production settings, thereby promoting reliability and reducing environment-specific bugs.
Streamlined Dependency Management: A major advantage of using Docker is the abstraction of dependency management. Containers come pre-packed with all necessary dependencies, obviating the need for individual installations. This not only simplifies the setup but also avoids conflicts arising from mismatched dependency versions.
Immediate Project Ramp-Up: Docker’s containerization allows developers to bypass the time-consuming setup process traditionally associated with development environments. With everything encapsulated in a container, starting a new project or joining an existing one becomes a quick and straightforward process.
Simplified Collaboration and Distribution: Sharing a project is as simple as sharing its Dockerfile and docker-compose file. This ease of distribution ensures that team members can replicate the exact development environment effortlessly, paving the way for seamless collaboration and consistency across teams.
Generated with the help of ChatGPT
You may learn more about the following with plenty of resources available all over the internet:
Some parts of this guide were structured, formatted, and refined with the assistance of ChatGPT. The model was used to draft technical explanations and generate code snippets. All code snippets used in the guide and command sequences were reviewed, implemented, and tested by the teaching team to ensure accuracy and functionality.