What is Docker? A Brief Introduction

What is Docker

If you are a developer or techie then you must have heard a buzz word Docker and must be wondering what is docker? Docker is a new way of revolutionizing the software development and delivery process. All this noise is happening because more and more companies are adopting Docker at a remarkable rate.

The company behind the Docker has always described the Docker as a solution to fix the problem it works on my machine.

What is Docker?

According to Wikipedia:

Docker is a set of platform-as-a-service (PaaS) products that use OS-level virtualization to deliver software in packages called containers. Containers are isolated from one another and bundle their own software, libraries and configuration files; they can communicate with each other through well-defined channels. All containers are run by a single operating-system kernel and are thus more lightweight than virtual machines.

Let me put this in simple words for you. Docker is an open-source engine that automates the deployment of applications using containers.

Docker was introduced by Solomon Hykes, founder and CEO of the company then called dotCloud back in 2013.

Docker is a container management system that helps us manage Linux Containers in an easier fashion. Docker lets you create images in a virtual environment on your desktop computer or laptop and then you run the commands against them. The actions you perform to the containers running on your local environment will be the same commands or actions that you will run on these images when they are running in a production environment.

This helps you in that you don’t have to do things differently when you go form a development environment to a production environment.

Understand Docker using Problem Cases

Before we start exploring what docker is and how it can benefit you as a developer or a development company, let’s revise the original problems that developers and organizations were facing every day before adopting Docker.

Today in this so advanced world and with all power of Cloud Computing, it is still very easy for a developer ‘s working environment to not match the final production environment.

Developer’s Problem

Personally, I was in the same kind of situation every time I have to deploy my application and end up fixing the production environment issues.

For example, if we are using a window or macOS based development environment, PHP will probably not be running the same version as the Linux server that hosts our production code. Even if the version match, you might have to deal with differences in the configuration and dependencies on which the PHP is running.

There are inevitable chances that the way the file permissions handled on the different operating systems, to name just a potential problem.

All frustration starts when a developer needs to deploy the code to the production server and it doesn’t work. So, the production environment be configured to match the developer’s environment, or should developer-only do their work in an environment that matches those used in the production?

In an ideal world, everything should be consistent, from the developer’s laptop all the way through to the production servers. Every developer has their own way of working and their own preferences-enforcing consistency across all platforms is difficult to achieve.

Docker Solution for Developers

If you are using macOS or Windows or even a Linux based operating system, you can wrap your code in a container with predefined configurations or create by your team using Dockerfile. Then you can continue to use your favorite IDE and maintain your workflow while working on your codebase.

Organization’s Problem

Large organizations suffer from the problems which are critical to their business and occur on a much larger scale which involves a different kind of risk as well. Because of the risk involved in their sales cost and impact of reputation, companies need to test and verify every deployment before it’s release.

Which means the new features and bug fixes will be stuck in a holding pattern while the following takes place:

  • Applications are deployed across the newly configured environments
  • Test plans are executed and the application configurations are tweaked unit the tests pass
  • Requests for changes are written, submitted and discussed to get the application deployed to the production server

This process can take days, weeks or even months depending on the complexity of the application. While this process requires the continuity and availability of the enterprise at a technological level, it does potentially introduce the risk at the business level.

Imagine, if you have a feature stuck in the holding pattern and your competitor releases a similar or same feature ahead of you. Your company’s sales and reputation will be on the stack because of the downtime.

Docker Solution for Organizations

Docker will provide you extra shield from such holding patterns and it will make things a lot easier as your company’s process will be working consistently. This means your developers are working on the same configuration that is running on the production server.

For example, in a real case scenario, when a developer checks their code that works on their local environment, your testing tool can launch the same container to run the automated tests. When all tests pass the testing container can be removed to free up the resources for the next batch of testing.

This means that all your testing process is a lot more flexible and consistent, you can reuse the same environment instead of redeploying servers for the next batch of testing.

The quicker you can complete this process, the quicker you can confidently launch new features and keep ahead of your competitors.

Dedicated Host vs Virtual Machine vs Docker Containers

So, by now you must have an idea about what we are talking about and what is docker. Before we looking at Docker’s more details let’s see how Docker container is different from a dedicated host and a virtual machine.

Dedicated Host

A dedicated host is a dedicated server with dedicated resources to a single client and not shared among other clients.

Dedicated Host
Dedicated Host

As you can see in the above diagram, we have a hardware infrastructure running an operating system. On this operating system, we have a fixed software stack upon which our three applications are running. If we want to run one application on Windows and two applications on Linux based operating system then we can’t do on this dedicated host as it has just one main operating system thus we are locked in.

Virtual Machines

A virtual machine is an emulation of a real computer that runs on top of a physical machine using a hypervisor.

Virtual Machines
Virtual Machines

A hypervisor is a software that runs on physical computers called host machine. The host machine provides the resources to the VMs (virtual machines) such as RAM, CPU, and storage. These resources can be divided among VMs so if one VM is running the heavy application you can allocate the resources to this VM form any VM which is not running memory consuming application.

The VM running on the host machine using the hypervisor is called a guest machine. This guest machine contains both the application and whatever needed to run this application (OS, libs, etc.).

In the above diagram of a virtual machine, we have our hardware infrastructure running a host operating system and then we have a hypervisor running on that operating system. On top of this, we can run three different virtual machines each using a different OS with all the things needed to run our applications.

 Docker Containers

Docker containers provide the operating system-level virtualization and they are not the same as VMs. Containers have their own private computation power, can execute the commands on the root level, have their own network interface and etc. Docker adds an application deployment engine on top of a virtualized container execution environment.

Docker Containers
Docker Containers

If you compare the above diagram for a Docker container with the dedicated host and virtual machines, you will find the main difference is the operating system. The biggest benefit of using Docker is that there is no need for a complete operating system every time we want to fire up a new container, which cuts down the overall size of a container.

Since almost all the versions of Linux use the standard kernel models, Docker relies on using the host operating system’s Linux kernel for the operating system it was built upon, such as Red Hat, CentOS, and Ubuntu.

In the above diagram we can have our applications marked in orange color are running on Ubuntu while application marked in green color is running on CentOS, but there will be no need of installing the Ubuntu or CentOS as we only install the binaries such as a package manager and, for example, Apache/PHP and the libraries required to get just enough of an operating system for your applications to run.

Another benefit of Docker is the size of images are very small when they are created. They are built without the operating system. This makes their size very small, compact and easy to ship.

Docker Architecture

Docker is a very powerful technology that comes with a high level of complexity under the hood. However, its fundamental user-facing structure is simply based on the client/server model.

We will discuss the complex structure of Docker in a later post but for now, we will look at the simplest form of a Docker Architecture.

Docker on our host machine consists of two parts:

  1. A daemon with a Restful API
  2. And a Client which talks to the daemon

A RESTful API is one that uses standard HTTP request types such as GET, POST, DELETE, and others to perform functions that usually correspond to those intended by HTTP’s designers.

Docker Components Flow
Docker Components Flow

We invoke the Docker client to get information or give instructions to the daemon.

A daemon is a server which receives the requests and return the responses from the client using HTTP protocols.

The server or daemon make requests to other services to send and receive images, using the HTTP protocol.  The server accepts the requests from a command-line interface.

The daemon also looks after the images and containers behind the scenes, whereas the client acts as the intermediary between you and the Restful API.

Docker uses three main components:

Docker Architecture Overview
Docker Architecture Overview

Docker Containers

Docker containers are isolated user-space environments running the same or different applications and share the same host OS.

Docker Images

Docker images include the application libraries and application itself.  These images are used to create containers. You can create or update your own images or you can download from the Docker’s Public registry called Docker Hub.

Docker Registries

Docker registry is much like a place where you can host the images. Docker registries can be public or private.

Why Docker is Important?

The first reason why docker is important is it provides abstraction which allows you to work with complicated things in simplified manners. Instead of focusing on all the complexities associated with installing an application all we have to consider is what software we would like to install.

The process of installing software using Docker is similar to loading a shipping container onto a ship using a crane on the shipping dock. The size and type of things inside the shipping container may vary but how the crane will pick up the container will always be the same.

Docker is important as it makes containers available to everyone which saves time, money and energy.

The second main reason for Docker’s importance is there is a significant demand in the industry to adopt the containers and Docker. This demand is so high even big companies like Amazon, Google and Microsoft are offering the container-based solutions in their Cloud Computing stacks.

Third, finally, we starting to see the better adoption of the advanced isolation features of the operating system. It’s great Docker is helping us to utilize the advanced isolation features without any complexity.

Where and When to Use Docker

You can use Docker on any computer whether at home or work. Docker can run only applications that can run on a Linux operating system, if you want to run Windows-native application you can not do so using Docker.

Using docker on your day to day tasks will help you to keep your computer clean. Which helps you not to face the shared resources issues like VMs.

The most important thing to remember is when containers are inappropriate. Containers won’t help much with the security of programs that have to run with full access to the machine.

Containers are not a total solution for security issues but they can help you to prevent many attacks.

Why Docker is So Popular

Docker has become very popular because of the possibilities it opens for software delivery and deployment. To me, there are five main reasons for Docker’s popularity.

Fast Scaling

By using the containers, we can get more work done without expanding our hardware infrastructure. In the past, the only way to scale up an application was by throwing more resources which ultimately increase the cost of the project. Docker containers allow data center operators to cram far more workloads into less hardware infrastructure.

Ease of Use

One of the main reasons for Docker’s popularity is how easy it can be used. Docker can be learned quickly because of many resources available online on form creating an image to managing a large number of containers.

Docker is an open-source engine that can be used on a computer with an operating system that supports VirtualBox, Docker for Mac/Windows.

Better Software Delivery

Docker container images are light-weight and portable which ensure the delivery of software is robust and efficient. The container includes an isolated disk volume, that volume goes with the container as it is developed and deployed to various environments. The container also includes software dependencies such as libraries and configuration. If a container works on your local machine, it will work the same on the dev, staging or production machines.

Flexibility

Containerized applications are more flexible than those of non-containerized applications. Container orchestrators handle the running of hundreds of Docker containers while monitoring them at the same time. Container orchestrators are very popular tools for managing large deployments and complex applications. There are many orchestrator tools available in the market but Kubernetes is currently ruling the market.

Networking

The Docker Engine and CLI allow us to define isolated networks for containers without touching any network infrastructure. Developers and operators can design systems with complex network topologies and defines the network in configuration files.

Final Thoughts

Complete understanding of Docker can be challenging if you are coming from a non-technical background, but I have tried my best to give you an overview of Docker technology. I hope you have found the answer to What is Docker? question and learned how Docker helps to solve the problems for developers, administrators, and software users.

2 comments on “What is Docker? A Brief Introduction

  1. Awesome explanation!! I have never seen anyone explained in so detail and with example. Thanks a lot for sharing this with us.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

*
*