Skip links

Kubernetes Microservices with Docker

Here we will examine using Kubernetes Microservices with Docker – what are these elements? When, how and why should they be used and how can they be used in conjunction with one another?

Containers – What are they?

‘Containers’ allow applications to be packaged and separated from their original running environment with all aspects needed to run the application (code, run time, system tools etc) packaged inside the container object.

This ‘containerisation’ then allows these applications to be deployed in a range of environments – including public clouds and private data centres to bare metal.

This separation and ability to run anywhere gives users the chance to create predictable, isolated and consistent environments in which to deploy their applications and remove the need for an application to be dependent on specific configurations, their libraries, dependencies and files. This allows the application to move through production more easily.

As Containers offer a lightweight operating system for users that virtualise at the operating system level and share the OS kernel, this means they take up a lot less memory compared to boosting an entire OS and start much more quickly.

Kubernetes Deployment Model
Kubernetes Deployment Model Image Credit: https://thenewstack.io/

Why and When to use Kubernetes

As containers have become more popular, as have methods to manage and orchestrate them – with one of the most popular options being Kubernetes.

Originally designed by Google, Kubernetes, (Sometimes referred to as  ‘K8s’) are a portable, open-source platform used for managing containerised workloads and services. Using Kubernetes removes many manual processes in both deploying, scaling and operating containerised applications and automates them – saving time and complication for developers.

 Use Cases and Benefits of Kubernetes

Some of the primary use cases and benefits for using Kubernetes include:

  • Multi-cloud Adoption
  • Speeding up the development and  deployment of applications
  • Applications using Microservice Architecture
  • Lowering the cost of infrastructure

Microservices (Microservice Architecture)

Microservice Infrastructure
Microservice Structure. Image Credit:  microservices.io/

Microservices (also known as Microservice Architecture) is a style of application structure where applications are structured as a collection of services. Microservices development is based on breaking up applications and building small, independent units that communicate and execute the application’s predefined logic.

This design method allows developers to create highly flexible and scalable applications that enable the delivery of large and complex applications that are fast, frequent and reliable.

Microservice service collections offer simplicity for the users and are highly maintainable, loosely coupled and independently deployable as well as being organised around business capabilities. Whilst Microservice architecture does not necessarily require a cloud, they do function well within and compliment the cloud.

Microservices Use Cases

Some of the primary use cases of microservices include:

  • Legacy Applications Refactoring
  • Applications that deal with Big Data
  • Real-Time Processing Applications
  • Large Scale, Complex logic systems
  • Third-Party Applications that serve multiple users

When to use Microservices

There are several instances in which Microservice architecture can or should be used. These include;

  • When building agile applications that need rapid delivery
  • When legacy applications need to be rewritten in a new language or tech stack
  • When building a monolithic application to accommodate scalability, agility and management
  • When there is a standalone business application which needs to be reused across multiple channels

Microservice Architecture Challenges/Issues

As with any product or service, whilst Microservices offer developers many benefits, there are also some downsides to be aware of before committing to using microservices. Much of these issues relate to the complex architecture of Microservices.

These include:

  • Need for increased collaboration between teams
  • Increased difficulty in testing and monitoring
  • More work needed to maintain the network due to less fault tolerance and increased need for load balancing
  • Increased security issues

Migrating to Microservices

Migrating applications to microservices carry a range of benefits including removing manual effort duplication, reducing programmatic development risks, improving system control and providing a single, unified view of the data.

When developers choose to migrate their applications to microservices there are certain criteria selections that will ensure the migration is successful and as beneficial to the application as possible.

When converting monolithic applications to microservices, users must identify the elements of the application that can be taken off and moved to separate microservices. Some of the criteria and methods for identifying these parts and separating the application include; looking for business logic in which the application can be separated, finding naturally isolated code and looking for logic that will benefit most from scaling of configuration settings or memory requirements.

Google Cloud Identifies the following steps for successful App Migration here:  (Source)

  • Leaving the existing code in place and operational in the legacy application to facilitate rollback
  • Creating a new code repository, or at least a sub-directory in your existing repository
  • Copying the classes into the new location
  • Writing a view layer that provides the HTTP API hooks and formats the response documents in the correct manner
  • Formulating the new code as a separate application (create an app.yaml)
  • Deploying your new microservice as a service or separate project
  • Testing the code to ensure that it is functioning correctly
  • Migrating the data from the legacy app to the new microservice. See below for a discussion
  • Altering your existing legacy application to use the new microservices application
  • Deploying the altered legacy application
  • Verifying that everything works as expected and that you don’t need to roll back to the legacy application
  • Removing any dead code from the legacy application


Microservices and Containers

Whilst Containers and Microservices are related – they are not the same and do not require one another to function. Microservices can run within containers but are also capable of running independently as a fully provisioned VM where a container is not needed.

The two do, however, regularly work in conjunction with one another. When developing and deploying microservices, containers are beneficial as container tools can be used to manage microservice-based applications.


Service Orientated Architecture (SOA)

Another software design approach and stage of application development to note is Service-Orientated Architecture (SOA)

Service-Orientated Architecture describes a method of making software components reusable via service interfaces. Using SOA design style requires multiple services communicating with services provided to other components by application components via a network’s communication protocol. This communication is carried out through either the passing of service communication or via activity coordination.

Typically implemented with web services, SOA principles are independent of vendors and other technology forms and can work both with and without Cloud computing. However, businesses are increasingly choosing to move file storage to the cloud, which is why it is convenient to use cloud services and SOA in conjunction.

SOA Characteristics and Benefits

As a concept, The defining characteristics of SOA for business are; Business Value, Strategic Goals, Intrinsic Inter-operability, Shared Services, Flexibility and Evolutionary Refinement. Some of the key benefits and uses of Service-Oriented Architecture are;

  • Creating Reusable Code and using multiple coding languages
  • Promoting Interaction by putting standard communication forms in place. This allows different systems to function independently
  • Greater Scalability through reducing client-service interaction
  • Cost Reduction via limiting analysis requirements


Docker and Docker Container Images

Docker Hub Logo



Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Because of the container, the application will run on any machine regardless of any customised settings that machine might have that could differ from the machine used for writing and testing the code. The container approach also allows developers to run applications written in Docker on serverless platforms such as AWS Lambda.

Docker Hub Container Images

Launched in 2013, Docker Hub offers the usage of container images – boasting the world’s largest collection of container images with over a billion options available for use.

A Docker container image acts as a lightweight, executable software package encompassing all of the above aspects needed to run an application. The container images become containers at runtime when run on Docker Engine, which is available on both Linux and Windows applications and can run on either infrastructure. This ensures that despite differences in development and staging, the containers can isolate the software from its environment

Find out more about Docker Hub


Kubernetes Microservices with Docker

Now that we know more about Kubernetes, Microservices and Docker, we should examine how to use them together. Kubernetes relies on containerisation technology to orchestrate and schedule applications. There are a lot of options such as RKT and more recently CRI-O, a CNCF sponsored Incubating project; 

Open Container Initiative-based implementation of Kubernetes Container Runtime Interface. However, the default option is to utilise Docker as it is currently the most popular tool in this space and a lot of effort has been made in ensuring it works efficiently with Kubernetes.

If you imagine an organisation with hundreds of applications, each application consisting of tens to hundreds of microservices running in containers – it’s easy to see how managing all of these microservices can become near impossible.

As the leading orchestration platform for containers, Kubernetes is the ideal platform to run microservice applications. Kubernetes takes care of all the heavy lifting your infrastructure teams would have to do without an orchestration platform such as

  • Ensuring services are always running
  • Cost-effectively utilising infrastructure
  • Providing resiliency and high availability for microservices
  • Easing the networking burden between applications
  • Providing a common platform for all applications to share things like secrets, DNS and configurations

As the largest project to come out of the Cloud Native Computing Foundation (CNCF), Kubernetes is the most popular project on GitHub with a massive community of contributors from across the Globe. To get the most out of Kubernetes, use a managed service offered by Cloud Providers to take care of the running and managing of Kubernetes.


Running Clusters

Kubernetes clusters can be difficult to manage on their own as well as configured to provide a strong security posture. Traditionally, Kubernetes would be installed by a platform team onto virtual machines in the cloud. However, as previously mentioned – cloud providers such as AWS, Google and Microsoft offer Kubernetes managed services.

These managed services take care of setting up, managing and upgrading your Kubernetes cluster as well as ensuring it is highly available and resilient. For a monthly fee, managed clusters provide a great platform for innovation and enterprise-grade platforms.

That being said, managed Kubernetes services don’t do everything, the following tasks are required to establish a quality Kubernetes platform:

  • CICD pipelines to automate build and deployments to the cluster
  • Monitoring and alerting for both infrastructure and applications
  • Configuration management of the cluster
  • Further hardening the cluster with additional security policies
  • Creating cost-saving measures – such as spot/preemptive instances


Tools to be mindful of

Many of the tools that support the platforms and concepts mentioned above can be found on the CNCF website.