Skip to main content

Docker for dummies


The last couple of years, Docker and more general containers and container orchestration systems (like Kubernetes), are a very hot topic in the infrastructure and technology world. While most technical people, like software engineers and software architects, are already well acquainted with these terms, for managers and business people the importance of those technologies are still hard to grasp.
This blog aims to introduce some basic concepts for non-technical people about Docker containers, and how it impacts an (your) organisation.
Before diving in the details, some figures to show the explosiveness of the "container" trend in the IT landscape:
  • More than 8 billion Docker containers have already been downloaded
  • Docker Inc., the company behind Docker, was valuated at $1 billion in 2015, when the company reported less than $10 million in revenues. In early 2017, the revenues of the company are estimated between $25 million and $50 million.
  • In 7 funding rounds Docker Inc. has collected $180 million from investors
  • Docker is used in all the leading technology companies, like e.g. eBay, Spotify, Yelp…​
  • Over 3,000 developers worldwide contribute to the evolution of the Docker open-source platform
  • Forrester estimates that about 10 percent of enterprises currently use containers in production, but up to a third are testing them.
  • A yearly study of Datadog (provider of monitoring software) shows similar trends:
    • In 2016 13.6 percent of Datadog’s customers had adopted Docker. One year later that number has grown to 18.8 percent.
    • Large companies are often faster adopters than small companies (contrary to most other trends). Nearly 60 percent of organizations running 500 or more hosts have started using Docker.
An impressive track record for sure, but what are the drivers behind this impressive success.
In most organisations Virtual Machines (= VMs) are already common good for years. These VMs allow to run on 1 physical server multiple Virtual Machines (sort of tiny computers inside the server), which have fully independent operating systems, that run isolated inside the main operating system of the server (i.e. the host operating system).
While VMs are embedded in almost each organisation for multiple years, they face some important issues:
  • Around 2 GB of disk space is required for each VM (so if you run on 1 server, 5 VMs you already lose 10 GB of disk space)
  • Quite some time to activate
  • Considerable memory and other resources are used by the VM itself (even if no application running on it)
Containers aim to provide an answer to these issues. While containers provide a similar isolation as Virtual Machines, they do it much more light weight. Instead of deploying a full operating system, containers allow to run multiple isolated systems (= containers) on a single operating system. This means the different containers share the same operating system, while still guaranteeing the isolation of a Virtual Machine. This means containers have a lot less overhead and hence allow a more efficient usage of the system resources than Virtual Machines.
Such a light weight virtualization provides a lot of possibilities and advantages for an organisation:
  • Application Portability: containers allow to package just about any application with all its dependencies into a standardized unit (i.e. a shipping container system). This way an application becomes very portable, as users don’t need to spend time in the installation of the applications and all its dependencies (by just deploying the container, a full installation can be done, including all configuration settings).
  • Optimize costs: the light weight nature of containers, allows to use machine resources more efficiently compared to traditional Virtual Machines. Typically, a server can contain four-to-six times more application instances in containers than if VMs would be used. This means an organisation can generate huge savings in power and hardware costs, when switching from VMs to containers.
  • Create self-sufficient systems: as containers package applications with all their configuration and dependencies into a standardized unit, it becomes very easy to create development and test environments which are identical to the production environment. This means companies can create very easily sandboxes to test specific behaviour in a production-like environment.
  • Security: containers come with their own security layer, meaning that a breach in 1 application (container) can not compromise other applications running on the same server. Specific container features allow to further increase security, e.g. a container can be configured as read-only, allowing to secure very strongly read-only applications (like data reporting & visualization applications).
  • Resilience: the usage of containers also increases resilience, as failures can be isolated inside 1 Docker container.
  • Monitoring: resource monitoring can be done more fine-tuned as resource usage can be monitored at container level, which fits with the scope of the application.
While the principle of containers exists already for several years, the break through has come with the rise of Docker.
Docker is an open-source project, which has simplified the usage of containers and allowed to define a standard for containers (backed-up by several large players in the industry). A very simple set of commands allows to create and manage containers, allowing to move the creation, configuration and deployment of containers to the developers (fully in line with the DevOps philosophy), instead of to operations specialists (which is still the case when using Virtual Machines).
Docker images are the basis of containers. An image is like a description of the environment you want to run in the container. You specify which operating system you want to start from, which additional tools and libraries to install, which files from your computer to copy to the image and so on.
The success of Docker containers has created a full ecosystem of technologies and companies, providing services and products building further on this technology. For example, 95% of the code developed by the company of Docker, is not directly related to the Docker containers, but to the surrounding tools to accelerate and simplify the usage of containers.
Some examples of the most important tools developed by companies on top of Docker containers are:
  • Docker Hub: a way to share containers with the rest of the world. Docker makes containers reusable and shareable (just like GitHub did for source code). Over 500,000 Dockerized applications are already deployed on Docker Hub.
  • Docker Compose: define which containers to run and how they are linked together. The tool allows to run all the linked containers with one single command.
  • Cluster Management Systems (also called "Container Orchestration Tools"): allow to automatically manage a distributed system, by spinning up and down containers based on the resource needs. The most known container orchestration tools are Kubernetes, Mesos and Docker Swarm, with a clear market consolidation on Kubernetes.
  • Service Meshes: allow to control service-to-service communication over a network. Common features provided by a service mesh include service discovery, load balancing, encryption and failure recovery. The most known services meshes are Istio, Linkerd, Consul Connect and Kong Mesh, with a first tendendy on consolidation towards Istio.
The explosive success of Docker is even surpassed by the exponential rise of Kubernetes, for which an impressive ecosystem has been built out in a matter of a few years. Kubernetes is now becoming more and more a sort of operating system for distributed systems, abstracting away all the complexities of managing distributed servers.
Important to note is that these trends go hand-in-hand (all enforcing each other) with other evolutions in the market, i.e.
  • The rise of cloud players like AWS (Amazon Web Services), GCP (Google Cloud Platform) and Azure (Microsoft). The use of containers and container orchestration tools is perfectly in line with the philosophy of the cloud, where resources can elastically scale upon the needs.
  • DevOps principles, with principles like Continuous Integration and Continuous Deployment, are highly facilitated by containers
  • Micro-service architectures are ideal for containers, as each micro-service can be bundled in a container, which can be easily spinned up or down based upon the business needs
The result is that the typical lifespan of a virtualization is decreasing considerably. According to a study of Datadog, VMs have an average lifespan of 23 days, Docker containers without orchestration of 5.5 days and Docker containers with orchestration of less than a day.
This trend is likely to continue, ultimately resulting in "Functions as a Service" (also called serverless functions or lambda functions). This technology allows companies to pay only per function call, i.e. each time a function is called a container is spinned up to run the function and afterwards spinned down. This is the ultimate abstraction of infrastructure, where companies have no notion at all anymore of the underlying servers (infrastructure).
The above description shows the importance of these new technologies, but also of the speed (faster than any other IT technology in the past) in which this space is evolving and maturing. Banks should therefore invest in specialists in these new technologies, but also consider more and more a switch to public clouds, where these evolutions are followed by specialists of the cloud-providers and abstracted away in managed (container and Kubernetes) setups.

Comments

Post a Comment

Popular posts from this blog

Transforming the insurance sector to an Open API Ecosystem

1. Introduction "Open" has recently become a new buzzword in the financial services industry, i.e.   open data, open APIs, Open Banking, Open Insurance …​, but what does this new buzzword really mean? "Open" refers to the capability of companies to expose their services to the outside world, so that   external partners or even competitors   can use these services to bring added value to their customers. This trend is made possible by the technological evolution of   open APIs (Application Programming Interfaces), which are the   digital ports making this communication possible. Together companies, interconnected through open APIs, form a true   API ecosystem , offering best-of-breed customer experience, by combining the digital services offered by multiple companies. In the   technology sector   this evolution has been ongoing for multiple years (think about the travelling sector, allowing you to book any hotel online). An excellent example of this

Are product silos in a bank inevitable?

Silo thinking   is often frowned upon in the industry. It is often a synonym for bureaucratic processes and politics and in almost every article describing the threats of new innovative Fintech players on the banking industry, the strong bank product silos are put forward as one of the main blockages why incumbent banks are not able to (quickly) react to the changing customer expectations. Customers want solutions to their problems   and do not want to be bothered about the internal organisation of their bank. Most banks are however organized by product domain (daily banking, investments and lending) and by customer segmentation (retail banking, private banking, SMEs and corporates). This division is reflected both at business and IT side and almost automatically leads to the creation of silos. It is however difficult to reorganize a bank without creating new silos or introducing other types of issues and inefficiencies. An organization is never ideal and needs to take a number of cons

RPA - The miracle solution for incumbent banks to bridge the automation gap with neo-banks?

Hypes and marketing buzz words are strongly present in the IT landscape. Often these are existing concepts, which have evolved technologically and are then renamed to a new term, as if it were a brand new technology or concept. If you want to understand and assess these new trends, it is important to   reduce the concepts to their essence and compare them with existing technologies , e.g. Integration (middleware) software   ensures that 2 separate applications or components can be integrated in an easy way. Of course, there is a huge evolution in the protocols, volumes of exchanged data, scalability, performance…​, but in essence the problem remains the same. Nonetheless, there have been multiple terms for integration software such as ETL, ESB, EAI, SOA, Service Mesh…​ Data storage software   ensures that data is stored in such a way that data is not lost and that there is some kind guaranteed consistency, maximum availability and scalability, easy retrieval and searching

IoT - Revolution or Evolution in the Financial Services Industry

1. The IoT hype We have all heard about the   "Internet of Things" (IoT)   as this revolutionary new technology, which will radically change our lives. But is it really such a revolution and will it really have an impact on the Financial Services Industry? To refresh our memory, the Internet of Things (IoT) refers to any   object , which is able to   collect data and communicate and share this information (like condition, geolocation…​)   over the internet . This communication will often occur between 2 objects (i.e. not involving any human), which is often referred to as Machine-to-Machine (M2M) communication. Well known examples are home thermostats, home security systems, fitness and health monitors, wearables…​ This all seems futuristic, but   smartphones, tablets and smartwatches   can also be considered as IoT devices. More importantly, beside these futuristic visions of IoT, the smartphone will most likely continue to be the center of the connected devi

Neobanks should find their niche to improve their profitability

The last 5 years dozens of so-called   neo- or challenger banks  (according to Exton Consulting 256 neobanks are in circulation today) have disrupted the banking landscape, by offering a fully digitized (cfr. "tech companies with a banking license"), very customer-centric, simple and fluent (e.g. possibility to become client and open an account in a few clicks) and low-cost product and service offering. While several of them are already valued at billions of euros (like Revolut, Monzo, Chime, N26, NuBank…​), very few of them are expected to be profitable in the coming years and even less are already profitable today (Accenture research shows that the average UK neobank loses $11 per user yearly). These challenger banks are typically confronted with increasing costs, while the margins generated per customer remain low (e.g. due to the offering of free products and services or above market-level saving account interest rates). While it’s obvious that disrupting the financial ma

PFM, BFM, Financial Butler, Financial Cockpit, Account Aggregator…​ - Will the cumbersome administrative tasks on your financials finally be taken over by your financial institution?

1. Introduction Personal Financial Management   (PFM) refers to the software that helps users manage their money (budget, save and spend money). Therefore, it is often also called   Digital Money Management . In other words, PFM tools   help customers make sense of their money , i.e. they help customers follow, classify, remain informed and manage their Personal Finances. Personal Finance   used to be (or still is) a time-consuming effort , where people would manually input all their income and expenses in a self-developed spreadsheet, which would gradually be extended with additional calculations. Already for more than 20 years,   several software vendors aim to give a solution to this , by providing applications, websites and/or apps. These tools were never massively adopted, since they still required a lot of manual interventions (manual input of income and expense transaction, manual mapping transactions to categories…​) and lacked an integration in the day-to-da

Can Augmented Reality make daily banking a more pleasant experience?

With the   increased competition in the financial services landscape (between banks/insurers, but also of new entrants like FinTechs and Telcos), customers are demanding and expecting a more innovative and fluent digital user experience. Unfortunately, most banks and insurers, with their product-oriented online and mobile platforms, are not known for their pleasant and fluent user experience. The   trend towards customer oriented services , like personal financial management (with functions like budget management, expense categorization, saving goals…​) and robo-advise, is already a big step in the right direction, but even then, managing financials is still considered to be a boring intangible and complex task for most people. Virtual (VR) and augmented reality (AR)   could bring a solution. These technologies provide a user experience which is   more intuitive, personalised and pleasant , as they introduce an element of   gamification   to the experience. Both VR and AR

Beyond Imagination: The Rise and Evolution of Generative AI Tools

Generative AI   has revolutionized the way we create and interact with digital content. Since the launch of Dall-E in July 2022 and ChatGPT in November 2022, the field has seen unprecedented growth. This technology, initially popularized by OpenAI’s ChatGPT, has now been embraced by major tech players like Microsoft and Google, as well as a plethora of innovative startups. These advancements offer solutions for generating a diverse range of outputs including text, images, video, audio, and other media from simple prompts. The consumer now has a vast array of options based on their specific   output needs and use cases . From generic, large-scale, multi-modal models like OpenAI’s ChatGPT and Google’s Bard to specialized solutions tailored for specific use cases and sectors like finance and legal advice, the choices are vast and varied. For instance, in the financial sector, tools like BloombergGPT ( https://www.bloomberg.com/ ), FinGPT ( https://fin-gpt.org/ ), StockGPT ( https://www.as

From app to super-app to personal assistant

In July of this year,   KBC bank   (the 2nd largest bank in Belgium) surprised many people, including many of us working in the banking industry, with their announcement that they bought the rights to   broadcast the highlights of soccer matches   in Belgium via their mobile app (a service called "Goal alert"). The days following this announcement the news was filled with experts, some of them categorizing it as a brilliant move, others claiming that KBC should better focus on its core mission. Independent of whether it is a good or bad strategic decision (the future will tell), it is clearly part of a much larger strategy of KBC to   convert their banking app into a super-app (all-in-one app) . Today you can already buy mobility tickets and cinema tickets and use other third-party services (like Monizze, eBox, PayPal…​) within the KBC app. Furthermore, end of last year, KBC announced opening up their app also to non-customers allowing them to also use these third-party servi

Eco-systems - Welcome to a new cooperating world

Last week I attended the Digital Finance Summit conference in Brussels, organized by Fintech Belgium, B-Hive, Febelfin and EBF. A central theme of the summit was the cooperation between banks and Fintechs and more in general the rise of ecosystems. In the past I have written already about this topic in my blogs about "Transforming the bank to an Open API Ecosystem ( https://www.linkedin.com/pulse/transforming-bank-open-api-ecosystem-joris-lochy/ ) and "The war for direct customer contact - Banks should fight along!" ( https://www.linkedin.com/pulse/war-direct-customer-contact-banks-should-fight-along-joris-lochy/ ), but still I was surprised about the number of initiatives taken in this domain. In my last job at The Glue, I already had the pleasure to work on several interesting cases: TOCO   ( https://www.toco.eu ): bringing entrepreneurs, accountants and banks closer together, by supporting entrepreneurs and accountants in their daily admin (and in the f