Will container technology become as widespread as application virtualisation? Although the concept of the container already exists since the beginning of this century (in particular with Linux), the success of Docker (the most popular and most used open source containerisation platform based on Linux) since 2013 has reinvigorated the use of containers.
The advantages
In fact, a container is a complete development environment integrating the application, all of its dependencies, libraries, binary files and runtime files within one single 'package'. Consequently, it is a virtual envelope integrating anything an application needs in order to function. Remark: unlike server virtualisation or virtual machines, containers don’t have an operating system, but rely on the OS of the server on which they are deployed.
Another advantage of containers is their size, since they only 'weigh' a few tens of megabytes, versus several gigabytes for virtual machines and their complete operating systems, which allows to increase portability. Moreover, virtual machines may need several minutes to start and run their applications, while containers can start almost immediately.
Moreover, containerisation is more flexible and modular since a complex application can be split into modules (database, interface…) rather than stored in one single container. This is called the ‘technique of microservices’ (please refer to a previous blog). Since these modules are lightweight, they can be managed easily and activated 'on the fly' when a need is emitted. Remark: there are many container management systems on the market, whether for Windows or Linux, but also among the main cloud providers.
Finally, a technology such as Docker allows to deploy a locally tested application in production on almost any cloud, while the operation can be complicated in the case of virtualisation.
In order to convince people of the importance of containerisation, in 2014, IBM published a performance comparison between Docker and KVM, concluding that containers equal or exceed the performance of virtualisation. In 2017, this opinion was shared by the University of Lund in Sweden, which compared containers to VMware virtualisation and came to the same conclusion. The only constraint: containers created under Linux aren’t compatible with Microsoft and vice versa, which is not the case for traditional virtualisation.
Deployment
As you can see, containerisation appears to be a solution to increase the elasticity of an application and to improve its performance since each module is optimised for its specific use. Furthermore, the development of applications is faster and their continuous deployment is simplified because in case of modification only the code of the module concerned must be adapted, not the entire application. For this same reason, operational maintenance will be easier.
Additional remark: by means of containers, the developer enjoys more autonomy and freedom of action because he can work inside the container, without having to request the creation of a virtual machine. Moreover, the developer will benefit from an application stack closer to that of the production environment, which will lead to a more fluid production launch, in principle.
Container security, sometimes considered worse than that of virtual machines (since the insulation is intrinsically linked to the VM technology), has significantly improved in recent years, especially at Docker’s, which now integrates a signature platform.
Assistance
However, we should not conclude too soon that virtualisation has had its day and that the container is the magic solution. As a matter of fact, it will be necessary to adequately assist the developer, for instance when his container will have to be deployed on a more classic production infrastructure (Software-as-a-Service platform, virtual machine…).
In practice, the IT service provider can assist the development team in a whole series of value-added services: advice on the choice of architecture and deployment methods, provisioning of the processing or storage capacity, partitioning, making monitoring tools available, settings according to specific needs…
In short, the development team will have to be closer to the infrastructure (this is exactly the challenge of the DevOps approach) as well as to its IT service provider to increase efficiency and agility, and thus be even more tuned to the business needs.
Partner
To help the internal IT department with the deployment of its IT platforms, the Aprico Enterprise Architecture entity relies on proven methodologies, as well as on tools and referential frameworks. Moreover, our specialists will strive to facilitate the dialogue between the IT department and the business entities in order to implement the solutions most adapted to the business needs. Overall, Aprico's experts rely on referential frameworks and best practices in enterprise architecture. They aim to identify and to measure the real added value of any new project in order to formulate implementable and relevant recommendations for the organisation. Moreover, Aprico's architecture specialists are part of the group's global strategy in terms of integrity, privileged contact with the customers, operational excellence and transparency. More information: marketing@aprico-consult.com