Written by Mike Shields
Lightweight containers are low-overhead, isolated mechanisms for running applications – sort of a lighter alternative to a virtual machine. They are all the rage these days, with Docker in particular getting a lot of attention. While the concept may seem innovative to some, in fact, lightweight containers have been around for a long time – for instance, Solaris Containers appeared in 2005’s Solaris 10 release, and FreeBSD jails became available in 2000 – both based on chroot, which dates back to 1979.
Some of the goals behind lightweight containers originated from a desire to allow applications to run side-by-side on a single server, while isolating them for security purposes. The rise of Docker, however, has brought about a community that enables additional benefits to be realized – that is, it has created the ultimate ‘try before you buy’ mechanism.
Docker’s philosophy focuses on housing a single logical application inside a container – a container with a single purpose, rather than just being an arbitrary slice of a server. Docker’s Dockerfile facility provides a way to create and distribute a reusable description of an application housed in a container. Because of the popularity of Docker, a proliferation of Dockerfiles is turning Docker into a sort of package system – the ultimate ‘test drive’ for developers to try out applications, platforms, tools, frameworks, and other toys.
A development server devoted to hosting Docker containers is easy to set up, and is often a VM on a virtual infrastructure. “Provisioning” containers is performed by developers, is instantaneous, and doesn’t require manual allocation of resources like disk space, memory, or CPU.
Want to try RStudio out and see if R is applicable to your data issues?
docker run -d -p 8787:8787 rocker/rstudio
How about a quick Tomcat application server to see if your latest Java web application works?
docker run -d –p 8080:8080 tomcat:latest
This enables teams of developers to set up, connect, and run application servers with just ‘docker run’, instead of a platform-specific exercise – acquiring packages, resolving conflicts and satisfying dependencies, and configuring and starting services. A development team familiar with Docker can quickly iterate and create innovative solutions to complex problems, leveraging diverse combinations of tools, applications, servers, and platforms that, using traditional techniques, could take far longer to assemble – saving frustration, time, and cost.