Docker and containers could significantly improve software operations in HIT, but is it that simple? Logicworks’ Jason McKay explains.
The next big buzz word in HIT—container. Unlike “big data” and “the cloud,” the word creating so much excitement among techies conjures up visions of Tupperware rather than IT advancement. While these containers won’t do much for leftovers, they are promised to save considerable time and money when updating software thanks to the advent of Docker.
Containers are a solution that allow software to run from one computing environment to another, whether it be from a developer’s computer to a test environment or a physical machine in data center to a virtual machine (VM) in a private or public cloud. Docker is a newer type of container technology that allows software to run without the need of virtual machines (VMs). They contain their own CPU, memory, network resources, libraries, configurations, and internal dependencies in a single package.
In addition to making containers easier and safer to deploy, Docker, a San Francisco-based startup that formed in 2013, has brought standardization to containers by partnering with other container providers such as Canonical, Google, Red Hat, and Parallels. It’s becoming so popular that Docker dominates many conversations when talking about these nifty containers.
“If there ever was a battle among vendors, Docker has won…and yes, it’s now nearly synonymous with container technology,” said Jason McKay, senior vice president and CTO at Logicworks, a cloud computing and managed hosting company based in New York City. “Microsoft tried to create their own called Drawbridge, which they ended in 2011, and CoreOS created an open source project called Rocket, which has not had nearly the same uptake as Docker.”
Docker containers share the kernel of the host operating system, unlike virtual machines that each require its own guest operating system. Therefore, one benefit Docker container is that it’s lightweight; you can probably fit two or three times more containers than VMs because of this distinction in design. The actual benefit of Docker, however, is its place in the larger movement in software development and infrastructure automation.
“The vast majority of enterprises have hundreds of applications, many of which are dependent on a common set of app infrastructure, packages, or bridge applications,” said McKay. “When the applications work smoothly, this isn’t an issue. But whenever something fails, discovering the origin of the failure is exponentially more challenging when application boundaries are not discrete, and enterprise teams have no map of application dependencies; this leads to significant time spent combing through logs to discover the basic architecture of the program. Any change, even a small one, requires rebuilding and retesting of the entire application, because a small change may have cascading effects.”
Through Docker, developers can create smaller, discrete services that can be updated easier. Thus, developers no longer have to work on one test version of an application, which creates all sorts of process delays and potential failures; instead, they can make updates to individual services and test those changes independently. This approach is commonly called microservices.
To be clear, you don’t need microservices to use Docker, and you don’t need Docker to create a microservices process, but they work well together, said McKay. Updating a service is much easier by pushing code to a single container that already has all the necessary packages and configurations; for example, Docker can substantially reduce the number of days it takes and possible hiccups that can occur when updating an operating system (OS).
“It’s much easier to update to the latest version of your OS when you can update it in a single module, or in a single Docker base image, said McKay. “You know it will not have ramifications beyond the boundaries of your module/container and then deploy that base image to a single Docker cluster, perhaps 10 percent of your computing power, to maintain availability while testing the new OS.”
Dockers, Developers, and Doctors- Where’s the connection?
Having this kind of capability could be a game changer for organizations that are trying to move forward with technological advancements while running legacy software. So, how will the use of Docker help healthcare organizations achieve its accountable care goals?
Well, it won’t—not directly. What Docker can do, however, is help software development teams more quickly develop and iterate applications. It will also contribute to promising startups delivering their services quicker and more cost effectively. In that sense, it can help HIT to develop great applications more quickly, but it doesn’t necessarily make HIT applications themselves better. Moving away from traditional software models to software as a service (SaaS) models, developers will experience faster and more flexible growth. In turn, providers and organizations will seek out these SaaS solutions to effectively reform its processes.
“The healthcare startups that are shaking up the healthcare software market are almost all adopting an agiler software delivery model, just look at companies like Grand Rounds, CareCloud, Oscar Health, Orion Health, and Zenefits,” said McKay. “These are companies that can meet the new demands from healthcare providers for software that facilitates an accountable care model, precisely because they can get better software to market more quickly. Developing a consistent, repeatable deployment model is a crucial part.”
Docker is a powerful tool, but that does not automatically create microservices or interoperability. McKay said that before health IT teams can use Docker, they first need to address the more significant issue and stop building and managing monolithic applications. “Software development teams need to invest in creating new applications with fewer interdependencies; this is by far the most labor-intensive component of the process. These application teams may then find that Docker is a good solution for getting those services into production, but the decision to use Docker will not in and of itself create an agile application,” he explained.
Docker does bring up concerns regarding security, but opinions on the topic are mixed. Some argue that containers can offer more security than virtual machines when used correctly. However, others argue that because containers share the same kernel, an internal problem with a container using that shared kernel could potentially take down the whole host. As security breaches and ransomware figures continue to rise, so do a few eyebrows surrounding the potential vulnerabilities of containers.
“It’s best to think of Docker as a deployment mechanism, not a security model,” said McKay. “Docker shouldn’t do anything to change or improve the requirements of an underlying security model. The same tools, monitoring, and IDS are needed to protect servers or virtual instances where Docker containers sit.
Despite speculation, Docker and containers are deemed worthy of all the attention they are getting because it represents another key advancement in how business expects IT to function: as a provider of easy-to-access, easy-to-spin-up, repeatable “chunks” of computing, memory, and storage as opposed to a roadblock to software development teams.
“The takeaway is that Docker reduces the complexity of regular software updates. In healthcare, this could mean the difference between releasing software once a quarter versus once a day. That can ultimately have a large impact on the success of your software,” McKay concluded.