Moving to containers

11 June 2021

Chris Bujis, EMEA field CTO, NS1

Chris Bujis, EMEA field CTO, NS1

Network teams have a lot to do to achieve a modern, agile and resilient enterprise. They must apply automation practices to speed up network management and keep pace with their DevOps counterparts. This involves solving IP assignment and internal DNS management issues that crop up as workloads migrate to multi- and hybrid-cloud infrastructures and finding more economical, agile ways to scale the infrastructure that delivers these network services — whilst simultaneously being good partners with the IT and business teams.

Increasingly, reaching these goals means shifting away from a traditional appliance-based approach for delivering network services to enabling the use of containers. They are the next natural step in the enterprise transformation strategy.

From our dealings with customers, we see that the adoption of containers is moving quite slowly and that both complexity and cultural changes are often cited as the most challenging issues in using and deploying containers. It seems that businesses are happy to put a toe in the containerisation water, but hesitant to plunge the entire enterprise in.

Containers have been widely adopted by DevOps teams to deploy and operate applications quickly. But in the networking community there is a greater degree of perceived complexity and often a lack of knowledge and this threatens to hold enterprises back.

So, what are the benefits of containerisation for network teams? Firstly, by delivering DDI infrastructure as containerised software and automating processes with infrastructure as code, the network team has more deployment flexibility and additional opportunities for automation and integration.

This is important to today’s enterprise which is looking to make its legacy and owned environments operate in a more cloud-like manner by introducing private cloud and similar technologies so it can fully embrace operational velocity and automation.

The advantage of containers is that they can deploy on any platform — public cloud, private cloud, physical devices, and even virtual machines. Whether an enterprise is using cloud-based data centres, colocation facilities, or private-cloud networks the team will get more flexibility from a container-based delivery approach when it comes to choosing where and how containerised software runs, a major bonus if network topologies widely vary.

Containers also offer a lightweight footprint. Unlike virtual machines, all containers share the host system’s kernel and require minimal resources to operate. They can be spun up quickly on devices with a smaller footprint than traditional hardware or virtual appliances can. This also helps to reduce provisioning and certification times, as well as automate the provisioning of a sandbox environment within needed time frames to deliver critical updates to users.

Since container deployment is programmable and quick, it’s easy to spin up and deactivate containers in response to demand and, as a result, rapid auto-scaling capabilities become an intrinsic part of business applications. Organisations that regularly run live events or conferences, for example, can be fully supported by containers for temporary locations. In fact, many test or sandbox environments are so short-lived that they last for less than 24 hours.

For network teams, it can be frustrating when the time taken to deploy and update a legacy appliance is longer than the time it’s actually in operation. The increasing use of microservices architectures for digital transformation also drives the ephemeral nature of enterprise production environments.
Containerised microservices simplify and accelerate deployment to the point where autoscaling capabilities can become an intrinsic part of business applications. Individual services can be programmatically replicated or decommissioned to adjust capacity within minutes. This adds to the requirement for IP and DNS updates at higher rates.

Containers can also help enterprises redefine the economics of DDI software licensing. They make it easier to scale up network services for short time frames or temporary locations without worrying about reaching appliance resource constraints or artificially imposed limits by licensing. And there’s no need to incur appliance provisioning and deployment costs and delays to scale up for high-volume periods, such as Black Friday or year-end reporting.

One final benefit that must not be forgotten is that since zero trust eliminates the traditional perimeter, containers allow for an automatic zero trust compliant environment. They also mitigate the impact of a DDoS attack as an organisation can spin up new resources to absorb the increased workload.

As enterprises modernise their applications, network teams also need modern infrastructure to deliver network services for those applications. To make a real difference, modern application delivery requires containerisation supported by a DDI solution that is platform-agnostic, software-defined and scalable.

The best approach for network teams is to work with a provider that enables the transition from static network infrastructure with ‘big bang’ upgrades to continuous delivery of software-defined network services. They can use the same automation tools and infrastructure constructs as their DevOps counterparts. With cumbersome appliance upgrade paths replaced by deployment processes automated through API’s, enterprises will accelerate deployment velocity and improve efficiency, without the overhead associated with managing traditional appliances.