undefined

By Cathy Gadecki, Senior Director Data Centre, Juniper Networks

Enterprise IT strategies have travelled a long way in the past decade. Where once every function was managed within an on-premise data centre, IT has now evolved towards an increasingly distributed and far more nuanced approach.

There is on the one hand the emergence of public cloud computing, a model that has developed to the point where enterprises are now choosing to spread workloads across multiple providers of public cloud services according to suitability. This migration has been truly transformational, delivering advantages to IT managers as well as an improved experience for users of applications and services. Workforces are as likely these days to be operating remotely as they are from a branch or a head office, and they want consistent and seamless access to essential tools regardless. The right public cloud mix can enable this.

But that’s by no means the whole story. Many enterprises are seeing fit to retain certain essential functions in privately owned and operated data centres. This so-called private cloud model often runs concurrently with use of public cloud services. So why run both models? It’s largely a matter of practicality and economics. Findings from independent consulting firm ACG Research, commissioned by Juniper Networks, indicate that applications with high data transfer requirements and a demanding compute overhead are much more expensive to run in a public cloud versus a private cloud. Conversely, applications with less intensive requirements are more cost-effective in a public cloud.

IT Strategies: An Ongoing Philosophical Tussle

There is also the matter of changing trends. Visualise the story of the last several decades as a tussle between competing philosophies. There has been an uneasy balancing act between building an organisation’s own resources and outsourcing, between in-house and off-site, between data centre and cloud, between centralised and distributed, between open and proprietary, between mobile and fixed, between head office and remote office, and so on. It’s been an evolutionary journey that runs from mainframes and timeshares to client-server and Big Data clusters taking data and compute back and forth between centralised and distributed/shared.

Enterprise IT can seem to be moving with the tide in one direction, only to be pulled back by altered market realities the opposite way. Who, for example, foresaw that a pandemic would come from nowhere and light a fire under cloud migration strategies? Who anticipated the trend towards edge cloud buildouts or the rise in AI and ML workloads that trigger yet another rethink of best practices?

It is not a surprise then that enterprises are learning to rely on a mesh of approaches as they strive to get the best out of their IT funds. They don’t do this to make their lives easier. Organisations have complex IT because they have complex needs. A hybrid or multicloud approach can involve some difficult decisions and dilemmas and a need for the experts who are capable of sorting through these. If organisations are paying for multiple public cloud services, how can they be sure which function is best suited to which cloud? What is best retained in-house? And what are the economics of the whole thing? How do they determine when value is derived? In the interplay between private and public models, are they achieving all the efficiencies they hoped for? And in the final analysis to what extent are they in control, fully secured, legally compliant and future-proofed? These are all questions that need to be carefully considered and effectively addressed as a cloud strategy is formulated.

Balancing Cost, Experience and Performance

There are a few warnings to bear in mind here. For one thing, the most cost effective and expedient option is unlikely to prove the best overall. With cloud migration comes efficiency, but efficiency has a price. Then, there is the matter of security. Security has always been seen as the weak point in any public cloud strategy, and it is still difficult to provision and manage cloud-based security that meets the same levels of assurance as a data centre. In running a hybrid cloud policy, one also needs to draw a line between what is core and what is peripheral. Even with major multinational organisations, such as banks, starting to move essential applications over to the cloud, there is still the perceived risk in many minds of outsourcing the ‘family silver’. Does cloud come with the right kind of SLAs, and what do those add to the cost?

Additionally, there is the issue of user experience and performance. Cloud generally moves compute power closer to the user. But the obvious benefit that this entails needs to be weighed up with other considerations, like cost and security. What do these changes mean for the IT skills needed to run the resulting solutions mix? Smarter technology, such as AI and automation, is perhaps the antidote to the overload of repetitive IT work and a shortage of IT experts.

The future of IT is about managing all these nuances and deploying the latest technologies to enable true visibility, availability, security and agility. This will mean thinking beyond the traditional data centre model for the network and finding the right mix among the options for the organisation. It will entail rethinking data centre operations and moving towards the automation of everything, from the level of design and deployment right through to everyday operations and assurance. It is about managing all the complexity as a single entity, giving the enterprise a complete, simplified view of what’s happening every day in its digital business.