The New Age of Virtualisation - How the Cloud is helping it fulfill its potential

By Andrew Carr, CEO Bull UK & Ireland.

  • 11 years ago Posted in

Virtualisation first emerged as a means of curbing server sprawl - a scenario in which multiple under-used servers take up more space and consume more resources in the data centre than can be justified by their workload.


One of the most common causes of server sprawl was the practice of dedicating servers to single applications. Some organisations even reached the point that every time they needed to introduce a new application, service or system into a business, they would have to buy new hardware on which to run that solution. As the number of servers proliferated, the business’s expenditure on power and energy escalated while, at the same time, the organisation’s data centres became less and less environmentally friendly.


When a new system was brought in to the data centre on a new piece of hardware, the business would typically only use 10-20% of the capability of that hardware to run that system. In other words, it would typically have around 80-90% unused capacity on each of its servers – a highly inefficient approach.


Virtualisation solves this problem by allowing businesses to run multiple applications securely in a ring-fenced, virtual environment using shared infrastructure. This meant that organisations rapidly went from a scenario where systems were poorly utilised to systems running multiple services on a single piece of hardware, allowing them to instead use 80-85% of the available capacity, a much more efficient result.


Organisations quickly found three additional benefits after virtualising their estate, reduced time to provision an O/S, easier disaster recovery and reduced licensing costs for certain applications.


IT departments were able to build virtual servers in minutes rather than the old process of having to acquire hardware, rack and stack and then built the environment. Disaster recovery was made easier since if a virtual server or a hardware platform that the server was running was to fail, the virtual server could be quickly moved onto a different piece of hardware and it could then be back up and running very quickly.


Without virtualisation, the business would probably have to rebuild the whole system from scratch, an immensely time-consuming and costly process. Virtualisation also offered greater flexibility to businesses by enabling them to quickly and easily move workloads from legacy equipment to next generation hardware.


Some software vendors have a licence model where they charge customers by the number of processes they run their software on. This can be a significant problem for organisations not using virtualisation. Typically, as they add more hardware to their data centre, they add more processors in line with this, and licence and maintenance charges will increase accordingly. Virtualisation will not only reduce the amount of hardware in the data centre which in turn reduces the licence cost of the software itself, but it can also ring fence the amount of processing power an application can use and thus restrict the software licence costs.


Moving to the Cloud
Businesses are now looking to virtualise more aggressively. They want to move their mission critical applications from a legacy to a more modern infrastructure but they are nervous about taking applications from a traditional mainframe or traditional UNIX environment and running them on the latest generation of microprocessor architectures. Businesses have not felt confident that the right combination of infrastructure and virtualisation has been in place to give the scalability, stability and availability required to run mission critical applications.


So while businesses have got smarter with the way they manage their virtual infrastructure and have brought in better tools to do the job, the market has been quite stagnant in terms of organisations adopting completely new approaches.


However, we are now on the cusp of the next wave of virtualisation which I foresee lasting some three years or so where we will see organisations migrating their mission-critical systems into a virtual environment. And this in turn will create a virtualisation model that can be more closely aligned with cloud models - a case of virtualisation effectively acting as a key landmark on an ongoing journey to the cloud.


So what needs to happen to kick-start this process and enable businesses to make this move? The answer comes in the shape of the next-generation dedicated x86 servers specifically manufactured and built to run mission-critical applications in a virtual environment.
These kinds of solutions will give customers the confidence to move from a legacy infrastructure to a modern infrastructure that will give the right level of performance and accessibility and will not prove detrimental to the availability of their mission critical applications.


Confidence is the key here. Up to now organisations have focused on virtualising storage or environments like SharePoint or Exchange – the areas that they perceive will be relatively simple and risk free to move to a virtual environment.


If, in contrast, you are asking a large global company that has run its enterprise resource planning (ERP) on a mainframe for 30 years to virtualise that application, they need to have total confidence in the underlying infrastructure. So the key in getting businesses to migrate is not so much about the technical capability of virtualisation, it is more to do with confidence that the hardware itself can support mission-critical applications to a level that people expect from other proprietary operating systems.


Today, we increasingly see virtualisation as a vital staging point, in many ways the cornerstone of the journey to full cloud computing. So, for vendors looking to guide businesses on a roadmap to the cloud, it is vital that they are able to reassure their clients that virtualisation is a safe, secure and beneficial approach for them.


Delivering on its Promise?
So is virtualisation today delivering on the promise it offered businesses when it first emerged many years ago? In my view, while it has not fully done so yet, it will do so in the future. We have seen the initial wave of adoption, targeted at reducing server sprawl and virtualising storage and some other key applications. The virtualisation market has since stagnated to some extent as businesses focused on discussing the benefits of moving to full cloud computing.


Yet, there is a natural step in the middle here, which involves moving into that mission-critical virtualisation space and then moving into how you orchestrate and how you track and monitor usage in the virtualised environment to create a cloud and consumption based model. Over the next three to five years, we will see a significant upsurge in the uptake of virtualisation as businesses move through the virtualisation phase and then ‘onwards and upwards’ to a full cloud model.


Blurring the Boundaries
Indeed, the terms virtualisation and cloud are becoming increasingly synonymous. In the future, we are unlikely to talk so much about virtualisation as a discrete IT solution or offering, it will all be ‘wrapped up’ in discussions around different types of delivery models.


Today, thanks to the perceived cost, efficiency and environmental benefits, virtualisation remains a major talking point in the IT world. In 3-5 years, this will have changed because virtualisation will just have become the de facto standard and also customers will no longer be worried about what the underlying infrastructure is because they will be buying the functionality as a service rather than migrating to it themselves. As long as the customer can be confident that the CRM or ERP system is available and up and running 99.999% of the time backed up by service level guarantees, they won’t care what sits beneath it.


So the onus will get pushed to the service provider and it will be this group too that will take the benefit of virtualising to deliver services to customers. The service provider will effectively act like a travel agent giving their customer the benefit of their knowledge and access to services, taking them on a journey. The customer won’t care as long as the journey’s within their budget; they get the service they are expecting and they are happy with the end results.


Once this approach becomes the norm, we will no longer talk about virtualisation projects. Paradoxically, it is virtualisation’s success as an IT model that will lead to it disappearing as a hot topic in the IT world. As businesses increasingly move to a virtual environment and the lines between virtualisation and the cloud become increasingly blurred, the term ‘virtualisation’ may disappear but the benefits delivered by a virtualised infrastructure will become ever more compelling.
 

Quest Software has signed a definitive agreement with Clearlake Capital Group, L.P. (together with...
Infinidat has achieved significant milestones in an aggressive expansion of its channel...
Nearly all senior business decision-makers (96%) surveyed report data strategies as essential to...
SharePlex 10.1.2 enables customers to move data in near real-time to MySQL and PostgreSQL.
NetApp extends its collaboration to accelerate Ducati Corse’s digital transformation and deliver...
Partnership to be featured at COP26, highlighting how data-driven solutions and predictive...
Next-Gen solutions to deliver market-leading enterprise cloud scalability, cyber resilience and...
he EMEA external storage systems market value was up 3.3% year on year in dollars but down 5.5% in...