By Nelson Hsu, Principal
In today’s market we are seeing the adoption of “virtualization,” and vendors preparing to meet the need. Recently we have seen the acquisition of Softricity by Microsoft and Akimbi’s acquisition by EMC/VMWare with VMWare growing at 77% annually. Obviously these moves by industry leaders are being driven by specific application demand such as real time information, real time systems provisioning, compliance and the compounding growth of data stores. We will explore these market dynamics further but first we need to ask, what is virtualization?
Virtualization can take many forms but foremost, it is the ability to provide a shared pool of resources – CPU, disk, data, and applications – to be used by any application on demand. According to Wikipedia, a basic definition of virtualization is
“The process of presenting a logical grouping or subset of computing resources so that they can be accessed in ways that give benefits over the original configuration. This new virtual view of the resources is not restricted by the implementation, geographic location or the physical configuration of underlying resources. Commonly virtualized resources include computing power and data storage.”
Let us first take a pragmatic approach to virtualization for enterprise IT. The average metric in the industry is that servers are utilized at about 15-20%. Obviously we would like to get this server utilization up to 80-90% but how can we do that? While an enterprise may have a number of servers sitting idle within the enterprise, there are applications (i.e. business intelligence reporting, margin calls, interest rate analytics) thirsting for those unused compute cycles.
One would like to think it is the virtues of virtualization that have finally captured the enterprise’s attention and a simple case of ROI is the initial draw. The raw ROI equates to what enterprises can save in server consolidation by utilizing the unused computing cycles as virtual resources to meet the increasing real-time application requirements. This is driving a multitude of applications including today’s Business Intelligence and Data Warehouse initiatives. A Fortune 1000 retailer recently realized a 27:1 server consolidation; it is estimated that the power consumption savings alone equates to over six figures. Compound that with the reduced network administration, space savings, and licensing costs as a few additional ROI factors. The ROI is fairly compelling in this simplistic form.
By leveraging a virtualization platform, IT can also gain ground on meeting the mandates of Sarbanes-Oxley, Basel II, Bailey-Hicks, HIPAA and other regulatory requirements. For instance, let us assume you have a complete trading environment for tracking interest rates, bonds, stocks and fluctuating dollar valuations. As you upgrade to a new system you need to be sure that the prior system is completely auditable in the future. By utilizing a virtualized architecture, you can encapsulate the entire system into a VMWare file for complete access and reporting. This makes the system completely compatitble while minimizing (if not eliminating) any software, operating system or application versioning and management issues.
In another example, if you happen to also be the largest on-line retail or auction site in the world, you need to track and store all of the transactions that have occurred. This requires the ability to replicate entire systems, applications and data. With a virtualized architecture you can have access to this information in a matter of minutes rather than building an entire systems environment from scratch which could take hours if not days.
Consider growth and demand for compute cycles. Instead of having to upgrade to your next class of mainframe server or adding a dedicated server for each new application or applications that grow and demand greater processing power, why not leverage the unused compute cycles within your worldwide IT infrastructure? Further, if you are running IBM mainframes with logical partitioning (LPAR), software licenses such as database licenses are configured based on LPAR. If you create multiple virtual server instances per LPAR, you pay only for a database license for that particular LPAR and don’t pay based on virtual machines.
Not only is the ROI case for server consolidation clear, but one easily recognizes the ROI benefit for business continuity and disaster recovery and planning. The historic methods of data replication, mirroring and back-up and recovery are proven but complex and costly. They require redundant servers, disk farms, administration and a separate infrastructure – none of which is an inexpensive endeavor. In this case the parts are greater than the sum. The license cost alone for replication software can be staggering.
With virtualization, if one server is not available, another is dynamically provisioned to take over the processing being requested by an application. As part of a virtualization technology stack, you also implement storage virtualization. The data is accessible from all servers within the architecture so data availability is consistent. New storage capabilities for SAN environments are forthcoming and companies such as Incipient hold storage virtualization patents to deliver greater storage efficiencies. Virtualization allows one to encapsulate an entire computing environment including applications, data and tools within one back-up file and have it readily available in case of a disaster or even an unplanned audit. Instead of the tremendous cost of recovering unsupported software versions and each individual component within the enterprise, you can recover the entire environment in a matter of a day if not hours compared to potentially weeks.
Everything comes in cycles
Do you remember the centralized versus distributed computing dilemma? The mainframe versus mini debate? The era of client/server? One could go on contemplating the number of computing paradigm shifts. Perhaps what we are really witnessing overall is a convergence of applications, middleware, data and hardware and the desire to create a pool of services (perhaps the evolution of SOA as the paradigm) that can be utilized by all applications and users regardless of their physical location or programmatic make-up. However, what has not been fully realized is the opportunity for virtualization. It is not a technology but a computing platform. Today you can run on a virtual platform such as VMWare’s ESX but you need to look beyond the platform and up the technology stack to the remaining technologies that should take advantage of virtualization such as middleware components, the data, the way data is stored and accessed and, of course, the applications. One can argue the client applications such as a dashboard will not change from a technical perspective but understand that the underlying application architecture will.
The way applications are delivered is changing and the realization of what this truly means in magnitude remains to be seen, but we continue to see positive indicators. Case in point, VMWare (EMC) previously announced the acquisition of Akimbi, a small California start-up that delivers virtual test environments for testing applications and software. Think about the impact that virtualization could have on the software as a service (SaaS) model. If you built a business using Akimbi products to deliver test and QA platforms as a service, you no longer need to wait for IT approvals, server allocation or acquisition, server configuration and updates. SaaS could deliver this outsourced model to you in a day and you can begin testing the latest patches on earlier versions, test the latest builds and reduce your time to deployment. A similar offering is also delivered by Surgient of Austin, Texas.
Dynamic provisioning for applications is proving to be a significant driver. An associated shift in data analysis and computing is stream processing where data is analyzed and acted upon immediately as it is fed. This is a significant change from rules-based engines where data is stored and then acted upon based on pre-defined rules. With stream processing the data is never stored but analyzed in-stream and immediately acted upon. The compute cycles required can be intensive, and companies like StreamBase who is delivering a stream processing engine will change the paradigm of analytics, especially for financial services.
When you have an environment that hedges on real-time information such as changing interest rates, you want to be event-driven in a real-time fashion, which could save you millions of dollars. With virtualization you can leverage the virtual CPU cycles to analyze the stream of data. It would be hard to characterize data as being more real-time than this. Further, as we see business intelligence for the masses take hold, more end users will be analyzing real-time data in the pursuit of operational BI and Business Performance Management and putting further demand on the IT infrastructure.
Today, Microsoft is pushing hard as a platform provider with its Windows Virtual Server 2005 R2 and with its pending acquisition of Softricity, which provides Softgrid for application virtualization. We can only imagine the potential effect that this will have on the market in the coming years. Also one should note that Microsoft has enhanced its licensing to include four virtual server instances on each Microsoft server. At any one time there can be applications accessing a pool of four virtual servers and its resources for a 4:1 savings in server utilization. It’s not new math.
Given the dynamics of Microsoft’s model, a competing model to consider is of course open source. Today, the prominent player is XenSource, who is leading the charge of Linux Virtualization. CEO Peter Levine stated that his company would launch a virtualization product, XenEnterprise, to ease the process of building virtual machines based on Xen 3.0 open source code. Of course when we speak of open source we have to include RedHat, which has announced support for XenSource by embedding its hypervisor into its next release, another data point making the case for virtualization coming of age. A hypervisor is virtualization software that provides a more efficient way of running virtual machines on a server.
One challenge that remains is the old nemesis: interoperability between virtual platforms. Both XenSource and VMWare products are hypervisors. To fully take advantage of virtualization and its ability to deliver system failover and business continuity, data center managers would like to run different virtual machines on one piece of hardware. Unfortunately this is not the case as today you cannot use a hypervisor from one vendor and move it to another system. This also somewhat marginalizes the case for server consolidation, a top priority for data center managers desiring to implement virtualization. Currently there is work to allow hypervisors from Microsoft (Viridian – not due to release until 2007/2008), VMWare (ESX) and XenSource (Xen) to work in conjunction within the same datacenter. This would make it possible for a Viridian virtual machine running on a platform where the hardware fails to be moved automatically to a machine running VMWare and be fully operational, thus maintaining business continuity.
Major enterprises from retail to insurance and financial services are implementing virtualization architectures for consolidation and disaster recovery. Hurwitz & Associates believes the proliferation of applications from business intelligence, text analytics, data mining and compliance are all key drivers. It is when the elements such as middleware, data stores, storage and applications converge that enterprises will reap the business advantages by leveraging the entire virtualization technology stack. Middleware and databases are not sexy but the demands on applications and how enterprises will conduct business going forward makes this a truly significant paradigm shift for enterprise IT.