Welcome to TechnologyProfessional.Org

Advance the profession. Advance your career.

Login now!

Member Login

Lost your password?

Introduction to Virtualization

November 5, 2008

Virtualization: More Important Than You Think

By: Phil Stevens

The following is the first in a series of articles on infrastructure virtualization. A new article will be added to the series each month.

The traditional approach to computing has created new applications running on dedicated hardware for each business unit or function.  This resulted in a rapid increase in size and complexity of the enterprise technology environment despite low utilization.  Change became slower and increasingly more difficult.  Continuing on that course would require increasing staff and adding new layers of infrastructure, an approach that is not sustainable.

Technology is seldom a “silver bullet”; however, it can provide a foundation on which a better solution can be built.  Virtualization provides a set of tools that were developed over many years and recently have matured into an efficient, effective, integrated whole.  The author will overview the upcoming series on virtualization and how it can be leveraged to reduce cost, improve responsiveness, and improve operational performance.

What Is Virtualization?

Virtualization introduces logical devices, which are typically software based, to replace physical devices.  For example, multiple virtual computers could run on one physical computer.  Virtualization can be applied at various levels: Operating system, application, server, management, network, and storage.  Until recently, the technology was limited by a lack of integration of the management infrastructure and a lack of a holistic architecture appropriate for the enterprise.  Fortunately, important developments have made it possible to implement virtualization holistically in other words to virtualize technology and business services rather than just components.

What Drives Virtualization and Why Now?

Virtualization is not a new concept, as it has been written about and debated for several years.  In Good to Great, Jim Collins explains that exponential change appears to start out slowly and then rocket upward suddenly.  Collins states that a great deal of ground work must be completed but goes unnoticed until an inflection point is reached.  The actual impact of virtualization will overshadow what has been seen to date, and it will have major implications on enterprise environment for years to come.

We have to ask the question: Why now?  To answer that question, several key factors have to be considered:

  • Declining Cost and Advances in Processor Technology: Moore’s law states the power of computers will continue to grow, and typically, the cost of a unit of computing power will decline.  At the same time, major advances in multi-core and multi-threading technology have put tremendous power inside a single desktop or server.   The extra power makes it possible to run many computers’ worth of work on a single server at a fraction of the cost of a mainframe.
  • Internet and Increasing Network Bandwidth: Not long ago, a branch location might have operated with a 56Kbps frame relay connection to the data center, or a company might have had a 1.5Mbps point-to-point T1 to a business partner.  Today, many organizations have 1.5Mbps or faster connections to the branch and a 45Mbps or faster Internet connection for business-to-business data exchange. Declining prices, improving availability, and the any-to-any nature of the Internet have provided exciting new options.
  • Open Source Software: With traditional software licensing, running 10 instances of an application on 10 virtual servers would require 10 licenses at 10 times the cost, thus, limiting the practicality of virtualization.  Open source software is “free” for many uses, and the availability of the source code provides an unprecedented level of flexibility.  There are significant costs involved in bringing a software package into an enterprise even free software.  However, open source software enables the full potential of virtualization by eliminating cost-per-license as a constraint.
  • Management Tools:  Early tools that managed virtual components were stand alone and required human oversight.  These tools are now being integrated with enterprise management tools, and the tools are becoming more autonomic.  Autonomic is a term IBM uses for a self-managing environment.  These developments are critical, because they help to solve the scale problem with the current environment.
  • New Architectures: New architectures are being developed to take advantage of virtualization.  One example is Software as a Service (SaaS).  SaaS makes certain software functions accessible independent from the details of how or where they are performed.  These architectures and capabilities are being built into products hitting the market now.

Each of these factors is important, but together they create a powerful confluence which enables a dramatic, long-term change in enterprise computing.

Benefits of Virtualization

Properly and holistically implemented, this new generation of virtualization will provide important benefits:

  • Increased Efficiency: A traditional IT environment must be sized to handle peak loads, such as the Christmas selling season or month-end processing.  Since peak loads tend to occur for a small portion of the year, the hardware is underutilized most of the time.  Virtualization allows multiple systems with different usage characteristics to share the same infrastructure, resulting in more efficient utilization.
  • Improved Scalability and Flexibility: Virtualization enables an organization to scale up or scale down, allocating resources as needed.  This translates to more flexibility for IT and quicker time to market for the business.
  • Enhanced Manageability: Virtualization tools make it more straightforward to create a new virtual system based on a well-defined system image.  Updating that image ensures all virtual environments are current and secure.  Accomplishing tasks in a more efficient and effective manner reduces the demand to expedite processes, which can cause a drift from standards.
  • Improved Availability:  The benefits of more consistent images and the ability to easily deploy updates include improved stability and security.  Furthermore, the ability to quickly spin up a new system enables the operations team to bring up a new system before taking a system down for maintenance, providing another increase to uptime.

In short, virtualization can reduce long-term cost (or at least flatten the curve of cost increase) and maximize the potential of IT to support the business strategy.

Obstacles Lie Ahead

Virtualization presents challenges to overcome and areas where further development is required:

  • Component vs. Service Virtualization: Despite recent advances, most of the experience in the industry is in implementing virtualization on a component basis.  It will take time for the industry to make a successful, holistic implementation a foregone conclusion.  In the mean time, anticipate a bit of a learning curve.
  • Initial Investment: Virtualization will mitigate the ever-increasing costs of technology sprawl in a traditional environment.  Depending on a company’s starting point, virtualization can reduce costs; however, a significant up-front capital investment will be necessary.  Short-term benefits are likely to be related to stability, flexibility, and responsiveness rather than lower cost.
  • Software Issues: Not all software vendors have adapted their license models to accommodate virtualization.  Until vendors do so, estimates for virtualization project costs will be variable.


A confluence of events will reshape enterprise IT in the coming years.  The remainder of this series will explore the role virtualization plays in the key areas of storage, network, servers, and desktops, and how the landscape of the business world’s computing infrastructure could change when virtualization is holistically implemented.

Tags: ,

Leave a Reply