Virtualization vs cloud computing

Virtualization vs cloud computing

 

Virtualization vs cloud computing

Cloud computing and virtualization are both technologies that were developed to maximize the use of computing resources while reducing the cost of those resources. They are also mentioned frequently when discussing high availability and redundancy. While it is not uncommon to hear people discuss them interchangeably; they are very different approaches to solving the problem of maximizing the use of available resources. They differ in many ways and that also leads to some important considerations when selecting between the two.

Virtualization: More Servers on the Same Hardware

It used to be that if you needed more computing power for an application, you had to purchase additional hardware. Redundancy systems were based on having duplicate hardware sitting in standby mode in case something should fail. The problem was that as CPUs grew more powerful and had more than one core, a lot of computing resources were going unused. This obviously costs companies a great deal of money. Enter virtualization. Simply stated, virtualization is a technique that allows you to run more than one server on the same hardware. Typically, one server is the host server and controls the access to the physical server’s resources. One or more virtual servers then run within containers provided by the host server. The container is transparent to the virtual server so the operating system does not need to be aware of the virtual environment. This allows the server to be consolidated which reduces hardware costs. Less physical servers also mean less power which further reduces cost. Most virtualization systems allow the virtual servers to be easily moved from one physical host to another. This makes it very simple for system administrators to reconfigure the servers based on resource demand or to move a virtual server from a failing physical node. Virtualization helps reduce complexity by reducing the number of physical hosts but it still involves purchasing servers and software and maintaining your infrastructure. Its greatest benefit is reducing the cost of that infrastructure for companies by maximizing the usage of the physical resources.

Cloud Computing: Measured Resources, Pay for What You Use

While virtualization may be used to provide cloud computing, cloud computing is quite different from virtualization. Cloud computing may look like virtualization because it appears that your application is running on a virtual server detached from any reliance or connection to a single physical host. And they are similar in that fashion. However, cloud computing can be better described as a service where virtualization is part of physical infrastructure.

Cloud computing grew out of the concept of utility computing. Essentially, utility computing was the belief that computing resources and hardware would become a commodity to the point that companies would purchase computing resources from a central pool and pay only for the number of CPU cycles, RAM, storage and bandwidth that they used. These resources would be metered to allow pay for what you use model much like you buy electricity from the electric company. This is how it became known as utility computing. It is common for cloud computing to be distributed across many servers. This provides redundancy, high availability and even geographic redundancy. This also makes cloud computing very flexible.

It is easy to add resources to your application. You just use them, just like you just use the electricity when you need it. Cloud computing has been designed with scalability in mind. The biggest drawback of cloud computing is that, of course, you do not control the servers. Your data is out there in the cloud and you have to trust the provider that it is safe. Many cloud computing services offer SLAs that promise to deliver a level of service and safety but it is critical to read the fine print. A failure of the cloud service could result in a loss of your data.

A practical comparison (Virtualization vs CLOUD COMPUTING)

VIRTUALIZATION

Virtualization is a technology that allows you to create multiple simulated environments or dedicated resources from a single, physical hardware system. Software called a hypervisor connects directly to that hardware and allows you to split 1 system into separate, distinct, and secure environments known as virtual machines (VMs). These VMs rely on the hypervisor’s ability to separate the machine’s resources from the hardware and distribute them appropriately.

CLOUD COMPUTING

Cloud computing is a set of principles and approaches to deliver compute, network, and storage infrastructure resources, services, platforms, and applications to users on-demand across any network. These infrastructure resources, services, and applications are sourced from clouds, which are pools of virtual resources orchestrated by management and automation software so they can be accessed by users on-demand through self-service portals supported by automatic scaling and dynamic resource allocation.