Why is virtualization needed for cloud computing

cloudvirtualization

Can anyone explain me why is virtualization needed for cloud computing? A single instance of IIS and Windows Server can host multiple web applications. Then why do we need to run multiple instances of OS on a single machine? How can this lead to more efficient utilization of resources? How can the virtualization overhead be worth it? Is it strictly a matter of economics – I have money to buy only 100 machines, so I run virtualization to pretend I have 1000 machines?

Best Answer

Virtualization is convenient for cloud computing for a variety of reasons:

  1. Cloud computing is much more than a web app running in IIS. ActiveDirectory isn't a web app. SQL Server isn't a web app. To get full benefit of running code in the cloud, you need the option to install a wide variety of services in the cloud nodes just as you would in your own IT data center. Many of those services are not web apps governed by IIS. If you only look at the cloud as a web app, then you'll have difficulty building anything that isn't a web app.
  2. The folks running and administering the cloud hardware underneath the covers need ultimate authority and control to shut down, suspend, and occasionally relocate your cloud code to a different physical machine. If some bit of code in your cloud app goes nuts and runs out of control, it's much more difficult to shut down that service or that machine when the code is running directly on the physical hardware than it is when the rogue code is running in a VM managed by a hypervisor.
  3. Resource utilization - multiple tenants (VMs) executing on the same physical hardware, but with much stronger isloation from each other than IIS's process walls. Lower cost per tenant, higher income per unit of hardware.
Related Topic