Being a technical CIO / CTO / Consultant for a living, Cloud Computing has definitely come to the forefront of technology today. There are still many people who have tried out VMware player on their desktops or even VirtualBox / Parallels and assumed that this is cloud computing, thus writing it off due to the fact they didn’t get the performance they expected.
The reality of the whole thing is that is Virtual-Computing and not Cloud-Computing. People get mixed up with the two. Virtual computing allows you to run multiple virtual machines on your desktop or server, but is far from cloud computing. It doesn’t leverage the resources of the computer in the way a cloud does.
The two can’t be compared. Virtual Computing is used for running a virtual machine. It’s good for testing and tech support purposes, but beyond that, it’s not really meant for much more vs. a Cloud Computing infrastructure that actually manages resources in a very different way.
Cloud Computing is a technology that maximizes the resources of the hardware in a very different way than a virtual-computing infrastructure. So please don’t confuse the two.
It’s like comparing a cheap screwdriver to a $500.00 Dewalt power drill. There just is no comparison.