Before virtualization, the best practice was to run one application per physical server. In other words, servers typically ran at only 5-15% utilization. This gross underutilization translated into massive energy waste — incurring both financial and environmental impacts. Virtualization enables higher server utilization, which enables more consolidation. This drastically reduces global datacenter electricity consumption. However, because many servers today are running at only 20-25% utilization, there is still significant room for improvement.
Key opportunities for innovation include:
Enabling “cloud-sharing” that puts spare capacity to productive use by transient and non-time-sensitive workloads.
Recouping stranded capacity from oversized virtual machines, containers, and servers that no longer do useful work (sometimes called “zombies”).
Leveraging hybrid public cloud bursting to provide on-demand peak and backup capacity, enabling customers to reduce on-premises infrastructure and run it with higher utilization.
These innovations would produce productivity and sustainability improvements, while also meeting performance and availability requirements.