Virtualization Everywhere to be Found

SAN FRANCISCO — Last year, grid computing  basked in the buzzword glow of major technology conferences.

This year, virtualization surged ahead of “Web 2.0” and “user-generated content” as the buzziest of buzzwords at the Oracle OpenWorld conference.

The term , as wide as it is, floated out of keynote presentations, into discussions about grid computing, and underneath hardware utilization chats across the Moscone Center’s cavernous exhibition hall.

One of the busiest exhibition booths at the conference: VMWare’s spot at the back of the exhibition hall.

Nicholas (Neela) Jacques, senior product manager for the EMC subsidiary, scribbled away at a large sketchbook as attendees shuffled by the booth. They wanted to hear more about virtualization in all its forms.

In a nutshell, he explained, virtualization is a broad term that describes the separation of a resource or request for a service from the underlying physical delivery of that service.

Take virtual memory. With that approach, software can access more memory than is actually installed, thanks to the background swapping of data-to-disk storage, he explained to one attendee.

But the same techniques can be applied to other IT infrastructure layers, networks, storage, laptops and hardware.

His discussions ranged from how to consolidate servers and cut server management costs; how to make virtual machines move across shared hardware, and boost server utilization rates from the 5 percent to 15 percent range through the 60 percent to 80 percent range.

Or it could deploy a testing and development environment for standardizing application testing; disaster recovery by encapsulating entire systems into single files that can be replicated on any target server; or securing unmanaged PCs and laptops across enterprise desktop virtual machines.

He sketched out virtualization scenarios about servers, and at other times explained how a virtualized server area network  might look.

Elsewhere, software vendors traded notes on whether new licensing models are gelling around a computing model that is turning traditional licensing plans, which are based on per-core pricing, on their ears. The consensus: no consensus, yet.

Hardware vendors shrugged off any suggestion that they’d be selling less hardware with each virtual machine.

Enterprise decision-makers extolled the virtues of the cost-savings, server sprawl consolidation, security that they could visualize with virtualization technology.

Mark Hurd, CEO of HP, also chatted up the concept in his keynote address. He said HP plans to take 5,000 of its own global applications and shrink that to fewer than 1,500.

The company had 21,700 servers churning out data at its offices and centers around the globe. That’s shorthand for a lot of unused capacity, so look for HP to reduce that server fleet down to 1,400 through the use of virtualization technologies.

What that means, Hurd added, is that HP will not only be reducing the number of servers it maintains, but will be adding 80 percent more processing power and will increase cooling and wattage by 50 percent.

Charles King, industry analyst for Pund-It Research, is among the many analysts giving VMWare kudos for getting ahead of the trend back in 1998.

Eight years later, parent company EMC’s most recent earnings results tell the tale.

VMWare, EMC’s subsidiary, recorded sales growth of $188.5 million, an 86 percent jump year-over-year.

King said VMWare moved ahead of the trend years ago by taking x-86-based  servers, the widely used Intel microprocessor architecture for PCs, and began virtualizing them.

If you go back a few years, even in 1998 and 1999, even before Linux really started to take off in the enterprise, plenty of people, Sun’s former CEO Scott McNealy for example, poked fun at the workhorse x86 server market, King continued.

They sniffed that it wasn’t up to snuff compared to big iron Unix or Sun’s own Solaris operating system.

One big factor in the change is that the tools for virtualizing commodity hardware systems are easier to use.

“We’ve gotten to the point where we’re creating a virtual partition on a processor. Years ago, you needed a degree in computer science to make these work. It was fairly inaccessible for many people.”

Even Sun’s recent Project Blackbox announcement, a datacenter-to-go service, is expected to support multiple desktops when it is shipped next year.

King is in the camp that thinks hardware vendors will see other doors open as the spread of virtualized servers translates to fewer server sales.

For one, he added, many server consolidation efforts appear to be aimed at older equipment.

“But I do think at a certain point, [virtualizing] hardware features and hardware capabilities are not just about performance. You have to be able to define that performance in terms of a business value.”

As for licensing models for software makers who historically relied on per-core pricing with servers, virtualizing many instances of the same software poses new challenges on how to charge.

Many attendees and experts say approaches to addressing that are all over the place. VMWare, for one, is taking a core-neutral approach to its licenses, offering instead a per-virtual machine approach to license prices.

“Everybody seems to be approaching that subject pretty liberally,” King added. “I think the recognition is that virtualization is here, and that in order to preserve the value, the vendors are going to have to be flexible on how they approach licensing.

“From what I’ve seen, companies seem to be approaching it with options that will help preserve their profits and continue to make it attractive for their customers.”

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web