Virtualization's Dirty Little Secret: The Other 30 Percent

By TechZone360 Special Guest
Jim Thompson, Chief Technology Officer, Unisys Technology Products
June 23, 2014

Few technologies have garnered more attention over the past decade than server virtualization and for good reason.  Flash back 10 years ago, the average organization was suffering from some serious server sprawl issues.  “One app, one server,” was the battle cry in IT shops back then.  Security and predictable performance took precedence over efficiency and server utilization.

Hypervisors and virtual machines have helped tame that mess.  By enabling multiple applications to share memory, processing power and other resources on the same physical host server, virtualization has reduced costs and made IT shops more agile.  Need computing power for a new business application? Just provision a virtual machine on your host server and voila!  You’re in business.

However, there’s a dark side to virtualization.  It’s a secret that large IT shops know well but hardly ever gets mentioned in the trade press and certainly not by the vendors that peddle virtualization wares. 

The dirty little secret is this: traditional virtualization technologies are great for handling commodity work, but they are fundamentally unsuited for applications that organizations depend on to run their most important, sensitive, business-critical processes – such as those that drive customer relationships, financial transactions and supply chains – or mission-critical applications.

The reasons have largely to do with performance and security.  Virtualization works on the principle of resource sharing.  The computing and memory resources of physical servers are pooled together in a virtual environment.  Different applications then go out to that pool to compete for resources when needed.

That principle works fine for lower-level applications that don’t require a lot of resources.  Larger, more mission-critical applications don’t play so nicely.  Enterprise resource planning applications, for example, are known to be resource-intensive.  Put them into a shared environment, and suddenly you have a Darwinian scenario where the biggest applications are taking all the resources and creating performance bottlenecks for everyone.

Security and compliance are the second big area of concern when it comes to putting mission-critical applications into virtualized environments.  Because virtual servers share resources, it’s more difficult, if not impossible, to isolate or harden specific applications and workloads for security.  Auditing such applications for compliance therefore becomes an issue.

For all these reasons, virtualization has hit a ceiling when it comes to the world of mission-critical applications.  Gartner estimates that about 70 percent of server environments have been virtualized as of 2013.  The remaining 30 percent, which largely represents complex, transaction-intensive, mission-critical workloads, remains untouched by virtualization.

Containerization or secure partitioning: a third choice

Until recently, CIOs looking to reduce the costs and increase the flexibility of their mission-critical applications have faced a Hobson’s choice: either take the risk of virtualizing these workloads and hope they don’t run into resource contention or keep their mission-critical apps locked away on dedicated, expensive, underutilized proprietary servers, which isn’t much of a choice at all.

But with the advent of fabric computing, which uses high-speed interconnects to link loosely coupled computing resources into elastic IT environments, organizations are seeing a new alternative to virtualization, one that delivers the cost savings and flexibility of virtualized servers without sacrificing performance and security. 

That option, colloquially called “containerization” or “secure partitioning,” involves creating secure, dedicated containers within a fabric environment based on standard Intel x86 processors. These containers are dedicated to serving a specific mission-critical application, providing all the memory, computing power, storage and other resources needed by those workloads to operate securely and at mission-critical levels of performance and reliability. This approach eliminates competition for resources.

In essence, these containers act like hard-wired partitions on a physical server, but because they are software-based and reside in a high-speed fabric environment, they gain the advantage of extreme flexibility.  Containers of resources can be put together and provisioned within minutes, and then quickly taken down when no longer needed.  A secure container can also serve geographically dispersed end users and locations. The results are fewer physical servers and significant cost savings.

All this comes without sacrificing security.  Containers can be isolated or hardened for highly-sensitive workloads. And again, all this can be done via software, not hardware.

For the other 30 percent of IT environments that haven’t been virtualized, this is good news.  CIOs no longer need to compromise when it comes to their mission-critical applications.  They can get the benefits of virtualization at low cost without sacrificing security and performance.    

Compare three platforms for mission-critical applications

Operational requirements

Dedicated Host

One application per host

Virtualized Host

Multiple applications per host as virtual servers

Secure-Partitioned Host

Multiple applications per host in secure partitions

Deploy in minutes

No  

Yes

Yes

Move workloads in minutes

No   

Yes

Yes

Consolidate servers, reducing sprawl

No

Yes

Yes

Save on power, cooling, licensing

No

Yes

Yes

Performance: predictable

Yes

No

Yes

Security: easy to harden/isolate

Yes

No

Yes

Compliance: easily auditable

Yes

No

Yes




Edited by Maurice Nagle


SHARE THIS ARTICLE
Related Articles

Consumer Privacy in the Digital Era: Three Trends to Watch

By: Special Guest    1/18/2018

Digital advertising has exploded in recent years, with the latest eMarketer data forecasting $83 billion in revenue this year and continued growth on …

Read More

CES 2018: Terabit Fiber - Closer Than We Think

By: Doug Mohney    1/17/2018

One of the biggest challenges for 5G and last mile 10 Gig deployments is not raw data speeds, but middle mile and core networks. The wireless industry…

Read More

10 Benefits of Drone-Based Asset Inspections

By: Frank Segarra    1/15/2018

Although a new and emerging technology, (which is still evolving), in early 2018, most companies are not aware of the possible benefits they can achie…

Read More

VR Could Change Entertainment Forever

By: Special Guest    1/11/2018

VR could change everything from how we play video games to how we interact with our friends and family. VR has the power to change how we consume all …

Read More

Making Connections - The Value of Data Correlation

By: Special Guest    1/5/2018

The app economy is upon us, and businesses of all stripes are moving to address it. In this age of digital transformation, businesses rely on applicat…

Read More