Category Archives: Gartner Group

OVA Members Hope to Close Ground

I discussed the fast-growing Open Virtualization Alliance (OVA) in a recent post about its primary objective, which is to commoditize VMware’s daunting market advantage. In catching up on my reading, I came across an excellent piece by InformationWeek’s Charles Babcock that puts the emergence of OVA into historical perspective.

As Babcock writes, the KVM-centric OVA might not have come into existence at all if an earlier alliance supporting another open-source hypervisor hadn’t foundered first. Quoting Babcock regarding OVA’s vanguard members:

Hewlett-Packard, IBM, Intel, AMD, Red Hat, SUSE, BMC, and CA Technologies are examples of the muscle supporting the alliance. As a matter of fact, the first five used to be big backers of the open source Xen hypervisor and Xen development project. Throw in the fact Novell was an early backer of Xen as the owner of SUSE, and you have six of the same suspects. What happened to support for Xen? For one, the company behind the project, XenSource, got acquired by Citrix. That took Xen out of the strictly open source camp and moved it several steps closer to the Microsoft camp, since Citrix and Microsoft have been close partners for over 20 years.

Xen is still open source code, but its backers found reasons (faster than you can say vMotion) to move on. The Open Virtualization Alliance still shares one thing in common with the Xen open source project. Both groups wish to slow VMware’s rapid advance.

Wary Eyes

Indeed, that is the goal. Most of the industry, with the notable exception of VMware’s parent EMC, is casting a wary eye at the virtualization juggernaut, wondering how far and wide its ambitions will extend and how they will impact the market.

As Babcock points out, however, by moving in mid race from one hypervisor horse (Xen) to another (KVM), the big backers of open-source virtualization might have surrendered insurmountable ground to VMware, and perhaps even to Microsoft. Much will depend on whether VMware abuses its market dominance, and whether Microsoft is successful with its mid-market virtualization push into its still-considerable Windows installed base.

Long Way to Go

Last but perhaps not least, KVM and the Open Virtualization Alliance (OVA) will have a say in the outcome. If OVA members wish to succeed, they’ll not only have to work exceptionally hard, but they’ll also have to work closely together.

Coming from behind is never easy, and, as Babcock contends, just trying to ride Linux’s coattails will not be enough. KVM will have to continue to define its own value proposition, and it will need all the marketing and technological support its marquee backers can deliver. One area of particular importance is operations management in the data center.

KVM’s market share, as reported by Gartner earlier this year, was less than one percent in server virtualization. It has a long way to go before it causes VMware’s executives any sleepless nights. That it wasn’t the first choice of its proponents, and that it has lost so much time and ground, doesn’t help the cause.

What Philosophers Counsel on the Private Cloud

“The beginning of wisdom is the definition of terms,” — Socrates

“If you wish to converse with me, define your terms.” — Voltaire

As Socrates and Voltaire knew, meaningful discourse depends on a shared understanding of the topic under discussion. To have shared understanding, clearly defined and agreed terms are prerequisites.

In the sphere of cloud computing, defined terms and shared understanding have been at a premium. Protracted debates have ensued regarding the definitions of various permutations of cloud computing. Debate and discourse should eventually resolve into enlightenment, but discussion of cloud computing often produces more heat than light. It’s been that way for a long time.

Consensus Hard to Find

This year, the latest cloud-computing flashpoint involved a battle over the definition of the private cloud. As far as I can tell, the battle still rages with no end in sight. Many definitions have been advanced by a number of notable individuals and groups — and sometimes heated discussion has followed their introduction — but the disputation still hasn’t resulted in consensus and shared understanding, though some organizations, such as the National Institute of Standards and Technology (NIST), have made valiant efforts to deliver much-needed clarity.

Technical criteria aside — such characteristics and qualifiers have been discussed at length by accomplished software architects, illustrious infrastructure personages, and all manner of engineers and market analysts — what seems to be occurring is a power struggle.

Our philosopher friends quoted above observed that a solid definition of terms was the starting point for meaningful conversation and wisdom. That’s true, of course. But it’s also true, as political philosophers will tell us, that knowledge — which is based on the understanding that accrues from agreed terminology — is power.

Battle for Control

Accordingly, we can understand the battleground of cloud computing — some have called it a “circus,” but a circus aims at entertainment and amusement — if we consider its political aspects. It’s here that we can see the central conflict.

On one side, we have an established order or status quo — represented by the way IT has been supplied, delivered, and managed until now — arrayed against the forces of change, represented by cloud computing’s purest adherents and most passionate proponents. Each side wants to control the discussion, which is why the battle over terminology is so intense and why definitions are constantly revised and challenged.

For those waging the battle, the fight itself makes sense. Each side, as noted, wants to control terminology so that it can condition and facilitate a desired result. Unfortunately, disinterested observers, including enterprises and organizations trying to set their IT-related roadmaps and strategies, are confused by the tumult. They’re not getting what they need from the principals in the debate.

Cutting Through the Noise

Fortunately, there is a way for these organizations to cut through the noise, to gain the insight and understanding they require to set their course and ascertain whether, how,  and where new methodologies, service models, and technologies are applicable.

How they do it, of course, is by being as self-interested as the disputants on either side of the cloud-computing debate. What they must do is demand clear answers as to how what’s being pushed at them will add value to their organizations.

When a vendor or service provider makes a pitch, prospective customers must step back and consider the implications. Who benefits? Is it just the vendor or service provider making the pitch, either by retarding change or hastening it, or is it the customer, which must support established applications and processes while charting an assured course toward lower costs, higher productivity, and greater overall efficiency? The answer to that question is the real litmus test, for the solicitous vendor as well as the prospective customer.

It is What it Does

According mathematician-cum-philosopher Alfred North Whitehead, a thing is what it does. If a vendor is pushing a hard sell on the public or private cloud, customers should challenge the vendor to clearly state the customer-centric benefits, costs, and implications of the proposed offering. Then, after the vendor has made its case, the customer can evaluate it. In the end, each individual customer should and must make its own decision, based on its objectives, its needs, its requirements, its risk tolerance, and its culture.

If there is no common industry-wide definition — and the vendor community has been responsible for the cloud-computing muddle — then each prospective customer will have to reach his or her own conclusions about what’s really being discussed. That’s how it needs to be, anyway.

Lurid Bribery Case Doesn’t Reflect Poorly on Gartner

It’s a headline nobody at Gartner wants to see. Thankfully, the underlying story isn’t nearly as distressing.

The well-known consultancy and IT market-research concern probably is mortified that a news story posted on the Network World website — and covered elsewhere, too — offers the lurid headline, “Former Gartner manager gets jail for accepting bribes.” If you’re a company in Gartner’s business, you don’t ever want your name to appear in the same sentence as the words “jail” and “bribes.”

What’s important to note, however, is that the manager in question wasn’t involved in consulting or research activities. Instead, he was responsible for the purchase of multimedia services and equipment.

Moreover, an earlier press release from the United States Attorney Southern District of New York, makes clear that Gartner was a victim of the bribery scheme. It’s an unseemly and unfortunate tale, replete with attention-grabbing headlines, but it doesn’t and shouldn’t reflect negatively on Gartner’s credibility and repute as a purveyor of advisory services and market research.

Gartner Sees Better TImes Ahead for Smartphones in 2010

Gartner says smartphone sales didn’t quite meet its expectations in 2009, even though the high-end handsets continue to account for a growing percentage of overall mobile-phone sales.

In 2013, according to Gartner, more than every third phone sold will be a smartphone. The market-research firm says the biggest threat to that forecast is represented by wireless operators, which could inhibit sales if they persist in packaging smartphones with flat-rate data plans that put the handsets beyond the financial means of many prospective buyers.

For 2009 — which might go down as annus horribilis in so many respects — sales of mobile phones to consumers are expected to drop less than one percent and total about 1.2 billion units. Gartner predicts that mobile-phone sales will start to grow again, at about 9 percent, in 2010, with average sales prices per unit dropping approximately $2 from 2009 levels. Average sales prices of mobile phones dropped about $10 per unit in 2009.

I haven’t seen Gartner’s base assumptions for growth across global markets, so I’ll refrain from commenting in detail. That said, I’m skeptical of a sharp rebound in market growth, especially one that won’t be goosed by lower prices.

What’s Gartner Saying?

As I perused Gartner’s press release announcing its “top 10 technologies and trends that will be strategic for most organizations in 2010,” two of the listed items annoyed me, though for slightly different reasons.

At the top of Gartner’s list of top 10 strategic technologies is cloud computing, that much-discussed but nebulous technological phenomenon that is reputedly taking hold in the minds and planning processes of enterprises worldwide.

I am not going to take the position that cloud computing isn’t important, or that it doesn’t have a potentially lucrative future, but I am going to take the position, alongside Oracle CEO Larry Ellison, that it is ambiguously and poorly defined by most of those who like to talk about it.

Alas, Gartner is no exception to that rule. Gartner, coming down the mountain with its tablet of 10 strategic technologies, says the following on the subject:

Cloud computing is a style of computing that characterizes a model in which providers deliver a variety of IT-enabled capabilities to consumers. Cloud-based services can be exploited in a variety of ways to develop an application or a solution. Using cloud resources does not eliminate the costs of IT solutions, but does re-arrange some and reduce others. In addition, consuming cloud services enterprises (sic) will increasingly act as cloud providers and deliver application, information or business process services to customers and business partners.

Could that have been more muddled? Does anybody understand what Gartner is on about? Shouldn’t we expect a modicum of clarity and cogency from a research firm that is paid so richly to tell enterprises and IT vendors what to think?

Yes, my apoplexy is in full-tilt boogie. But I feel my cause is righteous. So-called thought leaders should express their thoughts articulately and clearly. Coherence and intelligibility should not be negotiable.

Further down the list, Gartner says the following about another allegedly strategic technology, social computing:

Workers do not want two distinct environments to support their work – one for their own work products (whether personal or group) and another for accessing “external” information. Enterprises must focus both on use of social software and social media in the enterprise and participation and integration with externally facing enterprise-sponsored and public communities. Do not ignore the role of the social profile to bring communities together.

Again, the sentence structure and wording leave something to be desired, but I’ll put that objection aside. What I will not put aside, however, is my complaint that Gartner has not put forward a compelling reason for enterprises to countenance their employees spending time on social-networking sites while at the office, presumably during business hours.

Really, what’s the business case for untrammeled Facebook access at work? Shouldn’t employees who report to the office, you know, actually work there? Does Gartner realize that Facebook owns the content posted to it? How does that adhere to corporate or government policies relating to information confidentiality?

What’s the ROI-related business case for allowing employees to spend time on Facebook or MySpace? It’s impossible to know, because Gartner has stated no clear business argument for opening the social-networking floodgates.

I’m taken aback that Gartner has issued this press release. Not enough thought has gone into the substance and presentation of its content. That should be a worrying sign for the clientele that pay the company for its research and opinions.

ConSentry Latest NAC Vendor to Close Shop

ConSentry Networks, a pioneer in network-access control (NAC) switching, is closing its doors.

Founded in 2003, ConSentry raised lifetime funding of approximately $80.4 million, borrowed in five installments, roughly one per year. Its final funding round, announced in January of this year, was for approximately $9.4 million and came from existing investors.

During the course of its existence, ConSentry was financially backed by some prominent Silicon Valley VCs, including Accel Partners (one of Accel’s partners became ConSentry’s CEO for a brief period) and Sequoia Capital.

Network Access Control, sometimes referred to as Network Admission Control — which fortunately resolves into the same acronym — is a market that has failed to deliver on its hype and promise. For a long time, it was billed as the next major wave in enterprise security.

To date, though, it has fallen well short of becoming a $1-billion market. Gartner reported that the NAC market grew 51 percent in 2008, but was worth just $221 million. That’s not enough to support dozens of players, including many established, brand-name vendors who tend to capture their fair share of NAC from within their broader installed bases of customers.

In that respect, ConSentry is not alone in failing to survive the economic downturn and the natural consolidation of the marketplace. All technology markets consolidate as they mature — with the marketplace eventually selecting winners and losers — and they consolidate faster during economic downturns. Consolidation is heightened further if it occurs in a market segment that isn’t living up to commercial expectations.

A few of ConSentry’s former rivals had closed their doors previously, and others have been acquired by bigger players.

In general, whatever spoils the NAC market has yielded have gone to the leading network-infrastructure players (Cisco, Juniper, HP ProCurve, etc.), who’ve built or bought their own NAC capabilities, and to the vendors who own security real estate on computing end points. Those vendors include Symantec, McAfee, Sophos, and, increasingly, Microsoft.

Each time the market’s music stops, signaling that another vendor must leave the party, fewer chairs are available at the table. Some of those chairs are permanently reserved for
the major players
. Less and less room is available for standalone NAC switch or appliance vendors, even if they’ve attempted to tailor their value proposition toward providing security intelligence to the network infrastructure or endpoints belonging to the big guys.

Some observers still claim the “Year of NAC” will come, but it won’t arrive soon enough for ConSentry and others.

Should Market Researchers Declare Potential Conflicts of Interest?

I have been wondering lately whether market researchers at firms such as Gartner, IDC, and Forrester should have to declare conflicts of interest when speaking for attribution with the trade press and business journalists about vendors with whom they have business relationships.

Market analysts at investment banks often are asked by journalists to disclose whether their firms have current or prospective business ties to technology vendors on which they pass comment. By revealing whether a relationship exists, the analyst provides the journalist and his or her readers with additional context in which to judge the objectivity of the commentary. It is a form of disclosure that adds to the credibility of both the article and the sources quoted in it.

I think the same practice should be applied to technology’s leading research houses. Journalists and their readers have a legitimate right to consider, for example, whether a business relationship Gartner or IDC has with Cisco, Oracle, or Microsoft might inhibit or otherwise color commentary and opinions that might be offered regarding a vendor’s product strategy, launch campaign, or prospects for continued success.

At the very least, such a practice would provide full disclosure to readers, so that they could more completely evaluate the pronouncements of the technology industry’s pundits. What do you think?