Category Archives: Business models

PC Market: Tired, Commoditized — But Not Dead

As Hewlett-Packard prepares to spinoff or sell its PC business within the next 12 to 18 months, many have spoken about the “death of the PC.”

Talk of “Death” and “Killing”

Talk of metaphorical “death” and “killing” has been rampant in technology’s new media for the past couple years . When observers aren’t noting that a product or technology is “dead,” they’re saying that an emergent product of one sort or another will “kill” a current market leader. It’s all exaggeration and melodrama, of course, but it’s not helpful. It lowers the discourse, and it makes the technology industry appear akin to professional wrestling with nerds. Nobody wants to see that.

Truth be told, the PC is not dead. It’s enervated, it’s best days are behind it, but it’s still here. It has, however, become a commodity with paper-thin margins, and that’s why HP — more than six years after IBM set the precedent — is bailing on the PC market.

Commoditized markets are no place for thrill seekers or for CEOs of companies that desperately seek bigger profit margins. HP CEO Leo Apotheker, as a longtime software executive, must have viewed HP’s PC business, which still accounts for about 30 percent of the company’s revenues, with utter disdain when he first joined the company.

No Room for Margin

As  I wrote in this forum a while back, PC vendors these days have little room to add value (and hence margin) to the boxes they sell. It was bad enough when they were trying to make a living atop the microprocessors and operating systems of Intel and Microsoft, respectively. Now they also have to factor original design manufacturers (ODMs)  into the shrinking-margin equation.

It’s almost a dirty little secret, but the ODMs do a lot more than just manufacture PCs for the big brands, including HP and Dell. Many ODMs effectively have taken over hardware design and R&D from cost-cutting PC brands. Beyond a name on a bezel, and whatever brand equity that name carries, PC vendor aren’t adding much value to the box that ships.

For further background on how it came to this — and why HP’s exit from the PC market was inevitable — I direct you to my previous post on the subject, written more than a year ago. In that post, I quoted and referenced Stan Shih, Acer’s founder, who said that “U.S. computer brands may disappear over the next 20 years, just like what happened to U.S. television brands.”

Given the news this week, and mounting questions about Dell’s commitment to the low-margin PC business, Shih might want to give that forecast a sharp forward revision.

Divining Google’s Intentions for Motorola Mobility

In commenting now on Google’s announcement that it will acquire Motorola Mobility Holdings for $12.5 billion, I feel like the guest who arrives at a party the morning after festivities have ended: There’s not much for me to add, there’s a mess everywhere, more than a few participants have hangovers, and some have gone well past their party-tolerance level.

Still, in the spirit of sober second thought, I will attempt to provide Yet Another Perspective (YAP).

Misdirection and Tumult

It was easy to get lost in all the misdirection and tumult that followed the Google-Motorola Mobility announcement. Questions abounded, Google’s intentions weren’t yet clear, its competitors were more than willing to add turbidity to already muddy waters, and opinions on what it all meant exploded like scattershot in all directions.

In such situations, I like to go back to fundamental facts and work outward from there. What is it we know for sure? Once we’re on a firm foundation, we can attempt to make relatively educated suppositions about why Google made this acquisition, where it will take it, and how the plot is likely to unspool.

Okay, the first thing we know is that Google makes the overwhelming majority (97%) of its revenue from advertising. That is unlikely to change. I don’t think Google is buying Motorola Mobility because it sees its future as a hardware manufacturer of smartphones and tablets. It wants to get its software platform on mobile devices, yes, because that’s the only way it can ensure that consumers will use its search and location services ubiquitously; but don’t confuse that strategic objective with Google wanting to be a hardware purveyor.

Patent Considerations 

So, working back from what we know about Google, we now can discount the theory that Google will be use Motorola Mobility as a means of competing aggressively against its other Android licensees, including Samsung, HTC, LG, and scores of others.  There has been some fragmentation of the Android platform, and it could be that Google intends to use Motorola Mobility’s hardware as a means of enforcing platform discipline and rigor on its Android licensees, but I don’t envision Google trying to put them out of business with Motorola. That would be an unwise move and a Sisyphean task.

Perhaps, then, it was all about the patents? Yes, I think patents and intellectual-property rights figured prominently into Google’s calculations. Google made no secret that it felt itself at a patent deficit in relation to its major technology rivals and primary intellectual-property litigants. For a variety of reasons — the morass that is patent law, the growing complexity of mobile devices such as smartphones, the burgeoning size and strategic importance of mobility as a market — all the big vendors are playing for keeps in mobile. Big money is on the table, and no holds are barred.

Patents are a means of constraining competition, conditioning and controlling market outcomes, and — it must be said — inhibiting innovation. But this situation wasn’t created by one vendor. It has been evolving (or devolving) for a great many years, and the vendors are only playing the cards they’ve been dealt by a patent system that is in need of serious reform. The only real winners in this ongoing mess are the lawyers . . . but I digress.

Defensive Move

Getting back on track, we can conclude that, considering its business orientation, Google doesn’t really want to compete with its Android licensees and that patent considerations figured highly in its motivation for acquiring Motorola Mobility.

Suggestions also surfaced that the deal was, at least in part, a defensive move. Apparently Microsoft had been kicking Motorola Mobility’s tires and wanted to buy it strictly for its patent portfolio. Motorola wanted to find a buyer willing to take, and pay for, the entire company. That apparently was Google’s opening to snatch the Motorola patents away from Microsoft’s outstretched hands — at a cost of $12.5 billion, of course. This has the ring of truth to it. I can imagine Microsoft wanting to administer something approaching a litigious coup de grace on Google, and I can just as easily imagine Google trying to preclude that from happening.

What about the theory that Google believes that it must have an “integrated stack” — that it must control, design, and deliver all the hardware and software that constitutes the mobile experience embodied in a smartphone or a tablet — to succeed against Apple?

No Need for a Bazooka

Here, I would use the market as a point of refutation. Until the patent imbroglio raised its ugly head, Google’s Android was ascendant in the mobile space. It had gone from nowhere to the leading mobile operating system worldwide, represented by a growing army of diverse device licensees targeting nearly every nook and cranny of the mobile market. There was some platform fragmentation, which introduced application-interoperability issues, but those problems were and are correctable without Google having recourse to direct competition with its partners.  That would be an extreme measure, akin to using a bazooka to herd sheep.

Google Android licensees were struggling in the court of law, but not so much in the court of public opinion as represented by the market. Why do you think Google’s competitors resorted to litigious measures in the first place?

So, no — at least based on the available evidence — I don’t think Google has concluded that it must try to remake itself into a mirror image of Apple for Android to have a fighting chance in the mobile marketplace. The data suggests otherwise. And let’s remember that Android, smartphones, and tablets are not ends in themselves but means to an end for Google.

Chinese Connection?

What’s next, then? Google can begin to wield the Motorola Mobility patent portfolio to defend and protect is Android licensees. It also will keep Motorola Mobility’s hardware unit as a standalone, separate entity for now. In time, though, I would be surprised if Google didn’t sell that business.

Interestingly, the Motorola hardware group could become a bargaining chip of sorts for Google. I’ve seen the names Huawei and ZTE mentioned as possible buyers of the hardware business. While Google’s travails in China are well known, I don’t think it’s given up entirely on its Chinese aspirations. A deal involving the sale of the Motorola hardware business to Huawei or ZTE that included the buyer’s long-term support for Android — with the Chinese government’s blessing, of course — could offer compelling value to both sides.

Bit-Business Crackup

I have been getting broadband Internet access from the same service provider for a long time. Earlier this year, my particular cable MSO got increasingly aggressive about a “usage-based billing” model that capped bandwidth use and incorporated additional charges for “overage,” otherwise known as exceeding one’s bandwidth cap.  If one exceeds one’s bandwidth cap, one is charged extra — potentially a lot extra.

On the surface, one might suppose the service provider’s intention is to bump subscribers up to the highest bandwidth tiers. That’s definitely part of the intent, but there’s something else afoot, too.

Changed Picture

I believe my experience illustrates a broader trend, so allow me elaborate. My family and I reached the highest tier under the service provider’s usage-based-billing model. Even at the highest tier, though, we found the bandwidth cap abstemious and restrictive. Consequently, rather pay exorbitant overages or be forced to ration bandwidth as if it were water during a drought, we decided to look for another service provider.

Having made our decision, I expected my current service provider to attempt to keep our business. That didn’t happen. We told the service provider why we were leaving — the caps and surcharges were functioning as inhibitors to Internet use — and then set a date when service would be officially discontinued. That was it.  There was no resistance, no counteroffers or proposed discounts, no meaningful attempt to keep us as subscribers.

That sequence of events, and particularly that final uneventful interaction with the service provider, made me think about the bigger picture in the service-provider world. For years, the assumption of telecommunications-equipment vendors has been that rising bandwidth tides would lift all boats.  According to this line of reasoning, as long as consumers and businesses devoured more Internet bandwidth, network-equipment vendors would benefit from steadily increasing service-provider demand. That was true in the past, but the picture has changed.

Paradoxical Service

It’s easy to understand why the shift has occurred. Tom Nolle, president of CIMI Corp., has explained the phenomenon cogently and repeatedly over at his blog. Basically, it all comes down to service-provider monetization, which results from revenue generation.

Service providers can boost revenue in two basic ways: They can charge more for existing services, or they can develop and introduce new services. In most of the developed world, broadband Internet access is a saturated market. There’s negligible growth to be had. To make matters worse, at least from the service-provider perspective, broadband subscribers are resistant to paying higher prices, especially as punishing macroeconomic conditions put the squeeze on budgets.

Service providers have resorted to usage-based billing, with its associated tiers and caps, but there’s a limit to how much additional revenue they can squeeze from hard-pressed subscribers, many of whom will leave (as I did) when they get fed up with metering, overage charges, and with the paradoxical concept of service providers that discourage their subscribers from actually using the Internet as a service.

The Problem with Bandwidth

The twist to this story — and one that tells you quite a bit about the state of the industry — is that service providers are content to let disaffected subscribers take their business elsewhere. For service providers, the narrowing profit margins related to providing increasing amounts of Internet bandwidth are not worth the increasing capital expenditures and, to a lesser extent, growing operating costs associated with scaling network infrastructure to meet demand.

So, as Nolle points out, the assumption that increasing bandwidth consumption will necessarily drive network-infrastructure spending at service providers is no longer tenable. Quoting Nolle:

 “We’re seeing a fundamental problem with bandwidth economics.  Bits are less profitable every year, and people want more of them.  There’s no way that’s a temporary problem; something has to give, and it’s capex.  In wireline, where margins have been thinning for a longer period and where pricing issues are most profound, operators have already lowered capex year over year.  In mobile, where profits can still be had, they’re investing.  But smartphones and tablets are converting mobile services into wireline, from a bandwidth-economics perspective.  There is no question that over time mobile will go the same way.  In fact, it’s already doing that.

To halt the slide in revenue per bit, operators would have to impose usage pricing tiers that would radically reduce incentive to consume content.  If push comes to shove, that’s what they’ll do.  To compensate for the slide, they can take steps to manage costs but most of all they can create new sources of revenue.  That’s what all this service-layer stuff is about, of course.”

Significant Implications

We’re already seeing usage-pricing tiers here in Canada, and I have a feeling they’ll be coming to a service provider near you.

Yes, alternative service providers will take up (and are taking up) the slack. They’ll be content, for now, with bandwidth-related profit margins less than those the big players would find attractive. But they’ll also be looking to buy and run infrastructure at lower prices and costs than did incumbent service providers, who, as Nolle says, are increasingly turning their attention to new revenue-generating services and away from “less profitable bits.”

This phenomenon has significant implications for consumers of bandwidth, for service providers who purvey that bandwidth, for network-equipment vendors that provide gear to help service providers deliver bandwidth, and for market analysts and investors trying to understand a world they thought they knew.

Intel’s Fulcrum Buy Validates Merchant Silicon, Rise of Cloud

When I wrote my post earlier today on Cisco’s merchant-silicon dilemma, I had yet to read about Intel’s acquisition of Fulcrum Microsystems, purveyor of silicon for 10GbE and 40GbE switches.

While the timing of my post was fortuitous, today’s news suggests that Intel has been thinking about the data-center merchant silicon for some time. Acquisitions typically don’t come together overnight, and Intel doubtless has been taking careful note of the same trends many of us have witnessed.

Data Center on a Chip

In announcing the deal today, Intel has been straightforward about its motivations and objectives. As Intel officials explained to eWeek, Fulcrum’s chip technology will not only allow network-equipment vendors to satisfy demand for high-performance, low-latency 10GbE and 40GbE gear, but it also will put Intel in position to fulfill silicon requirements for all aspects of converged data centers. With that in mind, Intel has stated that it is working to integrate a portfolio of comprehensive data-center components — covering servers, storage, and networking — based on its Xeon processors.

With converged data centers all the rage at Cisco, HP, Dell, IBM, (and many other vendors besides), Intel wants to put itself in position to meet the burgeoning need.

Intel did not disclose financial details of the acquisition, which is expected to close in the third quarter, but analysts generally believe the deal will have only modest impact on Intel’s bottom line.

Strategically, though, the consensus is that it offers considerable upside. Intel apparently has told Deutsche Bank analysts that it now captures only about two percent of overall expenditures dedicated to data-center technology. Fulcrum is seen as a key ingredient in helping Intel substantially boost its data-center take.

Unlikely to Repeat Past Mistakes

The deal puts Intel into direct competition with other merchant-silicon vendors in the networking market, including Broadcom and Marvell. Perhaps a bigger concern, as pointed out by Insight64 analyst Nathan Brookwood, is that Intel failed in its previous acquisitions of network-chip suppliers. Those acquisitions, executed during the late 90s, included the $2.2-billion purchase of Level One.

Much has changed since then, of course — in the market in general as well as in Intel’s product portfolio — and Brookwood concedes that the Fulcrum buy seems a better fit strategically and technologically than Intel’s earlier forays into the networking space. Obviously, data-center convergence was not on the cards back then.

Aligned with March to Merchant Silicon, Rise of Cloud

To be sure, the acquisition is perfectly aligned with the networking community’s shift to merchant silicon and with the evolution of highly virtualized converged data centers, including cloud computing.

One vendor that’s enthusiastic about the deal is Arista Networks. In email correspondence after the deal was announced, Arista CEO Jayshree Ullal explained why she and her team are so excited at today’s news.

Arista Thrilled 

First off, Ullal noted that Arista is one of Fulcrum’s top customers. Intel’s acquisition of Fulcrum, Ullal said, “validates the enterprise-to-cloud networking migration.” What’s also validated, Ullal said, is merchant silicon, as opposed to “outdated clunky ASICs.” Now there are three major merchant chip vendors serving the networking industry: Intel, Broadcom, and Marvell.

Ullal also echoed others in saying that the deal is great for Intel because it moves the chip kingpin into networking/switch silicon and cloud computing. Finally, she said Fulcrum benefits because, with the full backing of Intel, it can leverage the parent company’s “processes and keep innovating now and beyond for big data, cloud, and virtualization.”

Even though, monetarily, there have been bigger acquisitions, today’s deal seems to have a strategic resonance that will be felt for a long time. Intel could play a significant role in expediting the already-quickening commoditization of networking hardware — in switches and in the converged data center — thereby putting even more pressure on networking and data-center vendors to compensate with the development and delivery of value-add software.

What Philosophers Counsel on the Private Cloud

“The beginning of wisdom is the definition of terms,” — Socrates

“If you wish to converse with me, define your terms.” — Voltaire

As Socrates and Voltaire knew, meaningful discourse depends on a shared understanding of the topic under discussion. To have shared understanding, clearly defined and agreed terms are prerequisites.

In the sphere of cloud computing, defined terms and shared understanding have been at a premium. Protracted debates have ensued regarding the definitions of various permutations of cloud computing. Debate and discourse should eventually resolve into enlightenment, but discussion of cloud computing often produces more heat than light. It’s been that way for a long time.

Consensus Hard to Find

This year, the latest cloud-computing flashpoint involved a battle over the definition of the private cloud. As far as I can tell, the battle still rages with no end in sight. Many definitions have been advanced by a number of notable individuals and groups — and sometimes heated discussion has followed their introduction — but the disputation still hasn’t resulted in consensus and shared understanding, though some organizations, such as the National Institute of Standards and Technology (NIST), have made valiant efforts to deliver much-needed clarity.

Technical criteria aside — such characteristics and qualifiers have been discussed at length by accomplished software architects, illustrious infrastructure personages, and all manner of engineers and market analysts — what seems to be occurring is a power struggle.

Our philosopher friends quoted above observed that a solid definition of terms was the starting point for meaningful conversation and wisdom. That’s true, of course. But it’s also true, as political philosophers will tell us, that knowledge — which is based on the understanding that accrues from agreed terminology — is power.

Battle for Control

Accordingly, we can understand the battleground of cloud computing — some have called it a “circus,” but a circus aims at entertainment and amusement — if we consider its political aspects. It’s here that we can see the central conflict.

On one side, we have an established order or status quo — represented by the way IT has been supplied, delivered, and managed until now — arrayed against the forces of change, represented by cloud computing’s purest adherents and most passionate proponents. Each side wants to control the discussion, which is why the battle over terminology is so intense and why definitions are constantly revised and challenged.

For those waging the battle, the fight itself makes sense. Each side, as noted, wants to control terminology so that it can condition and facilitate a desired result. Unfortunately, disinterested observers, including enterprises and organizations trying to set their IT-related roadmaps and strategies, are confused by the tumult. They’re not getting what they need from the principals in the debate.

Cutting Through the Noise

Fortunately, there is a way for these organizations to cut through the noise, to gain the insight and understanding they require to set their course and ascertain whether, how,  and where new methodologies, service models, and technologies are applicable.

How they do it, of course, is by being as self-interested as the disputants on either side of the cloud-computing debate. What they must do is demand clear answers as to how what’s being pushed at them will add value to their organizations.

When a vendor or service provider makes a pitch, prospective customers must step back and consider the implications. Who benefits? Is it just the vendor or service provider making the pitch, either by retarding change or hastening it, or is it the customer, which must support established applications and processes while charting an assured course toward lower costs, higher productivity, and greater overall efficiency? The answer to that question is the real litmus test, for the solicitous vendor as well as the prospective customer.

It is What it Does

According mathematician-cum-philosopher Alfred North Whitehead, a thing is what it does. If a vendor is pushing a hard sell on the public or private cloud, customers should challenge the vendor to clearly state the customer-centric benefits, costs, and implications of the proposed offering. Then, after the vendor has made its case, the customer can evaluate it. In the end, each individual customer should and must make its own decision, based on its objectives, its needs, its requirements, its risk tolerance, and its culture.

If there is no common industry-wide definition — and the vendor community has been responsible for the cloud-computing muddle — then each prospective customer will have to reach his or her own conclusions about what’s really being discussed. That’s how it needs to be, anyway.

Microsoft Past Its Prime, but Even Blodget Wouldn’t Bet on Its Imminent Demise

Henry Blodget’s Business Insider is a guilty pleasure. From the tabloid headlines to the flashpoint content, carefully contrived to generate criticism and heated debate, Blodget gives you plenty of sizzle even when he forgets to put a steak on the grill.

Occasionally, though, he’ll provide some food for thought alongside the crowd-seeking sensationalism. In one of his latest pieces — portentously titled, “The Odds Are Increasing That Microsoft’s Business Will Collapse” — Blodget injects enough plausibility into his argument to evoke the image of an erstwhile software giant staggering incontinently toward an open grave.

To summarize, Blodget contends that Microsoft draws the vast majority of its profits from its Windows and Office franchises. He provides colorful charts to illustrate the point, which is indisputable. He then posits Microsoft’s predicament: the Internet, the rise of mobility (in which it has been abject), the ascent of cloud computing, and the determined competitive incursions of Apple and Google have put Microsoft’s cash cows in mortal peril.

As Blodget phrases it:

The desktop PC isn’t the center of anyone’s universe anymore. The Internet is. And the Internet doesn’t require Windows.

As for Office, he points to the rise of Google Apps, which Blodget perceives as a “classic disruptive technology” that is “cheaper, easier, and more convenient to use than Microsoft Office.”

At the end of the piece, Blodget presents three scenarios for Microsoft:

Right now, the investors are concluding that Microsoft will gradually become the equivalent of a technology utility–a boring but necessary provider of the software that runs the world’s business community.  A smaller, more optimistic crowd is still arguing that, one day, Microsoft will be able to turn its fortunes around, and fight its way back into an industry leadership position.

What almost no one is talking about is a third possibility, one that becomes more likely by the day: The possibility that, a couple of years down the road, Microsoft’s business may just completely collapse.

Given enough time, anything is possible. Still, is there a strong likelihood that Microsoft’s business will “completely collapse” in two years? I doubt it. The primary reason for such doubt is that customers aren’t moving to the cloud fast enough to bring about Microsoft’s immediate demise.

Startup companies, free of established processes and prior IT investments, increasingly are adopting cloud models that tend to leave Microsoft out of the action (or with only a small piece of it). Even so, Microsoft has a Windows installed base of SME and enterprise customers that will think at least twice before abandoning the devil they know. That’s human nature, especially during a period of great and persistent economic uncertainty.

The situation is similar, though perhaps more tenuous, for Office. Google will win defections, starting in vertical markets where Microsoft’s Office pricing is most onerous and its high-end features less necessary. There’s no question that Microsoft will see erosion in its licensed and shrink-wrapped Office business, but that erosion is not likely to become a catastrophic landslide within two years.

Are Microsoft’s best days behind it? Yes, I think so. The company is extremely unlikely to reach anything approaching market leadership in mobile platforms and smartphones, its former hold on PC and mobile-device OEMs has slackened, and it’s at a perennial loss in areas such as web search and  in most consumer markets. It needs to invest more in its SME and enterprise offerings, including its business-oriented cloud services, and less in consumerist boondoggles.

But the collapse of Microsoft in two years? All things considered, I’d bet against that outcome. I tend to think Blodget would, too. Then again, he’s drawn traffic with his provocatively headlined post, so he probably won’t mind the hedging strategy.

GridPoint Among Startups Recalibrating in Search for Smart-Grid Gold

Using GridPoint as an example, Martin LaMonica of CNET examines the hardships some smart-grid startup companies are experiencing as utilities take a discriminating approach to expenditures on technology upgrades.

Even though the general consensus holds that the smart grid eventually will fulfill its commercial promise — most of it, anyway — many market analysts and startup investors now concede that they were overly optimistic regarding their industry forecasts and commercial expectations.

Like a pop star trying to appeal to a fickle audience, GridPoint has reinvented itself on a number of occasions, with its transformations perhaps prompted as much by anxious investors as by customer demand. Depending on one’s perspective, GridPoint is a market visionary seeking to provide comprehensive grid-management software or an increasingly desperate company firing shotgun blasts in all directions.

As often is the case, however, it’s not that simple. In nascent markets, such as the smart-grid space, startup vendors often recalibrate their strategic plans as expectations meet reality. It isn’t unusual for companies to go through several metamorphoses before finding the right path to prosperity — or getting irredeemably lost in the wilderness (where there are no paying customers). It remains to be seen how it will end for GridPoint, but the company is leaving no stone unturned its quest for viability.

GridPoint has struggled as a purveyor of residential energy-management software, partly because consumers remain unconvinced they need such a product and partly because utilities are ambivalent about acting as a sales channel for such products. GridPoint also has tried, with varying degrees of success, to sell grid-management software, including vehicle-to-grid (V2G) solutions, directly to utilities. Now, the company believes it has worked the market oracle by offering energy-management software to commercial, industrial, and government customers, obviating utilities in the process.

There is an identifiable, well-contested market for demand-response software in the commercial and industrial sectors, but GridPoint’s play is a bit different. Through its acquisitions of Standard Renewable Energy and ADM Micro, GridPoint has put together a relatively comprehensive offering for businesses and government organizations pursuing “optimized energy consumption” — comprising reduced costs, longer equipment life cycles, and attainment of corporate-sustainability goals — as well as a greater integration of renewable energy (and, thus, lower emissions) into their consumption profile.

It’s true that organizations can derive efficiency gains and cost savings these sorts of solutions, but customers tend to be keener on adoption if they have a self-appointed or externally enforced “green mandate.” For that reason, large departments and agencies at governments of various levels, even in these straitened times, might be the low-hanging fruit for GridPoint’s latest near-term revenue focus.

As for what LaMonica’s interlocutors, including GridPoint, tell us about the state of the rest of the smart-grid space, I agree and disagree with some of their salient observations. Yes, I agree that it’s exceedingly difficult at this juncture to sell home-energy management solutions to consumers. Most consumers aren’t fully cognizant of the smart grid, and many whose homes have been equipped with smart meters aren’t much interested in the new devices. They typically don’t notice the smart meters until time-of-use (ToU) billing is activated, at which point they are as likely to react with indignation as with bemused curiosity.

It’s not clear to me that utilities know how to sell residential energy-managmeent systems, nor is it obvious that they want to sell them. At the same time, utilities are equally concerned, perhaps for good reason, about allowing third-party vendors, such as Google and Microsoft, to circumvent them and go directly to consumers with home-energy management offerings.

Another challenge, of course, is proving to consumers that the time and money they’ll spend on such systems will be rewarded with a compelling ROI, whether measured monetarily or in environmental gratification. Utilities haven’t worked it out, and neither — as far as I can see — have vendors of such products. For their part, regulators and public-utility commissions (PUCs) seem undecided about how to proceed.

So, yes, the smart-meter-connected consumer remains a tough nut to crack. Interests haven’t yet aligned to bring the consumer the type of value proposition that will persuade him or her to become an active market agent.

But, contrary to what the article seems to suggest, utilities are spending on smart-grid upgrades to their electricity generation, transmission, distribution, and substation infrastructure. Vendors are making money selling products and services to utilities in those areas. GridPoint might have missed that particular target, but others are hitting it. The spending occurs in phases — not all at once, and not in a huge wave — but it is proceeding in measurable increments that continue to grow.

The smart grid is an expansive, sprawling, heterogeneous mosaic of  functions, products, technologies, interdependencies, and ecosystems. Depending on one’s particular vantage point, it will look different. What’s more, not all smart grids, in all parts of the world, are being created equally. Some utilities are ahead of others, and some have different priorities based on economic, environmental, financial, geographical, and policy considerations.

For vendors, as GridPoint will attest, the challenge is in determining who in the smart-grid constellation is wiling to spend now on urgent near-term priorities as well as on long-term strategic initiatives. For some vendors, that will mean reaching outside utilities, while others have found ready markets for their products and services inside utilities.

Facebook Croons New Tune, But Song Remains the Same

Bruce Nussbaum, a former assistant managing editor at Business Week who now serves as a professor at Parsons School of Design, makes the argument that some of Facebook’s current privacy-related woes stem from its inability to remain attuned to cultural changes affecting its audience.

I’m not sure whether I buy the argument in its entirety, partly because Facebook long ago left behind its singular focus and dependence on college and high-school kids. Still, two brief sentences in Nussbaum’s blog post at Harvard Business Review are undeniably true:

At the moment, it (Facebook) has an audience that is at war with its advertisers. Not good.

No, it’s not good. But, as I argued early last year, Facebook was destined to be in conflict with its audience. The outcome was inevitable, resulting from Facebook’s inability or unwillingness to be transparent about the specifics of its business model and its exploitative relationship with its audience.

Facebook was neither forthcoming nor honest. Then, as now, Facebook continues to play a cynical game with those who use its service. It continues to lead them to believe they incur no downside for using a nominally free service. Then, as subscribers drop their guards, Facebook exacts a price, furtively dismantling privacy protections and trading on the sorts of sliced and diced demographic data that advertisers crave.

Now, as Facebook goes through another privacy overhaul, promising to make amends for what has become a pattern of deception and dishonesty, subscribers to the service ought to recall a hackneyed admonition about violated trust: Fool me once, shame on you. Fool me twice, shame on me. (George W. Bush emphasized a variation on this theme, you might remember.)

The truth is, Facebook can’t change. It’s too late. It’s caught in the bind I described in that blog post back in early 2009. Still, even though Facebook is ensnared in a trap of its own design, its audience doesn’t have to go along for the ride.

Will Electric Cars Redeem the Smart Grid’s Reputation?

Michael Kanellos of Greentech Media has written a commentary suggesting that electric vehicles might be the silver bullet that overcomes public apathy and outright antagonism toward smart meters and the smart grid.

After explaining that utilities in the United States and Australia have discovered that consumers aren’t enamored of the concept of demand response or of the higher electricity bills that frequently accompany smart-meter rollouts, Kannellos writes the following:

Even avid greenies seem blasé. In Canada, Toronto Hydro has scrutinized the behavior of around 115,000 customers on time-of-use plans. Has cut rate power at night goosed them to shift their behavior? “No. Not really,” said Toronto’s Karen France during a meeting at eMeter’s customer event.

Matt Golden, co-founder of retrofitter/software vendor Recurve, told me recently that the company has installed some energy management dashboards in the homes of clients. After two weeks, the frequency of interaction with the dashboards drops considerably. There have been success stories — customers surveyed in a test conducted by Silver Spring Networks and Oklahoma Gas and Electric were overwhelmingly surprised to learn about their rate of energy consumption — but people seem to be dozing off on what is a very important technology.

So what’s the problem? Utilities and building management outfits are asking people to change their behavior to save pennies. PG&E’s residential rates range from 11 to 49 cents a kilowatt hour. Will you alter your laundry schedule to save 37 cents? Toronto’s spread is 9.9 cents at peak and 4.4 at night.

Indeed, Kanellos identifies the problem, in Toronto and elsewhere. But the problem runs deeper than that, and Kanellos, to his credit, addresses it.

A little later in his commentary, Kanellos writes that consumers are wary of smart meters, and of the larger smart grid, because they suspect strongly that utilities will be the only parties to benefit from them. There’s some truth to that assessment, too, especially when one considers utilities’ operational costs savings: no more truck rolls for meter reading or for shutting down or activating service, plus the capacity to shave peak demand and to avoid having to add costly electricity-generation capacity.

For the consumer? Well, the benefits aren’t so clear, and certainly not as compelling. In some jurisdictions, careful consumer ministrations to smart meters mean only the difference between small increases in electricity bills and larger hikes.

Kanellos thinks electric cars will enhance the consumer appeal of the smart grid. To his way of thinking, electric cars are destined to be a huge hit with consumers, who will come to understand that the smart grid, including charging stations at home and out in the wider community, is essential to the sustenance of their new vehicles. At that point, Kanellos believes, consumers will grasp the importance and value of the smart grid, and they’ll buy into the program the utilities are pushing.

Maybe Kanellos is right. Perhaps electric vehicles will rescue the smart grid from public apathy and infamy. Then again, electric cars will not become ubiquitous overnight. A year from now, even a few years from now, not everybody will have one.

In the meantime, the braintrusts at utilities, regulators, and smart-grid vendors will have to devise other means of engaging, rather than alienating, electricity consumers.

Resisting Facebook’s f8

So, Facebook is opening its f8 developer conference today, and there’s some debate regarding how one should pronounce the event’s title.

Some say the pronunciation should be two syllables, as in the letter F and the number 8. Others, though, suggest that the pronunciation should be “fate,” as in the word denoting ” the development of events beyond a person’s control, regarded as determined by a supernatural power.”

Well, there are some big egos at Facebook, and I would imagine the reality-distortion fields in the company’s boardrooms and hallways have the power to scramble logical thinking and to engender delusions of grandeur. Facebook might actually believe that it is fated to conquer the world, or a least that portion of it that exists online.

Lately much debate has ensued about whether Facebook will render Twitter irrelevant. Like many others, I don’t see a close similarity between the two companies, the online services they offer, their subscriber demographics, or even their current business models. Given Facebook’s prodigious user base, however, there obviously is overlap between its subscribers and Twitter’s.

But the services themselves are very different. From my perspective, Twitter is about communication and information sharing. Facebook, though I haven’t been on it for a long time, seems to be about frivolity, triviality, a veritable online water cooler. It’s designed to be a place where people go for distractions, like television but more interactive.

That’s not surprising because Facebook’s real purposes is to serve as a giant consumer-analytics engine for advertisers. To the extent that it can cover the web, sucking information about what and where its subscribers do in their online existence, Facebook stands to make a lot of money.

But there’s not much to Facebook beyond that. It’s trying to transform its subscribers into an enormous database of likes and dislikes that can be segmented and sold to corporate marketers and advertisers. That’s always been Facebook’s game — as I’ve said here for a long time — and that’s why consumer and customer privacy just isn’t a priority for Facebook.

I’ve always enjoyed the delicious irony that Facebook originated at Harvard University. An institution renowned for erudition and scholarly achievement has produced a commercial entity that does its utmost to culturally impoverish the Internet, and to turn its subscribers into nothing more than data points for advertising campaigns.

Facebook is so malevolently vacuous that it reminds me of the corporate fascism depicted in RoboCop. Facebook is the online manifestation of Omni Consumer Products (OCP), the movie’s fictional, omnipresent megacorporation. Facebook probably would like nothing more than to have its subscribers function solely as consumers, focused only on their likes of dislikes of products and services that advertisers want them to buy.

Like the lecherous huckster in RoboCop who kept repeating the phrase” Ill buy that for a dollar!,” Facebook will try continually to dumb down and commercially condition online communication and interaction.

But I’m not buying what it’s selling, not for a dollar or for any other amount.

Google-China Conflict Must Be Viewed in Context of Bigger Story

As the old saw goes, we sometimes can’t see the forest for the trees. What’s happening is hidden in plain sight, but we don’t see it, either because we’re focusing too closely on an incidental element or because we don’t want to confront an unpalatable reality.

I feel that way as I watch the Google-China conflict play out. In truth, the dispute between Google and China is a symptom of a larger problem, one that has far-reaching implications for Western economies and entire industries, including the technology sector.

No, censorship is not the core issue. Censorship is a MacGuffin, a plot device that keeps the story moving in the media but doesn’t get to the heart of what’s really happening. As much as we like to think our companies value human rights above all else, it’s simply not true. Companies are businesses, and they behave like businesses. They’re guided by the profit motive, and they seek to grow revenue and earnings. It’s what they do.

Occasionally, ethical and moral considerations play a role in corporate strategies. There are companies that practice enlightened self-interest, and Google is one of them.

Google knows, for instance, that its search engine is more popular and valuable if it is seen to be objective, delivering the best possible results, not beholden to the solicitations of commercial interests or the fiats of oppressive governments. Paradoxically, by refusing to capitulate to those who would have Google skew its search results, Google actually makes its search engine more valuable to everybody, including Google. That’s enlightened self-interest.

So, what’s really happening? What’s the big picture? Google is one of dozens of Western multinational companies finding that China, though the fastest-growing major economy in the world, will not provide them with the riches they had anticipated. That’s because of China’s nationalist mercantilism, as reflected in its “indigenous innovation” industrial policies.

A story in today’s Wall Street Journal is instructive. Titled “Business Sours on China,” the article explores the growing disillusionment of foreign businesses in China. These businesses are discovering that Chinese authorities are increasingly favoring homegrown state-owned companies across a range of industries, including almost all involving technology-related growth sectors.

What follows is a salient excerpt from the WSJ story:

“The Google issue has had a crystallizing effect,” says Lester Ross, managing partner in Beijing for U.S. law firm Wilmer Cutler Pickering Hale and Dorr. “It raised the consciousness of government and of the boardrooms and other stakeholders” about the difficulties of doing business in China, he says.

Foreign investors have long complained about China’s haphazard legal system and regulation.

These were mere annoyances when China was an emerging market. Today, the huge Chinese market is increasingly fundamental to the health of large Western multinationals. Lose here, say Western executives, and multinationals are weakened globally.

So, as you can see, the stakes are huge. Companies that have built robust Chinese growth into their business models and revenue projections are increasingly anxious — and for good reason.

It doesn’t help that China’s systematic efforts to create state-backed, homegrown, market-leading behemoths doesn’t stop at “indigenous innovation.”

Remember that these issues are being raised by foreign transnationals in the immediate aftermath of what McAfee calls Operation Aurora, an outbreak of corporate espionage that allegedly saw China-based hackers attempt to purloin the source code, product formulas, and other intellectual property in “software configuration management systems” of at least 20 (and perhaps as many as 100) US-based companies. (Yes, Google was one of them, and that’s how and when his latest conflagration with China began.)

We don’t know what intellectual property was stolen from which companies. That information is not being volunteered. What’s not at issue is that somebody was trying to get what McAfee’s calls the corporate “crown jewels.”

I’m not saying censorship and human-rights abuses are not important issues. I wish they were more important than they are. But the fact is, this story is even bigger, with ramifications that could affect the health of Western economies as well as the profitability of the corporations they host.

Dell at the Crossroads

As I read the news coverage of Dell’s fourth-quarter financial results, I noticed a salient question from Shannon Cross of Cross Research:

“You have higher revenue but we didn’t see it on the bottom line. The question is, what is the potential profitability of their model?”

That’s a good question. I don’t have the answer, and I’m not sure Dell does. Which begs another question: Just what is Dell’s strategic focus?

The company is caught between a rock and a hard place. When Michael Dell returned to the company, he said he would boost gross margin and find a way to bring back the balanced profitability and growth for which Dell was known in its halcyon days.

He’s struggled to recreate the old magic, but it’s not because he’s doing things differently from how he and his team did them previously. In fact, the problem is that Dell hasn’t adapted enough to current circumstances. Dell needs to make some hard choices, and that means answering some difficult questions.

For example, should it continue to play in the consumer-PC market? I think the answer to this question depends primarily on whether it has the wherewithal to succeed in the market, and secondarily on whether the company can reduce its component and production costs to the point where it makes decent margins on sales. As things stand, Dell isn’t getting it done, and one has to wonder whether the situation will change. As it slides down the PC market-share charts, its economies of scale won’t improve.

The company also has a branding problem in the consumer space, and that exacerbates the situation. The tarnished brand can be burnished, but that will take sustained effort and resources, both of which might be more gainfully employed in other areas of the business.

Recently, Dell has added a smartphone and a five-inch tablet PC to its consumer-product portfolio. I understand the motivations. Dell wants to get a piece of the relatively high-margin smartphone market, and it’s also keen to ride the iPad wave in the seemingly resurgent tablet space. However, does Dell have a market mandate to play in these spaces? Does it have a reasonable expectation of being anything more than a non-medal contender in those areas?

A given market segment might be attractive, for reasons of margin or other considerations, but not every company should try to compete in it. The dynamics of the aforementioned consumer segments overwhelmingly favor the top market-share players, and I don’t think Dell can become a leader in smartphones or tablets, especially with products that seem compromised, inspired by calculations of margin percentage rather than by an implicit understanding and appreciation of consumer interest.

On the other sides of Dell’s business, there’s promise. In the SMB and enterprise markets — as well as in verticals bolstered by its acquisition of Perot Systems — Dell can compete effectively and win. It has a decent brand, it has the customer relationships, it has a reasonably attractive product and services portfolio, it’s growing its profile in the emerging BRIC economies (though I wonder whether any technology company that is not of China can truly thrive for long in the Chinese market).

In those business-oriented markets, the company also possesses a good appreciation of what the customers want today and what they might want tomorrow. As some news coverage suggests, Dell might be discounting more than is necessary to maintain presence in SMB, enterprise, and government accounts. But, with careful calibration, that problem can be readily fixed.

Another area that Dell needs to reconsider, on the product side, is its networking strategy. I think this is an area where it cannot be ambiguous. Dell can follow HP’s lead, and go all in against Cisco as a direct competitor, or it can to take a software-based, services-led approach that is agnostic toward network infrastructure, responding impartially and objectively to customer needs. Sometimes that will mean working with Cisco and its gear and sometimes not, but it doesn’t entail the same stark dynamics as an unambiguously antagonistic relationship.

Here’s the question that should drive that decision: In all honestly, and without the reality distortion that comes form wearing one’s own marketing goggles, does Dell believe that enterprise customers want the integrated, proprietary data-center pitch of Cisco’s Unified Computing System (UCS)? If Dell believes that’s what enterprise customers want, and that HP will follow suit once it has integrated 3Com into its full-service offerings, then Dell probably has no choice but to acquire and own the necessary networking assets to play the same game.

If Dell doesn’t believe that UCS is what customers want, that customers will seek an open, interoperable approach to data-center integration, then the company ought to take an IBM-like, integrator’s approach to the market. It can leverage Perot, focus on extending its software portfolio in areas such as data-center management and orchestration, build value at the application layer, investing in the glue that brings everything together rather than in the underlying plumbing.

Dell knows what its customers are telling it. The company just has to listen to what’s being said.