Monthly Archives: March 2010

Update on Status of HP’s Pending 3Com Acquisition

For those of you wondering whether or when HP’s pending acquisition of 3Com will be approved by China’s Ministry of Commerce (MOFCOM), what follows is the latest word, excerpted from a 3Com Form-8K SEC filing:

On November 11, 2009, we announced an agreement to be acquired by Hewlett-Packard Company pursuant to a merger agreement executed by the parties (the “Merger”). The parties are currently targeting completion of the merger by the end of April 2010, however the exact timing cannot be predicted.

The closing of the Merger is subject to the satisfaction or waiver of specified closing conditions, including, without limitation, (i) the adoption of the Merger Agreement by 3Com’s stockholders and (ii) the expiration or termination of waiting periods, and obtaining of requisite approvals or clearances, under specified antitrust and competition laws (including, without limitation in China, the European Union and the United States, among others). On December 22, 2009, the relevant U.S. antitrust authorities granted early termination of the waiting period under the U.S. Hart-Scott-Rodino Antitrust Improvements Act of 1976, as amended. On February 12, 2010, the European Commission cleared the Merger under the EU Merger Regulation. In addition, on January 26, 2010, 3Com’s stockholders adopted the Merger Agreement at a special meeting of stockholders.

So, the wait continues, but the end is in sight.

Smart-Grid Success Depends on Alignment of Interests

I wonder whether utility executives are suffering from survey fatigue. Every week or so, new smart-grid market research, derived form surveys of the utility industry, hits the streets.

Maybe the utility executives enjoy the attention. That, at least, would explain why so many of the studies seem to receive participation rates that result in useful sample sizes.

Interestingly, the surveys are producing similar findings. One thing that seems clear is that utilities aren’t certain when the smart-grid payoff will come — for them or for their customers.

For instance, according to the results of a Comverge survey of 100 utility-industry attendees at last week’s DistribuTech utility conferencein Tampa, almost an equal number of respondents believed that measurable smart-grid benefits are one to three years from realization (27 percent) as believe that measurable benefits will accrue in 10 or more years (29 percent). Ten years, or more, is a long way off.

While spending on smart-grid initiatives seems strong, with 77 percent of respondents indicating budgets allocated to energy-management resources have increased in the past year, the industry’s smart-grid vision would benefit from greater clarity, coherence, and consistency.

Indeed, if the utilities aren’t sure when they’ll realize a return on their smart-grid investments, they seem even less certain about how and when their customers will derive benefits. I get the distinct impression the utility industry has’t fully thought through the customer angle. The formation of the Smart Grid Consumer Coalition (SGCC) was a sign that the industry at least recognized the problem, but the smart-grid spending priorities of the utilities suggest that remedial action is lagging.

How else to explain why smart meters are getting an aggressive push while smart energy outlets, in-home energy displays, residential energy-management systems remain afterthoughts?

Let’s consider: The smart meter is just a digital two-way, real-time meter. It allows the utility to implement time-of-use (TOU) billing based on variable pricing. Consumers will pay more when energy is in peak demand, and presumably they will pay less when consuming energy during off-peak hours.

But f consumers have no means of accessing information and services that help them reduce their energy consumption during periods of peak demand, they will not realize savings on their energy bills. In fact, if experience is any guide, they’ll receive higher energy bills, which will sour them on smart meters and the smart grid.

That would be unfortunate, not just for utilities and for the vendors that supply them, but for everybody. The smart grid has the potential to deliver value that flows downstream from technology vendors and utilities all the way to their commercial, industrial, institutional, and residential customers.

What needs to happen is proper alignment and reconciliation of interests.,It will not be enough for utilities to foist feel-good sentiments on consumers about how smart meters contribute to energy conservation and environmental sustainability. That’s great, for as far as it goes, but it doesn’t go nearly far enough — not during an era in which tapped-out consumers already will be trying to do more with less.

The smart grid can’t be exclusively about utilities saving money, either as result of not having to utilize higher-cost energy-generation facilities for peak loads or of not having to build new electric-generation facilities to cope with rising demand. Consumers will have to share in the benefits; if they don’t, they’ll become adversaries, not partners.

Of course, the smart grid shouldn’t just be about making or saving money. It plays into a bigger picture, having to do with economic viability, industrial growth, technological innovation, and environmental sustainability. (That’s another post, which I will save for another time.)

Nonetheless, to the extent that money can and will be made, the utility industry must give careful consideration to ensuring that everybody in the value chain derives real benefits. If such consideration isn’t paid, then the rosy market forecasts might never come to fruition.

IBM-Johnson Controls Partnership Signals New Automation Wave

As Internet protocol proliferates, pervading every device that can possibly be networked, automated solutions are reaching into new realms and exploring untapped possibilities.

They’re also providing a dynamic foundation for new technology partnerships. One such partnership, which includes a smart-grid aspect, involves IBM and Johnson Controls.

The two companies are combining forces to improve the energy efficiency of office buildings and other commercial and industrial facilities. They’ll achieve that result by integrating IBM’s business-analytics software and middleware with Johnson’s building control technology. The objective is to deliver an automated system that will automatically turn off lights in unoccupied rooms or buildings, identify areas of heat loss, and shut off and turn on HVAC systems as required.

IBM and Johnson Controls worked together previously to deliver energy-efficiency solutions for the data center. As a result of those customer engagements and others, they realized more was possible.

Clay Nesler, Johnson Controls’ VP for Global Energy and Sustainability, explained how the partnership evolved:

“These capabilities have been available for a long time, but they haven’t been widely applied . . . . Both organizations are committed to open standards and Web technologies. So while this would have required a lot of engineering and R&D work several years ago, we now hope to leverage as many standards as possible,”

That’s the key: Standards are facilitating the development and delivery of automated-management solutions, including many of those applicable to the smart gird, that were too cumbersome, too unwieldy, too ocustomized (and therefore too expensive) in the past.

This is why the networked machine-to-machine (M2M) market is seen to offer so much promise, not just for IBM and Johnson Controls, but for every major information-vendor that has its eyes open and its skin in the game. Think of Intel, which can provide chips for the embedded devices; Cisco and its network infrastructure; Oracle with its analytics applications and databases; Dell with its servers; Ericsson with its wireless-networking technologies. Also consider the new business possibilities for wireless operators worldwide.

It’s a huge opportunity, and the smart grid, broadly defined, is a big piece of it. Technology vendors have a correspondingly large stake in ensuring that it is done right. Expect them to become active and involved partners with utilities, regulators, and governments.

Dell Makes Right Move Providing Financing for SMB Customers

Dell is being more aggressive in extending financing to its SMB customers, according to an article in the Wall Street Journal.

Although I recognize the risks inherent in providing relatively generous credit terms to SMBs, many of which have suffered inordinately during the savage economic downturn and the current joyless recovery, Dell is doing the right thing, for itself and for the economy.

As the money spigots are turned tightly off by banks and other traditional sources of credit, Dell and other vendors that depend on the ongoing patronage of SMBs confront a difficult dilemma: refuse to provide vendor financing to these customers, and see them perish or go to other vendors willing to extend financial largesse; or provide them with the financing they need to buy your products and services, incurring the risk that some of them will fail anyway and not repay your loans.

The second option is better, particularly if Dell is selective in how and to whom it provides its vendor financing. In that regard, Dell, which derives about 23 percent of its revenue from SMBs (companies with fewer than 500 employees), is taking a conservative, prudent approach in assessing the credit risks of its clientele.

That’s good practice, of course. But it’s also good practice for Dell to reach out and help those customers who can be helped, and who will continue buying Dell PCs, servers, and services when times improve. The concept of enlightened self-interest in business often is viewed cynically, and there’s no question that it features a hard edge of commercial realpolitik. It can work, though, delivering benefits for all involved, and this is one example where it definitely serves more than one party’s interests.

Dell has $7 billion in credit available for small companies, and it extends most of the credit itself rather than through financial intermediaries. Good for Dell, and good for the companies and organizations that qualify for the financing.

Brocade Regional Director: Ethernet Space a “Red Ocean with Blood Everywhere”

Computer Reseller News is running an interview with Charlie Foo, Brocade’s regional director in the company’s partner business group for Asia-Pacific and Japan.

His answers are more forthcoming than one might expect. Usually, these types of interviews generate offer neither candor nor insight, and the news value is negligible. There’s nothing earthshaking in what Foo tells CRN, but he admits to a few issues Brocade is trying to correct.

As you might expect, most of the challenges relate to Brocade’s Foundry operation, which produces Ethernet switches. Foundry has been struggling, underperforming and losing ground to rival vendors. Foo uses graphic language to illustrate the dilemma:

With the acquisition of Foundry, we got into the Ethernet space. The IP market is a red ocean with blood everywhere. We’re parachuting in this ocean, not knowing where we’re going to land. But what we will do is play in the verticals we are strong in. These include education, media, entertainment, healthcare, service providers and government.

A red ocean with blood everywhere? That can’t be good. It’s worse when your Ethernet IP-switching company is the one doing the most hemorrhaging.

Foo goes on to discuss Brocade’s plans for the SMB space — I’ll withhold judgment there, though I’m among the somber skeptics — how Brocade intends to enlist and motivate Select Partners, and expounds on the company’s demand-creation plans. He also touches on a security partnership with McAfee.

At one point, while talking about the moves Cisco and HP have made to provide “converged networking” solutions for the data center, Foo contends that the 3Com-fortified HP still will not have IP-based Ethernet switching products that overlap with Brocade’s Foundry gear. That seems a hopeful assertion.

Given the challenges Brocade faces on the Ethernet switching side of the house, however, one can allow that a dose of optimism is a necessary tonic.

Cisco’s Tandberg Acquisition Officially Approved, Dance for Polcyom Begins

When I first learned of the alleged acquisitive interest Apax Partners was said to have expressed toward Polycom, I dismissed it as nothing more than a media head fake.

Let’s consider: When news of that sort is leaked, it’s made public for a reason. In this context, it seemed, the reason was to bring others to the table. Somebody who has an interest in Polycom being acquired wanted to engender a bidding war for the company. It happens all the time.

There was something else, too. Apax didn’t seem a likely acquirer. Where were the direct synergies with Polycom in Apax’s investment portfolio? Where were the connections between Apax’s people and major vendors in the videoconferencing and unified-communications worlds? The deal didn’t offer enough risk mitigation for Apax; the pieces didn’t fit together.

Even if Apax had wanted to acquire Polycom, I’m not sure it had the conviction or the stomach to conclude the deal at the price Polycom would have commanded.

Now, though, Cisco’s acquisition of Tandberg has been consummated, and Polycom stands exposed. Polycom was Tandberg’s videoconfencing rival, and it’s a company of considerable importance to the UC strategies of more than one vendor.

We must consider the Cisco-Tandberg context, because contrivances like the leaked report of Apax’s interest in Polycom tend not to occur in a vacuum. Who’s supposed to step from the shadows and make a welcome bid, at an appetizing price, for Polycom?

There are a few candidates, including one that already has tipped its hand. That player is The Gores Group, 51-percent owner of Siemens Enterprise Communications. But The Gores Group’s bid was leaked, too, and we have to wonder why. Expect others to enter the picture, publicly or otherwise.

An obvious candidate is Avaya. Even though Avaya has barely digested its acquisition of Nortel’s enterprise business, it might feel as though it cannot let Polycom fall into other hands. In a perfect world, Avaya would not have to pursue Polycom now, immediately after assimilating and integrating Nortel.

Nonetheless, strategic imperatives might necessitate a move. Avaya is backed by the high rollers at Silver Lake, who rarely think small. They might not be willing to pass up the opportunity of taking Polycom off the board.

Who else? Not Dell. I can’t see it happening.

I don’t think HP will make the move, either. It’s got is own telepresence systems already, it’s very close to Microsoft in unified communications, and it wants to leverage Microsoft in the battle against their common enemy, Cisco.

Juniper is a possibility, but the company has signaled that it will grow organically, not through big-ticket M&A. Juniper will stay focused on building its intelligent network infrastructure and try not to get distracted by the action in the M&A casino.

IBM could make a move for Polycom, but I don’t think it will. Microsoft also enters the equation.

Yes, Polycom sells hardware, and Microsoft has steered clear of stepping on the toes of hardware partners such as HP. But there’s a way Microsoft could structure a deal that would be amenable to HP and its other hardware partners. All it takes a little creativity and ingenuity, and Microsoft retains plenty of that commodity on the enterprise side of its business.

If I were making book on which company will acquire Polycom, I’d make Silver Lake-baked Avaya the favorite, with Gores-backed Siemens Enterprise Communications the second choice, Microsoft the third option, with IBM next. Of course, in no way do I encourage illicit gambling on prospective M&A activity.

If you have theory on whether Polcyom will be acquired, and by whom, feel fee to share your thoughts below.

The Long, Winding Road to Application-Intelligent Networks

It seems that we’ve been talking about application-aware networking for a long time. I admit that I was talking about it more than a decade ago, back when I initiated a technology partnership between my employer at the time, a workload-management vendor, and a network colossus (a previous employer of mine) that shall also remain nameless.

My brainwave — if I might beg your kind indulgence — was that the availability of applications and services over the Internet could be greatly enhanced if one could somehow feed upper-layer intelligence about content and applications down to the switches and routers that were making load-balancing decisions at the network and transport layers.

This was back in 1998, so it was considered heady stuff. The partnership was necessary because my employer at the time knew a lot about the characteristics and real-time behavior of applications but knew little about the network; whereas the network-infrastructure vendor knew all about the network but not nearly enough about the applications it supported.

Because I had worked for both companies, I understood how they could combine forces to create an application-intelligent infrastructure that would provide unprecedented availability over and across the Internet. In theory, everything should have worked without a hitch. In theory.

The real challenge was getting the two companies to understand each other. It wasn’t a clash of corporate cultures so much as a failure to speak the same language. One focused on how and where applications were processed and the other focused on the pipes that stitched everything together. They had different assumptions, used different vocabularies, and they viewed the world — the data center, anyway — from different perspectives. Each had difficulty understanding what the other was saying. For a long time, l felt like an interpreter in the diplomatic service.

You’d think by now these problems would be behind us, right? You’d think that in an era when networking titans are buying videoconferencing vendors, computer vendors are buying networking vendors, and everybody seemingly has a strategy for the converged data center, that maybe the networking world and the computing world would speak the same language, or at least have an implicit understanding of what the other side is saying. Apparently, though, the problem persists. The chasm between the two worlds might not be as vast, but a schism still precludes a meaningful mind meld.

I reached that sad conclusion after reading a blog post by Lori MacVittie at F5′s DevCentral. In her post, she recounts finding an article that promised to explore the profound smartness of application-aware networking. Instead of reading about a network capable of understanding and dynamically supporting an application’s data and behavior, she read a piece that talked about “container concerns” such as CPU, RAM, and memory, with a dollop about 10-GbE networking tossed into the mix for good measure. As she writes:

Application-awareness is more than just CPU and RAM and network bandwidth. That’s just one small piece of the larger contextual pie. It’s about the user’s environment and the network and the container and the application and the individual request being made. At the moment it’s made. Not based on historical trends.

So, the technology isn’t the problem. The defining concept of application-aware networking has been with us for some time, and the technologies clearly exist to facilitate its widespread deployment. What’s preventing it from coming together is the balkanized thinking and — why not say it? — the entrenched politics of the traditional data center and its vendor ecosystem.

We’ll get there, though. The wheels are in motion, and the destination is in sight.The trip just took longer than some of us though it would.

Lesson learned: Never underestimate institutional resistance to change that is seen to threaten an established order. Keep that lesson in mind as you consider all those rosy near-term projections for cloud computing.