Category Archives: PCs

PC Market: Tired, Commoditized — But Not Dead

As Hewlett-Packard prepares to spinoff or sell its PC business within the next 12 to 18 months, many have spoken about the “death of the PC.”

Talk of “Death” and “Killing”

Talk of metaphorical “death” and “killing” has been rampant in technology’s new media for the past couple years . When observers aren’t noting that a product or technology is “dead,” they’re saying that an emergent product of one sort or another will “kill” a current market leader. It’s all exaggeration and melodrama, of course, but it’s not helpful. It lowers the discourse, and it makes the technology industry appear akin to professional wrestling with nerds. Nobody wants to see that.

Truth be told, the PC is not dead. It’s enervated, it’s best days are behind it, but it’s still here. It has, however, become a commodity with paper-thin margins, and that’s why HP — more than six years after IBM set the precedent — is bailing on the PC market.

Commoditized markets are no place for thrill seekers or for CEOs of companies that desperately seek bigger profit margins. HP CEO Leo Apotheker, as a longtime software executive, must have viewed HP’s PC business, which still accounts for about 30 percent of the company’s revenues, with utter disdain when he first joined the company.

No Room for Margin

As  I wrote in this forum a while back, PC vendors these days have little room to add value (and hence margin) to the boxes they sell. It was bad enough when they were trying to make a living atop the microprocessors and operating systems of Intel and Microsoft, respectively. Now they also have to factor original design manufacturers (ODMs)  into the shrinking-margin equation.

It’s almost a dirty little secret, but the ODMs do a lot more than just manufacture PCs for the big brands, including HP and Dell. Many ODMs effectively have taken over hardware design and R&D from cost-cutting PC brands. Beyond a name on a bezel, and whatever brand equity that name carries, PC vendor aren’t adding much value to the box that ships.

For further background on how it came to this — and why HP’s exit from the PC market was inevitable — I direct you to my previous post on the subject, written more than a year ago. In that post, I quoted and referenced Stan Shih, Acer’s founder, who said that “U.S. computer brands may disappear over the next 20 years, just like what happened to U.S. television brands.”

Given the news this week, and mounting questions about Dell’s commitment to the low-margin PC business, Shih might want to give that forecast a sharp forward revision.

Is Li-Fi the Next Wi-Fi?

The New Scientist published a networking-related article last week that took me back to my early days in the industry.

The piece in question dealt with Visible Light Communication (VLC), a form of light-based networking in which data is encoded and transmitted by varying the rate at which LEDs flicker on and off, all at intervals imperceptible to the human eye.

Also called Li-Fi — yes, indeed, the marketers are involved already — VLC is being positioned for various applications, including those in hospitals, on aircraft, on trading floors, in automotive car-to-car and traffic-control scenarios, on trade-show floors, in military settings,  and perhaps even in movie theaters where VLC-based projection might improve the visual acuity of 3D films. (That last wacky one was just something that spun off the top of my shiny head.)

From FSO to VLC

Where I don’t see VLC playing a big role, certainly not as a replacement for Wi-Fi or its future RF-based successors, is in home networking. VLC’s requirement for line of sight will make it a non-starter for Wi-Fi scenarios where wireless networking must traverse floors, walls, and ceilings. There are other room-based applications for VLC in the home, though, and those might work if device (PC, tablet, mobile phone), display,  and lighting vendors get sufficiently behind the technology.

I feel relatively comfortable pronouncing an opinion on this technology. The idea of using light-based networking has been with us for some time, and I worked extensively with infrared and laser data-transmission technologies back in the early to mid 90s. Those were known as free-space optical (FSO) communications systems, and they fulfilled a range of niche applications, primarily in outdoor point-to-point settings. The vendor for which I worked provided systems for campus deployments at universities, hospitals, museums, military bases, and other environments where relatively high-speed connectivity was required but couldn’t be delivered by trenched fiber.

The technology mostly worked . . . except when it didn’t. Connectivity disruptions typically were caused by what I would term “transient environmental factors,” such as fog, heavy rain or snow, as well as dust and sand particulate. (We had some strange experiences with one or two desert deployments). From what I can gather, the same parameters generally apply to VLC systems.

Will that be White, Red, or Resonant Cavity?

Then again, the performance of VLC systems goes well beyond what we were able to achieve with FSO in the 90s. Back then, laser-based free-space optics could deliver maximum bandwidth of OC3 speeds (144Mbps), whereas the current high-end performance of VLC systems reaches transmission rates of 500Mbps. An article published earlier this year at provides an overview of VLC performance capabilities:

 “The most basic form of white LEDs are made up of a bluish to ultraviolet LED surrounded by a yellow phosphor, which emits white light when stimulated. On average, these LEDs can achieve data rates of up to 40Mb/sec. Newer forms of LEDs, known as RGBs (red, green and blue), have three separate LEDs that, when lit at the same time, emit a light that is perceived to be white. As these involve no delay in stimulating a phosphor, data rates in RGBs can reach up to 100Mb/sec.

But it doesn’t stop there. Resonant-cavity LEDs (RCLEDs), which are similar to RGB LEDs and are fitted with reflectors for spectral clarity, can now work at even higher frequencies. Last year, Siemens and Berlin’s Heinrich Hertz Institute achieved a data-transfer rate of 500Mb/sec with a white LED, beating their earlier record of 200Mb/sec. As LED technology improves with each year, VLC is coming closer to reality and engineers are now turning their attention to its potential applications.”

I’ve addressed potential applications earlier in this post, but a sage observation is offered in piece by Oxford University’s Dr. Dominic O’Brien, who sees applications falling into two broad buckets: those that “augment existing infrastructure,” and those in which  visible networking offers a performance or security advantage over conventional alternatives.

Will There Be Light?

Despite the merit and potential of VLC technology, its market is likely to be limited, analogous to the demand that developed for FSO offerings. One factor that has changed, and that could work in VLC’s favor, is RF spectrum scarcity. VLC could potentially help to conserve RF spectrum by providing much-needed bandwidth; but such a scenario would require more alignment and cooperation between government and industry than we’ve seen heretofore. Curb your enthusiasm accordingly.

The lighting and display industries have a vested interest in seeing VLC prosper. Examining the membership roster of the Visible Light Communications Consortium (VLCC), one finds it includes many of Japan’s big names in consumer electronics. Furthermore, in its continuous pursuit of new wireless technologies, Intel has taken at least a passing interest in VLC/Li-Fi.

If the vendor community positions it properly, standards cohere, and the market demands it, perhaps there will be at least some light.

Reviewing Dell’s Acquisition of Force10

Now seems a good time to review Dell’s announcement last week regarding its acquisition of Force10 Networks. We knew a deal was coming, and now that the move finally has been made, we can can consider the implications.

It was big news on a couple fronts. First, it showcased Dell’s continued metamorphosis from being a PC vendor and box pusher into becoming a comprehensive provider of enterprise and cloud solutions. At the same time, and in a related vein, it gave Dell the sort of converged infrastructure that allows it to compete more effectively against Cisco, HP, and IBM.

The transaction price of Dell’s Force10 acquisition was not disclosed, but “people familiar with the matter” allege that Dell paid about $700 million to seal the deal. Another person apparently privy to what happened behind the scenes says that Dell considered buying Brocade before opting for Force10. That seems about right.

Rationale for Acquisition

As you’ll recall (or perhaps not), I listed Force10 as the second favorite, at 7-2, in my Dell Networking Derby, my attempt to forecast which networking company Dell would buy. Here’s what I said about the rationale for a Dell acquisition of Force10:

 “Dell partners with Force10 for Layer 3 backbone switches and for Layer 2 aggregation switches. Customers that have deployed Dell/Force10 networks include eHarmony,, Yahoo, and F5 Networks.

Again, Michael Dell has expressed an interest in 10GbE and Force10 fits the bill. The company has struggled to break out of its relatively narrow HPC niche, placing increasing emphasis on its horizontal enterprise and data-center capabilities. Dell and Force10 have a history together and have deployed networks in real-word accounts. That could set the stage for a deepening of the relationship, presuming Force10 is realistic about its market valuation.”

While not a cheap buy, Force10 went for a lot less than an acquisition of Brocade, at a market capitalization of $2.83 billion, would have entailed. Of course, bigger acquisitions always are harder to integrate and assimilate than smaller ones. Dell has found a targeted acquisition model that seems to work, and a buy the size of Brocade would have been difficult for the company to digest culturally and operationally. In hindsight, which usually gives one a chance to be 100% correct, Dell made a safer play in opting for Force10.

IPO Plans Shelved

Although Force10 operates nominally in 60 countries worldwide, it derived 80 percent of its $200 million in revenue last year from US customers, primarily data-center implementations. Initially, at least, Dell will focus its sales efforts on cross-pollination between its and Force10’s customers in North America. It will expand from there.

Force10 has about 750 employees, most of whom work at its company headquarters in San Jose, California, and at a research facility in Chennai, India. Force10 doesn’t turn Dell into an overnight networking giant; the acquired vendor had just two percent market share in data-center networking during the first half of 2011, according to IDC. Numbers from Dell’Oro suggest that Force10 owned less than one percent of the overall Ethernet switch market.

Once upon a time, Force10 had wanted to fulfill its exit strategy via an IPO. Those plans obviously were not realized. The scuttlebutt on the street is that, prior to being acquired by Dell, Force10 had been slashing prices aggressively to maintain market share against bigger players.

Channel Considerations

Force10 has about 1,400 customers, getting half its revenue and the other half from channel sales. Dell doesn’t see an immediate change in the sales mix.

Dell will work to avoid channel conflict, but I foresee an increasing shift toward direct sales, not only with the Force10’s data-center networking gear, but also with any converged data-center-in-a-box offerings Dell might assemble.

Converged Infrastructure (AKA Integrated Solution Stack) 

Strategically, Dell and its major rivals are increasingly concerned with provision of converged infrastructure, otherwise known as as an integrated technology stack (servers, storage, networking, associated management and services) for data centers. The ultimate goal is to offer comprehensive automation of tightly integrated data-center infrastructure. These things probably will never run themselves — though one never knows — but there’s customer value (and vendor revenue) in pushing them as far along that continuum as possible.

For some time,  Dell has been on a targeted acquisition trail, assembling all the requisite pieces of the converged-infrastructure puzzle. Key acquisitions included Perot Systems for services, EqualLogic and Compellent for storage, Kace for systems management, and SecureWorks for security capabilities. At the same time, Dell has been constructing data centers worldwide to host cloud applications.

Dell’s converged-infrastructure strategy is called Virtual Network Services Architecture (VNSI), and the company claims Force10’s Open Cloud Networking (OCN) strategy, which stresses automation and virtualization based on open standards, is perfectly aligned with its plans. Dario Zamarian, VP and GM of Dell Networking, said last week that VNSI is predicated on three pillars: “managing from the edge,” where servers and storage are attached to the network; “flattening the network,” which is all the rage these days; and “scaling virtualization.”

For its part, Force10 has been promoting the concept of flatter and more scalable networks comprising its interconnected Z9000 switches in distributed data-center cores.

 The Network OS Question

I don’t really see Dell worrying unduly about gaining greater direct involvement in wiring-closet switches. It has its own PowerConnect switches already, and it could probably equip those to run Force10’s FTOS on those boxes. It seems FTOS, which Dell is positioning as an open networking OS, could play a prominent role in Dell’s competitive positioning against Cisco, HP, Juniper, IBM, and perhaps even Huawei Symantec.

Then again, Dell’s customers might have a say in the matter. At least two big Dell customers, Facebook and Yahoo, are on the board of directors of the Open Networking Foundation (ONF), a nonprofit organization dedicated to promoting software-defined networking (SDN) using the OpenFlow protocol. Dell and Force10 are members of ONF.

It’s possible that Dell and Force10 might look to keep those big customers, and pursue others within the ONF’s orbit, by fully embracing OpenFlow. The ONF’s current customer membership is skewed toward high-performance computing and massive cloud environments, both of which seem destined to be aggressive early adopters of SDN and, by extension, the OpenFlow protocol.  (I won’t go into my thoughts on OpenFlow here — I’ve already written a veritable tome in this missive — but I will cover it in a forthcoming post.)

Notwithstanding its membership in the Open Networking Foundation, Force10 is perceived as relatively bearish on OpenFlow. Earlier this year, Arpit Joshipura, Force10’s chief marketing officer, indicated his company would wait for OpenFlow to mature and become more scalable before offering it on its switches. He said “big network users” — presumably including major cloud providers — are more interested in OpenFlow today than are enterprise customers. Then again, the cloud ultimately is one of the destinations where Dell wants to go.

Still, Dell and Force10 might see whether FTOS can fit the bill, at least for now. As Cindy Borovick, research vice president for IDC’s enterprise communications and data center networks, has suggested, Dell could see Force10‘s FTOS as something that can be easily customized for a wide range of deployment environments. Dell could adapt FTOS to deliver prepackaged products to customers, which then could further customize the network OS depending on their particular requirements.

It’ll be interesting to see how Dell proceeds with FTOS and with OpenFlow.

 Implications for Others

You can be sure that Dell’s acquisition of Force10 will have significant implications for its OEM partners, namely Juniper Networks and Brocade Communications. From what I have heard, not much has developed commercially from Dell’s rebranding of Juniper switches, so any damage to Juniper figures to be relatively modest.

It’s Brocade that appears destined to suffer a more meaningful hit. Sure, Dell will continue to carry and sell its Fiber Channel SAN switches, but it won’t be offering Brocade’s Foundry-derived Ethernet switches, and one would have to think that the relationship, even on the Fiber Channel front, has seen its best days.

As for whether Dell will pursue other networking acquisitions in the near team, I seriously doubt it. Zeus Kerravala advises Dell to buy Extreme Networks, but I don’t see the point. As mentioned earlier, Dell already has its PowerConnect line, and the margins are in the data-center, not out in the wiring closets. Besides, as Dario Zamarian has noted, data-center networking is expected to grow at a compound annual growth rate of 21 percent through 2015, much faster than the three-percent growth forecast for the rest of the industry.

The old Dell would have single-mindedly chased the network box volumes, but the new Dell aspires to something grander.

HP’s TouchPad: Ground to Make Up, but Still in Race

After I wrote my last post about the limited commercial horizons of Cisco’s Cius tablet, I was asked to comment on the prospects for HP’s webOS-based TouchPad.

A Tale of Two Tablets

Like Cisco’s Cius, the TouchPad made its market debut this month, a few weeks ahead of its Cisco counterpart. The two tablets also have an enterprise orientation in common. Moreover, like Cisco’s Cius, the TouchPad was greeted with ambivalent early reviews. Actually, I suppose the early reviews for the TouchPad, while not glowing, were warmer than the tepid-to-icy responses occasioned by Cisco’s Cius.

There are other differences between the two tablets. For one, HP’s TouchPad sports its own mobile operating system, whereas Cisco has chosen to ride Google’s Android. There’s nothing wrong with Cisco’s choice, per se, but HP, in buying Palm and its webOS, has a deeper commitment to making its mobile-device strategy work.

As we’ve learned, Cisco is casting the Cius as an entry point — just one more conduit and access device — to its collaboration ecosystem as represented by the likes of WebEx and its Telepresence offerings.

Different Aspirations and Objectives

Put another way, HP clearly sees itself as a player in the tablet wars, while, for Cisco, tablets are incidental, a tactical means to a strategic end, represented by greater adoption of bandwidth-sucking collaboration suites and videoconferencing systems by enterprises worldwide. Consequently, it would come as no surprise to see Cisco bail on the tablet market before the end of this year, but it would come as a genuine shock if HP threw in the towel on webOS (and its associated devices) during the same timeframe.

That won’t happen, of course. HP believes it can carve out a niche for itself as a mobile-device purveyor for enterprise customers. To accomplish that goal, HP will port webOS to PCs and printers as well as to a growing family of tablets and smartphones. It also will license webOS to other vendors of tablets and smartphones — and perhaps to other vendors of PCs, too, presuming such demand materializes. Cisco doesn’t have an OS in the mobile race, so it doesn’t have those sorts of aspirations.

Multiple Devices, Bundling, and Services

Another difference is that HP actually knows how to make money selling client devices with more than a modicum of consumer appeal. That’s still uncharted territory for Cisco. In a period in which “consumerization of IT” is much more than a buzz phrase, it helps that HP has some consumer chops, just as it hurts that Cisco does not. Presuming that HP can generate demand from end users — maybe that’s why it is using the decidedly non-corporate Russell Brand as its TouchPad pitchman — it can then use bundling of webOS-based tablets, smartphones, printers, and PCs to captivate enterprise IT departments.

To top it all off, HP can wrap up the whole package with extensive consulting and integration services.

I’m not saying HP is destined for greatness in the tablet derby — the company will have to persevere and work hard to address perceived weaknesses and to amass application support from the developer community — but I’d wager that HP is better constituted than Cisco to stay the course.

Pondering Intel’s Grand Design for McAfee

Befuddlement and buzz jointly greeted Intel’s announcement today regarding its pending acquisition of security-software vendor McAfee for $7.68 billion in cash.

Intel was not among the vendors I expected to take an acquisitive run at McAfee. It appears I was not alone in that line of thinking, because the widespread reaction to the news today involved equal measures of incredulity and confusion. That was partly because Intel was McAfee’s buyer, of course, but also because Intel had agreed to pay such a rich premium, $48 per McAfee share, 60 percent above McAfee’s closing price of $29.93 on Wednesday.

What was Intel Thinking?

That Intel paid such a price tells us a couple things. First, that Intel really felt it had to make this acquisition; and, second, that Intel probably had competition for the deal. Who that competition might have been is anybody’s guess, but check my earlier posts on potential McAfee acquirers for a list of suspects.

One question that came to many observers’ minds today was a simple one: What the hell was Intel thinking? Put another way, just what does Intel hope to derive from ownership of McAfee that it couldn’t have gotten from a less-expensive partnership with the company?

Many attempting to answer this question have pointed to smartphones and other mobile devices, such as slates and tablets, as the true motivations for Intel’s purchase of McAfee. There’s a certain logic to that line of thinking, to the idea that Intel would want to embed as much of McAfee’s security software as possible into chips that it heretofore has had a difficult time selling to mobile-device vendors, who instead have gravitated to  designs from ARM.

Embedded M2M Applications

In the big picture, that’s part of Intel’s plan, no doubt. But I also think other motivations were at play.  An important market for Intel, for instance, is the machine-to-machine (M2M) space.

That M2M space is where nearly everything that can be assigned an IP address and managed or monitored remotely — from devices attached to the smart grid (smart meters, hardened switches in substations, power-distribution gear) to medical equipment, to building-control systems, to televisions and set-top boxes  — is being connected to a communications network. As Intel’s customers sell systems into those markets, downstream buyers have expressed concerns about potential security vulnerabilities. Intel could help its embedded-systems customers ship more units and generate more revenue for Intel by assuaging the security fears of downstream buyers.

Still, that roadmap, if it exists, will take years to reach fruition. In the meantime, Intel will be left with slideware and a necessarily loose coupling of its microprocessors with McAfee’s security software. As Nathan Brookwood, principal analyst at Insight 64 suggested, Intel could start off by designing its hardware to work better with McAfee software, but it’s likely to take a few years, and new processor product cycles, for McAfee technology to get fully baked into Intel’s chips.

Will Take Time

So, for a while, Intel won’t be able to fully realize the value of McAfee as a asset. What’s more, there are parts of McAfee that probably don’t fit into Intel’s chip-centric view of the world. I’m not sure, for example, what this transaction portends for McAfee’s line of Internet-security products obtained through its acquisition of Secure Computing. Given that McAfee will find its new home inside Intel’s Software and Service division, as Richard Stiennon notes, the prospects for the Secure Computing product line aren’t bright.

I know Intel wouldn’t do this deal just because it flipped a coin or lost a bet, but Intel has a spotty track record, at best, when it comes to M&A activity. Media observers sometimes assume that technology executives are like masters of the universe, omniscient beings with superior intellects and brilliant strategic designs. That’s rarely true, though. Usually, they’re just better-paid, reasonably intelligent human beings, doing their best, with limited information and through hazy visibility, to make the right business decisions. They make mistakes, sometimes big ones.

M&A Road Full of Potholes

Don’t take it from me; consult the business-school professors. A Wharton course on mergers and acquisitions spotlights this quote from Robert W. Holthausen, Nomura Securities Company Professor, Professor of Accounting and Finance and Management:

“Various studies have shown that mergers have failure rates of more than 50 percent. One recent study found that 83 percent of all mergers fail to create value and half actually destroy value. This is an abysmal record. What is particularly amazing is that in polling the boards of the companies involved in those same mergers, over 80 percent of the board members thought their acquisitions had created value.”

I suppose I’m trying to say is that just because Intel thinks it has a plan for McAfee, that doesn’t mean the plan is a a good one or, even presuming it is a good plan, that it will be executed successfully. There are many potholes and unwanted detours along M&A road.

Dell and HP Face Direction, Leadership Questions

Quarterly earnings results are on tap later today from Dell and HP. While the two companies would never be confused for twins, they have much in common. Not only do they sell similar products into similar markets, serving similar types of customers in the process, but both are bedeviled by serious questions about direction and leadership.

At HP, of course, the strange circumstances surrounding the sudden departure of former CEO Mark Hurd continue to generate more questions than answers. The details and machinations behind Hurd’s ouster might never be known. That presents a problem for a public company, because shareholders don’t usually like the firms in which they invest to be enveloped in a fog of murk, mystery, and intrigue.

For HP, the game of Clue will have to end. Whatever Mr. Hurd might have done, and how and where he might have done it, will have to take a decisive back seat to issues pertaining to HP’s direction, focus, strategies, and tactics. Investors and market watchers will be looking for clear indications tonight, when the company conducts its conference call with analysts, that HP has a firm hand on the tiller and is heading in the right direction. Given what’s transpired in the last couple weeks, HP will have to place particular emphasis on candor and clarity in its communications this evening. The substance of the message is always important, but tone now is critical for HP, too.

Nobody is Indispensable

My own view is that CEOs are like quarterbacks on football teams. They tend to get too much credit for corporate successes and too much blame for setbacks. Honest CEOs who’ve enjoyed success will tell you that they’ve been surrounded by excellent teams that contribute to the plans and bring the execution to fruition. The business media, though, likes to personalize and simplify, so it tends to focus on the CEO as the apotheosis of corporate culture. That’s not really how or why technology companies are successful, but it makes for good copy. The truth is, nobody is indispensable.

On Cisco earnings calls recently, I’ve noticed that CEO John Chambers has been giving prominent credit to his bench strength, noting the contributions of specific executives who run various parts of the company.

As much as it might pain it to do so, HP should follow Cisco’s lead and correct the ridiculous media misconception that the company’s wheels will fall off now that Mark Hurd isn’t sitting at the front of the bus. Considering that too much Mark Hurd might well have been a bad prescription for what had begun to ail HP, I think HP should be confident in showing that it knows how to correct its course.

No More Frankentablets

As for Dell, leadership issues also are on the front burner. About a quarter of votes cast at Dell’s recent shareholder meeting withheld support for Michael Dell’s re-election to the company’s board. The company’s shares are down a staggering 50 percent from where they stood in August 2008, and Dell recently paid $100 million to settle a government probe into questionable accounting practices.

Alleged accounting shenanigans aside, I think the primary problem for Dell is one of focus. It tries to have the breadth of an HP, but it doesn’t have the resources to pull it off. LIke a lot of other market watchers, I’d like to see Dell show more solution focus and market discipline.

The company is going nowhere in consumer markets. The Streak, for example, looks like something hatched by a committee that couldn’t decide whether it wanted to devise a smartphone or a tablet. Consequently, it wound up producing a Frankentablet with a five-inch display.  I have a feeling it came as much from Dell’s ODM partners as from its own design labs.

I know it won’t happen on this call — certainly not with the current composition of Dell’s board of directors — but I’d like to see the company recognize, for once and for all, that it’s out of its depth in the consumer space. I’d like to see it turn its attention, focus, and resources to the SMB and enterprise markets, and to further enhancing its evolving virtualization and cloud strategies.

It won’t happen, but it should.

RealD’s 3D Promise and Peril

I should have an opinion on RealD’s IPO today. Fortunately, I do have one, and I will share it with you now.

If 3D goes big, RealD will scale right along with it. The company is the leading purveyor of 3D projection systems for digital cinemas. By its own estimates, it owns more than half of that market, holding off competitors such as Dolby, Laboratories, Inc., IMAX Corporation, MasterImage 3D, and X6D Limited.

It’s interesting to see Dolby among RealD’s primary competitors. In many respects, RealD is emulating the approach Dolby used to dominate the stereoscopic sound market in cinemas worldwide. RealD has read Dolby’s playbook, and heretofore it’s done better applying it to 3D cinema than Dolby has done.

You can peruse RealD’s prospectus yourself, but here’s an excerpt to whet your appetite:

As of December 25, 2009, there were approximately 16,000 theater screens using digital cinema projectors out of approximately 149,000 total theater screens worldwide, of which 4,286 were RealD-enabled (increasing to 5,966 RealD-enabled screens as of June 1, 2010). In 2009, motion picture exhibitors installed approximately 7,500 digital cinema projectors, an approximately 86% growth rate from 2008, and in 2008, motion picture exhibitors installed approximately 2,300 digital cinema projectors, an approximately 36% growth rate from 2007. Digital Cinema Implementation Partners, or DCIP, recently completed its financing that is providing funding for the digital conversion of up to approximately 14,000 additional domestic theater screens operated by our licensees AMC, Cinemark and Regal. We believe the increasing number of theater screens to be financed by DCIP provides us with a significant opportunity to deploy additional RealD Cinema Systems and further our penetration of the domestic market.

The salient point is that the addressable market is large, the overall penetration rate for 3D projection systems is relatively low, and the market stage is nascent. Moreover, this is worldwide opportunity, not one restricted to the North American marketplace.

That’s a good thing, too, though RealD — like everyone else with valuable intellectual property — is concerned about the fate that might befall it in China. Among noted risk factors in the company’s prospectus, we find the following:

Our business is dependent upon our patents, trademarks, trade secrets, copyrights and other intellectual property rights. Effective intellectual property rights protection, however, may not be available under the laws of every country in which we and our licensees operate, such as China.

Even though that’s a legitimate concern, it isn’t RealD’s biggest worry. The real worries in my view are industry dynamics (namely, 3D’s spread from cinemas to consumer electronics such as televisions, PCs, cell phones, and game consoles), the quantity and qualify of 3D entertainment fare (also known as content), and the ability of the industry ecosystem and consumers to foot the 3D bill.

3D has proven marketable in cinemas, but now it is trying to expand its empire into consumer electronics. That’s an opportunity and a threat for RealD, which obviously wants to extend its hegemony beyond the three-dimensional silver screen.

RealD will have to rejig its business model and its technologies to capture consumer-electronics markets. It will have to enter into new relationships, build or buy new products and capabilities, and market and sells its wares differently. And that’s presuming that 3D makes a successful commercial leap into living rooms, mobile devices, and other display-bearing devices. Much remains to be done on that front.

Then we come to the content issue. You might have noticed that not all 3D films have the box-office wallop of Avatar. Movie exhibitors like the premium they charge consumers for watching 3D movies (though they are less enamored of the added cost of 3D projection systems), but the willingness of the masses to pay more per view is contingent on cinemas offering them experiences they deem worthy of the 3D surcharge.

I’ve scanned the lineup of 3D films slated to hit theaters over the several months. I am noticing — how shall I say? — the pungent whiff of ripe schlock arresting my olfactory senses, even though, incredibly, RealD has not entered the “Smell-O-Rama” business yet.

Sadly, a lot of cheesy horror movies are queued up for the 3D treatment. That’s not good. I’m of the aesthetic view that ostentatious protrusive effects, used to goose the shock value of severed heads and buzzing saws, aren’t the best utilization of 3D technology. I like the immersive depth 3D can bring to quality entertainment and live sports, but I’m not sold on the viability of cheap gimmicks, or of 3D as ornamental gossamer for bad content. Look, a crap movie is crap movie. A 3D turd is still a turd.

And a proliferation of 3D turds will not do the 3D industry any good. Does anybody in Hollywood remember the 1950s . . . or perhaps read history?

Anyway, presuming that 3D is used naturally, that it is applied to good movies rather than as a decorative wrapper for bad ones, RealD still will have to contend with the nasty array of macroecoomic uncertainties that beset all us all.

There’s considerable risk in RealD as an investment vehicle, and there’s also a commensurate measure of promise. Today, on their first day of trading, RealD shares were snapped up eagerly by investors who see more promise than peril. The stock was up sharply from the open, and the company was able to price its offering well above expectations.

That’s an important consideration, by the way. Earlier in this post, I mentioned that RealD intends to take its 3D technology to consumer electronics. As part of that foray, the company is also looking at developing autostereoscopic (3D without glasses) technologies to eventually supersede its stereoscopic (3D with glasses) technology.

All things considered, I don’t think the glasses are going to cut it for casual television viewing in living rooms; nor do I think anybody but the geekiest of geeks will want to be wearing 3D glasses for extended periods while using a mobile device or playing a game console. The company that does autostereoscopic 3D right stands to reap massive rewards. RealD wants to be that company, but it’s not alone — Sony, Samsung, Dolby, 3M, Nintendo, and many others are in the mix, and their advances are closely monitored by HP, Dell, Apple, IBM, Cisco, and other major players.

RealD needs a warchest to fight that battle. Today’s IPO delivers it, as the company makes clear:

We will continue to develop proprietary 3D technologies to enhance the 3D viewing experience and create additional revenue opportunities. Our patented technologies enable 3D viewing in theaters, the home and elsewhere, including technologies that can allow 3D content to be viewed without eyewear. We will also selectively pursue technology acquisitions to expand and enhance our intellectual property portfolio in areas that complement our existing and new market opportunities and to supplement our internal research and development efforts.

Today’s IPO will help RealD pursue its strategic plan. Numerous external factors, however, are beyond its direct control.