Showing posts with label standards. Show all posts
Showing posts with label standards. Show all posts

Tuesday, December 30, 2014

Web standards exist for a reason

Back at the end of the browser wars — i.e. the late 20th century — it looked like Microsoft had won and Netscape had lost. A number of Windows-centric shops designed their websites for Internet Explorer, either in terms of full functionality (“works best with Internet Explorer") or actual access (“requires Internet Explorer”). Microsoft encouraged this by promulgating APIs for Visual Basic, .Net and DirectX and the like.

Fast forward to today. Over the past five years, Microsoft’s desktop market share has been in a freefall. Statcounter — the widely cited arbiter of browser usage — chronicles how Google Chrome has come from nowhere to take share from IE and (to a lesser degree) Firefox (heir to Netscape’s customers and developers). At 55% in January 2010, the IE share is now under 22%:


When you include all platforms — tablets, mobile phones and consoles — the news for Microsoft is even worse — with an IE share of 13.5%:


Yes, as a Mac owner this was particularly galling, since Microsoft had a Mac version of IE (as one MS employee pointed out to me) only as long as it served its purposes during the browser wars. MS discontinued IE for OS X in 2003. Fortunately, with IE now a small fraction of the web audience, it no longer matters — except at one site crucial for business professors, as I discovered today working on a paper.

The Virtue of Bad Design
One of the more popular proprietary business databases is called Thomson One, from Thomson Corporation (later Thomson Reuters). For entrepreneurship scholars (like me), the most relevant content is VentureXpert, a database of investments by VCs, angel networks, corporate VC and other private equity investments. This data is used by PWC and its partners to announce their quarterly VC funding stats at the PWC MoneyTree site.

Unfortunately, Thomson One is only compatible with Internet Explorer. Worse yet, it is not supported (and doesn’t fully work) with any version of IE greater than IE 8 (as documented by IT support desks at Wharton, Harvard, Columbia, and other schools).

Internet Explorer 8 was introduced in 2009 and last updated in February 2011 (almost five years ago), just before IE 9 was released in March 2009. IE 8 is not compatible with Microsoft’s current desktops, laptops, tablets or mobile phones, which require Internet Explorer 10 or 11. StatCounter estimates the November 2014 market share of IE 6+7+8 at 4.03% of the desktop market.

For Windows users, there is an IE Tab plug-in that helps Chrome and Firefox imitate IE, but not all the Thomson One features are available in this emulation mode.

Customers Lose, and (So Far) Thomson Still Wins
So to recap, here is where we are:
  • The virtue of the web (particularly HTML 4+) is interoperability between browsers.
  • One or more IT architects at Thomson Corp. decided years ago to lock their database to specific features of one browser, rather than support Internet standards.
  • Those features are so non-standard that they are not supported by Microsoft browsers released since March 2011.
  • The company has done nothing to upgrade their site to support the 96% of the world that uses other browsers.
I'd like to think that whoever made this architecture design error was fired for his (it was most likely a he) mistake, but that would assume a level of IT competence that the legacy team of Thomson Corp has not yet demonstrated. (Meanwhile, other Thomson Reuters sites seem to work with a wider range of IE versions and in some cases even have a mobile client).

One thing that is clear is that Thomson Reuters is pretty confident of their monopoly position in this particular niche: if not, their customers would be defecting in droves, and fixing this broken IT infrastructure would finally become a priority. I’m not holding my breath (on either competence or customer orientation suddenly breaking out).

Saturday, June 29, 2013

Collision of mobile business models

It’s no secret that automakers are building fancy navigation and entertainment systems into their cars. The LA Times this morning has a great article about how they’re not doing so well in competition with cell phone makers, who have products that are better, faster and cheaper.

At least in the US, the automakers want to control the customer, selling them expensive add-on systems; the most successful recently has been the Ford Sync, and before that the GM OnStar. But today the consumers who might buy such systems all have smartphones that do most of the same features (and more). They also run headlong into some of the freemium Internet business models (Exhibit A: Google Inc.) that give away stuff that automakers want to sell.

Here are a couple of great passages from the article by Jerry Hirsch:

[C]ar companies are spending millions of dollars developing interfaces, voice recognition software and navigation systems. Many of these functions either already come loaded on phones or can be downloaded at the swipe of a finger. Honda Motor Co., for instance, charges $2,000 for a satellite-linked navigation and traffic system on the premium version of its popular Accord sedan. But Waze, a division of Google Inc., provides the same functionality in a free app.

"People today bond more with their smartphones than their car," said Tom Mutchler, the senior engineer at the Consumer Reports Auto Test Center. "Car companies are going to have to live up to the expectations that come with that."
The whole article is recommended and doesn’t seem to be behind a paywall.

The article points out two problems the automakers face. First, in a race of innovative software between Ford and Apple, or Honda and Google, who do you think is going to win?

Second, you keep a car for 10 years and a phone for 2. So for the average consumer, which one is going to provide the better experience?

But the third problem Hirsch misses is that (as any strategy professor will tell you) Apple and Google and Samsung have economies of scale, and the car makers don’t. 700 million smartphones were sold globally in 2012, most using one of two platforms. In the US, 16 million cars were sold, with in-dash entertainment systems fragmented among a dozen makers.

As I teach my students, R&D is a fixed cost amortized as (total R&D) ÷ (number of units). Apple sold 137 million iPhones in 2012 (not counting iPads and iPod Touch using the iOS). Assuming GM or Ford gets 17% share and half buy the fancy infotainment system, that means about 1.4 million Americans are buying each car-based platform. (Toyota, Fiat/Chrysler and Honda sell even less). Given that’s two orders of magnitude less than the #2 smartphone platform, no wonder carmakers have to charge $2,000 for their systems.

It’s clear carmakers are going to lose this fight. Obviously, if you can’t beat ’em, join ’em — which is what Honda, Toyota and Hyundai appear to be doing. The article refers to the Car Connectivity Consortium producing the MirrorLink standard, which includes these three carmakers, as well as Samsung and HTC. In addition to these car companies, the CCC website also lists Daimler, GM and VW as charter members, with Ford and Subaru (“Fuji Heavy”) as a non-voting “adopter” member (BMW, Fiat and Mazda have limited voting rights). Nissan and Kia are nowhere to be found.

Volkswagen was early in partnering with Apple, so they may continue to lead on iPhone connectivity. Android compatibility could be a good foot in the door for the 3 Asian carmakers. But from the article and their market actions, it seems like some automakers (led by Ford) Ford are stuck in the slow lane trying to sell overpriced, soon-to-be-obsolete dashboard systems as though they can dictate what options American auto buyers will use on the road.

Wednesday, June 5, 2013

Apple's loss is a loss for consumer and the industry

In a shocking decision, the International Trade Commission voted to ban import of old iPhones and iPads for infringing five claims of a Samsung W-CDMA patent. The band would impact the AT&T models of the iPhone 4, 3GS, 3 and iPad 3G and iPad 2 3G.

That Samsung sued (in retaliation for Apple’s earlier win) or won an infringement judgement is not what’s surprising. The surprise is that the ITC granted what amounts to injunctive relief for infringement of a standards-essential patent.

In telecom, standards-essential patents are different from any other type of patent. These are patents where a company (usually a handset or chip vendor) tells the standards setting organization (SSO) that they believe their patent is essential for implementing the patent.

Companies try to accumulate lots of these patents to force companies into cross-licensing (or royalty-bearing) agreements. Sometimes these patents are of dubious quality, as Rudi Bekkers & I showed in our 2009 study of W-CDMA patents.

However, in exchange for saying a patent is “essential,” the patent-holder promises to license their patents to all comers. As the main GSM (3GSM/W-CDMA) standardization notes the two declarations are inseparable:

declare your essential IPRs and to tell ETSI about your preparedness to grant irrevocable licenses on fair, reasonable and non-discriminatory [FRAND] terms and conditions pursuant to Clause 6 of the ETSI IPR Policy
That policy reads:
6.1 When an ESSENTIAL IPR relating to a particular STANDARD or TECHNICAL SPECIFICATION is brought to the attention of ETSI, the Director-General of ETSI shall immediately request the owner to give within three months an irrevocable undertaking in writing that it is prepared to grant irrevocable licences on fair, reasonable and non-discriminatory terms and conditions under such IPR to at least the following extent:
  • MANUFACTURE, including the right to make or have made customized components and sub-systems to the licensee's own design for use in MANUFACTURE;
  • sell, lease, or otherwise dispose of EQUIPMENT so MANUFACTURED;
  • repair, use, or operate EQUIPMENT; and
  • use METHODS.
The above undertaking may be made subject to the condition that those who seek licences agree to reciprocate.
So for these patents, the only questions are a) whether or not a royalty is due and b) how much the price is. Once patent infringement is determined by a court, it’s a question of damages and not an injunction.

For this very reason, Samsung faces sanctions in Europe for similar efforts. The definitive site for telecom patent war news, FOSS Patents, reported in December 2012:
European Commission Vice President Joaquín Almunia already indicated that the adoption of a Statement of Objections (SO) against Samsung over its pursuit of injunctions against Apple based on standard-essential patents (SEPs) was imminent.

At close of business today [21 Dec] the Commission issued a press release announcing that Samsung has been served an SO, which is a preliminary ruling. This means we're past the stage of Samsung merely being suspected of abuse of a dominant market position (this theory is not based on Samsung's smartphone market share but on the leverage that SEPs give their owners), but that the Commission has preliminarily determined, after almost a year of formal investigations (which followed several months of preliminary ones), that Samsung has indeed committed abuse and should be sanctioned.
As blog author Florian Mueller has noted, Google has faced similar criticism for asserting the Motorola SEP that it bought.

The industry has noticed this case and the damaging impact of Samsung’s (apparently successful) legal arguments. Various industry groups filed briefs against an exclusion order. As Matt Rizzolo blogged on April 9:
We noted that several other parties also submitted responses, offering their views on how an exclusion order in this case might affect the public interest. These parties include:
Each of these parties warns the ITC that allowing exclusion orders for FRAND-pledged standard-essential patents may have adverse effects on U.S. consumers and the U.S. economy, particularly future standards-setting activity.
This morning, Mueller noted that price is at the heart of the dispute:
Some will say Apple should have taken a license, but Samsung's initial 2.4% demand was far outside the FRAND ballpark (as a Dutch court said in a ruling), and it's not known what Samsung has demanded more recently (other than that Apple still considers it excessive).
And, as Rizzolo noted, the Cisco/HP/Micro filing proposed a mechanism for independently establishing a “reasonable” royalty.

The largest cellphone patent holder, Qualcomm, had earlier filed an opposition at ITC to Apple’s proposed interpretation of FRAND, but hastily withdrew the criticism of its major customer.

It’s not clear what the next step is. The ITC will seize the older (lower priced) models in 60 days, unless the president or a Federal court blocks that decision. Congress might reform the law, but given they can’t even resolve simple budgetary issues, they’re not going to pass a major piece of patent reform in two months.

Wednesday, December 30, 2009

Productive publishing period

[HornTooting]
In terms of quantity, 2009 was my most productive year ever for academic publishing. After publishing three journal articles in 2008, in 2009 I published five: one about telecommunications, one about standards, two about telecommunications standards and one about open innovation.

The first two articles were based on a four-year collaboration with my now-friend Rudi Bekkers, looking at patents in W-CDMA. One paper focused more on the case study of standardization, while the second paper looked at the quality and timing of essential patents as laid against the standardization process. The latter paper was published in Telecommunications Policy, the leading academic journal on, well, telecommunications policy.

Building on Rudi’s earlier pathbreaking papers on GSM patenting, we noted several shifts from the 2G to 3G era in European mobile phone standards. The number of essential patents increased eightfold and the number of claimants increased threefold. Equipment makers retained about the same proportion of overall patents, but the network operators virtually disappeared, replaced by component suppliers (notably Qualcomm) and technology licensing firms (notably InterDigital).

These were our conclusions:

The sources of UMTS patent proliferation have often been ascribed to IPR-focused companies outside the ETSI process, particularly Qualcomm and InterDigital. However, this study shows that the largest numbers of patents are held by two firms (Nokia and Ericsson) centrally involved in the UMTS standardization, and the timing of their patenting suggests that they used their knowledge of the standard’s development for anticipatory patenting—further contributing to patent proliferation.

Still, a cozy oligopoly of four main UMTS patent holders might have produced a manageable IPR regime comparable that to the five major holders of GSM patents. However, the number of firms claiming at least a one patent has grown threefold, increasing the risk of holdup, transaction costs and royalty stacking for firms implementing the newer standard. This uncertainty is magnified by the self-determination of essentiality: while it is virtually impossible to determine how many of the 1227 patents are actually necessary to implement UMTS, at the same time other parties may fail to provide an itemized list of essential patents.
This was not even the longest collaboration of the papers. One paper was based on an eight-year collaboration with my friend Scott Gallagher, which began when we met during the bubble era conference of the Strategic Management Society (2001) in San Francisco.

Then as now, the goal was to re-examine, critique and extend the traditional view of positive feedback in the adoption of standardized goods. We brought together a number of observations that (when we started) were somewhat novel, although the field has not stood still during that period. After many delays (including other projects, work and life), the paper was published in September in the Journal of Engineering and Technology Management, a respectable journal that has attracted papers from some of the top names in the field

I’ve already mentioned on my other blogs the two other papers published in 2009.

In April, I published the cover article in the Journal of San Diego History, based on my research into the origins of the San Diego telecom industry. The paper was entitled “Before Qualcomm” to make it more relevant to the general readership, and traced the early round of spinoffs of Linkabit, the region’s seminal company. It also included discussions of the role of Qualcomm co-founders Andy Viterbi and Irwin Jacobs in applying Claude Shannon’s to space communications, drawing on my 2008 article in the Journal of Management Studies.

The fifth paper is the first in what I hope will be a series of papers that contrast open innovation with user innovation and related theories. Published in the Washington University Journal of Law and Policy, it (not surprisingly) focuses on policy issues related to open, user (and cumulative) innovation.

I’ll be glad to send out a PDF of the published version of any paper to anyone who’s interested.

This morning I got email notice of acceptance of my first paper for 2010, a paper on the success of the iPhone that will be published by Telecommunications Policy. Michael Mace have been working on this paper since 2007 — actually before the iPhone shipped — although our understanding of the phenomenon has shifted significantly since then. Additional details as they become available.

[/HornTooting]

References

Rudi Bekkers and Joel West, “Standards, Patents and Mobile Phones: Lessons from ETSI’s Handling of UMTS,” International Journal of IT Standards & Standardization Research, 7, 1 (January 2009), 13-34.

Rudi Bekkers and Joel West, “The Limits to IPR Standardization Policies as Evidenced by Strategic Patenting in UMTS,” Telecommunications Policy, 33, 1-2 (Feb.-March 2009): 80-97. DOI: 10.1016/j.telpol.2008.11.003

Scott Gallagher and Joel West, “Reconceptualizing and expanding the positive feedback network effects model: A case study,” Journal of Engineering and Technology Management 26, 3 (Sept. 2009): 131-147. DOI: 10.1016/j.jengtecman.2009.06.007

Joel West, “Before Qualcomm: Linkabit and the Origins of the San Diego Telecom Industry,” Journal of San Diego History, 55, 1-2 (Winter/Spring 2009): 1-20.

Joel West, “Policy Challenges of Open, Cumulative, and User Innovation,Washington University Journal of Law & Policy 30 (2009): 17-41.

Tuesday, August 25, 2009

Reversing arbitrary path dependence

Some standards are arbitrary, with little if any technical justification. Certainly anyone who’s carried a pocket full of European AC plugs will attest to this.

The WSJ has an interesting article Monday on how Samoa is changing the handed-ness of its automobiles and roads. It’s giving up the choice of its neighbor American Samoa that matches its “imperial” master (left-hand drive), to match that of much larger neighbors, Aus/NZ who match their former imperial master (right-hand drive). Apparently the Americans (and the French) and their allies are 70% of the world while the Commonwealth countries (plus Japan) are 30% of the world.

In searching for cheap leftover cars from Aus/NZ (much as Mexico takes cheap leftover cars from the US and Eastern Europe does so from the EU), Samoa is swimming against the tide. The WSJ lists 11 countries that have switched from 1946-1974 — all switching from the losing standard (British) to the winner (Franco-American).

Tuesday, June 16, 2009

Pyrrhic victory

Although Sony won the Blu-ray standards war, the FT reports that it’s failing to help the bottom line:

Hollywood's expensive bet on Blu-ray technology has failed to prevent a sharp decline in home entertainment sales, dealing a blow to the film industry as it grapples with the global economic downturn.

Home entertainment revenues across the industry fell more than $2.6bn in 2008 as sales of standard DVDs tumbled, according to a new report from Screen Digest, a research company.
There are a variety of explanations for falling DVD sales. Some say piracy, some say rentals, some say downloads. I personally think the issue is the declining marginal utility of additional DVDs: in 1998 it was novel to own a random-access (semi) durable movie, but eventually everyone had video pacifiers for their kids and more movies than they ever watch.

Meanwhile, the lack of demand for Blu-ray has been noted for some time. Again, I wonder whether the marginal value of the increased quality (and slight manufacturing cost increase) justifies the 50% price premium.

Whatever the reason behind declining DVD sales or the failure of the next generation to catch on, this is a reminder that it’s impossible to accurately estimate a priori the value of winning a standards war. The Blu-ray circumstances may be unique, but as David Wood reminds us, it’s hard to predict the future.

Thursday, April 9, 2009

Innovation, standardization and commoditization

If Silicon Graphics ran Christmas...
Ornaments would be priced slightly higher, but would hang on the tree remarkably quickly. Also the colors of the ornaments would be prettier than most all the others. Options would be available for 'equalization' of color combinations on the tree. — Internet joke, Dec. 1998.


Merc columnist Mike Cassidy had a poignant column Tuesday (also available here) on the liquidation of SGI (born Silicon Graphics). One excerpt:

There was a can-do and why-not attitude all rolled into one. One of the first people I met at SGI was Joe DiNucci, the vice president of marketing who worked at the company for five years ending in 1997.

Yes, he remembers the good times.

"There really was an entrepreneurial spirit there," he says. "The dark side of it was that it was kind of frat boy, locker-roomish."

But that sort of zeal and bravado meant that everyone talked to everyone and good ideas were infectious. "You could turn the company," says DiNucci, a valley veteran. "If you had a great idea and you had balls, you could make something happen."

And plenty did, starting with developing a way to create moving 3-D images on a computer screen. It was a lightning bolt at the time, and it was SGI's franchise.
DiNucci recalls when SGI was on the cover of Business Week, giving additional credence to the Paul Krugman line: “Whom the Gods would destroy, they first put on the cover of Business Week.”

Sure enough, Peter Burrows of BW, linked to some of this early coverage in his own column last week on the death of SGI (complete with a faux cover):
There was a time when Silicon Graphics Corp. was the Apple Inc. of corporate computing. It received coverage out of all proportion to its size, certainly by BW. And for good reason: It involved larger-than-life characters such as Jim Clark, who went on to co-found Netscape. SGI was forever on the cutting edge of technology innovation, and pioneered use of powerful computing technology in the making of movies, game consoles and for early Web companies in the mid-1990s. And it was a lightning rod in the best sense, always a central player in the big debates roiling the computer industry (workstation vs minicomputer, Risc vs Cisc and UNIX vs Windows, come to mind).
I think once upon a time, I would have shared in this nostalgia, but not today. Perhaps it was because I just missed becoming a Unix workstation geek when I jumped to the Mac in 1986. But I think it’s because as a researcher (and middle-aged industry veteran), the death of SGI seems like the natural order of things.

The normal progression is that maturing technologies get standardized, and standards enable commodization. Doing something unique lasts for a while, but unless you control a proprietary standard (possible but not likely) eventually there will be less control and more competition.

Competition engenders efficiency and price cuts for buyers, fueling adoption. The impact on innovation is mixed: competition can fuel innovation wars (as in cellphones) or it can squeeze margins and squeeze out R&D dollars (as in PCs).

So if some companies creating cool innovation in the high-risk, high-growth period — but fail during commoditization — that’s a dog-bites-man story. A few companies such as Apple, and IBM have re-invented themselves multiple times, but the cadre of one-great innovators seems much larger: Cray, DEC, Motorola, Sun. (Ironically, according to Wikipedia SGI once owned Cray, which in its latest incarnation is still making top-ranked supercomputers — only now using commodity processors).

I didn’t have this perspective as a 21-year-old software engineer or a 29-year-old entrepreneur — in part because I was young, and in part because (other than the BUNCH) the computer industry hadn’t seen a lot of casualties yet.

For my students, I try to sensitize them to this perspective: both the natural life-cycle of an industry and also the different sources of competitive advantage that firms need at different points in the life-cycle. As in previous years, I believe the best way to bring this realism to 20-something students is to assign Geoff Moore’s Dealing with Darwin, and I will be using it again in the fall.

Monday, February 9, 2009

Kindle: 2 no greater than 1

Amazon has announced the Kindle 2,(due Feb. 24) which features the sort of technical improvements that you would expect from any consumer electronics device. It’s thinner, has a minimalist keyboard but is still over $300. It’s still sending data traffic to Sprint’s underutilized EVDO network.

A few IP lawyers are in a huff because the device has the ability to create derivative works:

Some publishers and agents expressed concern over a new, experimental feature that reads text aloud with a computer-generated voice.

"They don't have the right to read a book out loud," said Paul Aiken, executive director of the Authors Guild. "That's an audio right, which is derivative under copyright law."
What was interesting is what Amazon didn’t announce.

They didn’t announce open content to sell book readers for other platforms, like the iPhone. Google still hopes to rule the world with its own proprietary format, as the NY Times reported
“Our vision is every book, ever printed, in any language, all available in less than 60 seconds,” said Jeffrey P. Bezos, Amazon’s founder and chief executive.
although Amazon is passing on the manufacturing and distribution cost savings (of not killing trees) to the readers, over the objections of the publishers:
Amazon generally charges $9.99 for the digital versions of best sellers, although many publishers still sell the digital content to Amazon for the same price that they sell physical books. That means that for now, Amazon is taking a loss or making a small margin on the sale of some e-books.

“We do not agree with their pricing strategy,” said Carolyn K. Reidy, chief executive of Simon & Schuster. “I don’t believe that a new book by an author should ipso facto be less expensive electronically than it is in paper format.”

Mr. Bezos disagreed. “E-books should be cheaper than physical books. Readers are going to demand that, and they are right because there are so many supply chain efficiencies relative to printing a paper book,” he said.
Amazon also did not announce any sales figures, so everyone is using the speculation of 500,000 units. How do we know how big the market is or how much impact the reader had without sales figures?'

Speaking of speculation, we do have speculation that the shortage was not due to Kindle’s contract manufacturer, but due to Amazon being overly cautious in ordering a key component. As the WSJ reported this morning:
The $359 Kindle, which allows people to read books in an electronic format, has been out of stock on Amazon's Web site since November, which meant it was unavailable over the crucial holiday shopping season. Now clues from the contract-manufacturing industry in China and Taiwan suggest the Seattle company may have been blindsided by demand for the book-size device and that it has since been ramping up production for the launch of its new Kindle.

The maker of the Kindle's special screens, Taiwanese manufacturer Prime View International, says the Kindle shortages came from Amazon's conservative sales forecast for the device. Prime View adds that Amazon is now trying to avoid repeating the current shortage by asking it to pump out more screens, which it is now doing in case orders increase suddenly.

"It wasn't about delivery delay," says a Prime View spokeswoman. "The sales were just faster than expected," The company says the new version of the Kindle is set to have a slightly bigger screen than the first-generation model.
So the data suggests that the Kindle is a modest success so far, and that the new model is slightly enhanced but is ignoring (or forestalling) the Innovator’s Dilemma.

As I recall, the iPod started out as a modest success, and (as they say) the rest was history. The Newton also started as a modest success, but never crossed the chasm to the mass market.

Thursday, October 2, 2008

Stiffling interoperability and competition

Mike Madison has an interesting post on the lawsuit between the proprietary software package EndNote and the open source rival Zotero, from George Madison U, funded by the Mellon Foundation. Zotero is a Netscape plugin for organizing academic papers that I found less useful than a commercial alternative.

In its lawsuit, the owners of EndNote (Thomson Scientific, now Thomson Reuters) argues that its contract prohibits reverse engineering. Thus, it contends, Zotero’s developers violated contract law by using reverse engineering to figure out how to get interoperability with EndNote.

The lawsuit seems prompted by a recent Zotero feature that enabled importing EndNote libraries journal styles and thus eliminating reducing the lock-in Thomson thought it owned. Since EndNote compatibility is mainly for one-time imports, it seems like Zotero could easily strip this functionality and have it distributed as a plug-in by an open source coalition located in .ru or some other scofflaw domain).

I’m no lawyer and haven’t kept up with the law, so I largely have to defer to Prof. Madison’s analysis of the case. Madison notes that under federal law, reverse engineering for compatibility is probably legal fair use, and thus Thomson filed in state court.

My first exposure to reverse engineering started in practice. Like many other software engineers making Apple-compatible software, I did a fair amount of reverse engineering of Apple’s code for just this purpose. Apple knew it was going on (or at least the tech support people did) but never sued anyone except the folks who tried to make unlicensed clone computers.

I also immersed myself in reverse engineering law for a night school term paper in an Asian studies class I took with Chalmers Johnson. This led in 1995 to my first published journal paper (and second† academic paper) about a Japanese policy controversy over how copyright law handled reverse engineering of software for compatibility. At the time, Sun (and some Japanese firms) wanted to reverse engineer IBM systems to provide interoperability and the IBM people wanted to discourage such reverse engineering. The identical issues played out with Linux and Microsoft in more recent years.

As I found in 1995, two major US appeals verdicts in 1992 — Sega v Accolade and Atari v. Nintendoset a precedent under US law for allowing reverse engineering. After the 1992 precedents, two other important articles were published on reverse engineering and interoperability: one by Julie Cohen in the California Law Journal and another by (my recent friend) Charles McManis in the High Technology Law Journal.

Then as now, I think the precedent and the policy outcomes were reasonable, since copyright law provides production only for the expression of an idea, and not against revealing trade secrets (which are compromised by public distribution, even of the object code). The cost of reverse engineering and re-implementation usually provides adequate value and protection for the author’s original expression, while adding allowing it to be used for compatibility enables competition.

So while I have not inspected the Thomson lawsuit, I am hoping that their competitor (who happens to be a university developing open source) wins the case on its merits. No software rents are perpetual: software gets commoditized over time, and if you can’t deal with it, you’re in the wrong business.

† The first was designing a programming language, predating by nearly a decade my decision to become an academic.

Updated 3:30pm: Oops. As anonymous Bruce points out, the lawsuit is about importing the EndNote styles, which are a set of templates defining the reference format of a wide range of publications. (I have never used EndNote, so I didn’t understand the distinction.) Since this is not conversion of user content but of EndNote’s content, it seems to me that the Zotero is in a much more precarious legal position.

Tuesday, February 19, 2008

Toshiba surrenders

As speculated for nearly a year, and specifically rumored for the past week, Toshiba officially threw in the towel this morning, pulling the plug on HD DVD. The proximate cause was Warner’s decision in early January to abandon HD DVD to exclusively support Blu-ray.

The HD DVD lifetime sales figures were reported by Engadget from the Toshiba press conference:

600,000 players in the US and 300,000 Xbox 360 HD DVD drives. 100,000 units were sold in Europe. And about 10,000 players and 20,000 recorders in Japan. So about 1,030,000 units worldwide.

Toshiba’s president was also asked by lawsuits from angry orphans, but Toshiba’s president (correctly) noted “there will be always be a risk in buying.”

In addition to kits coverage, Engadget also kindly offers a Top 10 list of things to do with your defunct player.

Sony — which lost the original Betamax battle — is officially the winner with Blu-ray. It appears its gamble of bundling Blu-ray with PS3 has paid off — at least for Blu-ray, if not for PS3. (BTW, the WSJ reports the 6.3 million Blu-ray players sold worldwide includes PS3 devices, but Wikipedia reports 10.5 million PS3 sold and that all have Blu-ray)

Today people are assuming that the format war was causing consumer confusion and hesitation, and thus sales will take off now that it’s over. But I can think of two other explanations for the slow uptake of Blu-ray. One is that it (the player and/or discs) is too expensive: the HD DVD price war last Christmas made prices more acceptable and spurred demand, but now that Sony has won, will it be so aggressive on pricing?

The other possibility is that the need for HD discs isn’t all that strong yet. We all have lots and lots of DVDs on our shelves, and buying a new player does nothing for those. Also, HD penetration is still (I suspect) less than 50% of households, so lots of people will see little or no benefit from an HD player.

Sony will need to aggressively price cut next Christmas to get players into people’s hands. Will that work, or will it need a final push from the March 2009 changeover to digital TVs?

Or will there still be skinflints like me that object to Hollywood charging $40 for the HD version of a $15 DVD? (It’s not like they have to pay their actors or writers by the pixel).

Tuesday, August 21, 2007

HD DVD says “I’m not dead yet!”

While I was never a huge Monty Python fan, I still enjoyed the British comedy troupe in small doses. After all, who could forget the infamous exchange from Monty Python and the Holy Grail: “Bring out your dead” and repost: “I’m not dead!”†

This week my MBA students are picking term projects and several have suggested the next-generation video format war, which pits Blu-ray (Sony, Panasonic, Samsung) vs. HD DVD (Toshiba, Microsoft). As a standards researcher, there’s nothing I like better than a good standards fight, but since Blockbuster dumped HD DVD two months ago, it looked like things were tipping to Blu-ray and the battle was all but over.

Now HD DVD has proven “I’m not dead yet!” It shows that when you’re behind in a standards war, there’s always something you can do. In this case, Toshiba and/or Microsoft bribed two major studios to switch sides. Here’s the new lineup:

  • HD DVD: Universal, Paramount, Dreamworks Animation
  • Blu-ray: Sony, Disney, Fox
  • Both: Time Warner, Paramount
The story is a little messy, because Dreamworks Animation is a public company still controlled by two of its billionaire founders (Katzenberg, and Geffen), but in 2005 they sold Dreamworks Pictures to Paramount. Also, pictures directed by Steven Spielberg (the “S” in SKG) are not exclusive to either format. Still, the announcement means that November’s release of the popular Shrek the Third will be exclusive to HD DVD (plus plain old DVD).

Of course, when you’re behind in a standards war, you have to do something or you’ll get wiped out. The NY Times quoted anonymous executives of Viacom (parent of Paramount) as saying that the HD DVD group was paying $150 million for the switch. It’s not a huge risk, because that the exclusive was promised only for 18 months (according to the NY Times), just long enough for two Christmas seasons.

Uncertainty is bad for Hollywood, which has long memories of VHS vs. Beta. However, the HD DVD camp is pushing down prices of players — at $150-200 less than Blu-ray — which is likely to fuel adoption. The stakes are more than just players. Sony is pushing Blu-ray in its PS3, while Microsoft’s Xbox 360 includes a HD DVD player. One estimate said 90% of the Blu-ray movies are being played on the PS3.

HD DVD buyers still have the legitimate fear of becoming (as economist Paul David put it in a 1987 paper) an “angry orphan.” Given this, consumers may be willing to risk $200 on a player but not more money on a large library of content — ideal for the rental market, except that Blockbuster is exclusively Blu-Ray. Netflix is quietly supporting both HD DVD and Blu-ray, but it’s not like they’re promoting either one. Would winning over Hollywood Video be enough to matter? If I were Microsoft (or Toshiba), I’d offer a joint promotion with Hollywood Video to mail introductory coupons to affluent neighborhoods near any Hollywood store that carries HD DVD.

† For the past 30 years, I’ve heard it quoted as “I’m not dead yet!” but the “yet” is clearly missing from the original.

Technorati Tags: , , ,

Monday, June 18, 2007

Sony 2, Toshiba 0

3rd in a series of insomniac blogging.

Monday brought the news that the leading US video rental chain, Blockbuster, is rolling out Blu-ray nationwide and dropping HD DVD. Endgadget seems to think it’s game over in the next-generation DVD format war between Toshiba and Sony.

I’m more inclined to agree with Forbes that Blu-ray is pulling ahead. The interesting thing is that Blockbuster’s decision after a 250-store trial reflects consumer demand rather than that Blockbuster (a shell of its former self) has the power to sway markets.

Of course, Sony already won the important first round: five of the big six Hollywood studios back Blu-ray, with only Universal in the HD DVD camp. So it’s not surprising that consumers would want to rent Pixar or Paramount movies instead of limiting themselves to one studio. (This raises the question of why Toshiba launched HD DVD without broader backing).

If Sony does win, this would also influence the Microsoft vs. Sony video game war, since Xbox 360 has HD DVD and the PS3 (natch) has Blu-ray. (Neither format will have any impact on Wii, which is what American consumers really want for a videogame console).

Competition is messy, but competition is good. There was some talk of merging the two standards (as happened with DVDs), but that likely would have doubled the patent royalties paid by every manufacturer of disc players.

The final round will come on the store shelves. Will Best Buy and Wal-Mart stop shelving HD DVD players? I remember how hard it was to find a Beta VCR back in 1990 after ours was stolen in a burglary. At that point, there were only two manufacturers left for what had long since become a declining market: Sony and Toshiba.

Technorati Tags: , ,

Tuesday, March 6, 2007

Open vs. Profitable Strategies

[Palm stock 2000-2007]With my morning breakfast, I happened to read a column on Palm’s current woes in the Wall Street Journal. There was nothing dramatically new, but it provided a nice summary of how they got to where they are. And the stock price results were an eye-opener. Apple has been volatile, but Palm is only a slightly better investment than Iridium.

The article (subscription required) by Robert Cyran of BreakingViews emphasizes the failure of Palm to be proprietary enough. Normally we think of Palm as being a proprietary OS platform, but his view is that being proprietary in one layer is not enough:

How did this happen? In a nutshell, Palm failed to build competitive barriers around its devices, so consumers weren't locked into its products. The Palm Pilot became a dinosaur once cellphones could store contact details and other information. All the data stored on a Palm Pilot was easily transferred to other devices.
(It’s a short article, so under fair use I won’t quote more).

After recounting the fall in market cap from $92 billion to $2 billion, Cyran goes on to praise the Apple moat built using the iTunes Music Store, and Research in Motion (and its BlackBerry product) with its server (and associated services). This is consonant with the advice given by Berkeley’s Carl Shapiro and Hal Varian in Chapter 6 of Information Rules almost a decade ago.

It turns out that the Apple refugees at Palm copied Apple’s old playbook at a time when Apple had to learn a new playbook. Once upon a time, a proprietary platform was enough to extract profitable rents. Now, however, with the co-existence and interconnection of devices (via the Internet, Wi-Fi, etc.), the individual product technology may not matter. The good news is (as I argued four years ago) is that niche platform (like the Mac) can co-exist via open Internet and web standards.

The bad news is that buyers care less and less about the software technology that goes into a phone to make it work, any more than they care about the software technology that goes into a TV or DVD player. Despite new products like HDTV, all vendors get release similar products in parallel and the consumer electronics industry has long-since been in brutal commoditization and price wars. Perhaps a few products can be truly innovative and unique (e.g. the iPhone) but most products are me-too. That would suggest opportunities for pooled R&D, either through a for-profit consortium (like Symbian) or open source done right (which is easier said than done).

Nowadays, it appears that enterprise infrastructure like servers and the associated services have much higher switching costs than client devices and appliances. Addition to RIM and Apple, there’s Microsoft BackOffice, and the granddaddy of them all, IBM Global Services. Of course, the closed interfaces for these servers have gotten Microsoft and Apple in trouble.

Update (4:45pm): Unstrung claims that interest in Palm is fueled by Jeff Hawkins’ mystery device which may not be a cellphone, while PalmAddict says it will be demo’d in May. This would both be an Steve Jobs-style re-invention of Palm, and give Nokia a reason to get involved beyond the phone technology they don’t need.

Technorati Tags: , , , ,

Monday, March 5, 2007

Microsoft’s Anti-Competitive Motivations

With some time on my hands, I hope to catch up with blogging this week.

Microsoft’s antitrust problems are back with two items in the news. For one, it’s failure to comply with terms of a prior case; for the other, it’s new evidence of Microsoft’s naked assertion of its market power.

Europe and the Windows APIs

Last week, European competition authorities said Microsoft is not complying with the terms of the 2004 ruling ordering it release its WIndows server APIs to competitors. You won’t find me a one-handed economist on this one, because I have seriously mixed feelings.

Microsoft is saying that it’s a simple patent case — it has intellectual property and its competitors aren’t willing to pay a fair price. Its general counsel says: “we believe that we have been fair and reasonable in setting the proposed protocol prices.”

On the one hand, Microsoft’s competitors complain that its terms of providing the technology are unrealistic, because Microsoft wants to charge royalties for use of the APIs, including royalties on open source software. As a result, no firm has licensed the APIs — clearly the EU edict is not having its desired effect, and the lack of takers suggests Microsoft is charging an unreasonable price.

On the other hand, should the government decide? Yes, if I were French or Swedish, I’d feel differently about aggressive government intervention in the free market. But for a capitalist system, government coercion should only be used to correct a “market failure”. Much as I dislike Microsoft’s predatory practices, it’s not clear this is one. Perhaps the competitors aren’t negotiating in good faith, but instead are waiting the EU to beat Microsoft over the head again.

The editorial page of the Wall Street Journal feels no ambivalence. Today it weighed in (registration required):

Brussels no longer acts as merely prosecutor, judge, jury and executioner in antitrust cases; it now claims the power to assess the worth of patents, too. …

EU antitrust officials make no secret of their preference for the open-source business model practiced by the companies behind all the complaints against Microsoft, among them such heavyweights as IBM, Sun Microsystems and Oracle.
Microsoft vs. Apple

The second piece of news is a byproduct of the disclosure from Microsoft’s recently-settled antitrust case in Iowa. While the settlement was portrayed as a happy compromise, Microsoft got the plaintiffs to close the website with 3,000 documents discovered in the case, while the plaintiffs got to add to their millions in fees gained from suing Microsoft.

Apparently some of the interesting documents have been mirrored before they were pulled. One document (“Plaintiff’s Exhibit 6060”) related to Microsoft’s plans for Office 97 for Macintosh. The Mac press finds it interesting that the internal MS memo promises to ship lousy Mac software, but I think there was a more important revelation, consisting of an e-mail from the general manager of the Macintosh business unit to Bill Gates:
From: Ben Waldman
Sent: June 27, 1997 1:56 AM
To: Bill Gates
Cc: Jon DeVaan; Greg Maffei
Subject: RE: Moving forward with Mac Office 97

I am writing to argue for making a final decision to FINISH Mac Office 97, and detach this issue form the current Apple discussions.

… The threat to cancel Mac Office 97 is certainly the strongest bargaining point we have, as doing so will do a great deal of harm to Apple immediately. I also believe that Apple is taking this threat pretty seriously. …

Regardless of the outcome of these discussions, though, I believe we should ship Mac Office 97. Furthermore, I believe we need to decide this immediately … as we are not only close to shipping coding externally, but need to finally start press and customer communications, especially with MacWorld a month away.
and the CEO’s reply:
From: Bill Gates
Sent: June 27, 1997 9:37 AM
To: Ben Waldman
Cc: Jon DeVaan; Greg Maffei
Subject: RE: Moving forward with Mac Office 97

I have 2 things I need to understand

Realistically, when do we think we would ship this product?
Can we avoid Apple knowing how are along we are for the next 30 days?
[Time cover]Although Microsoft made $300 million a year (sales) off Mac Office, it threatened to destroy Apple’s ability to sell to the corporate (and university) market. The threat worked. Six weeks later, Gates had a deal which included Apple dropping Netscape (effectively killing it) in exchange for Internet Explorer, and Steve Jobs could announce that with Gates’ help, he had saved Apple. The announcement was so unreal (with echoes of the famous 1984 “Big Brother” ad), that it provided the conclusion for the 1999 dramatization (i.e. TV docudrama) about the two men.

Technorati Tags: , , , ,

Monday, January 29, 2007

WiFi trapped by its own success?

WiFi (aka 802.11a, b, g) has been a tremendous success. In fact, given its modest goals as a way to connect handheld computers in a warehouse, it widespread adoption in every laptop and an increasing number of PDAs an cell phones is remarkable.

If anything, it’s been too successful. Too successful, you say? Isn't that like being too rich or too thin?

The problem is that a large installed base creates an upward compatibility constraint that can be irresistible. Inertia for an existing standard is the cumulative effect of the number of customers times the individual switching costs (plus producer-related switching costs — in this case the base station and chip makers). As Brian Dipert of EDN reports, the committee took its time in standardizing, and meanwhile various greedy and impatient vendors shipped so many “draft 802.11n” products, that no one would vote for a final standard that was incompatible with all the nonstandard product in the field.

Meanwhile, George Ou has a provocative post where he argues that the 802.11n standardization committee wimped out, deciding to create something that's not really all that much better than 802.11g. As Ou tells it, the problem was that rather than spend a few extra bucks (initially) on a chip that also supported 5 GHz, they stuck with the crowed 2.4 GHz band. The existing 2.4 GHz spectrum only supports 3 (or 4) simultaneous channels and are already crowded, so (my reading of it is) unless you’re on a deserted mountaintop you’ll never see the claimed 100 Mbps throughput.

If I were the Enhanced Wireless Consortium, when the final standard gets blessed I’d get the press some sample units to demonstrate actual performance.

Technorati Tags: , , , ,

Saturday, January 27, 2007

Qualcomm’s Stunning Court Loss

Friday a jury ruled in favor of Broadcom in the first of seven patent lawsuits between the two companies. Even if the H.264-related patent is not core to Qualcomm’s assertion of IPR on 3G mobile phones, it’s hard to see this as anything but a big negative for the company.

Any loss emboldens the various mobile phone vendors and component makers, none of whom want to pay Qualcomm patent royalties. I would expect more aggressive lawsuits by Qualcomm rivals and less willingness to sign (or renew) Qualcomm patent licenses.

Technorati Tags: , , ,