Wednesday, January 31, 2007

Building an Open “Cyberinfrastructure”

Monday and Tuesday I was at an NSF-sponsored conference at the National Academies of Science and Engineering. Because it’s in DC, it has had a real mouthful of a name: “Designing Cyberinfrastructure for Collaboration and Innovation: Emerging Frameworks and Strategies for Enabling and Controlling Knowledgeuntil they realized it was a distraction and shortened it.

The title and the cyberinfrastructure jargon made it sounds more intimidating than it was: although participants were expected to use the cyberinfrastructure buzzword (tied to an NSF program of the same name), it contained an interesting assortment of 12-minute talks about real problems of openness from a cross-section of views. Based on their printed bios, the 47 scheduled speakers came from academia (44%), government (US plus UK plus OECD, 26%), companies (17%) and nonprofits (13%).

Most of the talks related directly to IT, although a few focused on IP and innovation issues related to the sciences (mainly biopharma).

Major Themes
There were three major themes:

  • The Need for Cyberinfrastructure. Why do we need cyberinfrastructure? After releasing the Atkins report in 2003 — named for the blue-ribbon panel led by Dan Atkins — the NSF was already sold on the idea. This part was mainly an introduction for the rest of us that haven’t been thinking about the problem.
  • How Infrastructure is Different. The dictionary defines infrastructure as shared facilities necessary for economic activity. Many of the speakers offered intriguing examples of how infrastructure enables research, innovation and economic activity.
  • Partitioning Between Public Good and Private Gain. My own talk fit in this category, as did the many talks about the IP system. It was clear that infrastructure is not the same as taxpayer-supported: the government pushed rural electrification and telephone deployment (or the Internet) even though facilities were owned by private firms.
As Yogi Berra would say, this was like dejà vu all over again. Organizer Brian Kahin noted that the issues of cyberinfrastructure today resemble those of 1994 with plans for the NII (a term that Al Gore and other policy wonks were then using to refer to what became the Internet). Then at Harvard’s Kennedy School, among his various efforts Brian hosted a 1996 conference and published a 1997 book on global NII policy, which is where I first met him. The issues of creating a digital communications infrastructure are similar to those of a decade ago: architecture, implementation, funding and use.

What’s different is the scale of data transmission and storage required for things like patient trials, medical imaging and digital astronomy. Mark Ellisman of UCSD (founder of the Biomedical Informatics Research Network) talked about 5 petabytes (that’s about 5 million gigabytes) for a single 3D image of a rodent organ. Imagine what movies would require.

What Was Interesting
It’s hard to capture 16 hours of meetings into 750 words, which is why I have separate posts on IP (coming soon), a general question of “what is open,” and my own talk about open standards/open source/open innovation.

My favorite talks were (in no particular order)
  • Siobhán O’Mahony (Harvard) and Fiona Murray (MIT), with 2 interesting case studies of how private innovation is getting more public and vice versa. They concluded that there’s a need to create boundary organizations if resources (technology, IP, other assets) is going to be distributed between shared (public) and private interests.
  • Shane Greenstein (Northwestern), excerpted from his in-progress book on the history of the U.S. internet service provider (ISP) industry. He talked about how individual (often small) firms performed economic experiments, and how these economic (i.e. business) innovations spread and thus were cumulative across the industry.
  • Steve Jackson (Michigan): once an infrastructure is created, there is an “inside,” an “outside” and often “orphans.” Certainly the decisions made in creating the infrastructure define winners and losers.
  • Brett Frischmann (Loyola Chicago) talked about how infrastructure is a sharable generic input into a wide variety of output, and how the value of infrastructure is actually realized downstream by consumers of outputs.
  • Sara Boettiger (PIPRA) whose talk I mentioned earlier.
Carl Cargill (Sun) also spoke forcefully about how the standards system is broken, while others talked about the issue in less apocalyptic terms. Because standardization is a central issue of openness in the IT industry, I’ll summarize these arguments later.

The slides for some of the talks have been posted and more will be posted later in the week. The individual papers are likely to end up in an online journal in a few months.

Technorati Tags: , , , ,

Tuesday, January 30, 2007

Is Open a Process or an Outcome?

One of the themes of this week’s conference is defining “open.” The word “open” means different things to different people. Our host, institutional entrepreneur Brian Kahin, joked that not only should we put quotes around “open,” but perhaps we should increase/decrease the number of quotes to indicate our uncertainty over the term.

A key question for “ “ “ open ” ” ” - ness is the question of whether it is a process or an outcome. As I noted in my own paper, this ties back to the research on organization behavior, which has found that workers perceive two types of “ “ fair ” ” outcomes: procedural justice (process fairness) or distributive justices (outcome fairness).

(Of course, open-ness could just be a hollow marketing slogan, as the example I cited several years ago of the OpenVMS proprietary operating system. This was best captured at the conference by J.C. Herz, a videogame industry analyst (and DoD consultant) who had the best one-liner of the conference, likening “open” to a magic flavoring: “We will sprinkle a little ‘open’ on it and people will eat it.”)

Today, from the realm of biotech IPR came a dramatic example of these two types of openness. The occasion was the talk by Sara Boettiger of the Public Intellectual Property Resource for Agriculture (PIPRA). The PIPRA is an interesting animal — a Rockefeller Foundation-funded nonprofit created to help manage and disseminate the IP of university and public sector agricultural researchers — based at UC Davis. There is an obvious public good of having taxpayer-supported agricultural research be used to improve crop yields in developing countries.

Boettiger used the hypothetical example of the perfect AIDS vaccine to illustrate the gap between what I call openness in process and outcome. Suppose someone discovers the vaccine, and puts it in the public domain? It would still require millions of dollars to produce and administer that vaccine to the [more than 500 million] residents of sub-Saharan Africa.

She then briefly summarized how this was handled with the polio vaccine, whose inventor, Jonas Salk, famously told Edward R. Murrow “There is no patent. Can you patent the sun?" Boettiger showed that it took a considerable amount of private money from the National Foundation for Infantile Paralysis (created by FDR, now known as the March of Dimes) to fund the research and manufacturing, as well as volunteer efforts (with schoolchildren as subjects) to conduct and analyze the trials.

Such nonprofit funding and volunteer effort are appropriate for something of such urgent and noble purpose. For more prosaic purposes, we have the volunteer efforts of creating Wikipedia. But of course the vast majority of innovation resources available in our economy are generated by firms reinvesting their profits for their own private gain. (It’s called capitalism, folks).

Ironically, thanks to the quasi-monopoly of the not-very-open Windows standard, there will be nonprofit money to distribute that AIDS vaccine when the time comes.

Technorati Tags: , ,

Monday, January 29, 2007

WiFi trapped by its own success?

WiFi (aka 802.11a, b, g) has been a tremendous success. In fact, given its modest goals as a way to connect handheld computers in a warehouse, it widespread adoption in every laptop and an increasing number of PDAs an cell phones is remarkable.

If anything, it’s been too successful. Too successful, you say? Isn't that like being too rich or too thin?

The problem is that a large installed base creates an upward compatibility constraint that can be irresistible. Inertia for an existing standard is the cumulative effect of the number of customers times the individual switching costs (plus producer-related switching costs — in this case the base station and chip makers). As Brian Dipert of EDN reports, the committee took its time in standardizing, and meanwhile various greedy and impatient vendors shipped so many “draft 802.11n” products, that no one would vote for a final standard that was incompatible with all the nonstandard product in the field.

Meanwhile, George Ou has a provocative post where he argues that the 802.11n standardization committee wimped out, deciding to create something that's not really all that much better than 802.11g. As Ou tells it, the problem was that rather than spend a few extra bucks (initially) on a chip that also supported 5 GHz, they stuck with the crowed 2.4 GHz band. The existing 2.4 GHz spectrum only supports 3 (or 4) simultaneous channels and are already crowded, so (my reading of it is) unless you’re on a deserted mountaintop you’ll never see the claimed 100 Mbps throughput.

If I were the Enhanced Wireless Consortium, when the final standard gets blessed I’d get the press some sample units to demonstrate actual performance.

Technorati Tags: , , , ,

Open {Standards|Source|Innovation}

Note to non-CS types: the title is a representation in Backus-Naur form of the formulation: “open standards” or “open source” or “open innovation.”

Today I’m speaking at a NSF-sponsored conference at the National Academies in Washington, DC. (More on the conference later). Within the conference, I’m on a panel entitled “The Ecology and Design of ‘Open’.” Other than the opportunity to join with a top tier group of speakers, I accepted the invitation (despite the 7-hour trip to the right coast) as a way to force myself to start writing about something I’ve been mulling over for a while

My own presentation is entitled “Contrasting Firm Strategies for Open Standards, Open Source and Open Innovation.” As the title suggests, it provides the opportunity to synthesize the commonalities from a decade of research on standards, open source and open innovation. Today I focused on understanding how the firm’s desire for value capture conflicts with (or must be weighed against) the value creation created across a value network.

This is only my first cut at a topic I expect to be writing about for several years. I’d welcome any thoughts by readers, either as comments below or via e-mail.

PS: Within the same panel, Elliot Maxwell had an interesting talk on these same three topics; I hope to comment on his talk in a future posting.

[Joel speaking]
Photo courtesy Alan Wolf

Technorati Tags: , , ,

Verizon Said Nyet to iPhone?

In my 8 previous postings to this blog, I’ve thus far resisted the temptation to mention the word “iPhone.” That’s remarkable because my private commentary on the iPhone announcement (via e-mail to friends) was the impetus that convinced me I had something to say in a blog. This week I’m at a conference, so my take on the iPhone will have to wait another week or two.

Because I’m staying in a hotel, I happened to see USA Today this morning, and, in particular, the article “Verizon rejected Apple iPhone deal.” It’s obviously a Verizon leak, spun to explain to shareholders why AT&T (née Cingular) has an exclusive on the hottest vaporware of the year. Given that Cingular claimed a few weeks ago that they “made Apple bend” — a claim rejected as implausible by professional Apple-watchers — I don’t know who to believe.

Today’s report claims that Apple wanted Verizon to

  • exclude mass market retailers
  • let it control hardware support and returns
  • share an ongoing revenue stream from the phone
The latter is an interesting wrinkle on the mobile phone business model that potentially would change the relationships within the industry. Certainly it’s in the opposite direction from Vodafone’s vision for a future of commoditized cell phones.



Technorati Tags: , , , ,

Saturday, January 27, 2007

Qualcomm’s Stunning Court Loss

Friday a jury ruled in favor of Broadcom in the first of seven patent lawsuits between the two companies. Even if the H.264-related patent is not core to Qualcomm’s assertion of IPR on 3G mobile phones, it’s hard to see this as anything but a big negative for the company.

Any loss emboldens the various mobile phone vendors and component makers, none of whom want to pay Qualcomm patent royalties. I would expect more aggressive lawsuits by Qualcomm rivals and less willingness to sign (or renew) Qualcomm patent licenses.

Technorati Tags: , , ,

Is LiMo Gated or Open Source?

The use of Linux in mobile phones is one of the big growth areas for embedded Linux. As mobile CPUs got faster and RAM got bigger, the once ludicrous idea of putting Linux on a phone has become feasible. (For those don’t know the players in the mobile OS platform wars, the most detailed (free) analysis can be found in a comprehensive 52-page report by Andreas Constantinou of VisionMobile Ltd.)

Last June, several existing vendors of Linux-based mobile phones (as well as two of the world’s biggest operators) announced an alliance to try to create unified Linux standards. The players were:

  • NTT DoCoMo, NEC and Panasonic Mobile Communications, creators of the MOAP-L Linux stack used for some of NTT’s phones. (Other DoCoMo phones use the MOAP-S platform based on the Symbian OS)
  • Motorola, the 2nd largest cell phone maker, whose EZX platform is based on code from MontaVista and Trolltech
  • Samsung, the 3rd largest cell phone maker
  • Vodafone, which as the world’s largest mobile phone operator, would like to sell more commodity handsets
Conspiciously absent was Nokia (the world’s largest handset maker again in 2006), which has released two Linux-based WiFi tablets but (thus far) no Linux GSM phones.

This week, the alliance got a new name and formal structure. The LiMo Foundation has been incorporated in Delaware but is headquartered in England. Aspects of the foundation appear patterned after the Eclipse Foundation, the most successful open source R&D consortium to date.

The Mobile Phone Development blog (where I learned of the new foundation), dug through the documents and found that they suggest the technical scope and architecture for the planned collaboration.

As someone who researches open source governance and communities, I found something else to be even more interesting. The nine-page white paper outlining the Foundation’s goals and structure says:
Membership in the Foundation is open, subject only to the payment of dues while access to the Foundation source code is subject solely to being a Member and compliance with a security-related self-certification and other specified security measures. [Emphasis added]
Unless I misread this, they are creating what Sonali Shah terms a “gated source” community — not an open source community — in which only members inside a “gate” get access to the source code. (Her recently published paper explains more about the concept).

One way to read this is that they made a simple mistake, and they actually intend to share derivative works of the GPL-licensed Linux with the entire world, without fee or membership. After, this is the way that embedded Linux is supposed to work. The other way to read this is that the partners just don’t “get” open source: Motorola has gotten criticism for not sharing EZX changes, while NEC and another Panasonic division were part of the CE Linux Forum, which also had trouble cooperating in an open community.

So far, Nokia has done much better. While Nokia often tries to be a vertically integrated company and push its own standards, on open source (and the 770 and 800), they appear to get it. Another open source research colleague, Sebastian Spaeth of ETH Zürich, is preparing a case study documenting what Nokia did and why.

Technorati Tags: , , , ,

Friday, January 26, 2007

Is it Fair to Force Open FairPlay?

The news keeps getting worse for Apple on the DRM front, with the Dutch and Germans joining the French, Danes, Finns, Swedes and Norwegians in kvetching about Apple quasi-locking iTunes downloads to work only on the iPod.

The press coverage has been a little misleading. When the head of the Consumentenbond tells Dutch consumers they should subscribe to eMusic instead of iTunes, that’s like Ralph Nader speaking, not a government minister. In any country there are activists who attack business decisions (some deservedly, some not). Also, in France and Netherlands (if not Germany) the proposed changes would help domestic consumer electronics companies that compete with Apple.

Here are a few questions about the “fairness” of Apple’s decision to restrict FairPlay-encrypted downloads to Apple players:

  • Is the iTunes Music Store open? No, it’s pretty much a vertically integrated distribution and platform system.
  • Is Apple misleading people on their openness? No.
  • Is Apple trying to create lock-in like other proprietary platforms before it? Of course.
  • Do they provide a switching option? Sorta.
  • The payoff pitch: Did Apple gain its dominant market position in music players through illegal means, such as tying? No. How could Apple use the market power of its 3% PC market share to force anyone to buy anything?
Apple got their music market share by making a system that works better end to end. The whole business model rests on a cross-subsidy from the iPod to the music store. Apple makes good margins selling new iPods (as long as people are willing to buy them), but (according to a Oct. 9, 2003 estimate in the Wall Street Journal), pays 2/3 of all revenues to record companies and makes a single-digit gross profit off the downloads. If you kill the IP model, then effectively Apple is being asked to run its music store so its competitors can compete away the profit margins that keep the store open.

The U.S. government said Apple’s system is legal, because consumers can choose between competing systems from Microsoft, Sony and others. The case certainly highlights the risk (particularly for American firms) of pursuing IP-based business models, for several reasons.

As the Microsoft case illustrated, European antitrust standards are different than the U.S. ones — in the U.S. you have to demonstrate consumer harm, whereas in Europe harm to competitors is a legitimate criterion. Secondly, outside the Anglo-American rule of law, the division between law-making and law-creating (or interpreting) is not as clearly defined. (To be fair, last summer’s changes in French policy were enacted through the legislative process).

The textbook definition of political risk is a change of government policy that reduces or eliminates the value of a company’s property held in a foreign country. While OECD countries don’t go around confiscating physical property like some Latin American strongman, the rules regarding intellectual property remain far less predictable. Whether or not Apple’s system is ultimately determined to be legal across the EU, as with open source licenses, legal predictability would be good for industry.

Technorati Tags: , , , ,

Thursday, January 25, 2007

Cause and Effect: Shared Governance Brings Community

In his blog today, Matt Asay makes an impassioned plea for the open source world to draw a sharp distinction between companies that are sincere about open source, and those who are not. He argues the firms that fully embrace open source will succeed — while those that are half-hearted or “treat open source as a ball-and-chain” will fail.

Based on my research on firm open source strategies over the past 7 years, I am sympathetic to the argument. My hunch is that in most cases there is a structural reason for insincerity: Microsoft (or another big proprietary software vendor) is insincere at open source and fails, but the root cause is that a full embrace of open source would cannibalize its business. Clay Christensen calls this The Innovator’s Dilemma.

However, I take violent exception to the throwaway line at the end:

Open source without community is a sham.
IMHO this is confusing the cause with the effect. There are 100,000+ SourceForge projects without community. Are they a sham, too? Or are they just otherwise sincere people that failed to attract an audience?

For the past 4 years I’ve been researching how firms work with to open source communities, working with Siobhán O’Mahony who did her Stanford Ph.D. on the subject and was the first academic sociologist to study the phenemonon. We’ve concluded that open source communities have three dimensions:
  1. intellectual property
  2. development practices and
  3. governance.
IP (i.e., OSI licenses) and development practices (e.g. CVS, Bugzilla etc.) fall into a fairly narrow range. However, the governance decisions (both transparency and accessibility) that firms make for sponsored open source projects strongly influence whether outsiders decide to join a community.

The obligation of a real open source company is to create governance structures which allow external participants to feel that they have the full right to participate and influence the outcome of the open source project; the best example of this is what IBM did with Eclipse. If no one shows up, then it might be because the sponsor failed to share control — or it might be because the technology has a small potential user base or because another project has already attracted the largest pool of potential contributors.

Matt is a friend, and one of the most insightful people I know on open source strategies, but I think he’s plain wrong here. It may have just been a minor slip of the fingers.

Wednesday, January 24, 2007

Enforcing the GPL on Embedded Linux

Presumably by now everyone knows about the GPL and (depending on who you ask) its reciprocal aka “viral” properties. In the past, to avoid being a propagandist for either side I’ve called this “compulsory sharing” although the reality is that no shorthand description will make everyone happy.

To me, the remarkable thing has been how little enforcement there has been of the GPL on Linux (as opposed to dual license sponsored projects like MySQL). Maybe it's because those Linux proponents promoting Linux adoption don't want to discourage adoption. Or because of the legally unresolved question as to whether linking constitutes a derivative work. Or perhaps it’s because (for server and desktop use) there's the fig leaf of doing separate installs — shipping a CD with a Linux install and (say) a proprietary application, which are thus installed separately and thus the application doesn't fall under the “work based on the Program” clause of the GPLv2.

No such fig leaf exists if you have a WiFi router or a DVD player or a cellphone that uses embedded Linux. When I first started researching embedded Linux, one of the embedded BSD vendors argued that this made the more mature and non-GPL BSD operating system more suitable for embedded use.

[Cisco logo]This week Cisco admitted to violating the GPL on its iPhone (the one no one’s buying, not to be confused with the much-hyped Apple vaporware). They didn't volunteer it, but had to be prodded by someone who reverse-engineered the firmware and figured it out. This sort of GPL violation is probably pretty common, and obviously a big company like Cisco is an attractive target, but there were two interesting points. One is that the the inquisitive engineer was from the formal GPL Violations Project. The other was that the alleged violation was over Cisco not disclosing its own software, rather than over modifications to the Linux software (which would also be violations of the LGPL, MPL, EPL, etc. etc.).

Such disclosure of one’s own software is exactly the unresolved legal question that Larry Rosen (and others) have been pointing out with the Free Software Foundation's interpretation of “derivative work.” It could have huge implications for the many mobile phone manufacturers who are today shipping Linux-phones with unshared proprietary modifications.

If such enforcement becomes common, then it is going to change the nature of innovation in segments that use embedded Linux, forcing them to become more open in their innovation practices. Some would choose to make sharing a virtue and to actively collaborate, while others might decide the cost of sharing is not worth the incremental value of Linux over non-GPL alternatives. On the other hand, if the current regime of sporadic ad hoc enforcement continues, then it isn't going to serve as a deterrent for most vendors, and lots of little schlocky little companies are going to design business plans assuming they can fly under the radar and not get caught.

It would be helpful for all concerned if there a test case that told us what the law actually is. Of course, if the GPL were tested in a US court, it’s possible it could lose, which could account for the lack of a test case.

Monday, January 22, 2007

Did OSDL’s Business Model Fail?

As many people have reported, Open Source Development Labs (OSDL) and the Free Standards Group (FSG) are merging. I first learned about it from Andy Updegrove’s blog (imagine, a lawyer blogging at 6pm on a Sunday night). It is being presented as a merger, but (as with the Alcatel-Lucent or HP-Compaq "mergers") it’s obvious that the FSG is taking over the OSDL. And as Steven Vaughan-Nichols remarked, given their overlapping missions, "What took them so long?"

Only one of the articles that I saw, by Charles Babcock at Information Week, linked the merger to OSDL’s obvious financial difficulties, as evidenced by the last month’s announcement that its CEO had quit and it was reducing its staff by 30%. I don’t know if such a linkage is considered too subjective (aka “analysis” or “commentary”) or if the FL/OSS fans among the press feel obliged to only report good news.

[OSDL Tux logo]Which raises the question, if it couldn’t pay its bills, did OSDL’s business model fail? As I noted in 2004 (in a journal article that was finally published last year in R&D Management), OSDL is just an industrial R&D consortium whose results spill over to non-members more readily than for a typical consortium.

Any consortium — whether in the auto industry or one of Updegrove’s computer industry standards consortia — must serve the interests of its members to get the revenues necessary to keep the doors open. So if the OSDL consortium wasn’t supporting the business models of its members — or it was supporting the business models of an ever-decreasing number of members — then that would account for the financial problems.

Another possibility is mission creep, a chronic temptation for nonprofits that don’t want to go out of business. The best known example is the March of Dimes, which helped cure polio but switched to birth defects and prenatal health to stick around for another 50 years. So perhaps OSDL has accomplished its original mission — legitimating Linux — and it’s earned a well-earned retirement.

Absent inside information, right now I have more questions than answers. Perhaps an enterprising reporter will follow up.

Sunday, January 21, 2007

The Business of Blogging Software

I started in 2007 resolute in my vow to never become a blogger. All that changed last week in my SJSU course on entrepreneurship, when the topic was business models.

Since my own research is about open source business models, I thought I'd offer students an example of a current startup pursuing a business model that includes open source. For my technology strategy class, I've explicitly taught my current research on open source business models, but that's more detail on open source than appropriate for this class.

For Saturday's class, my idea was to look at the wiki or blogging software companies that release their software as open source. By Friday, I'd signed up for 5 free services — 2 for wikis and 3 for blogs. I also created 3 blogs on 2 services. I thought trying the software would be a good way to learn about the business (or at least that was my rationalization).

[Wikipedia]My original interest was piqued by Wikia, the company co-founded by Jimbo Wales using the MediaWiki technology developed for Wikipedia. As a speaker at the Wikimania conference last August, I heard Wales talk about his company and its 2006 relaunch.

The problem is, does Wikia have a business model? I study this sort of thing for a living, and right now I just don’t get it. Of course I like open business models, but if you don't capture value somehow, how do you pay the bills? (And are banner ads alone enough?)

So I spent a day checking out various wiki and blogging startups, including the two (JotSpot and Pyra Labs) purchased by Google. As a teaching case, I decided to use Socialtext, because at least their revenue model is pretty clear.

What did we conclude? The exit strategy here is tricky; 5 years from now there isn't going to be a stand-alone blogging software market and probably not one for wikis either. This is all an economies of scope play (as defined by Panzar & Willig back in the 1970s).

Now that Google has one of each, are they done buying? And once MSN and Yahoo have dance partners, what will happen to the remaining belles of the ball? It doesn't sound like a Cinderella ending.

New Books on Open Innovation

I'm starting out this blog with some observations on open innovation, a recent area of research for me.

I am the co-editor of a recent book on innovation, Open Innovation: Researching a New Paradigm. The authors and I are justifiably proud of the serious and (IMHO) thoughtful treatment of the ideas of open innovation, an innovation paradigm that is becoming increasingly important in industry and academia. If you want a deep understanding of open innovation from multiple perspectives, this is the place to start.


My friend Henry Chesbrough also has a new book entitled Open Business Models, which offers a more practical outlook on how to incorporate open innovation into a firm's business model.

Finally, Henry’s original 2003 book is now available in paperback.

More information can be found at www.OpenInnovation.net.