Thursday, October 29, 2009

Insider blasts California dysfunction

One of the longest-serving California politicians testified last week about the train wreck that is the California legislative process. Testimony by State treasurer Bill Lockyer included this (as reported by the LA Times);

"There's too much junk," Democrat Lockyer told the committee members, raising his voice. "I'm sorry, but two-thirds of the bills I see come out of the Assembly, if they never saw the light of day, God bless it. . . . Just stop it! Just stop it! . . . Just say 'No.' "
Meanwhile, in an excerpt posted to YouTube, Locker predicted a train wreck if the state doesn’t limit public employee pensions:
It's impossible for this legislature to reform the pension system and if we don’t we bankrupt the state. I don't think anybody can do it here because of who elected you.
If anyone is Sacramento insider, it’s Bill Lockyer. He spent 25 years in the state legislature, ending his stint as the leader of the state senate in 1998. He then joined the California musical chairs gig, in which Democratic officials swap statewide jobs every 8 years to avoid term limits. He spent eight years as attorney general and is nearing the end of his term as state treasurer.

Some argue that the solution to the problems of California misgovernance is weaker checks and balances, but I don’t think that was Lockyer’s intention. Lockyer said last week that he believes there should be a cap on spending, which suggests he does not expect to ever again compete in a contested Democratic primary (where public employee unions usually pick the nominee).

Lockyer was testifying before the legislative committee nominally interested in reforming itself. As the LA Times reported:
Legislative leaders created the reform committee for two basic reasons.

One is only whispered about. It's the threat of mass demotion. There's a proposed ballot initiative to reduce the full-time Legislature to part-time status, cutting members' salaries in half. That wouldn't be reform, only public retribution.

The part-time movement is struggling. But lawmakers are responding to the threat, trying to show voters that they can reform themselves.

As the 18th century English writer Samuel Johnson observed: When a man knows he is about to be hanged, "it concentrates his mind wonderfully."

The second, more important and publicly stated reason for the committee's creation is that the state Capitol has fallen into dire disrepair. Politicians too often are paralyzed. California needs a functional Legislature to resolve its nagging budget, education, water, transportation and economic development problems, to list a few pressing priorities.
If California history is any precedent, the legislature will give up power (or otherwise make major reforms) only when it’s too late. (Exhibit A: Proposition 13). Potemkin reform (from the same people that give us Potemkin balanced budgets) will be the norm, until a ballot initiative qualifies.

The big question mark is the governator. A man of no discernible political or ideological principles, he has run against and embraced both parties as well as various special interests. A lame duck with a little over a year left in office, he could easily come out with both guns blazing, cajole the legislature into symbolic reform, or help them resist reforms in exchange for one last budget deal.

While workers in the real economy ignore Sacramento and get on with their lives, lack of budget discipline is wreaking havoc on the state’s once vaunted higher education system. Of course, the mismanagement of the state economy means that employers are seeking to move jobs (if not entire operations) out of the state. But neither seems urgent enough to change how the state govenrment works.

Wednesday, October 28, 2009

Why "too big to fail" must end forever

By John Kay of the FT:

There are people who believe that, in future, better regulation, co-ordinated both domestically and internationally, will prevent such failures. The interests of consumers and the needs of the financial economy will be protected by such co-ordinated intervention, and there will never again be major calls on the public purse. There are also people who believe that pigs might fly. Mervyn King, governor of the Bank of England has made enemies by pointing out that they will not.

It is impossible for regulators to prevent business failure, and undesirable to pursue that objective. The essential dynamic of the market economy is that good businesses succeed and bad ones do not. There is a sense in which the bankruptcy of Lehman was a triumph of capitalism, not a failure. It was badly run, it employed greedy and overpaid individuals, and the services it provided were of marginal social value at best. It took risks that did not come off and went bust. That is how the market economy works.

The problem now is how to have greater stability while extricating ourselves from the “too big to fail” commitment, and taking a realistic view of the limits of regulation. “Too big to fail” exposes taxpayers to unlimited, uncontrolled liabilities. The moral hazard problem is not just that risk-taking within institutions that are too big to fail is encouraged but that private risk-monitoring of those institutions is discouraged.

When the next crisis hits, and it will, that frustrated public is likely to turn, not just on politicians who have been negligently lavish with public funds, or on bankers, but on the market system. What is at stake now may not just be the future of finance, but the future of capitalism.
The latest installment in outsourced economic criticism in these difficult times.

Thursday, October 22, 2009

Web 2.0 shakeout? Now they tell us!

“Early Tremors: Is It Time for Another Social Network Shakeout?” is the title of a new article this month by Knowledge@Wharton, published by the PR arm of the highly-ranked University of Pennsylvania business school. The article was highlighted on Wednesday by a Wharton column at Forbes.com and I read it after coming home from a discussion with my MBA students about Web 2.0 business models.

The conclusion of the article is a combination of optimism and skepticism:

"Even in the U.S., the most mature market for social networking, there's user growth at every category level," says [Andrew] Lipsman [of Comscore].

Experts at Wharton, however, question how long that growth can last. "The bubble hasn't popped yet and there's tremendous value in social media," says [Wharton Professor Peter] Fader. "But it's wishful thinking to [believe] that others on the 'me too' bandwagon will survive."
In a 2006 article, the Wharton faculty were cautiously optimistic:
When it comes to placing a valuation on the social network sites, Wharton marketing professor Leonard Lodish says traditional tools, such as the discounted present value of the profit stream, apply to these new Internet networks as much as they do to any other business. He recalls an argument he had with marketing students during the Internet boom of 2000 about Internet music seller CDNow. Lodish said the firm would never be able to justify costs of $70 to attain each customer. The following year the firm declared bankruptcy.

In the case of MySpace and Facebook, Lodish points out, the cost of gaining new customers is practically nothing because users join voluntarily and provide their own content through their profiles. In addition, the cost of running the sites' web servers is relatively low. If a classic advertising or subscription revenue model is used, he says, low-cost social network sites could be highly profitable.

Yahoo must buy or develop content for its site to attract advertisers and Google has to invest in its search capabilities, Lodish notes. "Yahoo makes a lot of money selling ads on its sites. Why can't Facebook and MySpace do the same thing?"
I don’t know any of these guys, but Wharton gets its pick of each year’s graduating PhDs and every Wharton prof I’ve met has been somewhere between very smart and extremely smart. (They only keep the faculty that are good at publishing in “A” journals, a somewhat unrelated skill).

There’s no way to criticize the opinions of other academics without appearing catty, but I think these distinguished faculty are way late to the table.

I’ve been a skeptic since the earliest days of this blog — going out on a limb to say that all of these Web 2.0 firms face a challenge monetizing their traffic and that when it comes to profitability, most of them will fail to deliver on the hype. Let me summarize how I’ve argued for almost 2½ years that the Web 2.0 emperor is buck nekked.

May 18, 2007, “The looming crash in Web 2.0 hype”:
[Guy] Kawasaki’s new business is heavily swathed in Web 2.0 hype.

To me, Kawasaki’s entry is a convincing sign that the Web 2.0 fad is peaking. The hype has been promoted by a book publisher [O’Reilly & Associates] trying to sell books and conference registrations. Of course the web is changing, but it’s a silly or self-serving idea that there’s a discrete transition (2.0 vs. 1.1 or 3.2) or that all of these changes are somehow part of a unified industry change.

Once the hype/fad/boomlet crashes, then we’ll see which business ideas are sustainable and thus can survive without the hype. There is the obvious parallel to the dot-com crash, when the viable businesses dropped the “.com” and went back to selling their goods (and stock) based on a real value proposition rather than riding a web fad.
May 27, 2008, “Web 2.0 just like Web 1.0”:
Because of my master’s student 2007 project studying Web 2.0 in the mobile phone industry, I’ve been tracking Web 2.0 more closely over the past year. And there was certainly a sense of déjà vu all over again.

I’ve done consulting for a dot-com and several open source startups. With or without Web 2.0, these business are good at creating value and less so at capturing the value. In particular, if you want profits you first need to get a growing, dependable source of revenues.
August 1, 2008, “Web 2.0: deja vu all over again”:
When I was asked to speak [at USC], I decided to talk about Web 2.0 business models. … The first part summarized the use of the term “Web 2.0,” starting from the definition by Tim O’Reilly back in September 2004.

I then summarized the Web 1.0 problems and how they apply to Web 2.0. Effectively, Web 1.0 was commoditized due to low entry barriers (compared to say retailing or radio stations), too many entires, low perceived customer value for commoditized content, and questionable revenue models. Web 2.0 has exactly the same problems.

How will it turn out? Clearly a Web 2.0 shakeout is coming; this seems like 1999 of the Web 1.0 (i.e. dot-bomb) era, which means that the shakeout should happen the next 2-3 years.
October 12, 2008, “Web 2.0 most likely to crater”:
A regular topic on this blog is the problem of Web 2.0 business models, and in particular that these emperors have no clothes.

To this same end, on Friday CNET published a list of 11 Web 2.0 companies most likely to run out of money and die…
In my August 1, 2008 column, I quoted this 2005 observation:
[B]ubbles and consequent shakeouts appear to be a common feature of all technological revolutions. Shakeouts typically mark the point at which an ascendant technology is ready to take its place at center stage. The pretenders are given the bum's rush, the real success stories show their strength, and there begins to be an understanding of what separates one from the other.
As I concluded then:
What makes this even more delicious is that the comment was made back in 2005 — when TIm O’Reilly was explaining what happened with end of the Web 1.0 era and how it would mark the beginning of the Web 2.0 era.

Hopefully before “Web 3.0” is coined, someone will take seriously the problem of inadequate revenue models.
As Michael Arrington of TechCrunch pointed out last month, the hype and concomitant inflated valuations make it particularly difficult for Web 2.0 companies to even attempt revenue models, for fear that they will bomb and destroy their valuations. Thus, there are perverse incentives keeping firms from addressing their most fundamental problem — until the eventual Web 2.0 crash in valuations forces all firms to demonstrate profitability or die.

Monday, October 19, 2009

Not many Nokias come with music

Nokia has wanted to be a vertically integrated mobile services and devices company. It its own app store, its own maps and its own music service.

How’s that working out, Nokia? The data is in on the latter, as PaidContent (and others) report:

Free unlimited music for more than a year seemed like a good proposition, but, nine months after its launch, Nokia’s Comes With Music bundle is not proving particularly popular with consumers.

In July, the scheme had just 107,227 users across the nine countries in which it’s launched, according to figures Nokia (NYSE: NOK) sent out to labels and distributors, obtained by MusicAlly:
By comparison, PaidContent mentions that Spotify has 6 million users. As of May, Pandora claimed to have 4 million iPhone listeners. Sirius XM radio quickly got 1 million downloads for its music app (although it’s not clear how many of those pay the monthly fee.

The granddaddy of them all, the iTunes Store, has sold more than 6 billion songs, even if most of them (so far) are only for use on PCs.

PaidContent’s assessment:
Nokia has placed a lot of importance, and a lot of marketing money, on Comes With Music. But the programme has been hampered by DRM and complex PC operation and confusing PC-mobile sync. And that advertising campaign has really failed to communicate exactly what CWM means.

Even the official blog on which Nokia is celebrating CWM’s first birthday acknowledges: “Sure, it didn’t start out that rosy, with lots of folk not really certain about what Comes With Music offered ... we never shied away from the important education process that is needed in order to fathom that you can download and forever-keep as many tracks as you like – but the past 365 days have seen a much greater understanding and appreciation for the service emerge.”
DRM is dead, simplicity and ubiquity are in. It sounds like Comes With Music has a long way to go. Is it the problem with the product concept? The execution? The business model?

Whatever the problem, it’s clear that the benefits of vertical integration (compared to all the point solution rivals) are minimal for Nokia. One could argue that vertical integration has paid off handsomely for Apple, but for the past decade the company also had the best execution in the IT industry for product design and ease of use. The point products can’t copy Apple‘s vertical integration, but they can hope to copy (or surpass) its other advantages.

If my experience with my Nokia E61 and S60 is any indication, Nokia is a long way from winning awards for ease of use.

Saturday, October 17, 2009

What happened to disintermediation?

A front page story in the WSJ Friday talked about Wal-Mart’s price war for online books, with $10 bestsellers. The money quote is pretty good:

"If there is going to be a 'Wal-Mart of the Web,' it is going to be Walmart.com," said Walmart.com CEO Raul Vazquez in an interview. "Our goal is to be the biggest and most visited retail Web site."
Wal-Mart is also planning on getting into electronic books. Amazon is responding every way that it can.

But there was one passage that I think that deserved more scrutiny:
The price war sent shivers through the publishing world. Wal-Mart's move, and similarly low prices for electronic books, may ultimately condition consumers to expect new titles to cost $10, a price that would force the publishing industry to re-scale its entire business, including the advances paid to writers.

"The endgame is rather scary for authors," said one book executive.
Here’s my question: why is it scary for authors?

Authors sell to publishers who (sometimes) sell to distributors who sell to retailers who sell to consumers. If there’s a price war that cuts the retail price to consumers, where is it written that authors are the ones who should suffer?

A decade ago, pundits and academic researchers and MBA teachers were saying that e-commerce would bring “disintermediation.” This means the starting and ending point of the value chain (content creator and content consumer) are essential, but one or more of the intermediaries is superfluous or obsolete.

Under this scenario, either the publisher or the retailer could get cut out of the detail entirely. For example, my favorite band released their latest album by self-publishing, cutting out its 30-year publisher Warner/Elektra/Asylum. (BTW, the album was initially released directly and exclusively to Wal-Mart, and sold quite well.)

So at best, for the publishers this suggests a three-way scramble between authors, publishers and resellers (mostly Amazon) to create a 2 vs. 1 coalition to squeeze the margins of the third. At worst, the top selling authors will emulate the top music artists and begin to bypass the publishers for direct distribution.

Of course, the returns and economies of scale are seriously skewed here. Amazon & Wal-Mart are running their price wars with the very top bestsellers, the same books that earn back their typesetting and printing setup costs in a day — and that keep the publishers’ doors open. Authors of million-copy novels can dictate terms to (or bypass) publishers in a way that authors of 500-copy academic books cannot.

Still, this re-opens the question of which intermediaries (if any) uniquely add value — and which ones have negotiating leverage. There are many publishers, but in the US only one Amazon. Even adding Wal-Mart and BN.com, the publishers don’t have a lot of options to disintermediate retailers. To me, the endgame for publishers seems scarier than a Stephen King novel.

Hat tip: Good Morning Silicon Valley on SJMercury.com

Friday, October 16, 2009

A thousand posts later

Yesterday’s posting marked the 1,000th posting on Open IT Strategies since I began in January 2007. It seems worthwhile to note the milestone and changes for the next phase of the blog.

From the first 1,000 postings, the top 5 topics are about what I would have expected after the first six months:

  • business models (107)
  • iPhone (104)
  • Apple (101)
  • mobile phones (94)
  • Google (80)
Of the next 10 categories, open source (43) and smartphones (39) are surprisingly low, while academia (39) and off-topic (39) are surprisingly high. Commodiziation (53), Microsoft (46), Web 2.0 (42), Nokia and open standards (36 each) are about where I’d expect them.

The blog has become a bit more of a mobile phone industry blog than an open standards blog, in part because interesting issues of open standards strategies are relatively rare.

To set direction going forward, I ran a reader survey in August. Almost 300 people saw the survey, but only 19 replied. The first page (asking opinions of specific posts) was very useful but also destroyed the response rate. Overall, business models was the most popular topic with non-tech topics the least popular. (I was surprised that there wasn’t more interest in media industries and information goods).

The articles that attracted the most negative reaction were on issues of economic policy (57 articles total) and related topics about (mis)management of the US economy. Given the high representation of respondents in the Bay Area and Europe, that’s probably to be expected. (I never intended to write about these topics, but 13 months ago I succumbed to baiting by a friend and former John Kerry elector who wanted to see if I would defend “W”; as it turns out, my last posting about his economic policies was harshly critical.)

To understand what attracts interest, I also poke at Google analytics now and again. From both the user comments and the analytics, the most consistently popular story in 2009 was my 2,800 word eulogy for the Mervyn’s department store, apparently because it was one of the few personal tributes to a store that was loved by many.

From the survey, one thing came through loud and clear: more is not better. Readers can’t keep up with more than 3-4 postings a week — which is good, because some weeks I have spent 8 hours or more on the blog. Right now, the press of teaching has slowed down my blogging (as it did in Fall 2007), but even when I have more time I’m going to limit my postings and spend most of those on the topics of greatest interest.

Should I get even more time, I will spend it keeping my other blogs alive with weekly postings rather than writing more articles for this blog than people can read. I have more ideas than I can possibly cover in one week, so I’ll have to be selective to fit my time available to write and the time of my readers to read.

To supplement blogging, I’ve also discovered the benefits of Twitter®. I have begun to vent some of my ideas there, whether it’s griping about threats to free markets or reposting the interesting OSS/tech policy insights of my buddy Matt Asay.

Also to reduce time blogging on economics issues, I’ve decided to feature in the sidebar four of my favorite blogs by economists defending free markets: Greg Mankiw (Harvard), John Lott (Maryland), EconLog (a group of contributors) and the official blog of the libertarian Cato Institute. This supplements the four tech blogs previously/still listed on the right.

And speaking of economics, I believe the Google ad words experiment is now at an end. I earned $12 in three months, which certainly didn’t make it worth the tackiness of highlighting their ads. Instead, I’ll go back to occasionally mentioning a book that I personally find interesting, and rely on the Amazon Associates to bring in the $50-100 in referral fees that I earn every year — less than the minimum wage for the time I spend in any given month, but of greater value to readers than the ad words.

Thursday, October 15, 2009

Don't buy Snow Leopard

If previously I was equivocal, now I’m clear: buying Snow Leopard was a mistake.

Sure, I was already irritated that Mac OS X 10.6 deletes AppleTalk and PowerPC support. However, a few people may not have a need for these technologies. I was running it on my Intel-based iMac which needs two printers, neither which of changed in the past 6 years.

But today, trying to scan a document for my tax return, I found that 10.6 breaks existing scanner drivers. Trying to re-install the scanner software, it refused to install one of the kernel extensions (kext), implying that some or all 32-bit kernel drivers are toast.

Sure enough, my scanner is known to be incompatible with OS X 10.6 and it’s not on the list of scanners HP plans to support for 10.6. This is the second time Apple’s broken drivers this decade, a record that seems to surpass Microsoft.

Facing this, I wasted most of the afternoon re-installing 10.5, copying user files and preferences, re-configuring printer and network settings, and installing post-10.5 updates. And I paid $50 for the privilege of having to do so. (If this had been OS 7,8 or 9, the re-install would have been done after 20 minutes).

As a systems programer and standards strategist, I can see why Apple developed 10.6 to help the Nexties force a transition away from legacy architectures and APIs. It appears to have succeeded wonderfully here.

However, unless you own a large-RAM desktop or server CPU, I can’t see why anyone would pay to buy this less capable software. Wait for the new one that actually does something new.

Monday, October 12, 2009

This year's Nobel prize

An unusual duo are this year’s winners of the “Nobel” prize in economics (from the Bank of Sweden).

Oliver Williamson has been “future Nobel Prize-winner” since I entered grad school in 1994, for his work on transaction cost economics, i.e. the choice of markets or administrative hierarchies to solve problems. EconLog summarizes his most famous idea while both the Hayek and Mises blogs claim him as a disciple of the Austrian school. (That’s certainly more feasible than the same claim about last year’s winner.)

The less expected choice was Elinor Ostrom, a political scientist who uses economic rationality to examine questions of public choice. I know her from her book (and papers) on the tragedy of the commons. When considering open source and other examples of social production, Ostrom (and subsequent authors) give ideas on how to mitigate (or reduce) free-rider effects that would otherwise cause these communities to collapse.

Libertarian blogger Virginia Postrel notes an important common thread between the two. Along with prior winners Ronald Coase and Douglass North, both are key contributors to the New Institutional Economics. I agree with Postrel that a major implication of NIE is that free market economies are governed in part by non-governmental economic institutions, and that getting such institutions right are crucial to economic growth.

So in a year when conventional wisdom is rejecting free markets — and the power of economics as an explanatory social science — this year’s economics selections (at least) validate key pillars of the field as have been recognized for decades.

Thursday, October 8, 2009

Cellular smackdown

Visiting the CTIA Wireless IT conference here in San Diego has called into focus the direct rivalry between the two largest carriers that account for two thirds of all US cellular subscribers. AT&T is the innovative device company and Verizon Wireless is the network quality company.

Neither were present at the MobileFocus press party Wednesday night, which featured the other two major carriers and four of the world’s five largest handset makers (plus RIM but not Apple). However, one of the keynotes this morning (after the former and current Qualcomm CEOs) was by John Donovan, CTO for AT&T.

Donovan presented five principles for a mobile network operator

  • Growth happens. Eyeballing the graph, it looks like mobile data on AT&T mobile is about 15-20,000 terabytes/day. Wireless data has grown 4932% (50-fold) over the past 12 quarters, and not just because of the 8.3 million iPhones activating in the last year.
  • Law of the jungle. AT&T is spending heavily on 3G rollout to provide HSPA 7.2 in cities like LA and Chicago this quarter, everywhere by Q2 2011. (To me there was an implication that smaller carriers would be unable to keep up.)
  • Innovate or Die. There's 85% penetration, so innovation in services and business models (like renting rather than buying music) will drive growth
  • Customers Demand Open. It’s not clear if AT&T is more open than VZW, but they say they are.
  • Invest for the Future. AT&T plans to trial 4G LTE in 2010 and rollout beginning in 2011.
Interestingly — in contrast to Verizon’s reputation (deserved or not) for network reliability — Donovan’s presentation was a bit defensive about network reliability. He said he is well aware of press coverage of network issues, but “I don’t plan our network based on blogs. … no one knows more about the wireless network experience than AT&T”.

(AT&T sent both its wireless CEO and corporate CTO to keynote on successive days, but unlike Verizon didn’t have a booth to answer questions.)

However, Donovan recounted AT&T’s impressive record of firsts in innovative mobile devices. AT&T (or more likely its predecessors like Cingular) was the first US carrier for Palm, Windows Mobile, BlackBerry and the iPhone. Windows Mobile launched Pocket IE (for better or worse) at a time when the rest of the world still thought WAP was the future.

Donovan argues that its experience with the iPhone (and its 75,000+ apps) means that it understands what consumers expect from the latest mobile applications. It’s a plausible argument.

Perhaps now carriers differ in their quality, but eventually wireless bandwidth will be a commodity. The higher levels of the stack — applications, services, connecting people and not hardware — will become the drivers of growth, usage and revenues.

All of this is happening at the device level and above, which is why VZW will need to shed its parent (Vodafone’s) historic desire to commoditize handsets, and seek out the latest hardware. The rumored Dell Android phone for 2010 will be just a start.

Android's certainly uncertain future

In a week of CTIA-related mobile platform announcements, industry analyst Garner predicts that in the next 39 months, Android will rise from less than 2% market share to 14% market share, becoming 2nd in the global market after Symbian.

I don’t buy it: 14% seems plausible, but I think it implausible to assume that neither iPhone nor BlackBerry will get to 14% by then, given their strong recent growth. Also, by some estimates RIM is already well above 14%.

But then, there is the spurious precision of the Garner prediction for the year 2012 years from now that’s typical of the genre — which has Android edging out the iPhone, Blackberry and Windows Mobile by a fraction of a % in 2012 (not 2011 or 2013).

As Computerworld reported:

The complete Gartner forecast for smartphone OSes by the end of 2012 puts Symbian on top with 203 million devices sold, and 39% of the market. Android will be second with nearly 76 million units sold, and 14.5% of the market.

Coming in a close third, the iPhone will ship on 71.5 million devices in 2012, giving a 13.7% market share. Windows Mobile will finish fourth, with 66.8 million units sold, or 12.8% of the market.

Very close behind Windows Mobile, the BlackBerry OS will sell on 65.25 million devices in 2012, Gartner forecasts, making it fifth with 12.5% market share.

Various Linux devices will sell 28 million units, at 5.4% market share, in sixth place. Palm Inc.'s webOS will sell on 11 million units in 2012, about 2.1% of the market, in seventh place, Gartner says.
Why 62.25 million? Not 62.3 million or 61.9 million? This sort of precision is GIGO.

Last year, Gartner said Android will get 10% share in 2011. At least that’s an estimate of a single significant digit, without the pseudo-precision.

In the end, what was published is just a guess — maybe more of a WAG than a SWAG. It doesn’t take an industry analyst to predicutt aht Android will grow rapidly, but how fast and what the natural limit is unclear.

This also points out the stupidity of point estaimtes. If we accept the calculation as an unbiased estimate, then it is more reasonable to say Android will have 8-20% market share in 2012.

Wednesday, October 7, 2009

Sharing in faux openness

The world's richest Internet property has landed America’s largest mobile phone carriers, after launching a year ago with the smallest of the major carriers.

The result of 18 months of negotiations, this presumably ends the feuding between the two over Google’s efforts to cajole, incent or mandate open networks in a mobile version of net neutrality.

But perhaps the two parties have a shared interested in using openness as a marketing slogan, some magic pixie dust over a non-open strategy.

As noted earlier, Google (and other dot-coms) want open networks (dumb pipes) as a way to cement their existing market positions. And Android has an association (Open Handset Alliance) of companies that want to catch the code Google is throwing over the wall, but not a fully open, shared governance open source community ala Eclipse.

As for Verizon Wireless, they’ve been claiming that their network is open, but there’s understandable skepticism for the operator that has proven the most controlling and difficult of the US carriers to work with.

Google won a major concession in that unlike AT&T, Verizon promised not to block Google Voice. However, this decision will provide no competitive advantage because (as was inevitable) AT&T announced Tuesday that it had decided to capitulate and (also) allow Google Voice.

However, open does not mean that all applications and libraries are treated the same. For example, the revenue split between Verizon and Google is unknown: some claim that Google offers a 30% split, while others claim that the Android Market deal is so generous that Verizon will never take the iPhone.

Presumably Verizon will stop denigrating handsets — its current ads attacking the iPhone basically say who cares about the handset, it’s the network that provides the value. Certainly Verizon’s decision to seek advanced handsets will increase competition in the US smartphone market: T-Mobile and Verizon with Android, AT&T with iPhone and Sprint with webOS.

Verizon promises the new deals is not exclusive, causing Forbes to term the relationship an “open marriage”:

During Tuesday's call, Verizon's Chief Marketing Officer John Stratton said the company intends to continue working with multiple handset partners and software companies. "We don't see expanding business opportunities with one company as taking opportunities away from another," he added.
I’m guessing that Google will someday let go and OHA will eventually become open.

Conversely, based on decades of Baby Bell DNA (held both by Cingular AT&T and Verizon) I doubt the operators are going to suddenly get very open on their own. However, there may be enough pressure from the new administration and new FCC majority (at least until the 2010 Congressional elections) that faux openness could very well be replaced by real openness before they’re done.

The AT&T and Verizon announcements are the first of a series of major announcements this week at the CTIA Wireless show in San Diego. (I hope to have some live postings on Thursday).

Update, Wed 11pm: Somehow I missed that last month Sprint announced an Android device from HTC, as I predicted back in April.

Tuesday, October 6, 2009

Keep out the lawyers

Free software, open source software and creative commons are all efforts to use copyright law to enforce a particular philosophy of collaborative innovation.( In many cases, the philosophy is more important than the technology, as Jason Dedrick & I discussed in a 2007 paper on the minimal impact that F/OSS ideology have upon F/OSS adoption.)

At the same time, possession (as they say) is 9/10 of the law. Proprietary software was a barrier to imitation because of trade secrets, not copyright. Inherently, giving out your source code means that other people have it and that you need a lawyer (and a court) if someone uses your code in a way you don’t want them to.

A friend (and blog reader) forwarded to me an interesting summary of an OSS dispute involving Jin (a GPL-licensed Java client for chess servers) and iChessU, a new company.

Sasha Maryanovsky, maker of Jin licensed his code for US $4,000. iChessU used the code but reneged on papying so Maryanovsky sued in Israeli court. Both sides hired lawyers, filed briefs, and also argued the case in the court of public opinion. (Linux.com wrote up the dispute in 2006).

Maryanovsky summarizes the saga in his blog, which eventually settled in 2008. I don’t know the billing rate of an Israeli IP attorney, I’m guessing that both sides would be lucky if they paid less than $100k to their attorneys. (The settlement terms are confidential, but if the case against iChessU is as open and shut as GPLniks would claim, it’s likely that it paid both sets of legal fees.

Interestingly, Maryanovsky’s home page (and LinkedIn account) imply that he’s an unemployed Java programmer. Open source is often used by student programmers to signal their skills and availability to the labor market, so either the Java programmers labor market is very soft, or Maryanovsky sent the wrong signal. (Perhaps a JME/J2ME social networking client would have been a better signal).

Penultimate Flash holdout falls

On Monday, Adobe announced Flash 10.1 (not Flash Lite) will be available in beta form Windows Mobile and webOS in 2009, and Android, BlackBerry and Symbian in 2010. Adding support for the Research in Motion platform to the previously announced list leaves only one smartphone conspicuously absent.

Apple has been conspicuously rejecting Flash for a variety of reasons, both implicitly and by explicit Steve Jobs pronouncements. It uses its terms of service to ban such technologies. PC World argues that Apple is fighting Flash to protect its App Store business model against encroachments by direct Flash downloads.

In the Apple vs. Adobe battle of wills, Adobe seems to have blinked first: also on Monday, Adobe announced it is making a Flash to iPhone App translator, so that individual applications can be translated one at a time. This solves the iPhone performance issue but does not provide Flash-infested websites for iPhone/iPod Touch owners.

I once predicted that Apple could say no to Flash as long as RIM stood with it; by this time next year, iPhone users will stand alone. Will Apple announce Flash for the iPhone by then? If it’s a performance issue, eventually Apple will say yes. I think the odds are 1:3 that the iPhone still won’t have Flash on its 5th anniversary.

Monday, October 5, 2009

Startup Success: the importance of Plan B

A month ago, in my Engineering Entrepreneurship blog, I noted an essential skill that entrepreneurs need: the flexibility to find. Or as I wrote:

Many if not most tech entrepreneurs eventually face a wrenching problem: when do I give up on Plan A and go to Plan B?
As it turns out, a friend of mine, John Mullins has just co-authored a book called Getting to Plan B: Breaking Through to a Better Business Model, released last month.

John is still in London, but his co-author lives here in Silicon Valley and is familiar to both SV denizens and entrepreneurship students alike: Randy Komisar, author of The Monk and the Riddle.

Randy is talking about their new book Tuesday at 7pm in Menlo Park, in an event sponsored by SVASE. It’s an important topic, and I hope some of my readers will be able to make.

Saturday, October 3, 2009

Snow Leopard: 2 steps back, and …

Like a lot of Mac fanboys, I went out and bought Snow Leopard (Mac OS X 10.6) when it came out. It was cheap ($49 for the family pack, $29 for the single copy) and supposedly was faster and more compact.

After installing it on one computer, so far as I can tell, there’s no there there. Yes, it’s 64 bit — which makes sense for a server but not my underpowered laptop or bottom of the line desktop.

What’s clear is that there’s two major ways that 10.6 is less than 10.5. The first is eliminating PowerPC support. OK, we knew it was coming, they announced in the runup to 10.6, and these are the oldest machines around. However, at our house, half of the machines in daily use are PowerPC (not counting two archaic laptops we keep in reserve). This makes the economics of the family pack more suspect.

It was also interesting that Rosetta — the PowerPC emulator for Intel Macs — is not installed by default, although the OS will install it for you when needed.. Given Apple’s desire to wipe out compatibility mode as quickly as possible (cf. PowerPC, Classic, etc.), I suspect that Apple is trying to find out how many Macs actually have it running — e.g. from crash logs — and then will use that to decide how quickly they can kill it.

Which brings me to the other major step backwards. Snow Leopard deletes AppleTalk.

I know the Nexties who came to Apple in the reverse takeover of 1997 have been wanting to kill AppleTalk for years. When I was a developer, I heard numerous speeches that justified killing “legacy” code. So I can’t say they didn’t warn me.

Without AppleTalk, almost any laser printer can be used via LPR/LPD. But this is a giant step backwards, because it provides no feedback — paper out, paper jam — which is something that has been available on Mac network printing for 25 years.

The bigger problem is that AppleTalk is the only way to discover most older printers — including my 2002 laser printer at home. So if I go to a strange office and don’t have AppleTalk on my laptop, I won’t be able to find the printers unless I can manually figure out the IP address of the printer. This is very un Mac-like.

So right now the 10.6 is installed on the iMac where it does little damage, but it’s definitely not going to mess up my main machine, the MacBook Air. Paying more for less? Doesn’t make a lot of sense to me.

Friday, October 2, 2009

iPhone, Android ride the rising tide

The latest AdMob statistics show that the iPhone share of global mobile web browsing has risen from 33 to 40%. Interestingly, its share in Western Europe (67%) and Oceana is higher than in North America (52%).

By eyeballing it, the iPhone’s impact is diluted due an impressive 13% share here for Android. (Android is only 6% in Western Europe). Since February, Android has jumped from 2% to 7% worldwide, in part due to the new HTC myTouch (Magic).

Palm’s webOS jumped from 0 to 4% in the same period, or 9% in North America, which suggests that North America is 45% of the sample.

GigaOM and others have analyzed the data. Here’s a few points I haven’t seen.

First, market share data doesn’t capture the effect of growing the market. Windows, Symbian and RIM have lost share, but I don’t think this means fewer Blackberry owners are surfing the web. The AdMob data said that mobile web page requests grew 9% from July to August, translating to a 180% annualized growth rate.

After many years of hopes and predictions, the mobile Internet is growing rapidly, and that growth is coming from the most exciting devices — the various incarnations of the iPhone, and the more recent (and more limited) Android and webOS choices. As others have noted, the latter two platforms seem to have lots of potential for growth.

This reinforces the point that Mike Mace and I made two years ago — that the iPhone found a way to deliver the mobile web that everyone was waiting for). Apple got their first, by recognizing the failings of early offerings (see: WAP). By the way, this cements the dominance of WebKit — these three hot platforms are all WebKit, and when you add in Symbian (mostly WebKit S60), that’s 85% of all mobile web browsing in this samle.

The data are somewhat distorted by the fact that these three growing platforms are being sold (at least in the US) with a bundled data plan at an additional $40-50/month — while it’s still possible to buy a Nokia or Windows Mobile (or Treo) smartphone without such a data bundle. There is a selection effect: the only people who buy these devices are those who expect to use the Internet enough to make that cost worthwhile.

The US carriers will eventually run out of people who want to permanently increase their phone bills $600 a month, so they will either have to loosen up on tethering (allowing 20-somethings to cancel their cable or DSL), cut their prices, or expect a cap in the growth of their growth in 3G data usage.

In fact, one place where Android (so far) is not playing is the PDA market. The iPod Touch is about 40% of the total iPhone OS market share, and while they’re not being used to surf the web out in the national parks, they can be used to surf the web at home, work, a college campus or public library. The Zune HD has won good reviews — will this bring up the share of Windows Mobile more than any phone?

Finally, what is the future of Flash for the mobile web? Apple remains adamantly opposed, even if most of its rivals (save RIM) have embraced Adobe’s ubiquitous multimedia platform.

Personally, I’m hoping Apple wins this fight. At least as implemented by Adobe and used by web designers, Flash is an abomination — the single worst thing about the Internet. A computer, web browser and cable modem that would otherwise zip through the web suddenly grind to a halt because of some idiot’s self-centered effort to create catchy graphics in a futile hope of distinguishing their website. Phones have even slower connections and processors, so the last thing we need is to waste that limited bandwidth on websites that benefit the website owner rather than the web visitor.