Tuesday, January 28, 2014

When the brand trumps the product

In strategy, we often debate the cause and effect of success: how much is the product, and how much is the perception of the product? Fortune 500 companies spend billions trying to build the brand, in hopes that builds loyalty beyond (or instead) of anything the product does.

This question has come up in higher education, but I’ve never hear the brand value put so starkly as in this FT article this morning:

“A degree has value only if the degree is scarce, and the MBA is completely unscarce,” says Jeffrey Pfeffer, professor of organizational behavior at Stanford Graduate School of Business.

Prof Pfeffer has published on a wide range of topics but is well known for taking on the industry in which he works – business education. He argues that schools’ reputations have suffered from promoting themselves as a route to enhanced future salaries. The professor has been pointing out for a decade that the value of a degree is linked to the prestige of an institution rather than what it teaches – but few people have been listening.

“People don’t hear what they don’t want to hear,” he says, adding that he nonetheless believes that unless you go to an elite school – by which he means one ranked in the top 15 worldwide – an MBA is a complete waste of money.
So, Prof. Pfeffer’s argument goes, the value of the MBA is the transfer of the elite brand to your resume.

My coauthor, fellow blogger, former Apple and Palm executive and entrepreneur Michael Mace listed the bundle of services in a degree:
universities bundle several services in that thing called a degree:
--Teaching the students
--Credentialing (ensuring that the students have learned the material)
--Giving the students social connections (Yale, Stanford)
--Helping young people turn into adults in a semi-safe setting
The latter was on Mike’s mind as the parent of an undergraduate (now two), but the other three certainly apply to MBA programs.

By Pfeffer’s theory (or conjecture), the value of Stanford MBA is the credential, not the actual content of the courses. This is consistent with what other academics have wondered — are Stanford MBA students successful because of the selective admissions, i.e. is all the value of the MBA added by the admissions office letter?

In fact, in the FT article a soured MBA graduate recommend just that. Author of The MBA Bubble, Mariana Zanetti told the FT:
“But I don’t think it’s the MBA that adds the value – it’s the selection process that makes the difference,” she says. She would even recommend getting a place at a top school and turning it down to prove you were of a high calibre without wasting money.
Even if Pfeffer is correct, he leaves out a crucial factor in calculating the net value-added: cost, both out of pocket and opportunity cost.

Some MBA programs are more expensive than others. Obviously for the same cost-benefit, a low cost doesn’t have to deliver as much benefit. Forbes calculates the ROI on Stanford, which has the second highest prices (after Harvard). The public schools on the normal top 25 lists are charging market prices, but the Forbes list identifies some state schools (like Iowa, Michigan State or Washington) that gouge their students less than others.

The second cost is the opportunity cost. No one would argue that Bill Gates or Mark Zuckerberg would have been more successful if they had finished Harvard. During the dot-com mania, Stanford and Harvard MBA students were dropping out to start companies — both learning on the job (in a way new graduates rarely do) and getting lottery tickets (stock options) that could allow them to retire before 30.

Whether or not the (self-serving) observations of elite b-school professors are correct, the question of the value added by education is one that needs to be addressed. However, students who don’t get an elite credential might get other value from they bundle of services, i.e. if they actually learn something in college or grad school. To me, the acid test is what middle-aged students do with their own money: their careers are determined by their prior experience, not any burnishing of their resume by a mid-career degree. Instead, they go to school to improve their own human capital. These students are a breath of fresh air for any teacher.

And overall, I think Pfeffer’s analysis (true or not) sets a terrible example for his students. It’s a serious mistake for any business — including a nonprofit — to focus strictly on the brand and forget about the quality of the product. Even Apple — held out as a master of PR and branding — was left for dead in the 1990s due to its product execution, and only came back (and once again changed the world) because it made stuff people wanted to buy.

Sunday, January 26, 2014

Nobody beats Amazon

This week, there were two data points on the irreversible transformation of distribution (and employment) wrought by Jeff Bezos.

The first came in Friday’s WSJ, reporting on Q4 results by Seattle’s fifth largest company:

Starbucks Corp. reported slightly lower-than-expected revenue and same-store sales growth in its fiscal first quarter due, in part, to consumers' shift to online shopping during the holidays.

Howard Schultz, Starbucks's chief executive and chairman, told investors on Thursday that "2013 was the first holiday that many traditional retailers saw in-store foot traffic give way to online shopping in a very big way. Customers watched, waited, compared prices and then bought the brands and products they wanted online—frequently using a mobile device to do so" and that Starbucks is well-positioned to benefit from the shift thanks to its investment in mobile-payment options and electronic gift cards.
What does it mean to society if Starbucks — Starbucks! — is losing sales because people are shopping in their PJs rather than the malls? Presumably Schultz will still get the commuters and business travelers, even if their seasonal bump is never again what it used to be.

The second data point came from entrepreneur Howard Lindzon, who makes his money collating stock insights on Twitter. In a (dead tree only) column in our local birdcage liner, he wrote this morning how “bricks and mortar” has become “bricks and mortuary,” as America’s largest e-tailer continues to put storefronts out of business:
Over time, more and more of us have become comfortable doing an increasing portion of our buying online. Amazon has won. With tortoise-like patience, it has conditioned us to expect low prices, vast selection, quick delivery and one-click transaction. it is hard to compete against it today. Ask Best Buy.
Best Buy was once the quintessential “big box” retailer. Of course, Blockbuster used to be the category killer for video rental, and look how that worked out.

Lindzon expressed skepticism about Best Buy’s plan to improve online sales efforts in hopes of catching Amazon:
Good luck competing with Amazon on price and selection or on customer awareness. They have already won this battle. Your only chance is customer service, but it has never been in Best Buy’s blood. It is hard to teach an old dog new tricks.
If I were still teaching undergraduate strategy, I’d point my seniors to the three essential truths here
  1. The transformation of retailing is ongoing and unstoppable.
  2. Traditional retailers will be unable to out-Amazon Amazon, and only a handful will come close to competitive parity.
  3. The only hope firms have to compete with the volume leader is to offer superior service.
Some firms already value service, and may be able to find a way to monetize those competencies. For those that can’t or won’t, it’s hard to imagine a viable path forward (other than exit via M&A).

Friday, January 24, 2014

30 years of changing the world

Thirty years ago today, Steve Jobs unveiled the Macintosh. I didn’t see it live, but I bought the Time magazine report and put myself on the waiting list later that week.

It wasn’t much of a computer. At $2500, the Mac 128 was an overpriced and underpowered toy that didn’t do much other that matrix-print doodles. As I showed in my dissertation, it was another three years before Apple had something that could compete credibly with the IBM AT, and by that point the wheels were falling off of Apple as an organization as a series of mediocre CEOs replaced Jobs.

That said, Apple Computer changed the world — not once, but several times over. And nearly all these breakthroughs occurred when Steve Jobs was in charge.

Other firms changed the world, too. From the PC era, that would include IBM, Intel, Microsoft, Compaq, Visicorp and Lotus Development. When cellphones became smartphones, Nokia, Symbian and Research in Motion helped define the category before Apple released its first phone in 2007.

Some companies did a better job of created a technology (or product category) than a sustainable business. The first Palm PDA was an incredible breakthrough, but in the end, the world didn’t want a PDA, it wanted an all-in-one communication and computing device.

Just like aerospace engineers during the space race, I’m pleased to have lived through this era with a front-row seat. I’m also pleased to (so far) never have owned a Windows machine, which (given the PC industry’s ongoing decline) is a record that I’m likely to sustain to the end (Windows or mine).

Friday, January 10, 2014

Incremental progress against bureaucracy run amok

One of the principles of bureaucracy is power, once seizes, is never voluntarily relinquished.

A major problem for a small number of Americans is the tyranny of the local Institutional Review Board (aka “Human Subjects” committee). Originally created to prevent physical harm to humans from new substances or devices, their domain expanded to include psychological harm after the notorious Milgram experiments. As with any excess of power or authority, good intentions to solve a real problem were used to justify mission creep and a power grab.

This morning, our trade journal (the Chronicle of Higher Education) reported progress on this front:

January 10, 2014

National Research Council Panel Seeks Rules Overhaul on Human-Subjects Studies
By Paul Basken

The long-snarled bid to revise federal rules governing research on human subjects got a boost on Thursday with a National Research Council analysis that endorsed a substantial overhaul.

In a 139-page report, a 15-member panel assisted by dozens of field-specific experts, mostly from American universities, backed changes that include exempting many social-science researchers—such as those conducting oral histories—from the current set of rules.
The problem is that merely talking to (adult) people subjects researchers to this bureaucratic oversight. As the Chronicle reported in the summer of 2011:
Since the federal government began to establish stringent ethical oversight of medical research in the 1970s, institutional review boards like Hunter's have become a regular feature in the bureaucratic landscape of American colleges and universities, with the responsibility of safeguarding the rights of individuals who become subjects of research. While it is clear that vaccine trials and experimental drug tests demand a watchful eye from the outside, cases like Ms. [Bernadette] McCauley's are less cut and dried.

Institutional review boards have purview over any research on "human subjects" that creates "generalizable knowledge," but whether that murky definition encompasses or excludes conducting interviews, obtaining oral histories, or doing other types of humanities research involving living people has largely been left up to individual institutions, creating decades of inconsistent policy toward the disciplines.

"It's not clear to me what kind of harm you can do to somebody by interviewing them for an oral history," Mr. [Robert] Townsend says.

In the past, members of his organization have complained that institutional review boards insisted that historians keep their interview subjects anonymous to protect their privacy or avoid topics that might upset or embarrass them.

While perhaps reasonable in scientific or behavioral research, Mr. Townsend says, those demands go against the fiber of historical inquiry, where the credibility of an interview often depends on the researcher and reader knowing who the narrator is. And why, historians ask, shouldn't consenting adults be allowed to decide for themselves what subjects they feel comfortable discussing?

"A lot of the people on these review boards are amateurs when it comes to the humanities and social sciences," Mr. Townsend says. "So they filter those proposals narrowly through the written rules to make sure they are compliant."

That can lead to boards imposing onerous conditions on scholars conducting interviews, like requiring them to provide an advance list of the exact questions they will ask—something that can be nearly impossible to do for a personal interview that is intended to be open-ended. And review boards have been known to tell researchers that they must destroy their tapes after the research project is complete to safeguard their subjects.
Unfortunately, the latest article implies that little progress has been made in the past 29 months.

Perhaps the publication of the report by the National Academies will prompt local IRBs to reconsider. However, given the self-perpetuating nature of bureaucracies and the glacial rate of change in academia, there’s not a reason for hope.

Note to readers: I apologize for the slow rate of postings during fall, due to increased professional and personal responsibilities. In the next month, I hope to resume weekly (or nearly-weekly) postings.