Oreos and Edward Snowden: The coming crisis for companies
[Editor's note: Nicco Mele is the author of The End of Big and will be delivering the opening keynote for the KM-World 2013 Conference (kmworld.com/conference/2013) on Nov. 6 at the Renaissance Downtown Hotel in Washington, D.C.]
In 1912, the South Pole was discovered. The Titanic sank. And the Oreo cookie was invented. In 2012, Oreo celebrated the 100th birthday of the cookie—on Facebook. Each day for 100 days the brand ran a new online ad celebrating a different "twist" on the iconic cookie. The images in the ads celebrated everything from Shark Week to the Mars Rover to Gay Pride. Oreo's social media presence exploded, going from a handful of "likes" on Facebook to more than 34 million fans today. Long after the campaign retired, engagement around the brand continues to significantly exceed the online activity prior to the campaign.
Oreo sells about 7.5 billion cookies a year—around 20.5 million a day. Everybody loves an Oreo: They're even "accidentally vegan." So why does Oreo only have 34 million fans on Facebook?
By some measure, 34 million fans for Oreo is a lot. There is a hardly a television show in the country that carries that kind of audience. But that 34 million is a global audience. Oreo is sold in 100 countries, and reviewing daily comments from fans is to experience a towering Babel of languages.
The top 25 most popular "fan" pages on Facebook have just one big company: Coca-Cola. Out of the top 100 Twitter accounts with the most followers, there is not a single large consumer product brand, and almost no large companies or organizations. It is all individuals: celebrities, journalists, politicians, but also some random folks.
Focus on people
Social media is inherently, by design, a personal media. It fosters individuals—and it hates institutions—including brands. The technology is designed to empower and engage individuals—and to better connect people to each other. Your space online—whether it is social media or your inbox—is an intimate space. Paradoxically, it is not private—but it is intimate. You're naturally resistant to having institutions interrupt your conversations in person—and so we don't really want to "engage" with big organizations online.
When crafting a social media strategy for your organization, focus on individuals, on people—not on the faceless organization. How do you empower the people in your organization to better share their knowledge? How do you build their personal brands and online reputation while at the same time building your organization's brand and reputation? The paradox is that while the fundamental unit of social media—and indeed most of the digital universe—is the person, this creates real and significant challenges for organizations and institutions that seek credibility, authority and value in the digital world.
One of the challenges is that we don't even have an adequate vocabulary to talk about what's happening. The word "technology" is weak; a wheel is "technology" and so is the printing press, whereas our present-day "technology" collapses time, distance and lots of other barriers. "Networked" doesn't quite capture the dramatic global reach, the persistent presence, the mobile nature of the world we're living in. You often hear "social" used in connection with technology—social media, social business, social sharing. But it is not actually social, and that word is a weak way to describe the larger disruption we're facing.
Sometimes people utter the catchall phrase "digital," but it's not clear what that means either; remember the "digital" watches of the 1980s? "Open" sounds good: open government, open source politics, open source policy. But WikiLeaks brings severe diplomatic and political consequences that "open" doesn't capture. Just because something is "machine readable" or online doesn't mean most people can access it. Also, "openness" describes the result of technology, but it ignores the actual code that underlies most of this work—code that is closed and inaccessible to people who aren't programmers or developers.
I've settled on the words "radical connectivity" to describe what is happening today. We are all connected, all the time, with a radical intensity and constant presence. It is powerful, exhilarating and terrifying. The sheer volume associated with radical connectivity creates enormous challenges for leaders—how do you keep up with the overflowing inbox, the constant stream of information? How do you make good decisions and keep your organization focused and performing amidst distraction and a technology that empowers the individual at the expense of existing institutions?
This all started a long time ago. We take "personal," individual computers for granted, but in the history of computer science, they are a relatively recent detour. During the 1940s and 1950s, most computer science took place inside of large organizations—militaries, large corporations, universities. Even by the late 1960s, the freshly minted computer nerd looking for a job would likely have gone into a large institution. That's because computers were giant devices requiring a substantial amount of money and real estate.
In 1969, Seymour Cray started selling the CDC 7600, a supercomputer whose base price was about $5 million. Imagine one of today's stainless steel side-by-side refrigerators. Now take two of them and put them next to one another. You've got a wall of refrigerator. Now add two more to the left and two more to the right. Build yourself a large office cubicle with walls made of giant stainless steel refrigerators and you've got the Cray 7600. Its top speed was about 36.4 MHz, not much compared to today's iPhone 4's 1 GHz. That means your iPhone 4 is about a billion times more powerful that a Cray supercomputer. Not many places could afford those bad boys, and they used them not to watch and share videos or listen to the neighbor's kid's noisy garage band, but for complex calculations in mathematics, nuclear physics and other disciplines.