How Television Ratings Portend the Death of Mass Media

First published on FEED, 10/14/1999.

The fall Nielsen ratings are now in for the new crop of prime time TV shows, and in what is becoming a predictable pattern, the results are underwhelming. Despite the promotional avalanche that accompanies every new series, this year’s new shows failed to attract as much attention as the hits of previous seasons. This steady decline in popularity of each successive season’s offerings is fairly dramatic. Not since “Seinfeld” has a show managed to reach one out of four American households. The top three (“E.R.,” “Frasier,” “Friends”) only reach about one household in five, six, and seven, respectively, and even that tops the current season. The most popular new show (“Stark Raving Mad”) reached barely one in eight. There’s no mystery about this process: Proliferation of choices divides attention, and every year sees increased choices on cable, the video store, and the web. The mystery is that the price to run a commercial on one of these hit shows is at an all-time high.

If the most popular shows reach fewer people every year, why are advertisers willing to pay more for commercial slots on those shows? The first and simplest reason is that the dot-coms are pouring a huge amount of money into advertising, and driving up prices. But the more interesting answer has to do with the large-scale splintering of mass media. Broadcast networks and advertisers are used to buying audience attention in bulk, and in prime time bulk means at least 20 million households at a time. These massive audiences for prime time shows, developed when the entire TV universe included just ABC, NBC, and CBS, provided the kind of reach that made it possible for the General Foods of the world to turn a new brand into a household word in a few weeks, by simply buying the attention of a quarter of the country at once. As the shows which can offer this kind of mass audience disappear, things are getting trickier for the networks — they can sell the advertiser two shows that each reach an eighth of the country, but how can they prove that you are reaching two different people and not just the same person twice? They can’t. The symbiotic relationship between mass media and mass marketers is being threatened by the increasing number of media outlets, which sub-divides the mass audience into smaller and smaller niches.

Broadcast TV can charge higher prices for fewer households because the mass marketers simply have nowhere else to go. Despite the increasingly anemic performance of the most popular TV shows, there isn’t another medium that offers the option of reaching 10 million households at the same time with a single ad — even giant portal sites fragment their reach across a number of offerings. Indie mutiny has been in the works for years, but these recent Nielsen numbers may be a cultural bellwether — among the first concrete data to confirm the decay of mass media. Like a small island with a receding coastline, the total area the broadcast networks cover shrinks every year, but for mass marketers, it’s still the only ground above water. It is obvious that both the networks and their advertisers are soon going to have to adapt to a fragmented media market where nothing regularly reaches 20 million people, and the only way to get mass will be niche plus niche plus niche. In the meantime, though, old habits die hard, and it is these old habits of looking for mass that are driving ad rates up for hit shows even as those shows lose the very audience that makes them valuable. If the truth in advertising law had any real teeth, CBS would have to change the name to “Almost One Person In Eight Loves Raymond.” Don’t expect even that to last.

Kasparov vs. The World

10/21/1999

It was going to be acres of good PR. After the success of Garry Kasparov’s chess
matchup with IBM’s Deep Blue, Microsoft wanted to host another computerized chess match this summer — Kasparov vs. The World. The setup was simple: Kasparov, the John Henry of the Information Age, would play white, posting a move every other day on the Microsoft chess BBS. “The World” consisted of four teenage chess experts who would analyze the game and recommend counter-moves, which would also be posted on the BBS. Chess aficionados from around the world could then log in and vote for which of the four moves Black should play. This had everything Microsoft could want — community, celebrity, online collaboration, and lots of “Microsoft hosts The World!” press releases. This “experts recommend, The World votes” method worked better than anybody dared hope, resulting in surprisingly challenging chess and the ascension of one of The World’s experts, Irina Krush, into chess stardom. Things were going well up until last week, when Microsoft missed a crucial piece of email and the good PR began to hiss out of the event like helium from a leaky balloon.

While Deep Blue was a lone computer, here Kasparov’s opponent was to be the chess
community itself, a kind of strategic “group mind.” Since communication was the glue
that held the community together, it’s fitting that the game came unglued after a missed email. During last week’s end game, it was generally agreed that The World had made a serious tactical error in move 52, but that there was still the possibility of a draw. Then, on October 13th, Ms. Krush’s recommendation for move 58 was delayed by mail server problems, problems compounded by a further delay in posting the information on the Microsoft server. Without Ms. Krush’s input, an inferior move was suggested and accepted, making it obvious that despite the rhetoric of collaboration, the game had become Kasparov v. Krush with Kibbitzing by The World. Deprived of Ms. Krush’s strategic vision, the game was doomed. The World responded to this communication breakdown by collective hari kari, with 66% of the team voting for a suicidal move. Facing the possibility of headlines like “The World Resigns from Microsoft,” the corporate titan rejected the people’s move and
substituted one of its own. The World, not surprisingly, reacted badly.

Within hours of Microsoft’s reneging on the vote, a protest movement was launched,
including press releases, coordinating web sites, and even a counter-BBS which archived articles from the Microsoft chess server before they expired. Microsoft had run afoul of the first rule of online PR: On the internet, there is no practical difference between “community” and “media”; anyone with an email address is a media outlet, a tiny media outlet to be sure, but still part of the continuum. Since online communities and online media outlets use the same tools — web sites, mailing lists, BBS’s — the border between “community interest” and “news” is far easier to cross. The Microsoft story took less than a week to go from the complaints of a few passionate members of the chess community to a story on the BBC.

Microsoft made the same famously bad bet that sidelined Prodigy: By giving The World a forum for expressing itself, it assumed that The World’s gratitude would prevent criticism of its host, should anything go wrong. As Rocky the Flying Squirrel would say, “That trick never works.” What started as a way to follow on IBM’s success with Deep Blue has become a more informative comparison than even Microsoft knew. Two computerized chess games against the World Champion — one meant to display the power of computation, the other the power of community — and the lesson is this: While computers sometimes behave the way you want them to, communities never do. Or, as the Microsoft PR person put it after the game ended: “Live by the internet, die by the internet.”

An Open Letter to Microsoft

First published on SAR, 09/99.

Dear Microsoft,

I tried to give you some of my money last weekend and you wouldn’t let me. As this sort of behavior might be bad for your long-term profitability, I thought I’d write and explain how you can fix the problem.

Last Sunday night while visiting friends, I remembered that I was running out of time for a discount plane ticket, so I opened Internet Explorer 3.0, the browser running on my friend’s machine, and went to Expedia to make my purchase. (I trust it will not be lost on you that both IE and Expedia are Microsoft products). After submitting my flight details on Expedia, the results page created so many browser errors that I couldn’t even see what flights were available, much less buy a ticket.

Do you understand that? I wanted to use your product on your service to give you my money, and you wouldn’t let me. Being a good citizen, I wrote to the Customer Service address, expecting no more than a piece of ‘Thanks, we’ll look into it’ mail. What I got instead was this:

“Thank you for contacting Expedia with your concerns. Since Microsoft has upgraded the website, they have also upgraded the version you should use with Expedia. The version will be more user-friendly with Internet Explorer 4.0 and above.”

Now before you think to yourselves, “Exactly right — we want to force people to get new browsers”, let me contextualize the situation for you:

I cannot use IE 3.0 to buy tickets on Expedia, but…

I can use IE 3.0 to buy tickets on Travelocity.
I can use Netscape 3.0 to buy tickets on Expedia.
I can even use Netscape 3.0 to buy tickets on Travelocity.

Let me assure you that I bought my tickets as planned. I just didn’t buy them with you.

I understand from what I read in the papers that your desktop monopoly has atrophied your ability to deal with real competition. Get over it. If you really want to start building customer-pleasing businesses on the Web (and given the experience I had on Expedia, I am not sure you do) then let me clue you in to the awful truths of the medium.

First, you need to understand that the operating system wars are over, and operating systems lost. Mac vs. Windows on PCs? Dead issue. Linux vs. NT? Nobody but geeks knows or cares what a web site is running on. There’s no such thing as a “personal” computer any more – any computer I can get to Amazon on is my computer for as long as my hands are on the keyboard, and any operating system with a browser is fine by me. Intentionally introducing incompatibilities isn’t a way to lock people in any more, its just a way to piss us off.

Second, media channels don’t get “upgrades”. You have made a lot of foolish comments about AOL over the years, because you have failed to understand that AOL is a media company, not a software company. They launched AOL 1.0 at about the same time as you launched MS Word 1.0, and in all that time AOL is only up to version 4.0. How many “upgrades” has Word had – a dozen? I’ve lost count. What AOL understands that you don’t is that software is a means and not an end. As a media company, you can no longer force people to change software every year.

Third, and this is the most awful truth of all, your customers are the ones with the leverage in this equation, because your competition is just a click away. We get to decide what browser we should use to buy plane tickets or anything else for that matter, and guess what? The browser we want to use is whatever browser we have whenever we decide we want to buy something. Jeff Bezos understands this – Amazon works with Netscape 1.22. Jerry Yang understands this – there is not a single piece of javascript on the Yahoo home page. Maybe somebody over at Microsoft will come to understand this too.

Thinking that you can force us to sit through a multi-hour “upgrade” of a product we aren’t interested in solely for the pleasure of giving you our money is daft. Before you go on to redesign all your web sites to be as error-prone as Expedia, ask yourself whether your customers will prefer spending a couple of hours downloading new software or a couple of seconds clicking over to your competition. Judging from the mail I got from your Customer Service department, the answer to that question might come as a big surprise.

Yours sincerely,

Clay Shirky

The Internet and Hit-driven Industries

First published on SAR, 07/99.

ALERT! New community-based email virus on the loose!

This virus, called the “DON’T GO” virus, primarily targets major Hollywood studios, and is known to proliferate within 24 hours of the release of a mediocre movie. If you receive a piece of email from a friend with a subject line like “Notting Hill: Don’t Go”, do not open it! Its contents can cause part of your memory to be erased, replacing expensive marketing hype with actual information from movie-goers. If this virus is allowed to spread, it can cause millions of dollars of damage to a Hollywood studio in a single weekend.

Hit-driven industries (movies, books, music, et al.) are being radically transformed by Internet communities, because the way these industries make money is directly threatened by what Internet communities do best – move word of mouth at the speed of light. Internet communities are putting so much information in the hands of the audience so quickly that the ability of studios, publishing houses, and record labels to boost sales with pre-release hype are diminishing even as the costs of that hype are rising.

Consider Hollywood, the classic hit-driven industry. The financial realities can be summed up thusly:

Every year, there are several major flops. There are a horde of movies that range from mildly unprofitable to mildly profitable. There are a tiny core of wildly profitable hits. The hits are what pays for everything else.

This is true of all hit-driven businesses – Stephen King’s books more than earn back what Marcia Clark lost, the computer game Half-Life sells enough to pay for flops like Dominion, Boyzone recoups The Spice Girls latest album, and so on. Many individual works lose money, but the studios, publishers, or labels overall turn a profit. Obviously, the best thing Hollywood could do in this case would be to make all the movies worth watching, but as the perennial black joke goes, “There are three simple rules for making a successful movie, but unfortunately nobody knows what they are.” Thus the industry is stuck managing a product whose popularity they can’t predict in advance, and they’ve responded by creating a market where the hits more than pay for the flops.

For Hollywood, this all hinges on the moment when a movie’s quality is revealed: opening weekend. Once a movie is in the theaters, the audience weighs in and its fate is largely sealed. Opening weekend is the one time when the producers know more about the product than the audience — it isn’t until Monday morning water cooler talk begins that a general sense of “thumbs up” or “thumbs down” becomes widespread. Almost everything movie marketers do is to try to use the media they do control — magazine ads, press releases, commercials, talk show spots — to manipulate the medium they don’t control — word of mouth. The last thing a studio executive wants is to allow a movie to be judged on its merits — they’ve spent too much money to leave things like that to the fickle reaction of the actual audience. The best weapon they have in this fight is that advertising spreads quickly while word of mouth spreads slowly.

Enter the Internet. A movie audience is kind of loose community, but the way information is passed — phone calls, chance encounters at the mall, conversations in the office — makes it a community where news travels slow. No so with email, news groups, fan web pages — news that a movie isn’t worth the price of admission can spread through an Internet community in hours. Waiting til Monday morning to know about a movie’s fate now seems positively sluggish — a single piece of email can be forwarded 10 times, a newsgroup can reach hundreds or even thousands, a popular web site can reach tens of thousands, and before you know it, it’s Saturday afternoon and the people are staying away in droves.

This threat — that after many months and many millions of dollars the fate of a movie can be controlled by actual fan’s actual reactions — is Hollywood’s worst nightmare. There are two scenarios that can unfold in the wake of this increased community power: The first scenario (call it “Status Quo Plus”) is that the studios can do more of everything they’re already doing: more secrecy about the product, more pre-release hype, more marketing tie-ins, more theaters showing the movie on opening weekend. This has the effect of maximising revenues before people talk to their friends and neighbors about the film. This is Hollywood’s current strategy, having hit its current high-water mark with the marketing juggernaut of The Phantom Menace. The advantage of this strategy is that it plays to the strengths of the existing Hollywood marketing machine. The disadvantage of this strategy is that it won’t work.

Power has moved from the marketer to the audience, and there is no way to reverse that trend, because nothing is faster than the Internet. The Internet creates communities of affinity without regard to geography, and if you want the judgement of your peers you can now get it instantly. (Star Wars fan sites were posting reactions to “The Phantom Menace” within minutes of the end of the first showing.) Furthermore, this is only going to get worse, both because the Internet population is still rising rapidly and because Internet users are increasingly willing to use community recommendations in place of the views of the experts, and while experts can be bought, communities can’t be. This leaves the other scenario, the one that actually leverages the power of Internet communities: let the artists and the fans have more direct contact. If the audience knows instantly whats good and what isn’t, let the creators take their products directly to the audience. Since the creators are the ones making the work, and the community is where the work stands or falls, much of what the studio does only adds needless expense while inhibiting the more direct feedback that might help shape future work. Businesses that halve the marketing budget and double the community outreach will find that costs go down while profits from successful work goes up. The terrible price of this scenario, though, is that flops will fail faster, and the studio will have to share more revenue with the artist in return for asking them to take on more risk.

The technical issues of entertainment on the Internet are a sideshow compared to community involvement — the rise of downloadable video, MP3 or electronic books will have a smaller long-term effect on restructuring hit-driven industries than the fact that there’s no bullshitting the audience anymore, not even for as long as a weekend. We will doubtless witness an orgy of new marketing strategies in the face of this awful truth — coupons for everything a given movie studio or record label produces in a season, discounts for repeat viewings, Frequent Buyer Miles, on and on — but in the end, hit driven businesses will have to restructure themselves around the idea that Internet communities will sort the good from the bad at lightning speed, and only businesses that embrace that fact and work with word of mouth rather than against it will thrive in the long run.

An Open Letter to Jakob Nielsen

First published on ACM, 06/99.

[For those not subscribing to CACM, Jakob Nielsen and I have come down on opposite sides of a usability debate. Jakob believes that the prevalence of bad design on the Web is an indication that the current method of designing Web sites is not working and should be replaced or augmented with a single set of design conventions. I believe that the Web is an adaptive system and that the prevalence of bad design is how evolving systems work.

Jakob’s ideas are laid out in “User Interface Directions for the Web”, CACM, January 1999.

My ideas are laid out in “View Source… Lessons from the Web’s Massively Parallel Development”, networker, December 1998, and http://www.shirky.com/writings/view-source

Further dialogue between the two of us is in the Letters section of the March 1999 CACM.]

Jakob,

I read your response to my CACM letter with great interest, and while I still disagree, I think I better understand the disagreement, an will try to set out my side of the argument in this letter. Let me preface all of this by noting what we agree on: the Web is host to some hideous dreck; things would be better for users if Web designers made usability more of a priority; and there are some basics of interface usability that one violates at one’s peril.

Where we disagree, however, is on both attitude and method – for you, every Web site is a piece of software first and foremost, and therefore in need of a uniform set of UI conventions, while for me, a Web site’s function is something only determined by its designers and users – function is as function does. I think it presumptous to force a third party into that equation, no matter how much more “efficient” that would make things.

You despair of any systemic fix for poor design and so want some sort of enforcement mechanism for these external standards. I believe that the Web is an adaptive system, and that what you deride as “Digital Darwinism” is what I would call a “Market for Quality”. Most importantly, I believe that a market for quality is in fact the correct solution for creating steady improvements in the Web’s usability.

Let me quickly address the least interesting objection to your idea: it is unworkable. Your plan requires both centralization and force of a sort it is impossible to acheive on the Internet. You say

“…to ensure interaction consistency across all sites it will be necessary to promote a single set of design conventions.”

and

“…the main problem lies in getting Web sites to actually obey any usability rules.”

but you never address who you are proposing to put in the driver’s seat – “it will be necessary” for whom? “[T]he main problem” is a problem for whom? Not for me – I am relieved that there is no authority who can make web site designers “obey” anything other httpd header validity. That strikes me as the Web’s saving grace. With the Web poised to go from 4 million sites to 100 million in the next few years, as you note in your article, the idea of enforcing usability rules will never get past the “thought experiment” stage.

However, as you are not merely a man of action but also a theorist, I want to address why I think enforced conformity to usability standards is wrong, even in theory. My objections break out into three rough categories: creating a market for usability is better than central standards for reasons of efficency, innovation, and morality.

EFFICENCY

In your letter, you say “Why go all the way to shipping products only to have to throw away 99% of the work?” I assume that you meant this as a rhetorical question – after all, how could anybody be stupid enough to suggest that a 1% solution is good enough? The Nielsen Solution – redesign for everybody not presently complying with a “single set of design conventions” – takes care of 100% of the problem, while the Shirky Solution, let’s call it evolutionary progress for the top 1% of sites, well what could that possibly get you?

1% gets you a surprising amount, actually, if it’s properly arranged.

If only the top 1% most trafficked Web sites make usability a priority, those sites would nevertheless account for 70% of all Web traffic. You will recognize this as your own conclusion, of course, as you have suggested on UseIt (http://www.useit.com/alertbox/9704b.html) that Web site traffic is roughly a Zipf distribution, where the thousandth most popular site only sees 1/1000th the traffic of the most popular site. This in turn means that a very small percentage of the Web gets the lion’s share of the traffic. If you are right, then you do not need good design on 70% of the web sites to cover 70% of user traffic, you only need good design on the top 1% of web sites to reach 70% of the traffic.

By ignoring the mass of low traffic sites and instead concentrating on making the popular sites more usable and the usable sites more popular, a market for quality is a more efficient way of improving the Web than trying to raise quality across the board without regard to user interest.

INNOVATION

A market for usability is also better for fostering innovation. As I argue in “View Source…”, good tools let designers do stupid things. This saves overhead on the design of the tools, since they only need to concern themselves with structural validity, and can avoid building in complex heuristics of quality or style. In HTML’s case, if it renders, it’s right, even if it’s blinking yellow text on a leopard-skin background. (This is roughly the equivalent of letting the reference C compiler settle arguments about syntax – if it compiles, it’s correct.)

Consider the use of HTML headers and tables as layout tools. When these practices appeared, in 1994 and 1995 respectively, they infuriated partisans of the SGML ‘descriptive language’ camp who insisted that HTML documents should contain only semantic descriptions and remain absolutely mute about layout. This in turn led to white-hot flamefests about how HTML ‘should’ and ‘shouldn’t’ be used.

It seems obvious from the hindsight of 1999, but it is worth repeating: Everyone who argued that HTML shouldn’t be used as a layout language was wrong. The narrowly correct answer, that SGML was designed as a semantic language, lost out to the need of designers to work visually, and they were able to override partisan notions of correctness to get there. The wrong answer from a standards point of view was nevertheless the right thing to do.

Enforcing any set of rules limits the universe of possibilities, no matter how well intentioned or universal those rules seem. Rules which raise the average quality by limiting the worst excesses risk ruling out the most innovative experiments as well by insisting on a set of givens. Letting the market separate good from bad leaves the door open to these innovations.

MORALITY

This is the most serious objection to your suggestion that standards of usability should be enforced. A web site is an implicit contract between two and only two parties – designer and user. No one – not you, not Don Norman, not anyone, has any right to enter into that contract without being invited in, no matter how valuable you think your contribution might be.

IN PRAISE OF EVOLVEABLE SYSTEMS, REDUX

I believe that the Web is already a market for quality – switching costs are low, word of mouth effects are both large and swift, and redesign is relatively painless compared to most software interfaces. If I design a usable site, I will get more repeat business than if I don’t. If my competitor launches a more usable site, it’s only a click
away. No one who has seen the development of Barnes and Noble and Amazon or Travelocity and Expedia can doubt that competition helps keep sites focussed on improving usability. Nevertheless, as I am a man of action and not just a theorist, I am going to suggest a practical way to improve the workings of this market for usability –
lets call it usable.lycos.com.

The way to allocate resources efficently in a market with many sellers (sites) and many buyers (users) is competition, not standards. Other things being equal, users will prefer a more usable site over its less usable competition. Meanwhile, site owners prefer more traffic to less, and more repeat visits to fewer. Imagine a search engine that weighted usability in its rankings, where users knew that a good way to find a usable site was by checking the “Weight Results by Usability” box and owners knew that a site could rise in the list by offering a good user experience. In this environment, the premium for good UI would align the interests of buyers and sellers around increasing quality. There is no Commisar of Web Design here, no International Bureau of Markup Standards, just an implicit and ongoing compact between users and designers that improvement will be rewarded.

The same effect could be created in other ways – a Nielsen/Norman “Seal of Approval”, a “Usability” category at the various Web awards ceremonies, a “Usable Web Ring”. As anyone who has seen “Hamster Dance” or an emailed list of office jokes can tell you, the net is the most efficent medium the world has ever known for turning user preference into widespread awareness. Improving the market for quality simply harnesses that effect.

Web environments like usable.lycos.com, with all parties maximizing preferences, will be more efficent and less innovation-dampening than the centralized control which would be necessary to enforce a single set of standards. Furthermore, the virtues of such a decentralized system mirrors the virtues of the Internet itself rather than fighting them. I once did a usability analysis on a commerce site which had fairly ugly graphics but a good UI nevertheless. When I queried the site’s owner about his design process, he said “I didn’t know anything when I started out, so I just put up a site with an email link on every page, and my customers would mail me suggestions.”

The Web is a marvelous thing, as is. There is a dream dreamed by engineers and designers everywhere that they will someday be put in charge, and that their rigorous vision for the world will finally overcome the mediocrity around them once and for all. Resist this idea – the world does not work that way, and the dream of centralized control is only pleasant for the dreamer. The Internet’s ability to be adapted slowly, imperfectly, and in many conflicting directions all at once is precisely what makes it so powerful, and the Web has taken those advantages and opened them up to people who don’t know source code from bar code by creating a simple interface design language.

The obvious short term effect of this has been the creation of an ocean of bad design, but the long term effects will be different – over time bad sites die and good sites get better, so while those short-term advantages seem tempting, we would do well to remember that there is rarely any profit in betting against the power of the marketplace in the long haul.

Social Computing in Student Populations

Shortly after I sold my company (I was the Chief Technology Officer of a Web design firm in Manhattan’s Silicon Alley), Hunter College hired me to teach classes in Web design and uses of the Internet. One of the first things I did while setting up my lab was to drag an old computer into the hall outside my office, connect it to the Internet, and give students unfettered access to it. My thought was to bring the students into the present — little did I know that they would show me the future. 

For a few years now, there has been a deathwatch for the personal computer. It is an article of faith in much of my industry that every 15 years or so sees a major shift in computing hardware, and since the PC supplanted the mainframe in the early 80s, it’s time, or so the thinking goes, for the PC in turn to give way to the “Next Big Thing”, as these shifts are sometimes jokingly called. 

That shift is happening today — I can see it in the way my students are using the computer outside my office door — but unlike the previous revolution, this change is not about hardware but about patterns of use. The age of the personal computer is ending because computers are no longer personal, they’re social. My students are using the Web not just as a research tool but as a way to communicate with one another and to organize their lives. Web-based email, from companies like Hotmail and Yahoo, forms the center of their computer use; they use it for everything from communicating with professors to keeping in touch with family overseas to organizing forays to the pizza parlor. 

Furthermore, they’re not just using computers to run their social lives; they’ve begun to treat computers themselves as social objects, like phones. If you need to check your voice mail, it doesn’t matter to you what phone you use; any phone with an outside line is the same as any other. Even if its someone else’s phone, and they’ve programmed the speed dialer to suit themselves, you can still use it to get to your messages. In the same way, any computer with an “outside line” to the Internet is as good as any other for checking email, even if someone else owns that computer and they’ve got their own programs on the hard drive. From my students’ point of view, a computer is merely a way to get access to their stuff (email, web syllabi, online research, et al.) rather than being a place to store their stuff. Hotmail wins over Eudora because with web-based email, they can get their mail from any computer they happen to be at, whether they own it or not. 

This seemingly minor change has major ramifications for academic uses of computers. If a computer is for access to a network and not for storage or local computation, then the definition of a “good” computer changes. Any computer connected to the Internet is as good as any other computer connected to the Internet, and any computer not connected to the Internet is no good at all. Furthermore, the impact on student use is enormous. If computers are social objects like phones are, then any computer is their computer as long as their hands are on the keyboard. This is the real demise of the “personal” computer – many of my students don’t have their own computers, but that doesn’t stop them from running their lives via email. 

This isn’t just a question of affordability – the PC is coming unglued in many academic environments. An affluent student who has more than one computer has given up on the idea of the “personal” computer just as surely as the student who decides not to buy a computer at all. A student on a wired campus, with computers in the classroom and the dorm room, sooner or later realizes that it is easier to save their work on the network or email it to themselves than to carry disks or even laptops back and forth. Even faculty, usually the people with the most computing power on their desks, are seeing Web-accessible research and email discussion lists become as much a part of their academic work as the data stored on their hard drive. 

The personal has been taken out of the personal computer because the Internet has transformed computing into a social activity. With email from Hotmail, a free home page from Geocities, a personalized news page from my.yahoo, there is no need to own your own computer, lug a laptop from place to place, or pay $20 a month for Internet access. Computing is where you find it — campus labs, work, an Internet cafe — and the guts of any individual computer become unimportant. The old Macintosh vs. Windows debate is meaningless if they both run Netscape, since social computing turns the browser into the operating system and the Internet into the hard drive. 

Looking at my students instinctive sense of network use, I’ve boiled the elements of social computing down into three basic principles: Hardware is Boring. 

Who cares what kind of computer you have? The old concerns of the PC era – which operating system, what CPU speed, how much RAM – have been replaced by the concerns of connection – dial-up or ethernet, which connect speed, what browser version. If all hardware can connect, each individual piece is less important. 

Imagine that your institution plans to upgrade your computer, but that you have to choose: you can have either double the speed of your CPU, or double the speed of your Internet connection. Which would you choose? 

Computers are for Finding, not Keeping. 

Finding new information is more important than storing old information. A computer with a CD-ROM drive only becomes more valuable when you spend the time and money to buy a new CD, while a computer with network access becomes more valuable every time someone creates a new web site, which happens several times a day, for free. Computers used for keeping information lose value over time, as the information on them goes out of date. Computers used for finding information increase in value over time, as the breadth and depth of the accessible information increases with no additional investment. 

Imagine another choice: your institution will pay to quadruple the size of your hard drive, so you can store more stuff locally, or pay for you to get access to the online archives of the New York Times, the Wall Street Journal and the Encyclopedia Brittanica. Which would you choose? 

Information Leaks Out of its Containers. 

Information can no longer be trapped in objects. Multi-media is leaking out of CD-ROMs and into Web sites, music is leaking out of CDs and into Internet radio, text is leaking out of books and into online databases. At first, online academic materials were simply additions to existing research infrastructure, but they are starting to replace, rather than augment, traditional methods: MIT’s Sloan School of Business now only accepts electronic applications, and 1998 is the last year that Encyclopedia Brittanica will publish a paper edition. 

A third choice: imagine that your next computer can have either a CD-ROM drive or a connection to the Internet, but not both. Which would you choose? 

You may not face choices as stark as the ones outlined above, but your students do. Lacking an office, living on a budget, possibly balancing work and school, they will gravitate towards social computing wherever possible. Why spend money if they don’t have to? Why pay for a fast CPU if they have a slow modem, why pay for a large disk drive if TIME magazine stores all their articles online, why pay for a CD-ROM if the same material exists on the Web? 

The changes in a campus are which embraces social computing are far-reaching. Computers become less a product than a service, less an occasional and expensive purchase like office furniture, and more a steady stream of small purchases, like office supplies. PCs wither and networks bloom, and value is created by access instead of ownership. Anyone who has done the math for a ‘one computer per student’ policy sees millions of dollars tied up in machines that are unused most of the day, and that’s when the student body is affluent enough to afford such a requirement. For a student body already straining to meet tuition, the idea of personal computing is an absurdity, but social computing is not. 

If hardware is boring, then homogeneity of computing resources becomes unimportant. With the browser as the interface and the Internet as the hard drive, the differences between a 486 running Linux and a Pentium II running Windows 98 shrink in importance, and instead of a massive overhaul of computer labs every 3-5 years, it becomes possible to replace 20% of the machines annually. This takes advantage of continually rising quality and falling cost, raises the quality of the average machine every year, eliminates the problem of wheezing equipment at the tail end of a “complete replacement” cycle, and frees up low-end machines for use as hallway terminals or print servers as a side-effect. 

If computers are for finding instead of keeping, then then the role of the librarian changes drastically. With a computer as a finding tool, every library suddenly acquires a rapidly expanding global collection. Librarians will be increasingly freed from the part of their job that turns them into literate janitors, constantly reshelving paperback copies of The Iliad, while returning to their roles as taxonomists and sleuths, guiding students to useful resources and teaching them how to separate the good from the mediocre. 

If information leaks out of objects, then every professor becomes a publisher, and the separation between the classroom lecture and the textbook begins to fade. “Handouts” handed out over the web spare faculty the trip to the copy center (and the attendant cost), online syllabi don’t get out of date because the URL always points to the most recent copy, and one semester’s preparation means less editing to make it ready for future classes. An entire textbook can be built up online, one lecture at a time, and tested in the real world before it is published, rather than after. 

For us as members of the faculty, staff and administration of colleges and universities, the ramifications of the rise in social computing are clear. The amount of money and effort it takes for a “One personal computer per student” program puts it out of the reach of many institutions and many student bodies even as access to the Internet is becoming increasingly critical for both academic work and social life on college campuses. The rise in socal computing suggests a way to bridge that gap, by providing the advantages of access without the overhead of ownership. 

Our students are gravitating towards the Internet, using it in ways that aren’t intuitive for those of us still using personal computers. They will tend to prefer connectivity over storage and accessibility over raw processing power. If we aren’t sensitive to their preference for cheaper computers out where they can use them rather than expensive computers under lock and key, we will end up misallocating precious resources in ways that aren’t in our students best interests, or, ultimately, in our own.

Don’t Believe the Hype

First published in ACM, 10/96.

One of my responsibilities as CTO of a New York new media agency is to evaluate new networking technologies – new browsers, new tools, new business models – in order to determine where my company should be focussing its energies in the distant future. (Like May.). The good part of that job is free t-shirts. The bad part is having to read the press releases in order to figure out what’s being offered: 

“Our company has a totally unique and unprecedentedly original proprietary product which will transform the Internet as we know it. If you don’t get in on the ground floor, you will be tossed into the nether darkness of network anachronism where demons will feast on your flesh. Expected out of alpha in the mumbleth quarter of next year, our product, Net:Web-Matrix@Thing, will …” 

Faced with the growing pile of these jewels of clarity on my desk, I realized that I could spend all my time trying to evaluate new technologies and still only be making (barely) educated guesses. What I needed was a set of principles to use to sort the wheat from the chaff, principles which take more account of what users on the network actually spend their time doing instead of what new media marketers want them to be doing. So, without further ado, I present my modest little list, “Clay Shirky’s Five Rules For Figuring Out When Networking Marketers Are Blowing Smoke.” 

RULE #1: Don’t Believe the Hype. 

This could also be called the “Everything New is Old Again” rule. The network we know and love is decades old, and the research which made it possible is older still. Despite the cult of newness which is surrounding the Internet, the Web, and all their cousins right now, the engines which drive the network change very slowly. Only the dashboards change quickly. 

Changing the way networking works is a complex business. Events which alter everything which comes after them (a paradigm shift, to quote Thomas Kuhn) are few and far between. The invention of DARPANet was a paradigm shift. The deployment of TCP/IP was a paradigm shift. The original NCSA Mosaic may even turn out to be a paradigm shift. The launch of Netscape Navigator 4.0 for Macintosh will not be a paradigm shift, no matter how many t-shirts they give out. 

Right now, the only two big ideas in networking are the use of a browser as an interface to the network, and the use of distributable object-oriented programs on virtual machines. Everything else is a detail, either some refinement of one of those two ideas (the much-hyped Active Desktop is a browser that looks at the local disk as well) or a refinement of something that has gone before (the Network Computer is Sun’s “diskless workstation” ten year later). We will spend the next few years just digesting those two big ideas – everyone else, no matter what their press releases say, is just helping us color inside the lines. 

RULE #2: Trust the “Would I use it?” Test. 

The first question to ask yourself when looking at new technology is not “Will it run on my server?” but rather “Would I use it?” 

I once saw a demo of a product whose sole function was to accompanying the user around the Web, showing them extra ads while taking up more screen space. The sales rep was taking great pains to point out that if you charged 2 cents an ad, you could collect $20 per thousand ads served. While this certainly comported with what I remember about multiplication, I was much more interested to know why he thought people would tolerate this intrusion. He never got around to explaining that part, and I considered it rude to ask. 

We already know what people using networks want: they want to do what they do now, only cheaper, or faster, or both. They want to do more interesting stuff than they do now, for the same amount of money. They strongly prefer open systems to closed ones. They strongly prefer open standards to proprietary ones. They will accept ads if thats what pays for interesting stuff. They want to play games, look at people in various states of undress, read the news, follow sports scores or the weather, and most of all they want to communicate with one another. 

Beware any product which claims that people would prefer information to communication, any service which claims that people will choose closed systems over open ones, and any protocol which claims that people will tolerate incompatibility to gain features. Any idea for a networking service which does not satisfy some basic desire of network users is doomed to fail. 

RULE #3: Don’t Confuse Their Ideas With Your Ideas 

A time-honored marketing tool for sidespepping criticism is to make ridiculous assertions with enough confidence that people come to believe them. The best possible antidote for this is to simply keep a running rebuttal going in your head while someone is telling you why network users are going to flock to their particular product. 

What follows is a list of things I have actually heard marketers say to me in all seriousness. Anyone saying anything like this is not to be trusted with anything more technological than a garage door opener: 

“The way we figure it, why _wouldn’t_ users adopt our ?” (Same reason they don’t use LotusNotes.) 

“We’re a year ahead of the competition.” (Nope.) 

“The way we figure it, why _wouldn’t_ users be willing to register a name and password with us?” (Because they already can’t remember their other 17 8-character-mix-of-letters-and-numbers passwords.) 

“We don’t have any competition.” (Then you don’t have a market.) 

“The way we figure it, why _wouldn’t_ users subscribe to ?” (Because they can get better for free elsewhere.) 

“If you get everyone in your organization to adopt our product all at once, then you won’t needcompatibility with your current tools.” (That’s the way Communism was supposed to work, too.) 

RULE #4: Information Wants to be Cheap. 

Information has been decoupled from objects. Forever. Newspaper companies are having to learn to separate news from paper; CD-ROM companies are having to choose between being multi-media producers and vendors of plastic circles. Like the rise of written text displacing the oral tradition, the separation of data from its containers will never be reversed. 

With the Internet, the incremental cost of storing and distributing information is zero: once a Web server is up, serving 10,000 pages costs no more than serving 1000 pages. Network users recognize this, even if they can’t articulate it directly, and behave accordingly. 

Anyone who is basing their plans on the willingness of network users to pay for electronic information in the same way that they now pay for physical objects like CDs and books will fail. Many Web-based services have tried to get users to pay for pretend scarcity, as if downloading information should be treated no differently from printing, shipping, storing, displaying and selling physical objects offline. People will pay for point of view, timely information, or the imprimatur of expertise. They will not pay for simply repackaging information from other sources (they can now do that themselves) or for the inefficient business practices bred by the costs of packaging. 

RULE #5: Its the Economy, Stupid. 

The future of the network will be driven by the economy – not the economy of dollars and cents, but the economy of millions of users acting to maximize their preferences. 

Technology is not the network, technology is just how the network works. The network itself is made up of the people who use it, and because of its ability to instantly link up people separated by distance but joined in outlook, it is the best tool for focussing group endeavors the world has ever seen. 

This ability of groups to form or change in a moment creates a fluid economy of personal preference: Individual choices are featherweight when considered alone, but in cumulative effect they are an unstoppable force. Anyone trying to control people’s access to information these days has learned this lesson, often painfully. Prodigy, the Serbian Government, and the Church of Scientology all have this in common – when they decided that they could unilaterally deny people certain kinds of information, they found that the networks made this kind of control impossible. 

The economics of personal preference will shape the network for the forseeable future. Online services will wither unless they offer some value over what the Internet itself offers. Network phones will proliferate despite the attempts of long-distance companies to stifle them. “Groupware” programs will embrace open standards or die. The economics which will guide the network will be based on what users want to buy, not on what marketers want to sell. 

These rules have been around the world 9 times. Don’t break the chain! If you remember these rules, you will be able to ward off some of the unbearable hype currently surrounding the unveiling of every two-bit Java animation program and Web server add-on, all while keeping on top of the real user-driven forces shaping the network. If, on the other hand, you fail to remember these rules, demons will feast on your flesh.

Cyberporn!!! (So What???)

First published on Urban Desires, 03/96.

So much has been said on the net and off, about Philip Elmer-DeWitt’s Time magazine cover story, “Cyberporn” (July 3, 1995), that I have little to add except this: as bad as the decision to use Marty Rimm’s methodologically worthless study was, the subsequent damage control was worse. Compare these two passages, both written by Elmer-DeWitt after it became clear that an inadequately fact-checked story had come to grace Time’s cover:


Some [network users] clearly believe that Time, by publicizing the Rimm study, was contributing to a mood of popular hysteria, sparked by the Christian Coalition and other radical-right groups, that might lead to a crackdown. It would be a shame, however, if the damaging flaws in Rimm’s study obscured the larger and more important debate about hard-core porn on the Internet.
and
I don’t know how else to say it, so I’ll just repeat what I’ve said before. I screwed up. The cover story was my idea, I pushed for it, and it ran pretty much the way I wrote it. It was my mistake, and my mistake alone. I do hope other reporters will learn from it. I know I have.


The first quote is from a Time followup three weeks after the original story (“Fire Storm on the Computer Nets”), the second is from the Usenet newsgroup alt.culture.internet. This was the worst aspect of the cyberporn debacle – Elmer-DeWitt, living in two worlds, exploited that difference in the two cultures rather than bridging them. For those of us on the net, he was willing to write simply and directly – “It was my mistake, and my mistake alone.” Off the net, however, his contempt for his readers leads to equivocation, “It would be a shame, however, if the damaging flaws…” In other words, his offline readers are not to know the truth about the decisions which went into making an unreviewed undergraduate study into a Time cover story.

This split between “us” and “them,” the people on the net and off, and what each group sees and thinks about online pornography, is at the heart of the debate about porn on the net. We are often not even speaking the same language, so the discussion devolves into one of those great American dichotomies where each side characterizes the other as not merely wrong but evil. I will have to confine my remarks here to the US, because that is where the Elmer-DeWitt article has gotten the most play – it was even entered into the US Congressional Record – and because his article frames the debate in terms of the First Amendment to the US Constitution, which guarantees the freedom of speech and of the press. This makes the debate especially thorny, because the gap between what most US citizens know about the First Amendment and what they think they know is often extreme.

Who’s On First?

The two sides of this debate are not what they seem. At first glance, the basic split appears to be between net-savvy libertarians who are opposed to government regulation of pornography on the net in any form, and anti-smut crusaders of all political stripes who want to use Federal regulations to reduce the net’s contents to the equivalent of a prime time TV broadcast. This is a false opposition – both of these groups are merely squabbling over details. The libertarians want no new laws ever, and the crusaders want strong ones now, but they share the same central conviction, namely that without new laws, the US has no legal right to control obscene material on the net which lies within its boundaries.

Both groups are wrong.

Obscene material on the net which is within the jurisdiction of the U.S. does not now and has not ever enjoyed the protection of the First Amendment. The Supreme Court was unambiguously clear on the right to regulate obscene material in Miller v. California (413 U.S. 15 1973), the landmark case which sets the current U.S. standard for obscene material. The ambiguity in that decision is in defining what precisely is obscene (about which more later), but once something is so declared, states may regulate it however they like, no matter what medium it is in.

Many people who have been sold on the net as a kind of anarchist paradise react badly to this news, arguing that since the U.S. has not bothered enforcing these laws on the net so far, it is unfair to do so now. Indeed, it probably is unfair, but the law concerning obscenity has little concerned itself with fairness. Prosecution for obscenity is almost never directed at an elite – as long as sexually explicit materials are not readily available to the general populace, the law has usually ignored them. Once they become widespread, however, both regulations and enforcement generally become much more stringent.

The Current Regulations

The most memorable thing ever said about “hard-core” pornography in the Supreme Court was the late Justice Stewart’s quote “I know it when I see it.” The force of this sentiment is so strong that although it no longer has anything to do with case law, more people know it than know the current Miller standard, which comes much less trippingly off the tougue. Miller does not define pornography per se, but rather “obscenity,” which is roughly the same as hard-core pornography. Material is obscene if it matches all three of these criteria:

1. The average person, applying contemporary community standards, would find the work, taken as a whole, appeals to the prurient interest; and 
2. The work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and 
3. The work, taken as a whole, lacks serious literary, artistic, political or scientific value.

Every phrase in that tri-partate test has been gone over with a fine-toothed legal comb, but it is the first part which is the oddest, because it introduces as a test for obscenity the legal entity “contemporary community standards.”

Which Community? What Standards?

“Contemporary community standards,” is a way of recognizing that there are different standards in Manhattan, New York and Manhattan, Kansas. Both “prurient interest” and “patently offensive” are considered local constraints. (The third test, for “serious literary, artistic, political or scientific value,” is taken to be a national standard, and most recent and notable obscenity trials recently have hinged on this clause.) The effect of these locally determined qualities is to grant First Amendment protection to materials which a community does not consider offensive. Put another way, with the “contemporary community standards” benchmark, pornography which is popular cannot be obscene, and pornography which is obscene cannot be popular.Obscene material on the net which is within the jurisdiction of the U.S. does not now and has not ever enjoyed the protection of the First Amendment. The Supreme Court was unambiguously clear on the right to regulate obscene material in Miller v. California (413 U.S. 15 1973), the landmark case which sets the current U.S. standard for obscene material. The ambiguity in that decision is in defining what precisely is obscene (about which more later), but once something is so declared, states may regulate it however they like, no matter what medium it is in.

Many people who have been sold on the net as a kind of anarchist paradise react badly to this news, arguing that since the U.S. has not bothered enforcing these laws on the net so far, it is unfair to do so now. Indeed, it probably is unfair, but the law concerning obscenity has little concerned itself with fairness. Prosecution for obscenity is almost never directed at an elite – as long as sexually explicit materials are not readily available to the general populace, the law has usually ignored them. Once they become widespread, however, both regulations and enforcement generally become much more stringent.

The Current Regulations

The most memorable thing ever said about “hard-core” pornography in the Supreme Court was the late Justice Stewart’s quote “I know it when I see it.” The force of this sentiment is so strong that although it no longer has anything to do with case law, more people know it than know the current Miller standard, which comes much less trippingly off the tougue. Miller does not define pornography per se, but rather “obscenity,” which is roughly the same as hard-core pornography. Material is obscene if it matches all three of these criteria:

1. The average person, applying contemporary community standards, would find the work, taken as a whole, appeals to the prurient interest; and 
2. The work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and 
3. The work, taken as a whole, lacks serious literary, artistic, political or scientific value.

Every phrase in that tri-partate test has been gone over with a fine-toothed legal comb, but it is the first part which is the oddest, because it introduces as a test for obscenity the legal entity “contemporary community standards.”

Which Community? What Standards?

“Contemporary community standards,” is a way of recognizing that there are different standards in Manhattan, New York and Manhattan, Kansas. Both “prurient interest” and “patently offensive” are considered local constraints. (The third test, for “serious literary, artistic, political or scientific value,” is taken to be a national standard, and most recent and notable obscenity trials recently have hinged on this clause.) The effect of these locally determined qualities is to grant First Amendment protection to materials which a community does not consider offensive. Put another way, with the “contemporary community standards” benchmark, pornography which is popular cannot be obscene, and pornography which is obscene cannot be popular.

Back to the Marty Rimm study. The study, and Elmer-DeWitt’s treatment of it, were designed to play on the hypocrisy which surrounds most of the debates on pornography, where people profess to be shocked – SHOCKED – that such filth is being peddled in our fair etc., etc. The tenor of this debate is designed to obscure what everyone knows about pornography: many people like it. Furthermore, the more anonymously they can get to it, the more they will consume. (Increased anonymity also seems to be a factor in increasing use of pornography by women.) This is a pattern we have seen with the growth of video tapes, CD-ROMs and 1-900 numbers as well as on the net.

That many people like pornography seems self-evident, but the debate is usually couched in terms of production, not consumption. The ubiquity of pornography is laid at the feet of the people who make and distribute sexually explicit material: they are accused of pandering, of trafficking in porn, even of being pushers, as if the general populace has some natural inclination not to be interested in erotic stimulation, which inclination is being deviously eroded by amoral profiteers.

The net is the death knell for this fiction, and the anti-smut crusaders are in terror lest the law take note of this fact.

We are so comfortable with the world of magazines and books and video tapes that we often fail to notice the differences between data in a physical package, which has to exist in advance of a person wanting it, and data on the net, which does not. Physical packages of data have to be created and edited and shipped and stored and displayed, all in the hopes that they will catch someone’s eye. Data on the net, in contrast, only gets moved or copied when someone implicitly or explicitly requests it. 

This means that anyone measuring net pornography in terms of data – bandwidth, disk space, web hits, whatever – is not measuring production, they are measuring desire. 

Pornography on the net is not pushed out, it is pulled in. For a million people to have a picture of Miss September, Playboy does not have to put a million copies of it on its Web site; it only has to place one copy there and wait for a million people to request it. This means that if there are more copies of “Blacks with Blondes!!!” than “Othello” floating around cyberspace, it is because more people have requested the former than the latter. Bandwidth measures popularity.

You Can See Where This Is Going

The legal effects of demonstrating porn’s popularity on the net may well be precisely the opposite of what the anti-smut crusaders believe they will be. If they can truly demonstrate that pornography is ubiquitous on the net, it will be much harder to regulate under current case law, not easier. In particular, if the net itself can be legally established as a “community,” whose contemporary standards must be consulted in determining obscenity, all the hype will likely mean that it will become difficult to have much net porn declared obscene under Miller.

This notion of the net as a community is not as far-fetched as it sounds. There is a case winding its way through the courts right now where a California man, Robert Thomas, running the “Amateur Action BBS” out of his home, was convicted for violating the community standards of Memphis, Tennessee, where his images were downloaded by a postal inspector trying to shut his BBS down. Part of the defense is the question of which community’s standards should be used. If the verdict is ultimately upheld, every community in the country will have the same standards for net porn as Memphis, because any BBS in any other U.S. jurisdiction can be reached from Tennessee by phone. If the verdict is overturned, no community anywhere in the U.S. will be able to regulate obscenity any more stringently than the most permissive community that has a porn-distributing BBS. If the net itself is found to be a non-geographic community, then there will be two sets of laws, one for people with modems!
and one for people without.

The Likeliest Casualty

In any case, the likeliest casualty of this case and others like it will be Miller itself. The very notion of contemporary community standards resonates with echoes of Grovers Corners, of a tightly knit community with identifiable and cohesive values. Communities like that, already mostly a memory in 1973, are little more than a dream now. The original Miller descision, which was 5-4, has held together mostly because the challenges raised to individual materials have usually focussed on the idea that the works in question had some serious value (e.g. the Mapplethorpe photos trial in Cinncinnatti.) Sometime in the next couple of years however, Miller is going to run directly into a challenge to the idea of local control of a non-localized entity like the net. The result will likely be the first obscenity law geared to 21st century America. Watch this space.