Commentary

Universally Accessible Web Content Cannot Survive

If you're at all lucky, you'll spend at least part of this summer far away from spreadsheets, reports, ad consoles, and bid management software. While there's nothing wrong with single-mindedly focusing on all the granular data associated with search, too much data, all the time, can rot your brain.

With this latter point in mind, this week's column attempts to wrestle with some large-frame issues that I think deserve attention, because, as we learned from the financial crisis, it's often the most basic, most obvious issues (such as whether all subprime mortgages that were being issued could possibly ever be repaid) that can blow everyone out of the water. In other words, just because nobody's talking about something doesn't mean that it's not real enough to kill you.

There's been a lot of ink spilled in recent months about what it takes to differentiate a search engine enough so that users switch from one to another. Frankly, I think this conversation totally misses the mark. Journalists may care a lot about the bells and whistles adorning a particular engine's UI, but users seem totally oblivious. The only thing that really matters to end users is finding the information they want, as quickly as possible -- and, as we all know, Google's PageRank system represented a genuine breakthrough in an era when SERPs were chaotic, meaningless, repetitive nightmares. Today, however, we're in an era wherein all of the major engines use very similar algorithms, yielding largely identical (or at least comparatively similar) results.

advertisement

advertisement

I find it useful to think of search engines as digital cameras all aimed at the same landscape. One camera might have slightly better resolution than another, or better low-light performance, or faster shutter performance. But they all take pretty good pictures, and unless you're a professional photographer or digital camera reviewer, you really don't care. Most of us just want to capture the moment and move on to another.

What changes things is when one of these cameras is able to capture colors, or parts of the landscape, or parts of the visible light spectrum, that the others can't, and that's the one you'd want if you needed to capture these elements. Put another way, if you were going to visit Mount Rushmore, and you knew that Mount Rushmore could only be satisfactorily photographed with a Nikon camera, you'd pack that one, and leave the others behind.

Let's get back to search engines. Today, search engines index the same body of information: the crawlable World Wide Web. But suppose a large part of this body of data suddenly "went dark" to all but one engine? Suppose additionally that the portion of this data which became invisible happened to be strategic to your information task at hand? Well, you'd likely switch to that engine immediately, because it would be your only portal into this body of knowledge.

It's obvious that search engines have a powerful economic incentive to "lock up" such data because it is the only real way that they can meaningfully differentiate themselves enough to compel users to switch. Nor am I convinced that the reason this hasn't happened -- that most Web site operators have en masse opted to welcome any spider attempting to access their sites' content -- is an immutable situation. My opinion of most site operators (and yes, I am one) is quite cynical: most of them share characteristics with street musicians performing in the subway for spare change (which is not to say that some of them don't get a lot of spare change through AdSense and other ad networks). In other words, it wouldn't take much to bribe them into refusing another spider's visit, provided the price is right, and right now, this price is almost ridiculously low.

The problem, of course, is that The New York Times and other A-level content sources aren't street musicians: they're world-class concert performers who happen to be working at street-musician rates, which is why they'll fail until something radical changes in the fundamentals of the information economy. In fact, they've got more of an incentive to "lock down" their content than even the search engines, which is why, I think, within the next year or so, we'll start seeing the most authoritative sources of content on the World Wide Web "going dark" to information aggregators (which is all the search engines really are) that don't cut them a better deal. Many will mourn the fracturing and Balkanization of the Web into pieces which are no longer universally accessible in the way we're accustomed to, but unless somebody comes up with a better solution, very soon, such a Balkanization will soon come to pass.

11 comments about "Universally Accessible Web Content Cannot Survive".
Check to receive email when comments are posted.
  1. Chris Nielsen from Domain Incubation, July 27, 2009 at 11:15 a.m.

    "...just because nobody's talking about something doesn't mean that it's not real enough to kill you."

    Very true, and that certainly opens the door for much concern, since there is an almost endless number of things that no one is talking about.

  2. Matt Howard from SMBLive, July 27, 2009 at 11:16 a.m.

    Everyone these days seems to be talking about the inevitable death of the newspaper industry. This is a brilliant and simple analysis of why they're in so much trouble (world-class concert performers playing at street musician rates). Nicely done!

  3. Howie Goldfarb from Blue Star Strategic Marketing, July 27, 2009 at 11:23 a.m.

    Sorry Mr. Baldwin your name brings up bad memories of a horrible classmate/person who is also an actor!

    LOL

    Great post. I agree. The problem is people forget the reason content is cheap/inexpensive/free is because it is subsidized by advertisers. Since no content companies ever force you to view ads they just sold the Advertisers a chance to be seen. using movies as a great example of paid content I took the price of a movie split in 2 as the per hour charge a TV/Cable company would have to charge if advertising went away.
    I asked my folks if they would pay $7/hour to watch TV and they said no. But they also have TiVo! Something will have to give.

    The other catastrophe the web brought was ease of competition. Prior to the web most cities had 1 to 4 major newspapers. Now all newspapers on earth are technically the competition because i can read any online for free! As newspapers cut the journalists and start sharing content this will slowly reduce competition because I will see most of the same content on each site. It is a vicious cycle.

    Lastly if they are successful with a micro-payment system then people can pay per article they read and not feel ripped off.

    Long term I think we will see a few major news sites that compete on big news and the rest go out of business. And leave the local stuff to others.

  4. David Culbertson from LightBulb Interactive, July 27, 2009 at 11:30 a.m.

    "why, I think, within the next year or so, we'll start seeing the most authoritative sources of content on the World Wide Web "going dark" to information aggregators (which is all the search engines really are) that don't cut them a better deal"

    Then this will be their death because only a small portion of search engine visitors are actual A-level content customers / subscribers. If 'average Joe' internet user doesn't stumble upon NYT content via a Google search, their web traffic will fall right off a cliff. I have seen the analytics data for more than hundred websites and most of them would be starved of traffic if they disappeared from Google.

    Web advertising is cheaper than print advertising only because advertisers really know what they're getting because the measurement tools are so much better.

    One place all newspapers could improve is using print to drive people online. They've barely tapped their ability to do this. They need to shift their thinking, not hide.

  5. Kevin Pike from Kevin Pike, July 27, 2009 at 11:47 a.m.

    Interesting idea, however, I don't think "locking up" data is the future of the web. I'm not so sure Marshall Simmonds or anyone at the NY Times would agree this is the best fiscal approach either.

    Search engines treat "A-level" content sources as such. Although The NY Times is seeing online ad revenues drop recently, they are still getting paid like world-class performers compared to the average Joe's Adsense account.

    To block certain search bots from their content would further their revenue decline - not make people switch engines.

    The world has enough media outlets today that I bet search engines will stick to ranking the best article they can crawl for free. If you want to block your site for a subscription fee great, the rest of the world will move on without you.

    I fear any engine that did this, and tried to pass along fees to it users would also lose market share. I kind of see it as XM Radio vs. free radio. Some may like to pay for content, but I'll stick with free for now.

  6. Bruce May from Bizperity, July 27, 2009 at 12:28 p.m.

    David Culbertson has got the main issues right. When he says, “Web advertising is cheaper than print advertising only because advertisers really know what they're getting because the measurement tools are so much better,” he touches on the hidden truth of print ads… they don’t reach nearly as many people as you think. If you are on a web page, even if you don’t actually look at a banner, at least you are looking at the page it’s on. Most people don’t read every page of a newspaper (or magazines either). So in the print world, actual viewers are always exaggerated. Buyers focus on the total circulation only, since there is no way to measure “page views” in print. Making ads this accountable underscores the dirty little secret that ads just don’t work as well as media sellers in the world outside the Internet would have you believe. The larger question is “Can a print model ever work in an online environment?” I think it can but it will require a combination of text and video. Furthermore, the video will require interstitial commercials” We are at the beginning of an evolution toward a viable new media business model but we are not there yet. And local news media is losing out to national outlets because they can’t compete on national news. The local news, by itself, just isn’t enough to fill up a paper. New media will have to exploit niche markets, rather than focus on local or regional geography if they want to find enough value add to get into the game.

  7. Gary Senser, July 27, 2009 at 12:44 p.m.

    It's unlikely that you can "put the genie back in the bottle"!

  8. Paula Lynn from Who Else Unlimited, July 27, 2009 at 1:28 p.m.

    Used to be....newspapers would claim 2-3 readers average per issue while magazines claimed an average of 7. It's enough to kill ya' ! What's your impression? ;)

  9. Monica Bower from TERiX Computer Service, July 27, 2009 at 1:45 p.m.

    It wasn't "the internet" that killed newspapers, though; newspapers have been dying for thirty years. Certainly it hurt when craigslist took away the main reason people buy local papers - to find a job and buy/sell things - and it hurt more when news aggregators like Drudge and yes, even Fark, made it possible to get in and get out in less time than it takes to smoke a cigarette. The fact that paper alone costs orders of magnitude more now than ever before doesn't help, nor the crushing expenses of keeping huge buildings with huge press rooms and huge numbers of employees, nor the expensive contracts negotiated by those employees who assumed newspapers would never go away. Steel workers thought the same things, and they went away before the internet existed - but ultimately for the same reason; buyers could get more, better product, cheaper, from somewhere else.

  10. Josh Mchugh from Attention Span Media, July 27, 2009 at 5:39 p.m.

    @David C. - wholeheartedly agree with your point about using old media not as the payload, but as a marketing channel that drives people online. I think it makes sense for print now and, very soon if not already, for broadcast airtime.

  11. Stuart Long, July 28, 2009 at 4:17 p.m.

    Choosing to go dark is choosing to become irrelevant. All of the content on the Web is part of a global conversation. Muzzled sources cannot participate. If the New York Times decides to go dark they might as well turn off all the lights and go home.

Next story loading loading..