Commentary

Murdoch's Publishers Begin Blocking Search Engine From Indexing Content

News Now

It appears Rupert Murdoch has begun to make good on his promise to block search engines from indexing news and serving up headlines and links on their Web site. U.K.-based NewsNow.co.uk, a search engine that aggregates news, has become the first target.

NewsNow.co.uk has been blocked from aggregating content from News Corp subsidiary News International's Times Online. Soon it will no longer have access to index content from thesun.co.uk and newsoftheworld.co.uk, according to NewsNow Managing Director and Chairman Struan Bartlett.

In August 2009, NewsNow.co.uk contributed about 1.7 million initial page views to the three sites by redirecting people searching for news and information from its site to theirs. Bartlett suggests the stats, provided by NewsNow, only represent a portion of the page views. He makes the case that bounce rates on these sites are low and people typically stick around to read more content.

"We think NewsNow performs a public service by linking to news from a wide variety of different providers," he says, insisting NewsNow has been "unfairly" targeted. "It lets people compare and contrast reported views in the press. This makes NewsNow a kind of 'meta-newspaper.'"

Calling News International's decision a move to "undermine press freedom," Barlett says the freedom represents the identical rights newspapers rely on for free speech. He argues that Google News has not been blocked with the same restrictions using the code robots.txt.

So why did News International block NewsNow and not Google, Yahoo and Microsoft? I can't say for certain, but I have my assumptions. For starters, NewsNow offers a paid aggregation subscription service that provides companies with a keyword search for articles pertaining to specific topics.

Unfortunately for Bartlett, News International considers NewsNow's service as making money off the publisher's content without sharing the revenue generated from the service. Murdoch has been quoted as calling companies that do this "content kleptomaniacs and plagiarists" that "simply pick up everything and run with it" by "stealing stories without payment."

Through the Newspaper Licensing Agency (NLA), eight national U.K. newspaper publishers on Jan. 1 began asking for license payments from news aggregators who redirect people from their Web sites. Murdoch's subsidiary News International belongs to the NLA, but has not joined the Web licensing initiative, according to at least one report.

Bartlett explains the NLA wants to charge a fee to NewsNow's customers who subscribe to the search engine's "paid" service, and charge a fee for permission for companies to circulate links as part of NewsNow's subscription service. "We thought that the beginning of a slippery slope, because if you start there, what is the next step," he says. "Would it be to go to businesses and organizations not customers of monitoring companies like ours, asking them to pay a license fee to make commercial use of newspaper articles?"

Technically, if you're reading a newspaper article at work and send the link to a co-worker because it's a relevant article, that's arguable commercial use. Bartlett questions whether it would require a separate license and whether this could become a way to tax linking to articles.

Think of the uproar for the SEO community and their link-building strategies that would be created by taxing links connected to news articles.

2 comments about "Murdoch's Publishers Begin Blocking Search Engine From Indexing Content".
Check to receive email when comments are posted.
  1. Dave Woodall from fiorano associates, January 11, 2010 at 5:26 p.m.

    News Now's paid service doesn't strike me as being much different than a traditional clipping service provided by any decent PR firm. Obviously Mr. Murdoch singled-out News Now because he's frustrated by his own inability to monetize his news content. Unfortunately, what Mr. Murdoch doesn't yet realize is that much of his content is a low-value commodity and only of commercial interest when repackaged and enhanced...as News Now has done.

    Besides Rupert, it's not like your outlets have never engaged in a little "content kleptomania and plagiarism". I seem to remember your American Fox News correspondents relying pretty heavily on the TMZ.com website when news of Michael Jackson's death broke.

  2. John Jainschigg from World2Worlds, Inc., January 12, 2010 at 10:41 a.m.

    I disagree.

    In brutally-simple terms, Murdoch paid to produce the content in question, and has the right to determine use. Whether that works for his business in the long term is for him and his shareholders to determine.

    Bartlett's assertion that automated content aggregation and linking constitutes authentic journalism worthy of protection as a public good is hugely disingenuous. As Dave Woodall notes, this is just a clipping service. An old-school clipping service, however, was a specialized, labor-intensive product, produced and consumed solely by businesses with a special relationship to one another, and to news -- thus non-competitive with, hence tolerable by publishers. Automated and vastly scaled-up, however, aggregation (and arguably search) can become a strip-mining operation being carried out on private property without permission.

    Scale and automation are the relevant factors. All sorts of applications, e.g., forwarding links between co-workers, quoting stories in other stories, clipping services aimed at professional markets, buying books for lending libraries, individuals composing their own aggregations of RSS-feed-mediated content in XML readers, etc. -- all this can be tolerated perfectly well, and remain, in various spheres, a set of authentic public goods, if these activities happen at such a scale (or in such a diversity of forms) that they don't collectively break the business model of the downstream folks who are paying to have the writing done.

    It's also worth noting that all the NLA and Murdoch are apparently doing (so far) is putting a robots.txt file in a high-level directory -- equivalent to saying 'Please don't index this.' That's not equivalent to sending black helicopters and ninjas to aggregator datacenters, right? It's merely the assertion of a preference -- one that aggregator spiders are obliged to respect or run afoul of internet governance, which historically esteems the sharing of information, but also seeks to protect the rights of content creators.

Next story loading loading..