The New York Times is reporting that influential members of the publishing industry, including the AP and other news organizations, are pushing for extensions to the Robots Exclusion Standard which will give publishers more control over how search engines index their sites.
The extensions, known as the Automated Content Access Protocol (ACAP), will reportedly allow content providers to “limit how long search engines may retain copies in their indexes, or tell the crawler not to follow any of the links that appear within a Web page.”
On the surface, this seems rather mundane. However, the publishers are pushing for the standards for copyright reasons, claiming that by indexing and redisplaying content from their sites, search engines are violating their rights as content producers.
I can’t see how this is anything but a boneheaded move by the publishers, akin to the brilliant tactics used by the music and film industries to alienate users while attempting to shore up their failing business models. The description of ACAP above sounds like it will merely prevent potential readers from finding content—if key areas don’t get indexed, users won’t find them in searches, and if crawlers can’t follow links, then that will further limit the effectiveness of searches by hobbling search algorithms like Google’s PageRank. I can’t imagine how content providers can think this is a good thing, unless, of course, they hope to turn the internet into a segregated collection of walled gardens where users must pay for content at every site they visit.
It seems that the trend for online content providers is in the opposite direction, as previously walled sites are opening up access. The New York Times and Wall Street Journal, two of the largest, and most successful subscriber-based content systems, recently unlocked their sites. It is likely that they did so because the numbers suggest that the increased traffic brought by open access will more than replace subscriber fees with ad revenue.
I think the locking down of content that the ACAP represents is a move in the wrong direction. In the Times piece, the executive director of the European Publishers Council Angela Mills Wade is quoted as saying that “ACAP could . . . make Web sites more comfortable about putting more material online, including scholarly journals and other items requiring subscriptions.”
So, scholarly journals are afraid of putting their content online because search engines might index it, display summaries of it, and allow more people to read it? This makes absolutely no sense to me; the trend for content providers should be to monetize the long tail of users who would never find their content without it being freely searchable on the web. How is the content making any money—or acquiring any value at all, for that matter—by mouldering, unaccessed, offline? Publishing companies need to wake up and follow the lead of the Times and WSJ, not that of the RIAA and MPAA.
Thursday, November 29, 2007
Blind leading the blind: Publishing industry follows lead of MPAA, RIAA
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment