I’ve so far refrained from commenting on the Rupert Murdoch de-indexing comment and ensuing brouhaha. But Google’s recent policy change throws everything into the air.

Whether you like or dislike him, it’s time to stand up and recognize that Murdoch’s threat to pull News Corp sites from Google’s index has worked brilliantly.

Publisher unrest and the threat of a Bing deal and serious search engine competition on site indexing have pushed Google into a major concession. Google’s change to its First Click Free guideline is a bigger deal than many people realize. What appears to be a simple change in degree is actually a change in kind.

Google has now said that its okay with sites showing different content to its crawler than to a human following a search results link. There is no longer a guarantee that what shows on a search results page will actually be on the destination page. The Google search user experience will suffer slightly and publishers will now find it much easier to run a pay site.

A Little Background on First Click Free

I’ve seen First Click Free described by some bloggers as a Google “program” or “service”. It’s neither. It’s more accurate to call it a guideline or policy. Google has always taken a strict stance against “cloaking”, or showing different content to its crawler than to a human visitor. What First Click Free said to publishers of paysites was in effect that they had three options. (1) Opt out of indexing at all. (2) Let the crawler index all content, but direct a human reader to a sign-up page, and risk the wrath of Google, which could include de-indexing or ranking penalties. (3) Implement First Click Free, and check all incoming requests to see if they’re the Googlebot or have a Google referrer and show them the content for free. (This is what WSJ.com implemented)

Now with “First Five Clicks Free”, Google has given sites permission to not show a user the same content as the Googlebot sees after their fifth click.

First Click Free Created the Leaky Paywall

I had never understood the complaints about search engines “stealing” content that emanated from the top of News Corp. If anything, search engines were providing free advertising and new visitors to convert to paying subscribers. Pulling their sites from Google’s index wouldn’t hurt Google, and it wouldn’t help the site either. I thought that the Journal was allowing visitors from Google past the paywall voluntarily to increase traffic. Now its clear that the Journal was choosing between maintaining the loophole and violating Google’s rule against cloaking and risking losing Google derived traffic. In that context, the ire directed at Google makes much more sense.

Editors and staff at the WSJ are well aware of both the power of Google to drive traffic and visitors to the site, and the degree to which people were using it to circumvent their paywall. Every morning, an email report goes out to editors and staff detailing what search keywords were driving traffic to the site and what stories and trends are hot online. During my internship, compiling and writing this email report was one of my responsibilities. Visitors searching for the exact headlines of Journal stories often ranked among the top sources of Google referrer traffic.

That Google has so clearly and quickly reacted, means that some negotiating power is returning to the big publishers. Five free clicks per day is still probably too many to make them happy, though. But the more search share Bing gains, the more leverage publishers will have.

Predictions for the Future

I predict that we will soon see a future where major publishers will let search engines see and index the full text of a story, but show just a teaser and a “Purchase” button to users. In fact, paywalled sites could try it now, if they feel like playing chicken with Google. Would they actually follow through and penalize a sites ranking or de-index it? Especially for a site like the WSJ.com, it’s plausible that doing so would noticeably hurt the quality of web and news search results. If Bing doesn’t penalize a site for doing so, will their results look better in comparison?

By explicitly ignoring Google’s guidelines, publishers would throw the ball back into Google’s court to see how they’ll respond. “First Five Clicks” is a sign that Google may cave on this. My advice to Rupert Murdoch would be to patch that hole in the WSJ.com paywall (give away maybe one free click per day) and see what Google does.