Monday 17 February 2014

Google Adds, Then Pulls, Advice Not To Block Ad Landing Pages From Its Crawler

No comments
googlebotOn Wednesday/Thursday of last week, Google added a new guideline to their Webmaster Guidelines to “not block a destination URL for a Google Ad product” via your robots.txt file. 24-hours later, Googlereversed that guideline and put the webmaster guidelines back to exactly how they were Wednesday morning.
Here was the addition and then what they removed:
Make efforts to ensure that a robots.txt file does not block a destination URL for a Google Ad product. Adding such a block can disable or disadvantage the Ad.
We asked Google why did they make the change and then later, when we noticed the guideline was reversed, why did they reverse it?
We were considering something along these lines since advertising programs (not just ours) often need to crawl the landing pages, but we decided not to add it to the webmaster guidelines. Sorry for the confusion there.
As you may know, Google has told us they want us to make sure ads do not influence the search results and to block ads using the robots.txt file or use the nofollow. In fact, it is one of the existing guidelines:
Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google’s AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.
But then they added this new guideline that communicated almost the exact opposite.
To be fair, Google’s ad ranking algorithm does need to detect landing page quality and thus needs to crawl. But as you can see, this can get confusing when placed in the guidelines.
As you can see, this guideline change and reversal is very interesting. It may have been a complete accident and we shouldn’t bother looking more at it, or it may be two different departments butting heads at the company.