Google: URL Parameters Tool Is Not A Replacement For Robots.txt
Google’s John Mueller said on Twitter that using the URL parameter tool is no replacement for using a robots.txt file for blocking content. John was asked “how reliable is it” when setting “crawl no urls” of a certain type of URL pattern. John said “it’s not a replacement for the robots.txt — if you need to be sure that something’s not crawled, then block it properly.”
Here are those sets of tweets:
It’s not a replacement for the robots.txt — if you need to be sure that something’s not crawled, then block it properly.
I should note that last year John said do not use the robots.txt to block indexing of URLs with parameters. So keep that in mind as well.
Powered by WPeMatico