Note: Don’t always believe everything is posted on Twitter, or any other social media website. In this case, DejanSEO and John Mueller from Google were joking about rogue bots and cloaking. Parts of this post have been edited.
There was a recent Tweet from DejanSEO to John Mueller from Google about technical SEO. Apparently DejanSEO has rogue crawlers that disobey his robots.txt file. They crawl the website even though the robots.txt file says that they can’t crawl the website. Dan Petrovic asked John how to deal with rogue crawlers.
Technical SEO question. Rogue crawler disobeys my robots. @JohnMu what can I do to stop it?
John Mueller responded with:
Maybe redirect or serve it content from another website?
Keep in mind that John Mueller and DejanSEO were joking in this Twitter thread. What John suggested, like it or not, is what I would define as website cloaking. Cloaking is when you serve up content to one visitor but serve up different content to another visitor to your website. From an SEO perspective, website cloaking is done to deceive, to serve up certain content to the search engines and other content to a website’s visitors. There are generally two different ways to cloaking for SEO purposes:
- Identify the user based on their user-agent and serve them up certain content.
- Identify the user based on their IP address and serve them up certain content.
If your website serves up content to one visitor and serves up different content (or redirects them to another website or web page) to the search engines on purpose, then you’re cloaking. That violates the search engines’ guidelines. This is different than customizing the content. Customizing the content on a page is considered to be OK when you’re “customizing” the content (displaying certain content on the page) that others may or may not see based on who the visitor is. Generally that requires the visitor to have previously visited your website and accept cookies.
But if you serve up or redirect visitors based on their user-agent or based on their IP address in an attempt to deceive or hide optimized content, that could be considered website cloaking. As I said, that could be against search engine guidelines. Here’s Google’s guidelines on Cloaking, which is strictly against Google’s acceptable guidelines.
So while John Mueller was joking with DejanSEO, and suggested that he redirect (cloak) certain visitors (in this case a crawler), I actually don’t recommend that at all. Frankly, I would identify the crawler and just block them from visiting the website. If you don’t have the technical expertise to do that, then services such as CloudFlare can help stop rogue bots and crawlers from crawling your website too quickly or by crawling too many pages.
Join the discussion over on Twitter.
Update: DejanSEO told me, in a Tweet, that John Mueller “wasn’t talking about visitors, website or traffic. He meant I literally cover my walking robot from the video with a cloak to stop it from attacking my green robot.”