Is this Anti-Scraping technique viable with Crawl-Delay?

Posted by skibulk on Pro Webmasters See other posts from Pro Webmasters or by skibulk
Published on 2012-10-17T23:06:19Z Indexed on 2012/10/17 23:21 UTC
Read the original article Hit count: 223

I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to do this by returning a "503 Service Unavailable" error code for users that access an abnormal number of pages per minute.

I don't want search engine spiders to ever receive the error. My inclination is to set a robots.txt crawl-delay which will ensure spiders access a number of pages per minute under my 503 threshold.

Is this an appropriate solution? Do all major search engines support the directive? Could it negatively affect SEO? Are there any other solutions or recommendations?

© Pro Webmasters or respective owner

Related posts about googlebot

Related posts about crawling