How to create robots.txt for a domain that contains international websites in subfolders?

Posted by aaandre on Server Fault See other posts from Server Fault or by aaandre
Published on 2010-05-13T18:44:02Z Indexed on 2010/05/13 18:55 UTC
Read the original article Hit count: 278

Filed under:
|
|
|

Hi, I am working on a site that has the following structure:

site.com/us - us version site.com/uk - uk version site.com/jp - Japanese version

etc.

I would like to create a robots.txt that points the local search engines to a localized sitemap page and has them exclude everything else from the local listings.

So, google.com (us) will index ONLY site.com/us and take in consideration site.com/us/sitemap.html

google.co.uk will index only site.com/uk and site.com/uk/sitemap.html

Same for the rest of the search engines, including Yahoo, Bing etc.

Any idea on how to achieve this?

Thank you!

© Server Fault or respective owner

Related posts about seo

Related posts about robots.txt