Thursday, February 21, 2013

What is Robots.txt, Why it’s Used?



A Robts.txt File is used to prevent the indexing of duplicate content on a website or for implementing a no follow rule in an area of a website. Martijn Koster invented this file in the early part of the 90s – the file used more and more with the development well-known search engines as time progressed.

Why it’s Used?


Therefore, if a webmaster or an owner of a website wants to direct the web robots, he must establish the Robots.txt file and provide exact directions for robots to read before accessing other files on the site. If the file is not used, the web robots will simply conclude that no explicit instructions are available for the site. Also, If the website has more than one subdomain, then a Robots.txt. file must be used for each of the subdomains.  

Here is a standard robots.txt file we use on majority of our clients Magento websites – Magento robots.txt

No comments:

Post a Comment