robots.txt

The robots.txt is a rather obscure file that would normally only be fully appreciated by deep geeks.
In 2007 however robots.txt was elevated to contemporary relevance through the introduction of the Sitemaps Protocol.
Historically robots.txt was used to inform search engine spiders which website files and folders it may not index. Web developers could then hide website content from the search engines, and consequently from the internet public. This would apply to back office data or other information stored in the web space not intended for general public exposure.
In April 2007, the Sitemap Protocol was implemented which extends the role of robots.txt to also point to the location of the website’s XML Sitemap.
Robots.txt is always located in the root of the website eg http://<your website>/robots.txt