When a search engine crawls (visits) your website, the first thing it looks for is your robots.txt file. This file tells search engines what they should and should not index (save and make available as search results to the public). It also may indicate the location of your XML sitemap.

 

The robots.txt file belongs in your document root folder.

 

You can simply create a blank file and name it robots.txt. This will reduce site errors and allow all search engines to rank anything they want.

 

If you want to stop search engines from ranking you, use this code:

 

#Code to not allow any search engines!

User-agent: *
Disallow: /

 

Was this answer helpful? 0 Users Found This Useful (0 Votes)