Search engines are really skilled at finding content on the web, however sometimes they are just a little too skilled! When using WordPress you might think that you’d want people to be able to find everything on your site but that isn’t always the case. The truth is there are a lot of core files working away behind the scenes to make your blog work the way it does and when they contain secure data, they are the last thing you want showing up on the first page of Google!
Robots.txt is a file that was designed to tell search engines like Google when they shoudln’t be looking at a certain area of your site. You simply create a text file with the proper formatting and add it to your sites root directory, which in turn can tell search engines which content you don’t want them to explore. The code below is sample content robots.txt file that I’ve created based on my research of the latest WordPress security and SEO information.
User-Agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Disallow: /wp-content/cache/ Disallow: /wp-content/themes/ Disallow: /wp-includes/
To use the above example robots.txt all you need to do is open a basic notepad editor and copy and past in the code above. When you are done, save the file as robots.txt and upload it to the root or home directory of your WordPress installation.
Note: The file name must end in .txt rather than .doc or some other extension.