How To Optimize WordPress Robots.txt File For SEO

If you are working on your site’s SEO, knowing about how to optimize WordPress robots.txt file for SEO is something you got to keep in mind. All the search engines ultimately look for this file to take instructions before crawling your website and also look for your XML site map in this file.

How To Optimize WordPress Robots.txt File For SEO

How To Optimize WordPress Robots.txt File For SEO

 

Why Robots.txt?

 

The primary purpose of the Robots.txt file is to restrict access to your website by search engine robots. It is a text file that could be opened and created in almost any notepad or HTML editor or word processor.

Robots.txt file usually resides in your site’s root folder. You will need to connect to your site using an FTP client or by using cPanel file manager to view it.

If you do not have a Robots.txt file in your site’s root directory, then you can always create one. All you need to do is create a new text file on your computer and save it as Robots.txt. Next, simply upload it to your site’s root folder.

Let’s see how to optimize WordPress Robots.txt for SEO

 

Don’t use Robots.txt to hide low quality content or stop indexing of category and date. Prefer to  use WordPress plugins like WordPress SEO Plugin By Yoast and Robots Meta to add nofollow and noindex meta tags. The reason being that Robots.txt doesn’t stop search engine bots from crawling your site, it can only restrict its indexing.

To make a start, name your file Robots.txt and add it to the root layer of your website.

The format robots.txt file is actually quite simple. The first line usually names a User agent. The User agent is actually the name of the search bot you are trying to communicate with. For example, Googlebot or Bingbot. You can use asterisk * to instruct all bots.

The next line follows with Allow or Disallow instructions for search engines, so they know which parts you want them to index, and which ones you don’t want indexed.

User agent addresses the instructions to the search bots.

Allow/Disallow commands specify your restrictions. They are used to bound the bots from the website directory including the homepage.

The directives used are case sensitive, so make sure that the name you specify on the Robots.txt is the exact match against your folder name.

Consider disallow readme.html. Readme.html file can be used by someone to know which WordPress version you are using by browsing to it thus they will be able to hack your website. By disallowing it you are safe from these attacks.

Simply write

Disallow: /readme.html 

You should also disallow WordPress plugin directory for security reasons.

Simply write

Disallow: /wp-content/plugins/

Consider adding your site’s XML sitemaps to Robots.txt for better and faster indexing of your blog posts.

 

Robots.text file optimization

 

User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /comments/feed/
Disallow: /trackback/ Disallow: /index.php
Disallow: /xmlrpc.php
Disallow: /wp-content/plugins/
User-agent: NinjaBot Allow: /
User-agent: Mediapartners-Google* Allow: /
User-agent: Googlebot-Image
Allow: /wp-content/uploads/
User-agent: Adsbot-Google Allow: /
User-agent: Googlebot-Mobile Allow: /
Sitemap: http://www. onlineshouter.com/sitemapindex.xml
Sitemap: http://www.onlineshouter.com/sitemap-image

Christine

Christine

Author at onlineshouter
Christine writes for people who seek for knowledge about SEO, blogging, online marketing, gadgets and web apps.
Christine

One comment

  • Mudasir Yasin

    Is sitemap included is robots.txt file? Is it important for SEO or Indexing or not ?

Leave a Reply

Your email address will not be published.