How To Optimize WordPress Robots.txt File For SEO
If you are working on your site’s SEO, knowing about how to optimize WordPress robots.txt file for SEO is something you got to keep in mind. All the search engines ultimately look for this file to take instructions before crawling your website and also look for your XML site map in this file.
Robots.txt file usually resides in your site’s root folder. You will need to connect to your site using an FTP client or by using cPanel file manager to view it.
If you do not have a Robots.txt file in your site’s root directory, then you can always create one. All you need to do is create a new text file on your computer and save it as Robots.txt. Next, simply upload it to your site’s root folder.
Let’s see how to optimize WordPress Robots.txt for SEO
Don’t use Robots.txt to hide low quality content or stop indexing of category and date. Prefer to use WordPress plugins like WordPress SEO Plugin By Yoast and Robots Meta to add nofollow and noindex meta tags. The reason being that Robots.txt doesn’t stop search engine bots from crawling your site, it can only restrict its indexing.
To make a start, name your file Robots.txt and add it to the root layer of your website.
The format robots.txt file is actually quite simple. The first line usually names a User agent. The User agent is actually the name of the search bot you are trying to communicate with. For example, Googlebot or Bingbot. You can use asterisk * to instruct all bots.
The next line follows with Allow or Disallow instructions for search engines, so they know which parts you want them to index, and which ones you don’t want indexed.
User agent addresses the instructions to the search bots.
Allow/Disallow commands specify your restrictions. They are used to bound the bots from the website directory including the homepage.
The directives used are case sensitive, so make sure that the name you specify on the Robots.txt is the exact match against your folder name.
Consider disallow readme.html. Readme.html file can be used by someone to know which WordPress version you are using by browsing to it thus they will be able to hack your website. By disallowing it you are safe from these attacks.
You should also disallow WordPress plugin directory for security reasons.
Consider adding your site’s XML sitemaps to Robots.txt for better and faster indexing of your blog posts.
Robots.text file optimization
Disallow: /trackback/ Disallow: /index.php
User-agent: NinjaBot Allow: /
User-agent: Mediapartners-Google* Allow: /
User-agent: Adsbot-Google Allow: /
User-agent: Googlebot-Mobile Allow: /
Sitemap: http://www. onlineshouter.com/sitemapindex.xml
Latest posts by Christine (see all)
- How to add an Admin User when WordPress Site is Hacked - January 28, 2018
- How to Manually Update WordPress Plugins using FTP - January 26, 2018
- Why Bluehost is best for WordPress Hosting 2018 - January 11, 2018