How to Optimize WordPress Robots.txt SEO Tutorial

This Tutorial demonstrating how to optimize WordPress robots.txt for SEO. Need to optimize WordPress robots.txt file ? But don’t know about why it’s important for your WordPress SEO. Many beginners in blogging or in WordPress industry don’t have such knowledge about why robots.txt important for your WordPress seo and how to optimize. Don’t worry i am going to discuss about everything of robots.txt file. First of all we need to know what is a robots.txt file in WordPress or any other website and why its important for WordPress seo.

Basically robots.txt plays a major role for WordPress seo. you can call robots.txt as a medium between your WordPress site and search engine which are communicate with the help of this file. In simple word we can say this file defines that which directory and files should be index in search engine and what files should be excluded from crawling or indexing.

What is WordPress robots.txt file?

Robots.txt file give direction to the search engines which folder and files path allow for indexing and crawling and which part should be excluded. Absence of robots.txt file in your WordPress directory will not stop search engines from crawling and indexing your site. However i highly recommend if not present then create one for your WordPress website.

Location of Created Robots.txt File? How to Create Robots.txt File?

You can find in the root directory of your site if already created. You can access and modify through access your file manager via ftp client or cpanel.

If still not found then create robots.txt file for your site. It’s very easy to create robots.txt file. It is basically a .txt file and you can just right click and create new text document. After creating this text document just named it robots.txt and upload to your server root directory.

How to Use Robots.txt File?

The use of robots file in website is very simple. Here i will show a basic and general robots.txt file for WordPress.

User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /index.html

The very first line defined that which type of Use-Agent can access during site crawling and indexing. User-Agent means the type of search bots like googlebot, bingbot etc. Here we used asterisk means this site available for indexing and crawling for any types of bots or instruct to all bots to crawl and index.

The next lines follows with allow and disallow attribute or instruction for the search bots. It means robots.txt disallow will instruct them which part of your website you want to index with them and which parts should be exclude. For above example we don’t want to index plugin directory which is exist under the WP-content folder of your WordPress site. Also don’t want to index and crawl the index.html file in the root directory. But the second line instruct to bots that we want to crawl and index the uploads folder and its all files by them.

Optimize WordPress Robots.txt for SEO

In Google webmaster guideline Google advise all the webmasters to not use robots.txt for hiding such low quality content or keyword stuffing. The main purpose of use robots.txt is that instruct to bot which part of your website you want to index in search engine. Also in WordPress many free plugin available which are offer to you add nofollow, noindex in your archived pages. So you can use these plugins.

But remember you do not need to block some security pages like login, registration etc because by default WordPress add noindex tag for these pages. Also we recommend to disallow the readme.html file via robots.txt file in your WordPress site. Because there are a security viewpoint. If we allow this page then anyone can view what version of WordPress we are using in our website and can run malicious query on that. So add disallow protect your site from this type of malicious attack.

Adding Your XML Sitemap to Robots.txt File

Now a days in WordPress available a lots of free seo plugin which are generate XML sitemap for your WordPress website pages and posts automatically and add those XML sitemap to your robots.txt file. But if you create manually then just add those sitemap full path at the end of the robots.txt like below.

Sitemap: http://click4knowledge.com/sitemap.xml

Below we give an example of robots.txt which we are using for click4knowledge and optimize WordPress robots.txt file.

User-agent:  *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /comments/feed/
Disallow: /trackback/
Disallow: /index.php
Disallow: /xmlrpc.php
Disallow: /wp-content/plugins/
Disallow: /comment-subscriptions/?*

User-agent: Mediapartners-Google*
Allow: /

Sitemap: http://click4knowledge.com/sitemap.xml

I think this tutorial post will be helpful to understanding the basics of optimize WordPress robots.txt file and how to optimize in WordPress for seo. Share and like will be appreciated also help us to improve ourselves.

One response to “How to Optimize WordPress Robots.txt SEO Tutorial”

  1. Manshu ydoxy says:

    Excellent robots. Txt guide. Information is very useful. I want to know wildcard robots txt example can you share me.

Leave a Reply

Your email address will not be published. Required fields are marked *