Skip to content
close

How to Optimize Your WordPress Robots.txt for SEO in Hindi –

In this episode we will learn How to Optimize WordPress Robots.txt for SEO.

ie your Blog’s SEO How to optimize robots.txt file to make it even better.

With the help of robots.txt file, the robots of the search engine know which page and which part of our site is to be crawled and which part is not to be crawled.

Whenever we create a WordPress, WordPress generates robots.txt file for our site by default but it is not a perfect robots.txt file.

We optimize this file ourselves and create a Perfect robots.txt file because it is our Blog SEO is very important for

In this post, we will learn to create an ideal robots.txt file for SEO, but before that let us understand a little what is this robots.txt file.

What is Robots.txt File?

robots.txt is the name of a file inside which there is such text or code that tells the search engines which part of our site they should crawl and which part they should not crawl.

Because all search engines make a Crawl Budget for our site, for example, if you have 10 posts on your blog, then the search engine will make a budget to call 10 URLs.

The next time a search engine crawler comes to crawl your site and you haven’t created a perfect robots.txt file, it will call plugins or other useless things on your site to complete 10 urls and that’s a useful post. He can’t even crawl.

And when useful posts are not crawled, then they will be deindexed and there will be a big loss of traffic on your site.

Therefore, by creating a Perfect robots.txt file, we will tell the search engines that they should not crawl the useless things on our site that do not need to be crawled and only crawl the useful posts.

When is Robots.txt File Required?

Even if you do not create a robots.txt file, all search engines will crawl your blog, but then you will not be able to tell them which part of your site to crawl and which part not to crawl.

If you have just created your blog and there are not many pages on it, then it can run without robots.txt file. With the help of the file, set your crawling budget.

With the help of robots.txt file, we set the required page to crawl on our site and then our crawling budget is used on the useful page.

Due to the absence of robots.txt file on our site, the search engine crawlers are not able to crawl your site in one go and then come again and again for it and in the process our site is destroyed. Loading Speed Can be slow too.

That’s why by creating a robots.txt file on our site, we tell the crawler that you only crawl these useful pages, do not crawl the rest and then your entire site is crawled in one go.

What is a perfect robots.txt file?

A simple robots.txt file created by WordPress is as follows.

This file allows search engine bots to crawl your entire website including plugins, but we will optimize this file to not allow some part of your site to be crawled to save our crawl quota.

A perfect or perfect robots.txt file can be as follows.

Perfect robots.txt

This file is telling the search engine’s robots to index the worldpress image and file and is preventing WordPress admin area and readme file and affiliate link from crawling and indexing and it should be the same only then our crawl budget useful post will be saved for

We have also added a link to the Sitemap in this robots.txt file so that the search engine bots can easily find all the pages of your site and crawl them. Fasr Index Can do

Some new bloggers feel that once they publish the post by writing and putting its link in the search console, get it indexed and then it is done.

But it is not so at that time, that post gets indexed, but when the search engine bots come to your site again and they do not find that page, then that page again. Deindex it happens.

So right now we have to create robots.txt file ideally and it is necessary to add url of sitemap in it.

How to Create Robots.txt File in WordPress?

If you have yet WordPress Blog If you haven’t created a robots.txt file for your blog, WordPress may have initially created a simple file for your blog by default.

But now we will create a perfect robots.txt file by ourselves with the help of Yoast SEO plugin, first of all login to your WordPress admin panel.

If you have not yet installed Yoast SEO Plugin, then install it by going to the plugin section.

And then now in your WordPress admin panel, an icon of SEO will appear at the bottom left side, click on it and then click on this option Tools and then click on the link of File Editor in the right side. (see picture below)

create robots txt
create robots txt

As soon as you click on the link of File Editor, a button of Create robots.txt will appear in front of you.

Now click on Create robots.txt button and then a default created robots.txt file will appear in front of you. (see picture below)

Create robots.txt on yoast
Create robots.txt on yoast

Now we will not save this default created robots.txt file, just modify it and save again.

To make changes in this file, delete all the text written inside this box and note down the text given below and type it in the box.

Perfect wordpress robots.txt

Keep in mind that by deleting both the URLs of the Sitemap below, you enter the URLs of the Sitemap of your post and page, and if you have not yet created a Sitemap, then read this guide to create one. XML Sitemap Kaise Banaye,

And then to save this robots.txt file, click on the button below Save Changes to robots.txt.

Now that you have created a perfect robots.txt file for your site, now we will check it.

Testing the Robots.txt File

To Test Robots.txt File, You Open This Tool Of Google Search Console In Your Browser Robots Testing Tool,

And now by clicking on the button of please select a property here, choose the property whose robots.txt file you want to test. (see picture below)

test robots txt
test robots txt

Selecting the property will show you the robots.txt file of your site that you just created.

Another method you can use to view your robots.txt e-file live is by typing your domain name in your browser followed by robots.txt and then searching for example.com/robots. txt

By doing this you can check the robots.txt file of any site.

read this also
How to do Image SEO Optimization

Can you take blogging as a career

How To Get Your Website In Google Discover Feed

How To Get Traffic To a New Website

And finally

We optimize the robots.txt file to prevent crawling and indexing what we do not want to be made public, such as pages in the plugin folder or pages in the WordPress admin folder, etc.

Some people also disallow categories and tags in their robots.txt file, but this is not appropriate, there are some other solutions for this.

We hope that by reading this post How to Optimize WordPress Robots.txt File for SEO, you have optimized WordPress robots.txt file to improve SEO.

If you still have any query related to this post or you want to tell us your thoughts, then definitely write in the comment box below.

Leave a Comment