How To Add Custom Robots.txt File in Blogger
Robots.txt is a simple text file that sites in the root directory of your site. It tells “robots” (such as search engine spiders) which page to crawl on your site, which pages to ignore.

While not essential, the Robots.txt file gives you a lot of control over how Google and other search engines see your site. A better SEO is also needed to get organic traffic from search engines. One great step towards a better SEO is adding a robots.txt file to your blog.

What is Robots.txt File?

Robots.txt is a text file which contains few lines of simple code. It is saved on the website or blog's server which instruct the web crawlers on how to index and crawl your blog in the search results.
That means you can restrict any web page on your blog from web crawlers so that it can't get indexed in search engines like your blog labels page, your demo page or any other pages that are not as important to get indexed.
Always remember that search crawlers scan the robots.txt file before crawling on any web page. It tells the search engines crawlers about what page to crawl and what not to. In Blogger (Blogspot) we have the option to customize the robots.txt file according to our needs.

Uses of Robots.txt

Robots.txt is not an essential document for a website. Your site can rank and grow perfectly well without this file. However, using the Robots.txt does offer some benefits:

Discourage bots from crawling private folders 

Although not perfect, disallowing bots from crawling private folders will make them much harder to index – at least by legitimate bots (such as search engine spiders).

Control resource usage 

Every time a bot crawls your site, it drains your bandwidth and server resources – resources that would be better spent on real visitors. For sites with a lot of content, this can escalate costs and give real visitors a poor experience. You can use Robots.txt to block access to scripts, unimportant images, etc. to conserve resources.

Prioritize important pages 

You want search engine spiders to crawl the important pages on your site (like content pages), not waste resources digging through useless pages (such as results from search queries). By blocking off such useless pages, you can prioritize which pages bots focus on.

Steps for Adding Robots.txt to Blogspot Blogs:

To add Custom Robots.txt file to your blog, follow the basic steps mentioned below:

  • Go to your blogger blog.
  • Navigate to Settings >> Search Preferences ›› 
    Search Preferences
  • Crawlers and indexing ›› Custom robots.txt ›› Edit ›› Yes
  • Now paste the below robots.txt file code in the box.

Note: Replace with your URL

What Is The Meaning Of Above Codes?

This code is divided into three sections. Let's first study each of them after that we will learn how to add a custom robots.txt file in Blogspot blogs.
User-agent: Mediapartners-GoogleThis code is for Google Adsense robots which help them to serve better ads on your blog. Either you are using Google Adsense on your blog or not simply leave it as it is.
User-agent: *  - This is for all robots marked with an asterisk (*). In default settings, our blog's labels links are restricted to indexed by search crawlers that mean the web crawlers will not index our labels page links because of below code.
Disallow: /search - That means the links having keyword search just after the domain name will be ignored. See below example which is a link of label page named SEO.
And if we remove Disallow: /search from the above code then crawlers will access our entire blog to index and crawl all of its content and web pages.
Here Allow: /  - refers to the Homepage that means web crawlers can crawl and index our blog's homepage.