I had already written about 'blogger custom robot header tags'. Custom robot header tags are generally some predefined tags that tells to crawlers about what crawler will crawl. If you want customize it fully then robots.txt helps you. Today I am going to discuss about it.
In Blogger, You can't go into the root directory of your blog but you can access your robots.txt file by adding /robots.txt to the root directory of your blog or website. To see you robots.txt simply go to "yourdomain.blogspot.com/robots.txt".
(User-agent: *) means all kinds crawlers or robots and asterisk(*) means we set a command for all robots/crawlers.
(Disallow) is same as 'Allow: /' but it is not same as 'Disallow: /'. If you don't have any value in 'Disallow:' it means you are allowing everything to crawl and index while if you have "/" value into disallow as 'Disallow: /' you are disallowing everything to crawl and index by the crawlers.
(Allow) means web crawlers can crawl and index our blog’s homepage.
(sitemap) By default setting of sitemap the robots crawl only recent 25 posts. To increase the number replace default sitemap with below one. It will crawl first 500 posts.
--> Sitemap: http://yourdomain.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
If you have more than 500 then use two sitemaps like below:
--> Sitemap: http://yourdomain.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: http://yourdomain.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000
--> Set custom robot header tags in blogger.
Thats all for today's post. If you have any question do comment below. Do share and subscribe to support me. Happy blogging.
What is Robots.txt?
Generally a robots.txt file is located in the root directory of blog. Search engine scan that robots.txt file frequently, if the spiders/crawler find new pages or posts then they will index them to search engine that you are allowingIn Blogger, You can't go into the root directory of your blog but you can access your robots.txt file by adding /robots.txt to the root directory of your blog or website. To see you robots.txt simply go to "yourdomain.blogspot.com/robots.txt".
How To Create Robots.txt file?
There are basically 4 things in robots text like 'useragent', 'allow', 'disallow', 'sitemap'.(User-agent: *) means all kinds crawlers or robots and asterisk(*) means we set a command for all robots/crawlers.
(Disallow) is same as 'Allow: /' but it is not same as 'Disallow: /'. If you don't have any value in 'Disallow:' it means you are allowing everything to crawl and index while if you have "/" value into disallow as 'Disallow: /' you are disallowing everything to crawl and index by the crawlers.
(Allow) means web crawlers can crawl and index our blog’s homepage.
(sitemap) By default setting of sitemap the robots crawl only recent 25 posts. To increase the number replace default sitemap with below one. It will crawl first 500 posts.
--> Sitemap: http://yourdomain.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
If you have more than 500 then use two sitemaps like below:
--> Sitemap: http://yourdomain.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: http://yourdomain.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000
--> Set custom robot header tags in blogger.
Add Custom Robots.Txt to Blogger
To add a custom robots.txt Go to your blogger blog-> Settings -> Search Preferences -> Crawlers and indexing -> Custom robots.txt -> Edit -> Yes ->Now paste your robots.txt file code -> Save change.Best robots.txt for blogger
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://yourdomain.blogspot.com/sitemap.xml
Thats all for today's post. If you have any question do comment below. Do share and subscribe to support me. Happy blogging.
How to create and add custom robots.txt file in Blogger
Reviewed by avi
on
January 24, 2017
Rating:
No comments: