Sunday 16 June 2013

HOW TO SETUP CUSTOM ROBOTS.TXT IN BLOGGER | Crawlers and Indexing

1 comment

HOW TO SETUP CUSTOM ROBOTS.TXT IN BLOGGER FOR BETTER SEO

In this tutorial I will show you how to setup custom robots.txt in Blogger for better SEO.

What is Robots.txt

It is a text file which tell the search engines what part of your blog or website you want and don't want it to index.Generally,it is not mandatory that the search engines will your robots.txt but generally search engines obey what they are asked not to do.

STEPS TO SETUP CUSTOM ROBOTS.TXT FOR YOUR BLOG


1.Go TO Blogger Dashboard and click on settings

2.Then click on Search Preferences

3.In Crawlers and indexing Section you can see  custom robots.txt

4.Now click on the edit button and Select YES

5.Now a text Area will come up.Now just put the below code in that text area

User-agent: Mediapartners-Google
Disallow: 
User-agent: *
Disallow: /search?q=*
Disallow: /*?updated-max=*
Allow: /

Sitemap: http://www.chillofyblogging.blogspot.com/sitemap.xml
Just remember to change My Sitemap with your's. All you need to do is just replace my blogger url with you followed by /sitemap.xml For example:- If your blog url is www.xyz.blogspot.com Then just add /sitemap.xml to it  i.e www.xyz.blogspot.com/sitemap.xml because google have added a feature of sitemap.so,you don't have to create a sitemap.You just have to extend your blog url with sitemap.xml

1 comment:

.