Robots Text File

Robots txt is one of the most important elements that you should always check when it comes to onsite SEO.

I recently came across a website which had its homepage (the most important page of any website) blocked by mistake. This is somewhat similar to inviting someone to a party and closing the main door when they arrive.

So check your websites right now if your site is blocked for search engine spiders.

Go to your browser and type http://www.example.com/robots.txt (replace example with your domain)

See if you have a file which says:

User-agent: *
Disallow: /

If it looks like the above then you are in trouble.

User-agent: * means it applies to all robots

Disallow:/ instructs the robot not to visit any of the pages on the site.

However all robots do not go according to robots text some ignore but Google usually follows robots text strictly. It is safe to use the below which will allow all robots to visit your website.

User-agent: *
Disallow:

You can use robots text to exclude pages such as logins etc from getting indexed or robots visiting them. Find out all about how to use robots text visit www.robotstxt.org/robotstxt.html

Farhan Fawzer is a Sri Lankan SEO Specialist and Online Marketing Consultant.