Premium ebaseSEO Sponsor

Wednesday, October 13, 2010

Robots.txt for SEO

What is Robot.txt?

For those that are unfamiliar with Robots.txt on web 2.0, this is a text file that should sit in the root of your websites directory. The Robots.txt (RT) file controls what the search engines can look at and index throughout your website.
Simply a robots.txt file is a set of instructions that tell search engine robots which pages of your site to be crawled and indexed, these bots are automated, and before they access any sections of a site, they check to see if a robots.txt file exists that prevents them from indexing certain pages. In many  cases, your site is consist of many files or folders like  admin folders, cgi-bin, image folder, flash content file,  which are not relevant to the search engines.

Benefits of robot.txt file
Many site owners who don't have much knowledge of SEO sometimes avoid to use of robots.txt file but the main benefit for making a robot file is to improve site indexation by telling search engine crawler to only index your content pages and to ignore other page.

Reasons for using a robots.txt file?

  • Information you don’t want to be searched or crawled by search engines
  • to secure from the duplicate content

How to create a robots.txt file?

The robots.txt file is just a simple text file. To create your own robots.txt file, open a new document in a simple text editor (e.g. notepad).

The content of a robots.txt file consists of “records” which tell the specific search engine robots what to index and what not to access.

Each of these records consists of two fields – the user agent line (which specifies the robot to control) and one or more Disallow lines.

The URL path (web address) of your robots.txt file should look like this...

Things to remember

  1. Don't add “Allow” command. Everything is allowed by default.
  2. Don’t add more one common file or directory to a disallow command.
  3. Don’t add  “disallow” before the user agent as shown in figure.

A big Question?
Is the Robot.txt file is the responsibility of an SEO strategy or something much more?

Yeah it is for SEO purposes, you’ll generally want all search engines indexing the same content, so using “User-agent: *” is the best strategy.

If you do not have a robots.txt file, your server logs will return 404 errors whenever a bot tries to access your robots.txt file. You can upload a blank text file named robots.txt in the root of your site.

And it’s always a good idea to check your robots.txt file regularly as websites are constantly changing – so make sure your current robots.txt file is up to date.

So if you haven’t created a robots.txt file yet for your website – get go


  1. Thanks for sharing knowledge about your Affordable SEO Services Company India. Offshore SEO experts is an organic SEO Company India offers affordable and cheap SEO services including onpage, offpage optimization, Social Media Optimization and SEM services

  2. Hi this is an interesting post, thanks for the wonderful explanation. my company is facing same issue and here is the great solution….so now will try and get back to you