- One thing to do while registering with a search engine is to
create a Robot.txt file- The Robot.txt file is the invitation for bots and spiders to
crawl your website. - This file is seen as a type of permission and provides
direction for Yahoo!, MSN and Ask to your view your
website. - Create a file in notepad called robots.txt
- Add this code:
User-agent: *Disallow:
sitemap_http://www.domain.com/sitemap.xml
- The Robot.txt file is the invitation for bots and spiders to
82
Speeding It Up: Robot.txt