Using robots.txt

Per the Web Robots Page:

Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Search engines such as Google use them to index the web content, spammers use them to scan for email addresses, and they have many other uses.

The robots.txt file is located in the root of the Base Folder. 

Learn more on using robots.txt.