robots.txt

robots.txt files let you tell search engines which pages they can and can't index.

To define a robots.txt file, create a file called robots.txt in your pages/templates/ directory.

Here's an example of a robots.txt file that lets all bots crawl all pages, and points to your sitemap for reference:

User-agent: *
Allow: /

Sitemap: /sitemap.xml

See Also