Skip to main content

Robots.txt Generator

New

Create robots.txt files to control how search engines crawl and index your website.

4.5(127 ratings)

Rate this tool

How to Use

1

Select user-agent

Choose which search engine bots to create rules for, or use '*' for all bots.

2

Add allow/disallow rules

Specify which paths to allow or block. Use the quick-add buttons for common patterns.

3

Add sitemap and download

Add your sitemap URL, then download the robots.txt file to upload to your server.

Features

Multiple user-agents
Common patterns library
Sitemap support
Download file
Crawl delay

Frequently Asked Questions

Robots.txt is a text file placed in your website's root directory that tells search engine crawlers which pages or sections they can or cannot access. It helps control how search engines crawl and index your site.

The robots.txt file must be placed in your website's root directory. For example, if your domain is example.com, the file should be accessible at example.com/robots.txt.

Create a robots.txt file to control how search engines crawl your website. Allow or block specific bots, protect sensitive directories, and optimize your crawl budget with ease.

Did You Know?

The robots.txt standard was created in 1994 and is one of the oldest web standards still in use. It's based on the Robots Exclusion Protocol, a voluntary agreement between webmasters and search engines - there's no enforcement mechanism!

  • robots.txt is a suggestion, not a security measure - don't use it to hide sensitive data
  • Use 'User-agent: *' to apply rules to all bots
  • Always include your sitemap URL in robots.txt

Explore Other Categories

Discover more useful tools from different categories