Free Online Robots.txt Generator

Create a well-structured robots.txt file with HubKit's robots.txt generator. Configure allow and disallow rules for search engine crawlers, AI bots, and social media bots. Add sitemap references, crawl-delay directives, and LLMs.txt links — all from a visual editor that generates the correct syntax automatically. No data is sent to any server.

How to Use

Start with a preset (Allow All, Block All, Block AI Bots, or Standard) or build from scratch. Add user-agent groups and configure Allow/Disallow rules with path suggestions. Set optional crawl-delay values, add sitemap URLs, and specify an LLMs.txt URL. The generated robots.txt updates in real time. Copy or download the file.

Features

Frequently Asked Questions

What is a robots.txt file?

robots.txt is a plain text file placed at the root of a website (e.g., example.com/robots.txt) that tells web crawlers which pages they are allowed or not allowed to access. It follows the Robots Exclusion Protocol, supported by all major search engines.

Can robots.txt block AI bots like GPTBot and ChatGPT?

Yes. AI crawlers like GPTBot (OpenAI), Google-Extended, CCBot (Common Crawl), anthropic-ai, and others respect robots.txt directives. Add a User-agent group for each bot with Disallow: / to block them from crawling your site.

Does robots.txt guarantee pages will not be indexed?

No. robots.txt tells crawlers not to visit pages, but it does not remove pages from search results. For de-indexing, use the noindex meta tag or X-Robots-Tag HTTP header. Some bots may also ignore robots.txt entirely.

What is the Crawl-delay directive?

Crawl-delay tells bots to wait a specified number of seconds between requests. It helps reduce server load from aggressive crawlers. Note that Googlebot does not support Crawl-delay — use Google Search Console to set crawl rate for Google.