The Robots.txt Generator helps you create simple crawler instructions for search engines. Add a user-agent, allow and disallow paths, and an optional sitemap URL to generate clean robots.txt text.

The tool is designed for straightforward rules and runs entirely in your browser.

Generate Robots.txt

How to Use

1

Set User-agent

Use * for all crawlers or enter a specific crawler name.

2

Add Allow and Disallow Paths

Enter one path per line, such as / or /admin/.

3

Add Sitemap and Copy

Include your sitemap URL, generate the output, and copy it into your robots.txt file.

Why Use This Tool?

Simple crawler rules: Build a clean robots.txt file without memorizing syntax.

Sitemap support: Add a sitemap reference for search engines.

Local generation: No external API or server processing is used.

Practical Example

A site owner can allow the main website, disallow private admin paths, and include the XML sitemap URL in one clean robots.txt output.

FAQ

What is robots.txt? ?
It is a plain-text file that gives crawler instructions for allowed and disallowed paths.
Does robots.txt remove pages from Google? ?
No. It controls crawling behavior. Use noindex or other SEO controls when you need indexing rules.
Can I add a sitemap? ?
Yes. Enter a full sitemap URL and the tool adds a Sitemap line.