How to Use This Generator
A robots.txt
file tells search engine crawlers which pages or files the crawler can or can't request from your site. It acts as a guide, not an unbreakable command.
- Use Presets (Optional): For a quick start, click a preset button like "WordPress" to load a standard configuration.
- Set Sitemap: Enter the full URL to your sitemap. This helps search engines find all your important pages.
- Define Rules:
- User-agent: Specify the bot you're making rules for. Use
*
for all bots, or be specific (e.g., Googlebot
, Bingbot
).
- Add Paths: Click "Add Disallow" or "Add Allow" to create a new input field for each path you want to block or permit.
- Copy the Code: The text in the "Live Preview" box updates automatically. Once you are satisfied, click the "Copy" button.
How to Implement on Your Website
- Create a new plain text file on your computer.
- Paste the copied code into this file.
- Save the file with the exact name
robots.txt
.
- Upload this file to the root directory of your website. This is usually the
public_html
, www
, or main domain folder.
- Verify it's working by visiting
https://yourwebsite.com/robots.txt
. You should see the text you pasted.