Format: robots.txt and XML sitemap (sample structure).
Context: You configure crawlability for optimal indexing.
Task: Generate robots.txt and sitemap structure for {{SITE_DOMAIN}}, {{NUM_PAGES}} pages.
Constraints: Block unnecessary dirs, allow important pages, XML sitemap with priorities, standard format.
Do NOT: Block important content, create overly restrictive rules, miss priority pages.
0 copies
Variables
Replace these variables with your own values before using: