The robots.txt file: a source of SEO juice just waiting to be unlocked.
Better Robots.txt :
- Allows to enable/disable 16 most popular search engine bots to crawl your website
- Detects your Sitemap index (sitemap-index.xml) from Yoast SEO plugin (or any other Sitemap generator) and adds it automatically to your Robots.txt
- Allows the addition of your own Sitemaps (manually)
- Allows the edition of your robots.txt to add custom rules
- Allows to define a specific crawl-delay for search engine bots
- Blocks bad bots from crawling your website
- Blocks most SEO crawlers in order to make backlinks undetectable.
- Boosts your online store while blocking useless links from Woocommerce
- Avoids Crawler traps generating Crawl-Budget issues
- Provides 150+ Growth Hacking tools
- …
Better Robots.txt plugin will also add standard rules in your robots.txt file to maximize search engines’ crawl budgets by telling them not to crawl the parts of your site that aren’t displayed (by default) to the public.
Want to become familiar with some syntax used in a robots.txt file. Basic robots.txt guidelines by Google.