Skip to content

Features

Better Robots.txt is organized around practical control areas rather than a single raw file editor.

Search engine visibility

Choose between visibility levels or advanced control to decide how broadly you want to expose the site to search crawlers.

AI & LLM governance

Define how AI training bots and AI search systems should be treated. You can also publish optional machine-readable guidance such as llms.txt.

SEO tool protection

Restrict SEO crawlers such as Semrush, DotBot, Ahrefs, and similar agents when your plan allows it.

Bad bot protection

Enable a curated blocklist and, on higher editions, expand the list further.

Archive & Wayback control

Choose whether archive services may store public copies of the site.

Spam, feeds, and crawl traps

Reduce low-value crawling from feed endpoints, search URLs, comment spam parameters, and trap parameters.

WooCommerce cleanup

Keep cart, checkout, account, and parameter-heavy store routes from consuming unnecessary crawl budget.

Resources & assets

Decide whether CSS, JavaScript, and image rules should be explicitly allowed.

Social media crawlers

Allow or restrict social preview bots based on your sharing goals.

Ads & revenue

Keep ads.txt and app-ads.txt verification paths accessible when you need them.

Review & Save

Preview the generated output before publishing it.

Editions

Some features are available in all editions, while others are reserved for Pro or Premium. Use the pricing page to compare Free, Pro, and Premium clearly: Compare plans

Better Robots.txt — human-friendly, machine-first documentation for WordPress crawl governance.