Support Feel free to contact us.
If you have any questions or troubles about Better Robots.txt plugin.
Frequently Asked Questions
What is a robots.txt file ?
What is a sitemap ?
What is a Crawl-Delay ?
A line as follows below would lead to Yahoo! and Bing waiting 10 seconds after a crawl action. Yandex would only access your site once in every 10 second timeframe. A semantic difference, but interesting to know. Here’s the example crawl-delay line: crawl-delay: 10
Do take care when using the crawl-delay directive. By setting a crawl delay of 10 seconds you’re only allowing these search engines to index 8,640 pages a day. This might seem plenty for a small site, but on large sites it isn’t all that much. On the other hand, if you get 0 to no traffic from these search engines, it’s a good way to save some bandwidth.
Why adding your sitemap in the Robots.txt file can make a difference on your ranking?
Back in 2006, Yahoo, Microsoft and Google united to support the standardized protocol of submitting pages to a site via sitemaps. You were required to submit your sitemaps through Google webmaster tools, Bing webmaster tools, Yahoo while some other search engines such as DuckDuckGoGo uses results from Bing/Yahoo.
After about six months, in April 2007, they joined in support of a system of finding the sitemap via robots.txt called autodiscovery of sitemaps. This meant that even if you did not submit the sitemap to individual search engines it was OK. They would find the sitemap location from your site’s robots.txt file first.
(NOTE: Sitemap submission is still, however, done on most search engines that allow submissions of URL)
And hence, robots.txt file became even more significant for webmasters because they can easily pave way for search engine robots to discover all the pages on their website.
Which sitemap will be added by the plugin ?
Better Robots.txt will detect your configuration and add your Sitemap index URL (sitemap_index.xml) containing all your sitemaps (page, post, category, etc.) in the Robots.txt file.
Once activated, Better Robots.txt plugin will add it automatically.
However, if you want to use another sitemap (from another SEO/sitemap plugin), simply copy-paste it in the empty field dedicated to the Sitemap and save.
How to add more instructions in the Robots.txt file ?
Make sure to not delete or add content if you don’t know of what you are doing. This file is more than nevralgic and any misuse practice could lead to serious trouble with search engines (and consequently a significative loss of ranking).
Also, there is one thing you need to pay attention to when adding manually the Sitemap directive to the robots.txt file.
Generally, it is advised to add the ‘Sitemap’ derivative along with the sitemap URL anywhere in the robots.txt file. But in some cases it has known to give some parsing errors. You can check Google Webmaster Tools for any such errors detected, about a week after you have updated your robots.txt file with your sitemap location.
To avoid this error it is recommended that you leave a line space after the sitemap URL.
Write a review ? Our press team loves working with experts and journalists around the world to share their experience with our product.
If you want to write a review about Better Robots.txt,
we can offer you a free license to make it.
Send an email to [email protected] to be in touch with our team.