Support Feel free to contact us.

If you have any questions or troubles about Better Robots.txt plugin.


    We reply to every message, please check your Spam folder if you haven’t heard from us.

    Frequently Asked Questions

    What is a robots.txt file ?

    Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. Robots.txt files indicate whether certain user agents (web-crawling software) can or cannot crawl parts of a website. These crawl instructions are specified by “disallowing” or “allowing” the behavior of certain (or all) user agents.

    What is a sitemap ?

    Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

    Source

    What is a Crawl-Delay ?

    Supported by Yahoo!, Bing and Yandex the crawl-delay directive can be very useful to slow down these three, sometimes fairly crawl-hungry, search engines. These search engines have slightly different ways of reading the directive, but the end result is basically the same.

    A line as follows below would lead to Yahoo! and Bing waiting 10 seconds after a crawl action. Yandex would only access your site once in every 10 second timeframe. A semantic difference, but interesting to know. Here’s the example crawl-delay line: crawl-delay: 10

    Do take care when using the crawl-delay directive. By setting a crawl delay of 10 seconds you’re only allowing these search engines to index 8,640 pages a day. This might seem plenty for a small site, but on large sites it isn’t all that much. On the other hand, if you get 0 to no traffic from these search engines, it’s a good way to save some bandwidth.

    Source

    Why adding your sitemap in the Robots.txt file can make a difference on your ranking?

    A sitemap is an XML file that contains a list of all webpages on your site. It may also contain additional information about each URL in the form of meta data. And just like robots.txt, a sitemap is a must-have. It helps search engine bots explore, crawl and index all the webpages in a site through the sitemap.

    Back in 2006, Yahoo, Microsoft and Google united to support the standardized protocol of submitting pages to a site via sitemaps. You were required to submit your sitemaps through Google webmaster tools, Bing webmaster tools, Yahoo while some other search engines such as DuckDuckGoGo uses results from Bing/Yahoo.

    After about six months, in April 2007, they joined in support of a system of finding the sitemap via robots.txt called autodiscovery of sitemaps. This meant that even if you did not submit the sitemap to individual search engines it was OK. They would find the sitemap location from your site’s robots.txt file first.

    (NOTE: Sitemap submission is still, however, done on most search engines that allow submissions of URL)

    And hence, robots.txt file became even more significant for webmasters because they can easily pave way for search engine robots to discover all the pages on their website.

    Source

    Which sitemap will be added by the plugin ?

    Better Robots.txt plugin was made to use YOAST SEO plugin (probably the best SEO plugin) if, and only if YOAST SEO plugin is installed (and activated) and that the sitemap feature is also activated.

    Better Robots.txt will detect your configuration and add your Sitemap index URL (sitemap_index.xml) containing all your sitemaps (page, post, category, etc.) in the Robots.txt file.

    Once activated, Better Robots.txt plugin will add it automatically.

    However, if you want to use another sitemap (from another SEO/sitemap plugin), simply copy-paste it in the empty field dedicated to the Sitemap and save.

    How to add more instructions in the Robots.txt file ?

    When we made this plugin, we tought that you might want to add additional information in your robots.txt file (in order to specify some instructions). That’s why we have created a “Custom rules” access to the robots.txt content.

    Make sure to not delete or add content if you don’t know of what you are doing. This file is more than nevralgic and any misuse practice could lead to serious trouble with search engines (and consequently a significative loss of ranking).

    Also, there is one thing you need to pay attention to when adding manually the Sitemap directive to the robots.txt file.

    Generally, it is advised to add the ‘Sitemap’ derivative along with the sitemap URL anywhere in the robots.txt file. But in some cases it has known to give some parsing errors. You can check Google Webmaster Tools for any such errors detected, about a week after you have updated your robots.txt file with your sitemap location.

    To avoid this error it is recommended that you leave a line space after the sitemap URL.

    Looking for Growth Hacking Tools? Get our list of 150+ tools to skyrocket your website

    Write a review ? Our press team loves working with experts and journalists around the world to share their experience with our product.

    If you want to write a review about Better Robots.txt,
    we can offer you a free license to make it.

    Send an email to [email protected] to be in touch with our team.

    Support

    Thousands of People and
    Companies Love Using Better Robots.txt PRO