If you are a WordPress user, as a company or agency, you certainly know Yoast SEO, probably one of the best plugins available on WordPress to optimize your content and performance on search engines.
That’s why, in the design of Better Robots.txt plugin, we naturally thought we would associate with the best to offer… the best of optimization. Because it’s all about that, optimize your website to get the maximum efficiency of your content.
Did you know? Yoast SEO allows you to configure a lot of items inside your website, either META data, all sitemaps associated with your pages, articles, products, … Even the edition of your robots.txt file and the .htaccess (Advanced Configuration). But as of today, there was still no solution for combining “Sitemap” and “Robots.txt” with management of indexing instructions for search engines. And that’s exactly what Better Robots.txt proposes with the complicity of YOAST via the sitemap index.
But why is Robots.txt so important?
For the record, in 2006, Yahoo, Microsoft and Google joined forces to support and deploy the standard protocol of submission of pages of a site using sitemaps. Since then, several methods have been put in place to invite Web designers (agency, developer) to religiously submit their different sitemaps via tools such as Google for webmasters (Google Search Console), Bing for webmasters, Yahoo while for other search engines such as DuckDuckGoGo (for example), they use results directly from Bing / Yahoo.
After about six months, in April 2007, Yahoo, Microsoft and Google (still) joined a sitemap search system via robots.txt called “sitemap autodiscovery”. This meant that even if you had not submitted a sitemap to individual search engines (through a traditional listing), search engines could still find the location of the sitemap from the robots.txt file of your site (considering that the one was read by default by the search engines).
(NOTE: Sitemap submission is still performed on most search engines allowing the URL submission)
Consequently, from that moment on, the robots.txt file became an essential tool (in a context of communication with search engines) because it allowed, apart from any form of listing, to open the way to search engine indexing robots allowing the exploration and indexing of a website with which it is associated.
Hence, the importance of Robots.txt and its optimization!
What does Better robots.txt do?
Better Robots.txt, among other features (see description) allows, first, to detect if YOAST SEO is actually installed on the website (which we recommend) and if the Sitemap generation functionality is enabled and then automatically add the URL of the YOAST sitemap index (containing all sitemaps of the site), effectively, in the robots.txt file. Of course, if you use another SEO plugin (than YOAST), it is completely possible to manually add your sitemaps but we thought, given the success (to several millions of downloads of YOAST), that it was more appropriate to associate with a known reference of all.
How to add the Sitemap in the robots.txt can generate a difference?
In its simplest form, a sitemap is an XML file (.xml) that lists the URLs of a site by integrating additional metadata for each URL (last update date, frequency at which it usually changes, and importance in relation to the other URLs of the site) so that search engines can more intelligently explore the site. In fact, therefore, the Sitemap, in general, contains very valuable information for crawlers, it is dates of last changes. Given that the robots.txt is the very first file read when your site is loaded in a browser, take advantage of this opportunity to invite the crawlers to browse your sitemaps (and thus identify new pages or modified pages) is the assurance that your most recent content will actually be indexed and displayed in the search results.
Looking for performance ? Have a look on our list of 150+ Growth Hacking tools
Please note that Lifetime users/plans get a free listing on Baidu Webmaster Tools.