Nowadays, it is increasingly difficult to stand out through its website. The Web gets dense on a daily basis and organic ranking still represents one of the major challenges facing most businesses. Of course, there are more immediate short-term solutions to emerge in search engines, such as advertising (Adwords) but investment can quickly become a financial chasm in so far as the chosen positioning is not optimal and above all that the company is strictly dependent to survive.
Organic ranking allows natural growth over the long term as well as results, which have now become indisputable and relentless. But to get there, you have to put all the odds on your side and that’s where the SEO comes into play as well as all the processes allowing search engines to identify you and understand what you are doing (understanding & credibility).
But what is the relationship with Better Robots.txt?
In fact, Better Robots.txt directly intervenes on the capabilities of your website to generate organic ranking. In a “traditional” OnPage optimization process, an SEO technician will usually work at two levels, the container and the content. So what builds your website and what it contains.
Apart from the usual content optimization processes that are relatively well known to all (keywords, tags, structured data,…), the optimization of the container is often secondary or limited. We’re talking here about:
- loading optimization (Cache)
- Coding errors (script execution) and links (404)
- Canonicalization Errors
- Indexing capabilities optimization (robots.txt, sitemap)…
- (this list is long)
And it happens that in this list, the Sitemap and the Robots.txt, crucial elements associated with the content indexing of a Website, are very often forgotten or even neglected, whether the sites are made by an agency or not (!!). This reality is all the more striking than in our SEO audits (concerning all types of companies, from the most prestigious to the most modest), we were able to establish that:
- 1 times out of 5, the audited site does not have any Sitemap
- 1 times out of 15, the audited site has no robots.txt (less common)
- 4 times out of 5, the audited site contains no indexing instructions in the robots.txt (other than the bacis instructions)
- More than 9 times out of 10, the audited site does not contain the URL of the Sitemap (s) in the robots.txt
About 95% of WordPress websites are not able to deploy their full SEO potential related their content due to the lack of configuration of the Robots.txt file. It’s PURE LOSS of efficiency.
Better Robots.txt simply allows, in a few clicks, to correct this problem by the appropriate configuration of the Robots.txt, facilitating access to your content by search engines and maximizing your “past and future” creative potential (text, etc.). The idea here is to put an end to this situation by offering a simple but effective solution through a user-friendly plugin and accessible to all (without the need of any technical knowledge).
Learn more about the plugin?
How can we explain that this situation is so common?
First of all, you have to know that it is neither your fault nor that of WordPress! It is so… There are so many things to think about when creating a website, that a simple file(robots. txt) can easily get lost from the meanders of changes & adjustments inherent to Web creation.
However, as of today, if there is now one thing you need to worry about, among all the techniques SEO techniques with which you have already tried to improve the serach engine ranking of your website, the Robots.txt must be at the top of your list (either manually or through Better Robots. txt).
Take the test: what does your Robots.txt look like?
You can copy and paste the URL of your website (with http:// or https://) into the field indicated below, which will first identify if you have one, and if so, what its content is:
For comparison, here is ours: https://www.better-robots.com/robots.txt. Can you see the difference?
How did we get here?
It was both the experience and the daily practice that led us to wonder how we could solve this problem without having to connect to all the websites.
It turns out that PAGUP, the entity behind Better Robots.txt, is a SEO agency (CANADA) 100% specialized in “onPage” optimization for search engines. After having completed a very large number of SEO websites audits, we noticed the recurrence of this “sub-optimization” on most websites analysed, even from traditional and popular agencies or companies of international renown.
However, it is within the framework of our optimization services that we have seen the almost immediate impact that could generate the optimum configuration of this file. It turns out that our optimization processes, initially dedicated to wordPress websites, consisted of two steps. The first part, more technical, at the beginning of the mandate, consisting of correct recurring technical errors (which included robots.txt and sitemap) and a second, a little later (2-3 weeks), strictly content-oriented. And it is by daily monitoring the organic ranking of these websites that we have observed that, in the vast majority of cases, some weeks after the first part of the mandate (and without touching the second stage, thus the content), these websites had acquired much more ranking (more occupied positions in the SERPs) with, for some, a 150% increase of keywords.
After more than 250 similar observations, under the same circumstances, we came to the conclusion that a well configured / optimized robots.txt could have a massive and significant impact on organic performance (SEO). And because, at the time, there was no solution available on WordPress allowing to automate or simplify this optimization process for the largest number, we decided to create better Robots.txt.
A simple innovation but that will change your business on the web!