What's New in the Google Index Coverage Report?
Posted: Mon Dec 09, 2024 7:19 am
Last week, Google announced via a post on its official blog and a tweet on its Twitter account that it would be making some changes to the “Index Coverage” report in Search Console. So what are these changes and, more importantly, why do they concern us?
What is an Index Coverage Report?
First of all, I want to touch on this briefly, the Index Coverage report is a mini report that we can access via Search Console and check the index status of our pages. In this report, we can access data such as how many of our pages have problems, how many of our pages are in the index, how many of them we have deliberately excluded so that they do not enter the index. Or rather, we could access it.
What's New in the Index Coverage Report?
You can see some of the data that Search Console previously job seekers database shared in the index coverage report in the image below. First of all, do not panic, this data is still available. On the contrary, Google made a development that will make us happy this time and announced that it will start reporting some superficial data in this report in more detail. So what are these innovations?
First of all, Google announced that it will now provide a more detailed report titled “crawl anomaly”. This is really important for us because when we saw this error, we would roll up our sleeves to do a more detailed investigation with a question like; Crawl problem? Which one? We will no longer need to do this because Google will report to us in more detail which crawl problem it encountered on the relevant URL. We will both save time and reduce the risk of missing a problem on pages with multiple problems.
The “Submitted but blocked” error we received before will now turn into a “blocked but indexed” warning. In other words, if we have URLs that managed to enter the index even if we blocked them from robots.txt, we will now see this under the “blocked but indexed” warning.
With this new update, a completely new report, the “Indexed without content” warning, is also entering our lives. This error, which we can translate as “Indexed without content”, means that Google came to your page and indexed it, but there is no content on this page.
So why is it important to receive this warning?
First, if you are receiving this warning for certain pages, you will have an idea of which pages do not contain content. This way, you will be able to make the necessary improvements for these pages.
Secondly, when you check the pages that you receive this warning, you may notice that they actually have content. This means that Google may not be able to see or render this content. If you have pages that you receive this warning on but that contain content, you can eliminate this problem by finding out why Google cannot see this content. I recommend using the “URL Inspection” tool in Search Console to find out why the content cannot be seen.
The latest update was announced as the “soft 404” report becoming more consistent. With this Search Console update, Google will now be able to more successfully separate your pages that should actually be 404, even if their status code is 200. Thus, you will be able to solve this problem by renewing/correcting the content of these pages or by changing them to a 404 status code and then proceeding strategically as you wish.
I think that especially the detailed “crawl anomaly” report and the addition of the “indexed without content” report will make our work much easier in the coming days.
If you would like to get more detailed information or review Google's official statement, you can check out this link .
What is an Index Coverage Report?
First of all, I want to touch on this briefly, the Index Coverage report is a mini report that we can access via Search Console and check the index status of our pages. In this report, we can access data such as how many of our pages have problems, how many of our pages are in the index, how many of them we have deliberately excluded so that they do not enter the index. Or rather, we could access it.
What's New in the Index Coverage Report?
You can see some of the data that Search Console previously job seekers database shared in the index coverage report in the image below. First of all, do not panic, this data is still available. On the contrary, Google made a development that will make us happy this time and announced that it will start reporting some superficial data in this report in more detail. So what are these innovations?
First of all, Google announced that it will now provide a more detailed report titled “crawl anomaly”. This is really important for us because when we saw this error, we would roll up our sleeves to do a more detailed investigation with a question like; Crawl problem? Which one? We will no longer need to do this because Google will report to us in more detail which crawl problem it encountered on the relevant URL. We will both save time and reduce the risk of missing a problem on pages with multiple problems.
The “Submitted but blocked” error we received before will now turn into a “blocked but indexed” warning. In other words, if we have URLs that managed to enter the index even if we blocked them from robots.txt, we will now see this under the “blocked but indexed” warning.
With this new update, a completely new report, the “Indexed without content” warning, is also entering our lives. This error, which we can translate as “Indexed without content”, means that Google came to your page and indexed it, but there is no content on this page.
So why is it important to receive this warning?
First, if you are receiving this warning for certain pages, you will have an idea of which pages do not contain content. This way, you will be able to make the necessary improvements for these pages.
Secondly, when you check the pages that you receive this warning, you may notice that they actually have content. This means that Google may not be able to see or render this content. If you have pages that you receive this warning on but that contain content, you can eliminate this problem by finding out why Google cannot see this content. I recommend using the “URL Inspection” tool in Search Console to find out why the content cannot be seen.
The latest update was announced as the “soft 404” report becoming more consistent. With this Search Console update, Google will now be able to more successfully separate your pages that should actually be 404, even if their status code is 200. Thus, you will be able to solve this problem by renewing/correcting the content of these pages or by changing them to a 404 status code and then proceeding strategically as you wish.
I think that especially the detailed “crawl anomaly” report and the addition of the “indexed without content” report will make our work much easier in the coming days.
If you would like to get more detailed information or review Google's official statement, you can check out this link .