Stay organized with collections
Save and categorize content based on your preferences.
Monday, January 11, 2021
Helping people understand how Google crawls and indexes their sites has been one of the main objectives of Search Console since its
early days. When we launched the
new Search Console, we also introduced the
Index Coverage report, which shows the indexing state of URLs
that Google has visited, or tried to visit, in your property.
Based on the feedback we got from the community, today we are rolling out significant improvements to this report so you’re better informed on issues
that might prevent Google from crawling and indexing your pages. The change is focused on providing a more accurate state to existing issues,
which should help you solve them more easily. The list of changes include:
Removal of the generic "crawl anomaly" issue type - all crawls errors should now be mapped to an issue with a finer resolution.
Pages that were submitted but blocked by robots.txt and got indexed are now reported as "indexed but blocked" (warning) instead of "submitted but blocked" (error)
The changes above are now reflected in the index coverage report so you may see new types of issues or changes in counts of issues. We hope that
this change will help you better understand how we crawl and index your site.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],[],[[["\u003cp\u003eGoogle Search Console's Index Coverage report has been significantly improved to provide website owners with more accurate and detailed information about indexing issues.\u003c/p\u003e\n"],["\u003cp\u003eThe update includes the removal of the generic "crawl anomaly" issue, refined categorization of indexing errors, and the addition of a new "indexed without content" warning.\u003c/p\u003e\n"],["\u003cp\u003eThese changes aim to make it easier for users to identify and resolve issues that prevent Google from crawling and indexing their web pages, ultimately improving their site's visibility in search results.\u003c/p\u003e\n"],["\u003cp\u003eGoogle encourages users to provide feedback on the updated report through the Search Central Help Community or Twitter.\u003c/p\u003e\n"]]],["The Index Coverage report in Google Search Console has been updated to provide more accurate information on website indexing. Changes include eliminating the \"crawl anomaly\" issue, replacing \"submitted but blocked\" with \"indexed but blocked\" for robots.txt conflicts, and adding a new \"indexed without content\" warning. Soft 404 reporting is now more precise, offering users a more detailed understanding of crawling and indexing issues. Feedback on these improvements is encouraged.\n"],null,["Monday, January 11, 2021\n\n\nHelping people understand how Google crawls and indexes their sites has been one of the main objectives of Search Console since its\n[early days](/search/blog/2005/11/more-stats). When we launched the\n[new Search Console](/search/blog/2018/01/introducing-new-search-console), we also introduced the\n[Index Coverage report](https://support.google.com/webmasters/answer/7440203), which shows the indexing state of URLs\nthat Google has visited, or tried to visit, in your property.\n\n\nBased on the feedback we got from the community, today we are rolling out significant improvements to this report so you're better informed on issues\nthat might prevent Google from crawling and indexing your pages. The change is focused on providing a more accurate state to existing issues,\nwhich should help you solve them more easily. The list of changes include:\n\n- Removal of the generic \"crawl anomaly\" issue type - all crawls errors should now be mapped to an issue with a finer resolution.\n- Pages that were submitted but blocked by robots.txt and got indexed are now reported as \"indexed but blocked\" (warning) instead of \"submitted but blocked\" (error)\n- Addition of a new issue: \"[indexed without content](https://support.google.com/webmasters/answer/7440203#indexed_no_content)\" (warning)\n- `Soft 404` reporting is now more accurate\n\n\nThe changes above are now reflected in the index coverage report so you may see new types of issues or changes in counts of issues. We hope that\nthis change will help you better understand how [we crawl and index](https://www.google.com/search/howsearchworks/crawling-indexing/) your site.\n\n\nPlease share your feedback about the report through the [Search Central Help Community](https://support.google.com/webmasters/community)\nor via [Twitter](https://twitter.com/googlesearchc).\n\nPosted by Tal Yadid, Software Engineer, Search Console"]]