logo-small
Features Prices
News 0
Latest News See All

Temporarily unavailable. Please come back later.

See All
Webinars 0
Upcoming Webinars See All
Upcoming Webinars

Sorry, we could not find any upcoming webinars.

See recorded webinars
Blog 0
Recent Posts See All

Temporarily unavailable. Please come back later.

See All
Fili Wiese

Top 5 Google SEO Performance Indicators

Fili Wiese
Top 5 Google SEO Performance Indicators

Search Engine Optimization (SEO) is all about making websites better understood by search engines instead of merely hoping they would figure it all out by themselves. It’s both an art and a science.

While many algorithm updates and policies remain obscured, Google does provide some performance indicator insights into future search engine visibility of large sites in Google Search Console.

Read on to discover my top five SEO performance indicators that matter--and how to read their signals!

Page Download Speed

User experience focused Internet companies are obsessed with speed and Google is one of them. A number of studies such as this one on the impact of HTML delays have demonstrated that latency kills conversion with users, even long term. With faster devices and Internet connections becoming widely available, user tolerance for latency has declined in the years since.

This is why page download speed matters tremendously. Google Search Console (GSC, formerly Google Webmaster Tools) indicates the average time required for the Googlebot to crawl a page. The aim must be at keeping the value as low as possible, preferably consistently under 500 milliseconds.

Increasing values such as indicated on the graph down below alarmingly exceed benchmark parameters and will likely over time reduce the number of pages crawled per day.
Page Download Speed 500 milliseconds is currently a good benchmark value for page speed

Pages Crawled Per Day

The faster the website, the more likely it can handle more connections for Googlebot. The average volume of pages crawled by Googlebot on a single day tends to be in decline if average page download speed goes up as Google tries to automatically figure out what is a safe margin to crawl the website in question.

These two values are a strong signal that a website is not living up to it’s full potential neither in Google’s organic search nor for its users. Both indicators combined and failing ­growing time spent downloading and less pages crawled per day­ almost always indicate site performance issues that are likely to impact organic search visibility over time. Infrastructure and architecture must be examined and will have to be improved in order to revert these trends.
Pages Crawled Daily Google Search Console indicates the number of pages crawled daily

Sitemap Pages vs. Indexed Pages

Significant discrepancies between the number of the indexed pages and the pages in XML Sitemaps submitted to GSC can indicate serious SEO issues. These can range from having too many orphan pages, a broken internal linking structure, duplicate content or slow page speed, to name just a few. There are many reasons why Google may not crawl and/or index all the URLs listed in the XML sitemaps.

To get the most useful data possible from the GSC Sitemaps feature, it's imperative to submit all unique canonicals of the website in the XML sitemaps and ­when possible­ divide the XML Sitemap by type of URL into several files to submit each independently to GSC, that way where the discrepancies on the website occur can be better identified.
Sitemap Pages vs. Indexed Pages Growing discrepancy between submitted and indexed pages indicates a serious issue

Crawled Pages vs. Indexed Pages

Another important indicator for site performance is the ratio of pages crawled per day vs pages indexed in Google. The value is the answer to an important question: how many pages does Google need to crawl to index and rank just one? Experience shows that for large, commercial sites a ratio of 10/1 is a reasonable value. Ten to one hundred times less favorable ratios are unfortunately not unheard of.

If identified, such trends may indicate site resources are not being used efficiently, possibly due to structural and content issues, which are likely to negatively impact site performance in organic search.

In short: the site’s true SEO potential is not being fully explored. Managing a sites crawler budget more efficiently will however require a step beyond evaluating GCS data and diving log files in order to identify which pages are being crawled by the Googlebot.
Crawled Pages vs Indexed Pages How many pages does Google crawl to index one?

Structured Data

There are few good reasons for site wide structured data, such as structured breadcrumbs or the organization schema, to be in decline unless recent content review initiatives lead to a wide­spread noindexing. The structured data trend in GSC, compared against both pages submitted through sitemaps and against the number of pages actually indexed, can be a site health indicator.

Once again significant deviations may indicate inconsistent structured data or worse, crawling and indexing issues holding a website back in organic search.
Structured Data Decline in structured data over time may indicate site health issues

All of the SEO perfomance indicators from Google Search Console allow a glimpse into how well a large site does in organic search, they rarely work for small sites that can not provide enough data to draw any actionable conclusions. On top of that most data available in GSC is limited to the past 90 days.

It is important to keep in mind that while Google Search Console is an indispensable ­and free­ SEO data source it is anything but fail­safe and that the data provided must be considered with a grain of salt. Occasional spikes, may or may not indicate an issue.

Google also experiences problems sometimes. Similarly, seasonal variations may play a part and must be factored in. It’s important to keep in mind that any improvements introduced to a site can result in new trends becoming visible only weeks or even months later.

Google SEO performance indicators mentioned above are among the proven SEO targets that no large website can ignore and that allow for drawing actionable conclusions based on Google data over time.

Fili Wiese is a technical SEO expert and former member of Google’s Search Quality team. At SearchBrothers.com he recovers websites from Google penalties and offers SEO consulting.

Comments

2000 symbols remain
topsportweb
Great article Fili, thanks for sharing these valuable insights,hope you have more much the same
alanbleiweiss
alanbleiweiss
LOL Fili it's like you are inside my head with this stuff. Of course, we know why you understand how important and valuable these data points are... :-)
Fili Wiese
alanbleiweiss
Don’t worry, I can’t read minds yet (as my wife can confirm) so you are safe. But yes, these points are in my opinion indeed important :)
Kevin Pike
Kevin Pike
When you show 559K URL's submitted vs. under 30K actually indexed, this seems like a XML sitemap that needs to be optimized, right? I'm not sure I agree with "submit every canonical URL" because it suggests that you have included tags, category pages, and other duplicate content pages in the sitemap. Google is not big on indexing these. I understand that a 'growing' discrepancy is bad over time, but do you have any thoughts on having a poor ratio out of the gate?
Fili Wiese
Kevin Pike
This poor ratio can be due to a number of factors, for example: low number of pages crawled per day in combination with a huge site and poor internal linking. Googlebot may deprioritize the URLs mentioned in the sitemap in favor of the links found on the site during a crawl. I think it is best to include every canonical in a XML Sitemap, however you can divide the XML Sitemap up in several XML Sitemaps to identify where there may be potential indexing issues.
Eric Van Buskirk
Eric Van Buskirk
Fili, I'd say 50% signals come from external linking if we think about the challenges of making great sites that rank. What do you think is best performance indicator here, or are implying that performance from their reporting can't give important insight here?
Fili Wiese
Eric Van Buskirk
External linking is indeed important when it comes to SEO. Hyperlinks are a fundamental part of the design of the World Wide Web. However, on-page signals need to be positive to utilize the full external linking potential for search engines.
Patrick Mulder
Patrick Mulder
Great article Fili, thanks for sharing these valuable insights
Fili Wiese
Patrick Mulder
Thanks for reading!
Steven van Vessum
Good article Fili!

Regarding the XML sitemap: what if there is a big difference in the amount of submitted URLs and indexed URLs, but Google is showing the correct amount of pages indexed for a "site:" query (keeping in mind that the "site:" query isn't that precise)
Fili Wiese
Steven van Vessum
As the number in the site-operator is an estimation I only use this as an indicator. More important to use is the number in the Google Index Status as reported in Google Search Console.
Perry Bernard
Perry Bernard
Nice work Fili. These are all indicators I also use to track down issues, but of course they are only part of the answer.

Considering your work here, you might be interested in the following content I wrote a while ago on the same subject:
https://crankedseo.com/google-...

and this one too:
https://crankedseo.com/google-...

I just rated your article now at 5/5. Thanks!
Fili Wiese
Perry Bernard
Thanks. Glad you liked it.
Neeraj Pandey
Neeraj Pandey
Yeah fili. I liked this article.
Fili Wiese
Neeraj Pandey
Thanks for the positive feedback. Appreciated ☺
Dawn Anderson
Dawn Anderson
Thanks for sharing. Finding server log analysis really helps a lot too :)
Fili Wiese
Dawn Anderson
Thanks for reading! And I totally agree ☺
Modestos Siotos
Modestos Siotos
Great points Fili. What process you follow to figure out the crawled Vs indexed pages ratio? Site: shows the total number of pages indexed historically but how could you figure out how many pages get indexed daily?
Fili Wiese
Modestos Siotos
Hello Modestos, thank you for reading my article. In addition to checking the site-operator in Google, check the Google Index Status in GSC for index stats of your website. As mentioned in the article, if you want to know exactly which pages are being crawled daily, dive into your log files. Combine that data with which pages are indexable, and assuming you have no other on-page issues preventing proper indexing, you can get a good estimate how often and which pages get indexed regularly.
Have a Suggestion?