At SMX this past year, Matt Cutts made some interesting statements about how page speed affects SERPs. A seemingly innocuous sentence from his interview with Matt McGee ignited a debate that spanned continents:
You don’t get a boost for having a fast site. Sites that are outliers in terms of being slow will rank lower. All other things being equal, a site that’s too slow will rank lower. – Matt Cutts
Since then, a firestorm of articles has surfaced concerning site speed and its impact on Google search results. So what was the result? Most experts seem to agree that the impact site speed has on the SERPs isn’t as simple as previously thought. For example, most concluded that the benefits you get from having a fast site pale in comparison to the negative impact you get from having a slow site.
The patent that I’ll be referring to in this post, along with some additional observations based on my own anecdotal experiences, may shed some new light on these widely held conclusions.
On February 4, 2013, a patent dated November 12, 2010 (“Using Source Load Times in Ranking Search Results”) was upgraded. The patent says (Emphasis Mine):
This specification relates to search systems.
The Internet provides access to a wide variety of resources, examples of which include video or audio files, web pages for particular subjects, book articles, and news articles. A search engine can identify resources in response to a user query that includes one or more search terms or phrases. The search engine ranks the resources based on their relevance to the query and importance and provides search results that link to the identified resources.
The resources referred to by the search results may take different amounts of time to load in users’ web browsers. For example, for any particular resource, the size of the resource, the number of images the resource includes or references, the web server that serves the resource, and the particular network connection can impact the amount of time the resource takes to load in a user’s web browser. Given two resources that are of similar relevance to a search query, a typical user may prefer to visit the resource that has the shorter load time.
Later on in the summary, it gets a bit more specific:
Particular embodiments of the subject matter described in this specification can be implemented so as to realize one or more of the following advantages:
A search result for a resource having a short load time relative to resources having longer load times can be promoted in a presentation order; and search results for the resources that have longer load times can be demoted. The demotion can, in some situations, result in a search result for a resource that has a short load time being presented earlier in the order than a search result for a similarly relevant resource that has a longer load time.
Thus, for two resources that, apart from load times, appear to equally satisfy a user’s informational needs, the user will often select the resource that will likely load the most quickly of the two, resulting in a better user experience.
Extrapolated from the patent itself, I have described what each picture represents.
What Does the Patent Actually Say?
This patent describes the technologies related to the use of loading times for ordering the resources in the search results. ‘Loading time,’ in this context, means the total time it takes to load a page. There are many different ways to collect this information. In particular, the patent describes an incremental refinement method of the original result based upon the resources’ loading times in relation to many elements among which:
- A time limit for loading (minimum and maximum) relative to a specific set of results
- The device used to complete the search
- The connection used to complete the search
- Where the search was made from
- The type of search needed to determine whether or not refinement is necessary
The patent’s language can be hard to decipher if you’re not an engineer. I suggest you read this excellent summary of the patent by Bill Slawski before you dive into the complete version. The crucial point is that, in practice, there is no absolute time limit suggesting the velocity or slowness of any resource; but this time limit is defined by an aggregate of consolidated loading times that form a given search result. The time limit is clustered so that any given set of search results can have it’s own individual time limit. The relevant language: …slowness of any given resource; however, there is a kind of clustered time limit, which allows any given set of search results to have its own time limit. Let me give you some in-depth examples:
- A SERP with pages full of multimedia resources will have higher time limits than a SERP with only text pages.
- A SERP presented on a mobile device using a 3G network will have higher time limits than a SERP presented on a desktop browser that is operating through a cable connection.
This is an important distinction, because it allows us to understand how the collection of information regarding velocity or slowness of a resource works in relation to the context and the type of search.
The picture above will help you to get an idea of the amount of data that is available from Google Analytics reports that are dedicated to loading times.
The inner workings are applied only to a statistically significant number of consolidated data; therefore, searches where sufficient data are not available have been excluded from the results. Just like navigational searches, meteorological and news searches can be excluded from the refinement process.
How Does the Scoring Work?
Once the limit value has been established, one percentile is defined for the first adjustment level and another for the second adjustment level. The resulting adjustments can be based either on the demotion of the slowest resources for a determined value or on the promotion of the resources that are faster than a determined value. Here is an excerpt from the patent specifically pertaining to this information (Again, Emphasis Mine):
The threshold values can be determined using a sample of all collected page load times. For example, the first threshold value can be any value in a range of values corresponding to the 96th percentile of all collected page load times to the 99th percentile of all collected page load times. The second threshold value is lower than the first threshold value and can be any value in a range of values corresponding to the 85th percentile of all collected page load times to the 95th percentile of all collected page load times. Alternatively, other ranges can be used. The demotion values can be adjusted in accordance with the desired effect on search result scores, with the first demotion value being lower than the second demotion value to further demote resources with extremely low load times.
Alternatively, the search results adjusting engine can promote search results with shorter load times by comparing the load time measures for resources to a different threshold value (e.g., any value in the range of the 5th percentile of all collected page load time values to the 15th percentile of all collected page load time values), and setting the multiplier factor equal to a promotion value, if the load time measure is below the threshold value.
Enrico Altavilla‘s blog post “How a Ranking Signal Works”, explains Google’s claims. Matt Cutts’ clarification that faster websites do not get any push but slower websites usually get worse positions means that the velocity signal is always set to one for fast websites; but, its value decreases (yes decreases) below one, as the number of seconds the user has to wait to get the resource increases.
It results in the following graphic:
However, this appears to contradict what is written in Google’s patent; and it will transform the graphic into this one:
The above examples would seem to contradict Matt Cutts’ earlier assertions from SMX.
So What Does it All Mean?
Be it negative, positive, or neutral, it seems obvious that the speed of the website will affect the aggregate figure.
As of 2010, site speed has officially become a ranking factor, according to the official announcement:
Faster sites don’t just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that’s why we’ve decided to take site speed into account in our search rankings.
Personally, I think a website’s speed is just as important as the correct structuring of its resources. If a website is fast, search engines are able to crawl it more quickly, which will reduce infrastructure costs.
A website can benefit from this because it won’t waste its own allotted crawl budget. Using the budget in a fair manner can give, in return, a better visibility in search engines to whoever uses these licenses with great caution (by speeding up or getting rid of superfluous items). The less popular your website is, the less crawl budget you will have; therefore, it is crucial to maintain a high-performing website.
Recently, on our forum, many cases of drastic decreases connected to the performances have flourished. Surely one of the most interesting ones is found in this discussion, which I will sum up with a picture that was kindly provided by forum-user Peppinosh.
Note: With this picture, the intention is not to demonstrate that site speed affects rank, but that it is directly important to the efficacy of a website in relation to search engines.
Let’s Go Beyond the SERPs
From a purely business standpoint, a fast website converts more and spends less. Studies have shown that optimizing loading times will result in a positive boost in terms of conversions. The benefits of a high-performing website are self-evident. For example, a site that performs well will be more accessible through multiple channels, be they organic, pay, or referral. Understanding the direct impact of velocity on user experience is fundamentally important for a business’ marketing managers and the exponential growth of smartphone and tablet use makes optimizing for site speed even more important.
► A Fast Website Attracts Repeat Visitors
We won’t go into detail; but, according to studies that Google, Microsoft, and Facebook began in 2006, it is proven beyond any doubt, that loading times affect users’ behavior. Slow response times are directly related to a high bounce rate from users. Another test from Yotta shows that a decrease in loading time on an average-sized site corresponds directly to a decrease in bounce rate.
- “Marisa Mayer at Web 2.0” (2006): http://goo.gl/maFR8v
- “Speed Matters” (2009):http://goo.gl/o5oWzk
- “Bing and Google Agree: Slow Pages Lose Users” (2009): http://goo.gl/7guoCF
- “Every Millisecond Counts” (2009): http://goo.gl/LJow9q
- “How to Measure Page Load Time with GA”:http://goo.gl/KN4xEc
- Slow pages lose customers http://goo.gl/O5MBcf
► A Faster Website Converts Better than a Slow One
One second more of load time = a 10 percent loss of conversions. In 2009, Amazon discovered that 100 milliseconds of latency cost the company one percent of its sales. This would mean that only one second of latency would lead to a 10 percent loss in earnings. In the e-commerce sector, Shopzilla, another big website, found that reducing loading times on its own site from seven seconds to two seconds led to a 25 percent increase in site visitation and a 12 percent increase in revenue.
In a more recent study, Walmart noticed a two-percent increase in conversions after merely improving response times on its pages, which confirms that the user is, conditioned to act according to their own perception of speed. The following graph sums up a few surveys concerning conversions and load times. A dramatic decrease of conversions, as a result of an increase in response times from one second to four seconds, seems evident. (I would like to remind the reader that a response time of four seconds is not considered abnormally slow.)
Those few seconds make a big difference! Another recent study from Tagman showed that a one second delay in page loading can lead to as much as a seven percent loss in online sales. The following graphic shows, as a matter of fact, how an increase in response times corresponds to a decrease in conversions, therefore confirming Walmart’s study.
As a further confirmation of these studies, an analysis conducted by Gomez on 33 online sellers has shown that a site’s conversion rate can increase up to 74 percent if its pages have response times that fall between two and eight seconds.
► Optimal Loading Time
Over time, other analysts have tried to come up with a “magic number” concerning the average user’s tolerance for loading times.
From 2006 to 2012, average loading times decreased by four seconds for desktops and two seconds for tablets. According to a few surveys conducted in 2006, most users expect a page to load in less than four seconds.
Further surveys that were conducted in 2009 by Forrester Consulting for Akamai have shown that 47 percent of consumers expect a page of an e-commerce website to respond in less than two seconds. This means that, in just three years, the average buyer’s patience for loading times has grown thinner by two seconds.
In fact, in 2012, a Compuware survey confirmed that the great majority of tablet users (70 percent) expected response times of two seconds.
- “Online Experiments: Lessons Learned,” Ron Kohavi and Roger Longbotham (Stanford University, 2007): http://goo.gl/89gkaM
- “Shopzilla Site Redesign – We Get What We Measure” (2009): http://goo.gl/IJBJiA
- “Walmart: Real User Monitoring” (2012) : http://goo.gl/PyHKUh
- “Just One Second of Delay in Page Loading Can Cause a Seven-percent Loss in Customer Conversions”: http://goo.gl/BoMpfs
- “Gomez’s Best of the Web 2010: Web and Mobile Performance Awards” (2010): http://webperformanceguru.files.wordpress.com
► Repeat Purchases
Site speed can also impact brand loyalty. Even a few milliseconds of delay can drive a consumer away from a site.
A site with a lot of slow pages not only damages conversions, but a slow page response can also affect a user’s overall perception of an online brand.
The Forrester-Akamai survey that was mentioned above has shown that 79 percent of users are less inclined to re-visit a page that is too slow, which means that these users are even less likely to make purchases on such a page.
This shows that website speed can have a huge impact on a brand’s online and offline reputation.
- “Retail (Website?) Performance: Consumer Reaction to a Poor Online Shopping Experience,” Jupiter Research (2006): http://goo.gl/HvoGJ5
- “Akamai Reveals Two Seconds as the New Threshold of Acceptability for e-Commerce Web Page Response Times” (2009: http://goo.gl/IcmVSY
- “Poor Web Performance Results in Tablet Users Less Likely to Make Purchases Online,” Compuware (2012):http://www.compuware.com/
► Speed Is Also Crucial for Mobile Users
Mobile users are no different than their desktop counterparts when it comes to tolerating a slow page. Nowadays, users expect the same loading times for mobile devices as for desktop devices. From a search engine’s point of view, this element also factors into a site’s ranking.
A study by Gomez shows that users expect for websites to load on their smartphones at the same speed as they would on their desktop computers. According to this survey, it seems that 74 percent of users will abandon a website if they are required to wait more than five seconds for it to load.
The following graphic shows that drop-out rates increase as response times increase.
► The Perception of Speed
Although we should always consider our website’s total loading time, it is more important to consider how quickly we can make our site useable. The perception of velocity, which I spoke about at the Convegno Nazionale sul Search Marketing GT 2013, is something that we need to focus on. In my speech at the Web Marketing Festival, I explained that a decrease in TTI (Time To Interactive Document) will lead to an increase in conversion rates.
At WebReevolution, I only spoke about optimization for desktop users. However, later, at the Convegno, I focused on both desktop and mobile users. By that point, I’d found that mobile conversion increased in respect to the WebReevolution case study.
Around the same time, at the London Velocity 2013, Radware presented the findings of a really interesting survey. These findings explained the neurological impact of a few milliseconds of website latency for consumers.
Velocity or a lack thereof?
So which is it? A search engine could and should consider both, but people (apparently) prefer faster websites.
Speed kills! But not in the way you would normally think.
Andrea Pernici is Search Marketing Manager at GT Idea. His duties there include supervising internal development, speaking at conferences, managing their community magazine and forums,and consulting for big brands.