Website owners’ most precious treasure is the user-base visiting the site or buying products through it. Without users a website is nothing. Based on this thought developers create apps and websites which are for the users, with the users in mind. They want to satisfy the users’ needs.
What they often forget is that users won’t wait forever for a page to load on which the application is, they navigate away if it takes too long for a page to load which results in less page views, lower ROI, no matter how good the application is; it’s simply bad user experience if a page loads slow.
Back in April, 2010 Google announced that pages’ speed is considered a ranking factor and if you think that this means that Google is now TimeCop, you’re quite right. Google was the first major search engine to recognize that it’s a very bad user experience if a page loads slow and, even if page speed is only one of the more than 200 signals on which a page’s rank in search results is based on, in certain situations it may mean a precious clickthrough.
How is Page Speed measured?
Google measures a page’s speed using reports made by the Google Toolbar installed on million of users’ computer as well as the time spent by Googlebot(s) downloading a specific page. This means the new ranking algorithm is potentially relying on millions of reports, for every single website, thus it’s pretty accurate.
Saying that the download time measured by Googlebot also counts, arises an interesting question: what if your website is hosted on high latency servers or in a country from where Googlebot can not download, can’t connect speedily? Likely that this is the reason why Google is not relying solely on the data provided by Googlebot and takes the Toolbar data in consideration too; practically one compensates the other.
When is Page Speed taken in consideration?
Matt Cutts confirmed that only about 1% of the search results are affected by this ranking algorithm. This sounds to be nothing and is very likely that page speed is taken in consideration only in very specific cases. A possible case would be when two pages on two distinct hosts provide very similar content; page speed may have veto right to decide which page should rank higher for a specific query. Another case may be, separate from Web Search, the landing page quality of AdWords ads; pages loading faster may have higher landing page quality score.
What are the best practices to speed up the pages?
There are only a few things a developer has to solve in order to speed up the page speed and these can be categorized in three major sections:
- decrease payload
- minimize number of requests
- optimize code for rendering
As you observed, there’s no suggestion for server or uplink upgrades; this is because if the download speed measured to be slow by Googlebot that’s likely will be compensated by the Toolbar reports. That said, users don’t necessarily have to switch web-hosts just because one is faster than the other, it’s very likely that switching we-hosts causes more issues than good.
About Page Speed data
Webmaster Tools as well as Google’s own PageSpeed for Firebug add-on shows webmasters a speed score broken down to pages. The score is calculated based on various signals and is pretty hard to achieve a top notch 100/100 score for example in PageSpeed. A webmaster’s aim should be to achieve this score, however they also must not shoot themselves in their feet with the speed optimization.
The tool, advert and gadget scripts grabbed from the Google delivery network are often slow or they are rendered slowly for example, and many times prevent the webmaster from achieving the top score, but often removing these scripts may have worse overall impact than leaving them in place.
If the webmaster considers that a script is useful for the users or for the website, but it also slows down the site, they must take a decision. Removing for example adverts from the page renders the website without income and potentially the site has to be closed; removing the analytic scripts makes harder to further optimize the site which hardens the evolution of the site. The webmasters have to decide what’s better for the user on the long run. Again, what’s better for the users, not the tools created by engineers.
Google says that webmasters should create content for the users and not the search engines. This guideline applies on page speed as well: we don’t have to optimize our pages’ load time because Google demands it, we want to optimize them because users will reward it. Reports show that faster pages increases users engagement, they search more, they visit more pages and stay more on the website, they buy more; everyone’s happy.
They have to achieve the score they can without ruining other, beneficial aspects of the site; the Google engineers are smart enough to not send the web back to the WEB 1.0 era. Speeding up websites is a good deed which is rewarded by both the users and now Google as well. SEOs will slowly start to offer services which are focused on speeding up websites, web based applications will emerge which help the webmasters optimize their pages for speediness and as the optimizations are implemented, the result will be a better web. This is a slow process however; it won’t happen in a day, a week or a month, it’s a slow but constant process which might take even years. In any way, the dices were thrown, thanks to Google.