Recently Google has announced (on its webmaster blog) that its PageRank algorithm would include “page load time” to rank search results.

The inclusion of “page load time” is still likely of limited effect; in Google words:

“while site speed is a new signal, it doesn’t carry as much weight as the relevance of a page”.

This however is an important change as it introduces performance as a factor in an approach so far entirely based, directly or indirectly, on the semantics and popularity of the page itself.

While the announcement did not surprise me much (I honestly was convinced that this was already happening), the hundreds of comments to the blog entry did express it, and especially the relevant number of them heavily criticizing the decision. Some of the criticisms are:

  • the approach is unfair for semantically rich web sites (i.e. with lots of photos, rich textual entries, etc.)
  • the approach is unfair to small / personal sites and favors corporations with large infrastructures and computing power
  • the approach is for Google’s sole convenience, because faster page means quicker and cheaper indexing

We think that Google decision is strongly positive for several reasons.

Firstly, the main implication of this change is that web pages are not considered anymore documents, but rather services (among which, informational services are just a relevant part) for which QoS (and especially performance) is known to be a primary concern.

Secondarily, we think fairness is not an issue here. It’s fair to reward with a higher ranking a page with faster load time (better for the user) with respect to semantically equivalent slower pages. It is also fair to rank higher a page semantically worse if it largely outperforms a better piece of writing, assuming the relative weight of performance to semantics is reasonable. And being PageRank the most studied algorithm in history that seems very likely.

Thirdly, web page performance is nowadays mostly influenced by bad or good design rather than availability of computing power. For static web pages, that are still the most relevant for search engine search results, the greatest source of optimization is client time and the actions that can be taken (optimizing caching, minimizing round-trip times, minimizing request overhead, minimizing payload size, optimizing browser rendering) derive from design aligned to best practices, that can be achieved using free resources (see here what Google itself offers).

Finally SEO, which is a de-facto obligation for an effective internet presence, will incentivize positive behaviors from page authors, which users will ultimately benefit from. A better Internet driven by better rules. The best way to describe it is the closing sentence of Google’s blog:

“We encourage you to start looking at your site’s speed not only to improve your ranking in search engines, but also to improve everyone’s experience on the Internet.”

Our way to see it: we do not see any way for which rewarding quality can be a bad decision.

Paolo