Welcome to WebmasterWorld Guest from 54.167.157.247

Message Too Old, No Replies

Browser compatibility as a ranking factor?

   
11:25 am on Nov 30, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



We've long established that W3C compliance of websites is not a ranking factor per se, but what about browser compatibility?

Say you have a website that's optimized for modern browsers, empoying techniques such as CSS sprites, and you have no fallback methods for older browsers in place (for whatever reason). In the case of CSS sprites, for example, your images won't load for visitors using IE 7 or older.

In lieu of Google's never-ending quest to give their users the best possible experience, and supposing they're capable of detecting the above, wouldn't it make sense for them to lower your rankings in searches performed by users of such older browsers?

I'm just throwing this out there even though I just realized this is indirectly embedded in Panda. After all, if you don't support older browsers, any consequentially negative user experience is likely to have an effect on your engagement metrics such as bounce rate and time on site.
4:08 pm on Nov 30, 2011 (gmt 0)



I use metrics of my visitors to determine what browsers I need to optimize. Typically, my rule is:

- Site is compatible for latest version of IE, Firefox, Chrome, and Safari
- Site is compatible for the previous 2 versions of IE, Firefox, Chrome, and Safari
- Site is compatible for any beta versions that are publicly available of IE, Firefox, Chrome, and Safari

As a tip, I rank very well for a site ensuring browser compatibility is working as intended. Therefore, it definitely doesn't hurt.
7:27 pm on Nov 30, 2011 (gmt 0)



I'd say making sure your site works at least reasonably well on a smartphone is far more important than worrying about older browsers. Anything with IE6 or IE7 in the user agent and not in compatibility mode is more than likely a bot anyway, often an msnbot faking the UA.
8:02 pm on Nov 30, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



It's an interesting question - really made me think. After all, we've seen over many years that even different CURRENT browsers can get different versions of the search results.

The challenge that Google would face would be knowing if the website lost critical functionality with some older browser, not just whether the visitor's experience degrades in some way or other - because it could be a graceful degrading, after all.

And there's the rub, as I see it. How on earth could that challenging determination be scaled and made machine-based?

If Panda's attempt at measuring quality through an algorithm seems to be a mess at times, just imagine trying to program an algorithm to detect "ungraceful" degradation or loss of "critical" functions on some specific older browser. First it would need to understand what specific functions are critical to that specific website, just to have a place to start.
2:47 am on Dec 1, 2011 (gmt 0)



I think they must lower your rankings for all your users if your site really doesn't show up well in older browsers, because Google said that they use bounce rates as a ranking signal.
1:53 pm on Dec 1, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



wouldn't it make sense for them to lower your rankings in searches performed by users of such older browsers?

Yes it does, they can also detect it to some extend. Two factors come into my mind. The browser UA and the bounce rate on requests by the specific UA. And can take it further with individual pages.

But how practical that would be? They need to keep track of it in someway, store and manage the data and then filter the search results returned.
4:14 pm on Dec 1, 2011 (gmt 0)

10+ Year Member



I believe this is a rankings factor, and hence why I have made sure a latest redesign on one of my sites works perfectly well in IE6 + IE7.

Only a small percentage of visitors use these browsers, but if it means the user experience is improved for a few thousand visitors then I think its worth doing.

Catering for these browsers will improve overall user metrics (even if only to a small degree).
10:52 pm on Dec 1, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



First it would need to understand what specific functions are critical to that specific website, just to have a place to start.

Right, and so the percentage of collateral damage (i.e. incorrect interpretations of what's critical) would probably be too high for this to work at a large scale. A cross-browser comparison of usage statistics isn't perfect either, but good enough perhaps to be allowed some weight in the grand scheme of things.

You could take this other places: why send someone on a dial-up connection to a resource-heavy site?