|Should Google be held socially responsible for web accessibility?|
usability and web standards?
From Adam Lasnik interview on StoneTemple [stonetemple.com] talking about site accessibility, web standards and code cruftiness.
[...] we cannot use this in our scoring algorithms currently: There are a ton of very high quality sites, pages and sites from universities, from research institutions, from very well respected ecommerce stores, of which I won't name any, that have really crufty sites, and sites that won't validate. On some of these you can view the source and cry. And, because this is quality content, we really can't use that as an effective signal in search quality.
I know that some of you may think:
"Google is a commercial entity, why should they play a part in these issues?"
Then I would reply to you:
"Google has so much power on the internet today and they are making so much money out of 'us'. If they can have a major 'positive' impact on the web, why shouldn't they?"
Never has one single company held so much power on the net. If Google said today that standard compliant site, accessible and usable site will get a significant boost in the SERPs, how long do you think it will take for all the 'universities, research institutions, and very well respected ecommerce stores' to get their act together and have their site validate, be more usable and more accessible?
I am thinking that within 3 to 6 months, most of these would. How about corporate CMS merchant, custom built CMS, software vendors etc... How long before they all adhere to the 'standards'?
Who's coming with me to the 'plex to manifest for a significant boost to be applied to standard compliant, accessible and usable sites? :)
I disagree completely. There is tons of valuable content out there created by people who neither know nor care about web standards. A search engine is about getting people to information and the value and quality of information in not corelated in any way to the manner in which it is marked up.
Google really need to get their own house in order first.
My experience of working as part of a (UK) university web team is that:
a) accessiblity is very high priorty
b) SEO is not (rankings happened naturally due to depth and breadth of content)
c) there are too many people involved with publishing, outside of the central team, with varying levels of skills and enthusiasm to have validation across the board (having been called a nazi for banning font tags!) - you cannot police everyone all the time.
d) wide scale change moves at a glacial pace
I knew someone would :).
|There is tons of valuable content out there created by people who neither know nor care about web standards. |
That's exactly my point. If Google said that standard compliant sites would get a significant boost, guess how long it would take for them to learn about it. How long do you think it would take software vendors to make their latest web design soft 100% standard compliant - if everyone was asking to have them compliant?
|value and quality of information in not corelated in any way to the manner in which it is marked up. |
Maybe not now, but I think that for equally valuable information, the one marked up properly, accessible and for which the designer showed extra care in the markup, should appear first.
|a) accessiblity is very high priorty |
and so it should be
|b) SEO is not (rankings happened naturally due to depth and breadth of content) |
and may be the accessibility focus in a)
|c) there are too many people involved with publishing, outside of the central team, with varying levels of skills and enthusiasm to have validation across the board (having been called a nazi for banning font tags!) - you cannot police everyone all the time. |
I assume that you use a CMS of some kind for those non techie people. Does it produce standard compliant code out of the box?
|d) wide scale change moves at a glacial pace |
ok maybe for universities, but for businesses or ecommerce sites I am sure that within 6 months they'd all be standard compliant.
|I am sure that within 6 months they'd all be standard compliant. |
What world do you live in? What internet do you browse? :)
I am certain they wouldn't.
Gazillions of sites and pages are out there which are old, unedited for years. But very, very good. The original creators have often moved on to other things and possibly don't even have access to the pages anymore in some cases. Who on earth is going to go back and redo them all?
Gazillions of sites and pages are out there that have value as information and are created by people who not only don't know or care about standards, they have never heard of them, they are blissfully unaware that they exist. It won't matter what software providers do, they will carry on using the software they bought in 1999.
And anyway, what standards? I can make a site that validates perfectly but is a total disaster from a useability viewpoint, or equally one that is highly rated for useability but doesn't necessarily validate. How do you assess useability in an algoritm? Would one standard be better than another? Why? In what situation? It's a minefield.
|ok maybe for universities, but for businesses or ecommerce sites.. |
Which would give business and ecommerce sites an advantage in Google over information/educational sites. This would not necessarily serve Google's users well.
It's just a non-runner. Theoretically it is a nice idea, I agree, but in practice it rewards those with money and resources, not necessarily those with knowledge.
|Which would give business and ecommerce sites an advantage in Google over information/educational sites. This would not necessarily serve Google's users well. |
Guess who really would be the quickest to jump on the gravy train for ranking advantages, without bureaucracy to wade through?
I'm sure Matt Cutts and his webspam team would be absolutely thrilled to pieces if this happened.
[A note that, of course, the following is all broad generalization. But it accurately represents what I've seen.]
|If Google said today that standard compliant site, accessible and usable site will get a significant boost in the SERPs, how long do you think it will take for all the 'universities, research institutions, and very well respected ecommerce stores' to get their act together and have their site validate, be more usable and more accessible? |
I don't know about the ecommerce stores, but I can tell you that the majority of people who have anything to do with most university and (to a smaller degree) research institute websites would have no idea what that quote is talking about. (Just judging from my own experience, research institutes are more likely to have one centrally-run website than universities are, and can usually find someone who knows how to run it.)
The university I work for (in a medical research department) has one of the better IT teams around, but it's understaffed for even taking care of day-to-day operations, and can't control what departments, let alone individual faculty members, put online. The medical school is glacially introducing a CMS, which should help if it ever gets finished. Problem is, it's not mandatory, so even when it's fully implemented a lot of people will still be doing their own thing - which often means using FrontPage and being happy if they can get their nav buttons to work.
An idea of what the above quote would elicit from most university research department "webmasters":
What's Google? (MSN is much preferred, for the minority who know what a search engine is. But why wouldn't you just bookmark PubMed?)
Hey, we're really good at "standards compliant" when it comes to dealing with NIH and the IRB, but what does it have to do with the internet?
Validate? Oh, sure - We validate a study when we can reproduce the results. What's a "validating site?" - Of course, it's the site (NIH's term for an institution) where the research is done!
Accessibility is very important - Our entire campus is accessible. But I thought this was supposed to be about computers...?
"Hey, my nav buttons work!"
"Wow, how'd you do that?"
SERPS? -- C'mon, get real. The departments/faculty running their own sites don't know what the letters stand for. The pros on the IT team don't care. If they worry about accessibility it's because it's the right thing to do, not because of search engines.
-------The possibly saving grace is that most of the websites are simple, which is usually a good thing for accessibility. But if a full professor learns how to implement Flash - RUN! :o
The best place to force some sort of validation and accessibilty is the browser. Think about it, many people who publish a website don't care or even don't know about SEO, and even less care about Google in particular, but they all care whether the website displays on major browsers. It should be the browsers (IE, Firefox, Opera, Safari) that are held responsible for validating pages and rejecting bad ones, maybe starting with the next version of HTML. And even better, all browsers should render the same code exactly the same way, but I suppose I'm really dreaming there...
Not that I'm asking that the next version of browsers should just stop rendering old non-compliant websites. But starting with the next set of HTML standards, any documents that want to take advantage of the new features should only be rendered if they validate thoroughly.
|but they all care whether the website displays on major browsers |
No they don't. There are thousands and thousands of people regularly putting content online and creating websites who have barely heard of Firefox, much less downloaded it.
When you hang around places like this you are by definition much more informed than the vast majority of 'webmasters' and it's very easy to forget that.
The trouble with enforcing validity via the browser is that the end-user will download a new browser, try their favorite websites, see that those sites are broken, then blame the browser not the site, and revert back to what they were using before. And they would by quite right in their actions, as a browser's job is to display web pages, not to evangelize ideals. See the abject failure of XHTML to see the fallacy of the concept of the web with draconian error-handling.
Same goes for Google, unfortunately. If they were to promote accessibility by increasing ranking for accessible sites, their results would suffer as accessibility is not a criterion for relevance. Even if it were, there would be significant difficulties in algorithmically measuring accessibility, as the severe limitations of the current crop of accessibility validators show. Markup validity against a declared DTD is an extremely poor guide to accessibility when taken alone, so that would add no value to Google's algorithm either.
And if they did ever attempt something like this, who would be able to fix their sites in consequence? Those with the funds and the expertise to do so, like commercial entities, large companies, IT professionals. How would that "other" accessibility be affected - the accessibility which comes from the universality of the web, the low barriers to entry, the non-professional publishers, the personal pages...?
If Google is serious about web accessibility, it could always start by looking at its own coding practices and product sites before reaching out to push the concept on others.
|No they don't. There are thousands and thousands of people regularly putting content online and creating websites who have barely heard of Firefox, much less downloaded it. |
True, but those people have heard of IE or Safari, or some other major browser, no? It'll be rare that someone will publish a website without trying it out on at least one major browser. If all major browsers strictly enforced validity, then people will be forced to create valid HTML, or at least their authoring programs will be forced to do it.
|The trouble with enforcing validity via the browser is that the end-user will download a new browser, try their favorite websites, see that those sites are broken, then blame the browser not the site, and revert back to what they were using before. |
But if all major browsers enforced validity strictly then the author would be forced to fix the problems before publishing their site (assuming they test it on at least one major browser).
I agree with you that this would never happen, the net is too unregulated (and rightly so) for one entity to enforce anything beyond the minimum requirements. If one search engine gets picky, we all switch to another, it one browser gets picky we all switch as well. The only way is if there's a concerted effort by all major players, and then I believe it's better to do it at the browser level rather than the search engine level, and only for future versions of HTML so no one has to go back and fix existing sites.