| 9:50 pm on Apr 30, 2004 (gmt 0)|
This was suggested at the ODP in the public forums a long time ago, and the answer to that was simply "we are not the internet standards police". I wonder how people would view Google if they started doing something like that. I would welcome it, in some ways, but I think there would be a backlash of some sort.
There are many problems with doing what you suggest, simply because there are probably over a billion static pages of information that the webmaster owner is never going to update ever again. There are loads of sites out there where the owner has lost interest, but they are still online. There are millions of university pages where the owner has moved on but all their old projects are archived, and so on.
I'd like to see the web "cleaned up", but I can't see it happening anytime soon.
| 10:05 pm on Apr 30, 2004 (gmt 0)|
>why donít they make the web a standards compliant place with strong steps towards accessibility
More often than not standards compliant pages are auto generated by SEOs while on the other hand hand coder amateurs often leave their pages in a mess. If I were Google with its reputation of penalizing SEOs with OOP, a standards compliant site waives a red flag at me, to be checked manually.
| 10:09 pm on Apr 30, 2004 (gmt 0)|
I wouldn't object if pages which don't validate received a -1 PR penalty if rated 4 or above. I.e., 3 would still be a threshold for an "interesting" pages (where backlinks matter), regardless of whether it would validate.
As far as accessibility, maybe a slight bonus would be in order for pages ranked 7 or below. The web fails miserably in being accessible to people with disabilities, and I agree positive encouragement to improve this situation would be a good thing.
| 10:13 pm on Apr 30, 2004 (gmt 0)|
I think it's a great idea.
Even a 0.5% bias in favour of webpages that validate would send SEOs into a frenzy of creating valid code.
That would load pressure onto the toolset writers. The next iteration of their tools would be standards compliant because thats suddenly what the market is screaming for.
And everyone can then generate valid code.
And yes, there will be billions of pages, normally non-commercial ones, of the sort that g1smd mentions that won't be validated or updated. But they will generally be ranked against each other, so the relative rankings will not be affected.
Google could justify it by saying something like "we give a small priority in the SERPS to pages that are likely to display on any user agent / browser with few problems"
| 10:43 pm on Apr 30, 2004 (gmt 0)|
While I consider validating pages to be a Good Thing, giving a bonus to pages that validate would most likely be a step backwards.
The ONLY people that would start validating their code are those that REALLY care about their ranking, i.e. SEOs.
So while it would be good that the SEOed sites would have cleaner code, it would be bad in that it would probably lead to a reduced quality of the SERPs.
Good for the web, bad for google. I think I know which way Google will decide.
| 10:52 pm on Apr 30, 2004 (gmt 0)|
|The ONLY people that would start validating their code are those that REALLY care about their ranking, i.e. SEOs. |
Talk about Google killing off the Mom and Pop sites ... this idea is a big step in that direction.
| 11:13 pm on Apr 30, 2004 (gmt 0)|
I don't really care for the standards, and I refuse to design my sites by them. My code isn't "sloppy" or anything to that effect, it's just that the standards are somewhat limiting and sometimes I produce content that's outside of that scope.
If Google (or any SE) were to start making adjustments based on standards, no good would come of it. The ONLY positive effect would be that the efforts of the "standards geeks" would finally be validated, and other than that would serve no real value to everyone else.
Search engine results SHOULD NOT be based on how "good" your code is or whether it's compliant with this or that. Search Engine results should be based on CONTENT - how closely does the CONTENT of the site match what you are searching for - not how good the web designer was in making that site "compliant" with certain standards.
If Google (or any other SE) were to rank sites based on standards, I would IMMEDIATELY stop using them. When I'm looking for content, I don't care what brand of HTML the designer used just give me what I'm looking for.
This change would certainly NOT be "for the better".
| 12:33 am on May 1, 2004 (gmt 0)|
>> The ONLY people that would start validating their code are those that REALLY care about their ranking, i.e. SEOs. <<
Maybe the people who write FP, DW, et al would write tools that produced valid compliant code at last.
>> I don't really care for the standards, and I refuse to design my sites by them. <<
So, how many people arrive at your site then fail to get at the content, I wonder?
You know, having a certain level of validity would stop the junk like that which I see when I do a search for meta name= content= which returns nearly three million results, most of which are for pages with a malformed <title> tag. Ten seconds with a validator, before publishing, would have caught that.
| 4:10 am on May 1, 2004 (gmt 0)|
"Improve" the web by making it easier for search-engine optimizers to boost no-content pages?
To call that idea "half-witted" would be overestimating it.
| 5:01 am on May 1, 2004 (gmt 0)|
When I was in highschool and college, "zines" were all the rage due to advent of inexpensive desktop publishing and photocopying.
I had been contributing to and creating my small magazines and chapbooks since 84 - 85, but I had been frustrated by the fact that the "million dollar" layout idea in my head couldn't be transferred into something that could be copied off at the local copy shop for under $.25 per copy.
So I met the challenges of my budget, and learned how to silkscreen, use PageMaker and Quark, photoshop, shoot, develop and print my own photos and ended up turning out some pretty good stuff with some "co-conspirators" on a broad range of topics.
Then one day in 94 or 95, while at a Kinkos copy shop at 3 AM, one of the clerks said "dude, you should make this a webpage". Later that week, when I was making my first web page, I thought "Wow - now I can make a design exactly like I envision it, and it's 'free'".
Despite the huge commercial interests of the web, it's primary purpose is to allow the free and simple dissemination of information between disparate individuals.
So when I read something like let's ban all "non-compliant pages and make the web better", I wonder if you realize that apart from commercial opportunity, the web has a huge unrealized potential to allow people to express themselves. Blogs are a part of that, certainly.
I also know that there are probably 80 terabytes of crap for every kb of relevant and amazing information - but that's a fair trade off, as far as I am concerned.
If you want to make the web better, teach your granny how to blog and an underprivileged kid how to code HTML and get the web out of the hands of "high priests" and elitists. If IE can look at a page and understand the content on it, Googlebot et al should too.
i will now resume creating content for google's users. thank you.
| 5:44 am on May 1, 2004 (gmt 0)|
I'll make my web pages standards compliant when the people who write web browsers start making their browsers display stuff in a standard way. :)
| 6:19 am on May 1, 2004 (gmt 0)|
Amen to that abates :)
Think about it this way ... if we had been strictly going by standards all along, where would the web be today? It might have "cleaner" HTML code, but we wouldn't have had as many advances - basically we would all be using HTML 1.0.
Microsoft and other companies have come out with products that enhance the web in my opinion. Forcing yourself to live within standards is like saying everyone should have a v4 engine because it's more "efficient" than a v8. Sometimes I want a little more power.
| 6:28 am on May 1, 2004 (gmt 0)|
|So when I read something like let's ban all "non-compliant pages and make the web better", |
The first post in this thread to use the word ban is yours.
You may be quoting from some other thread or forum. This topic is not about banning web pages. It's about making them more widely available.
Deliberately inserting bugs into HTML code is not a good way of making any page widely available.
If Google took that into account....well, that is what this thread is about.
| 7:13 am on May 1, 2004 (gmt 0)|
Yet another example of a "contradictio in terminis":
People say that they take their job as webdeveloper seriously, while they don't care about validation.
| 10:58 am on May 1, 2004 (gmt 0)|
|My code isn't "sloppy" or anything to that effect, it's just that the standards are somewhat limiting and sometimes I produce content that's outside of that scope. |
They are only limiting if you don't know how to code them in a compliant way in the first place. Everything you can put on a page can be made compliant without losing functionality.
Back on topic - I've heard people say that one of Google's mottos is "Don't be evil". I wonder if that necessarily implies "Be Good".
Hey, with the 100+ factors considered by their results algoithms it could potentially be a (very minor) scorer already.
| 11:03 am on May 1, 2004 (gmt 0)|
Unfortunately, the extra overhead in running pages through a validator probably isn't feasable.
Good thought though....
| 11:22 am on May 1, 2004 (gmt 0)|
digitalv "Sometimes I want a little more power. "
You are so right :-)
I think this thread suggests that Google is not changing the web for the better and I would probably dispute that if I had the time which today I will not have.
My feeling is that Google's contribution to the web is a progressive one but that it is not superior to any other web user or search engine directory or spider out there, it may be for a moment but that moment will pass and the whole web will continue onwards as it has done.
Mathematically my argument would be:
Web = spider = pretty well omnipresent = google = user = progress = time
You mathematicians out there care to offer mathematical derivatives based on that simplistic view to prove any of those relationships or to differentiate them be my guest.
Google's future is tied up in the web.
Like yours and mine.
Thats a good thing :-)
Vote with your mouse :-)
Sounds like an animal deal this web spider mouse?
| 11:26 am on May 1, 2004 (gmt 0)|
PatrickDeese something I saw there ...
"high priests" and elitists
its human behavour to elect these ..
an elective democracy :-)
If you dont look up to anyone you must think you are on a pinnacle?
| 11:38 am on May 1, 2004 (gmt 0)|
Web = spider = pretty well omnipresent = google = user = progress = time = equality
| 2:26 pm on May 1, 2004 (gmt 0)|
yes thats kind of what I was thinking :-)
QED a circle
| 2:44 pm on May 1, 2004 (gmt 0)|
Google isn't evil, WOMEN are the root of all evil. Here is the mathematical proof:
Women = (Time * Money)
Time = Money
Money = Ā„Evil
Anyway ... The comment was made that we should start making our sites standards-compliant when the BROWSERS work within those standards. Sorry, but as long as the browsers continue to support features that go above and beyond the standard, I will continue to utilize those features when applicable. Those of you whining about standards do kinda sound like Elitists :P
| 3:40 pm on May 1, 2004 (gmt 0)|
I am not suggesting that Google become the internet police, but they could help shape a better web.
>>I wonder how people would view Google if they started
>>doing something like that
Do you really think they care. Remember the last three updates, how many people go toasted, did they change much. No. Simply speaking Google can do what they want. Who care what the WebmasterWorld Ďs of the world think.
I agree with the legacy web that you talk of, it a difficult situation.
>>standards compliant site waives a red flag at me, to be checked manually
Get out of town :)in the case of ďIf I were googleĒ. Standards are standards. And it you are not a spammer it does not matter now does it. If the flag is raised then spammers will have the choice, valid code boost or potential red flag. The ďlawĒ abiding webmaster need not worry.
I am not talking about penalising sites. This 100 factors algo, why not 101, then you get a slight boost. Encourage us as a whole. Word will spread, people will go to the W3 to find out what this validation is all about.
Stop worrying about TBPR :). Links below TBPR4 still count they just donít show in the link: command.
>>That would load pressure onto the toolset writers.
Now you are cooking, did not cross my mind, but that would help, rather than having to buy LIFT you get it rolled into MX etc.
BigDave + pleeker
>>Talk about Google killing off the Mom and Pop sites
Why would it, if a mom + pop site is in a competitive field it has likely already been severly beaten by SEOís. In less dangerous categories, it wont really matter. Again I am not talking ban.
>>I don't really care for the standards, and I refuse to design my sites by them
LOL. You would if there was an increase in ranking to be had.
>>To call that idea "half-witted" would be overestimating it.
Ahh, you got me there, what a well thought out response. :)
>>So when I read something like let's ban all "non-compliant
>>pages and make the web better
Show me where I say ban? You were never banned for using CSS, but the cleanness of your site was increased. No one was banned, but a benefit came from a ranking perspective.
With you on the tools and browsers mate.
>>I think this thread suggests that Google is not changing the web for the better
Well Google has made the world better and the web, no doubt about that. Thanks :). Perhaps they could make is betterer :)
| 3:40 pm on May 1, 2004 (gmt 0)|
Google's job is to index content. Period.
If Joe User is looking for information on wombats or widgets, he expects Google to deliver the most relevant search results for that topic--not for the sites that best comply with standards.
| 3:45 pm on May 1, 2004 (gmt 0)|
|Google to deliver the most relevant search results for that topic--not for the sites that best comply with standards |
Then why did we start using H* tags good title etc to dominate the serps of sites that dont know of such voodoo.
Are you also suggesting they do return the best results? I am talking about feilds that I have no vested interest in, full of guff from proper markup. Who's fault is that, Google for not being able to combat spam or the webmaster for knowing how to work it.
| 4:18 pm on May 1, 2004 (gmt 0)|
... and the next thing will be qualifications. No thanks.
I'm in favour of standards but only, in this case, if they're optional.
| 6:56 pm on May 1, 2004 (gmt 0)|
Whether you think there should be a ban, a ranking penalty for non-compliance, or a ranking bonus for compliance, the issue remains the same.
If someone creates a page that throws a browser into "quirks mode" - who cares? The searcher only wants the best information on "albino ferrets" and it is google's job to present it to them - and trust me - the end user doesn't care if a page "validates" or not.
FWIW - I am not saying that standards aren't good, nor am I saying that one shouldn't strive to create valid pages - however I disagree with G forcing compliance.
[edited by: PatrickDeese at 7:49 pm (utc) on May 1, 2004]
| 7:05 pm on May 1, 2004 (gmt 0)|
|I am not saying that standards aren't good, nor am I saying that one shouldn't strive to create valid pages - however I disagree with G forcing compliance |
I agree. That would have the potential to evolve into a mess.
| 7:15 pm on May 1, 2004 (gmt 0)|
In its serps Google can add a small W3C image below each compliant page. A few other symbols it could use for showing presence of other desirable traits. Just like those hotel guides that have tiny images to show whether the listings have valet parking, restaurants, or whatever.
| 7:49 pm on May 1, 2004 (gmt 0)|
> In its serps Google can add a small W3C image below each compliant page
I wonder how much that would cost Google in extra bandwidth on an annual basis - a million dollars? 2 million?
| This 74 message thread spans 3 pages: 74 (  2 3 ) > > |