homepage Welcome to WebmasterWorld Guest from 107.22.37.143
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

This 86 message thread spans 3 pages: 86 ( [1] 2 3 > >     
Page Quality metrics
What would Google use?
Mohamed_E




msg:239694
 8:23 pm on Aug 10, 2003 (gmt 0)

In another thread [webmasterworld.com] I wrote (slightly edited):

There is no evidence that Google currently uses any "page quality" metrics, but it is very possible that at some time they will do so. Popups and dead links would be two possible targets of any "page quality" metric.

Assuming that Google did start evaluating page quality, what features do you think they would take into account?

 

g1smd




msg:239695
 10:53 pm on Aug 11, 2003 (gmt 0)

I hope they give a penalty for Frontpage IE-only type code bloat, and for sites with massive amounts of HTML errors.

MonkeeSage




msg:239696
 11:08 pm on Aug 11, 2003 (gmt 0)

- More than 25-30% browser reality (i.e., how much is visible at any given time--scrollHeight / Width) taken up by ads. <edit>That is, using some standardized size (800 X 600 perhaps).</edit>

- Multiple (over three mabye?) applets / embeds / objects.

Jordan

daamsie




msg:239697
 12:41 am on Aug 12, 2003 (gmt 0)

I figure they should take into account:

- javascript (the more, the less quality imho)
- css (if it isn't there, less quality imho)
- filesize (although this is kind of already done for pages over 100k I believe)

Of course, those are only probability things. It is just 'less probable' that a site would be high quality if it has lots of javascript and no css. It doesn't rule it out though. That's the nature of these things though.

I'm sure there's more, but that's all I can think of off the top of my head. It would be an interesting addition to the algorithm if you ask me.

killroy




msg:239698
 12:56 am on Aug 12, 2003 (gmt 0)

- validation errors
- spelling errors

Don't get me wrong, I'm not beeign anal about this, jsut that I've often thought that if google would penalise spelling errors, a lot of people would start spell cehcking their sites, AND ergo, top results would contain well seplled clean pages made by people who really care about their pages, rather then fire-and-forget mess-content.

SN

MonkeeSage




msg:239699
 2:54 am on Aug 12, 2003 (gmt 0)

I'm not sure what to make of the idea of including use of JavaScript in such a metrics. I do all the stuff that can't be done with CSS or markup in JavaScript (event handling, data manipulation, media display, &c., what else is there for client side?).

I use markup as content, CSS for form, and JavaScript for tasks (isn't that the way the standards are written to be used?). A quality site often needs a good amount of tasking (e.g., see Gemal's Psyched Blog in the BrowserSpy section; not my site, but a good example of the kind of thing I'm referring to). No banners or spam or anything like that, just tasking in order to let you see different information about your browser, which AFAIK could not be accomplished any other way.

For example, just a few things that could not be done (client side) without JavaScript, which almost any quality site would want to do at least some of:

Cannot hilight text on page.
Cannot dynamically evaluate / change content.
Cannot load new pages in frames or iframes except from links (not from a button for example, and technically, the "target" attribute of links is deprecated in XHTML even though it still works in the bowsers).
Cannot link buttons to any other content.
Cannot read text input values.
Cannot do math.
Cannot alert or prompt or warn users.
Cannot determine content dimensions (for whole or individual elements).
Cannot determine current value of listbox content.
Cannot get or set cookies.

Jordan

PeterD




msg:239700
 5:22 am on Aug 12, 2003 (gmt 0)

Code quality is not the same thing as content quality. Although I see where you're coming from, and I agree that page bloat should be minimized, pages should validate, images should be optimally compressed, etc., etc.--while I agree with that, I strongly suspect that users who aren't designers (i.e., 99% of users) don't care. (As long as pages aren't completely broken and don't take forever to load.)

Usefulness of content is what matters, and I don't think any of the metrics mentioned are strongly correlated with usefulness of content. In fact, I don't think there is any decent metric for usefulness of content. Page Rank and related linkage analysis methods are probably the best things anyone has come up with so far, but clearly they're flawed, blunt instruments.

G has a content-centric view of search, and I doubt they'd dilute that by handing out rewards and punishments based on coding issues.

Pete

Perplexed




msg:239701
 7:34 am on Aug 12, 2003 (gmt 0)

As I have said in another thread.

A mini survey I did in the offices around me showed that "ordinary" people thought pagerank was about a sites quality of content.

That is probably as it should be, but I cannot imagine any way that an algo could do this, this has to be done by a human reviewer.

And then we would be thinking in terms of the DMOZ problems... :) :) :)

natural




msg:239702
 9:02 am on Aug 12, 2003 (gmt 0)

a) a sites quality is totally subjective. each individual cares about different things.

b) pr seemingly was supposed to be about site quality, but the fact that it was also tied to serps created an exploit which many webmasters employ: mining for pr. it would seem that, in a perfect world, sites would only link to other quality sites. and, if you think about the internet as a community, as it was much more of in the mid/late 90's, then folks would only link to other sites that were either on-topic, or considered cool. unfortunately, now there are webmasters securing 30,000 inbound links, buying pr, and putting no-index, no-follow on pages with outbound links, in an attempt to mine and hoard pr. unfortunate, but true. can't blame anybody for it, although i wish that weren't the case.

when i think of a quality site, i think of clean design, no script errors (html errors who cares...like closing font tags, no alt tags in images, etc), and easy to use navigation. but, then again, i've seen some wicked cool sites that google can't even accurately crawl. i don't see how an algo could do page quality. that's especially true when you look through dmoz, and see that even human editors have to leave page quality at the doorstep, and focus instead on content quality.

it would be nice if page quality could actually be measured. then we might actually get the most desired results, instead of artificially inflated pr, spammy keyword layden sites that 'appear' to be most relevant to the algo.

se results tend to remind me of school text books, while there are tons of works-of-art on the web that don't ever get ranking. my bet is that someone will try page quality metrics, however, and i wouldn't put it by the UI happy folks over at MS. their sites are all elegantly done (in a corporate kinda way), and easy to use...even though they're huge. even though i don't use their OS for anything more than a paper weight/file server, i do think that they're the most likely to try and implement any type of page quality metrics to their serps, as a differentiating factor.

anybody entering the se market will have to come up with a competitive advantage that brings in users. msn being fully established already, they will have to spark the curiosity of surfers who don't use their engine with some sort of competitive advantage. pqm would make sense.

sorry, a little long winded.

Dave_Hawley




msg:239703
 9:29 am on Aug 12, 2003 (gmt 0)

RE: I hope they give a penalty for Frontpage IE-only type code bloat, and for sites with massive amounts of HTML errors.

I dissagree. I have found that most sites/pages which are via Frontpage often have good quality content. At least compared to those written by Webmaster that make the assumption everyone else on the planet has a super high speed connection just like them. You can pick these sites easily as it takes at least 10 minutes to figure out the reason for the sites exitence. That's after getting past the 10 minute Flash intro.

Short to the point decriptive text and Short to the point decriptive links is number 1 IMO

Dave

GrinninGordon




msg:239704
 9:43 am on Aug 12, 2003 (gmt 0)

They should penalise for bad taste in background colours and using multiple font styles.

Also, broken links using c://mypc/mysites/iuseddreamweaver1/iamadummy.html should be banned for life.

Actually, I just think Google should boost sites with good clean html and navigation. Oh, they do already?! OK, how about sending to everlasting Internet pergatory all those affiliate sites that deem to put garbage like www.google.com&search=mydomain.com, www.whitehouse.gov&search=braincells in their html?

Dave_Hawley




msg:239705
 10:05 am on Aug 12, 2003 (gmt 0)

: They should penalise for bad taste in background colours and using multiple font styles.

Bit hard as it all comes down to personal taste. Besides, most search the net for info, not good colour schemes :o)

Dave

killroy




msg:239706
 11:48 am on Aug 12, 2003 (gmt 0)

What about hte old issue of Google using it's power to improve the net? What do you think about Google adjsuting for spelling, broken links. the sort of stuff everybody SHOULD do, but nobody cares enough to do.

If doing things properly (2 minutes more per page) would get you higher up, more would do so, and Google could have a far reaching, deep influence over th equality of the web.

I have plenty of old spaghetti sites, and I'd be glad if they get borried in the serps until I canfix them, they're an embarressment. ;)

SN

bull




msg:239707
 4:37 pm on Aug 12, 2003 (gmt 0)

pages w/ large flash intros, and, generally pages using flash extensively should pe penalised.

PeterD




msg:239708
 8:10 pm on Aug 12, 2003 (gmt 0)

I really just don't think it's the place of Google to use its influence to improve the quality of the web--what it should do is accurately reflect whatever is out there. Why should it penalize people who are experts on their particular subject matter, but who aren't very au fait with coding issues? If I want to find out about fishing poles, say, I'd rather see the site of Old Joe, the 75-year-old expert who's been hand-making custom fishing poles his whole life, even if his code is a nightmare created by the WYSIWIG editor from hell.

I do realize, though, that Google can't avoid influencing the web in some ways--all the linking and not-linking that goes on to try to game the system, but still, to me, there's a difference between creating a system that will have unintended but unavoidable consequences and intentionally trying to impose one's idea of what the web should be.

Natural said:

my bet is that someone will try page quality metrics, however, and i wouldn't put it by the UI happy folks over at MS. their sites are all elegantly done (in a corporate kinda way), and easy to use...even though they're huge. even though i don't use their OS for anything more than a paper weight/file server, i do think that they're the most likely to try and implement any type of page quality metrics to their serps, as a differentiating factor.

That's an extremely interesting and extremely frightening thought.

g1smd




msg:239709
 9:12 pm on Aug 12, 2003 (gmt 0)

>> >> I hope they give a penalty for Frontpage IE-only type code bloat, and for sites with massive amounts of HTML errors. << <<

>> I disagree. <<

There is an awful lot of junk code out there, much of it very easy to tidy up.

As an example of something seen recently, changing from this style of code:

<br></font><br><b><font face="Times New Roman, Times, serif" size="2">COUNTYSIDE <br>
<br></font></b><font face="Times New Roman, Times, serif" size="2">North
Countyside Widget Centre<br></font><font face="Times New Roman, Times, serif" size="2">
Someplace, Nr. Town, Countyshire CC00 0AA<br></font><font face="Times New Roman, Times, serif" size="2">
Tel: 01000 000000<br></font><font face="Times New Roman, Times, serif" size="2"><b></b>
Widget Starter : approx &pound;80<br></font><font face="Times New Roman, Times, serif" size="2">
Widget Foobar: approx &pound;80<br></font><b></b><font face="Times New Roman, Times, serif" size="2">
Widget Weekend: approx &pound;100<br></font><font face="Times New Roman, Times, serif" size="2">
Contact: Some Person</font><font face="Times New Roman, Times, serif" size="2">
<br></font> <br><b><font face="Times New Roman, Times, serif" size="2">COUNTY</font></b>
<b><font face="Times New Roman, Times, serif" size="2"><br><b></b></font><br></b>
<font face="Times New Roman, Times, serif" size="2">Someplace
Widget Centre<br></font><font face="Times New Roman, Times, serif" size="2">
Someplace Road, Someplace TT00 0AA<b><br></b>
Tel: 01000 000000</font><font face="Times New Roman, Times, serif" size="2"><br>
Contact: Mr Someone</font></p>
<p><font face="Times New Roman, Times, serif" size="2">Big Manor Widget
Centre <br>Some Lane, Little Village<br></font><font face="Times New Roman, Times, serif" size="2">
Someplace, County, MM00 0AA<br></font><font face="Times New Roman, Times, serif" size="2">
Tel: 01000 000000</font><font face="Times New Roman, Times, serif" size="2"><br>
Email: info@widget.co.uk<br></font><font face="Times New Roman, Times, serif" size="2">
Web site: www.widget.co.uk <br></font><font face="Times New Roman, Times, serif" size="2">
Contact: Jimmy Foobar<br></font><font face="Times New Roman, Times, serif" size="2"> <br>

Actually the above code isn't as bad as some examples I could include. I really hate those with thousands of <o:p> tags dotted throughout, and embedded inline CSS styles on every tag.

.

Changed to this style of code, as basic HTML. It would have been even better to dump the <font> tags completely and do it with CSS instead, but that is for later:

<h3 align="left">COUNTYSIDE</h3>

<h4 align="left">North Countyside Widget Centre</h4>

<p align="left"><font face="Times New Roman, Times, serif" size="2">
Someplace, Nr. Town,<br>
Countyshire CC00 0AA<br>
Tel: 01000 000000<br>
Widget Starter: approx &pound;80<br>
Widget Foobar: approx &pound;80<br>
Widget Weekend: approx &pound;100<br>
Contact: Some Person</font></p>

<h3 align="left">COUNTY</h3>

<h4 align="left">Someplace Widget Centre</h4>

<p align="left"><font face="Times New Roman, Times, serif" size="2">
Someplace Road,<br>
Someplace TT00 0AA<br>
Tel: 01000 000000<br>
Contact: Mr Someone</font></p>

<h4 align="left">Big Manor Widget Centre</h4>

<p align="left"><font face="Times New Roman, Times, serif" size="2">
Some Lane, Little Village,<br>
Someplace, County, MM00 0AA<br>
Tel: 01000 000000<br>
Email: <a href="mailto:info@widget.co.uk"
title="email Jimmy at Big Manor">info@widget.co.uk</a><br>
Website: <a href="http://www.widget.co.uk/"
title="Big Manor Widgets"
target="_blank">www.widget.co.uk</a><br>
Contact: Jimmy Foobar</font></p>

That was a basic start. I realise that there are a lot of other things that can be done, but the main one was to get rid of bloat (knocking 25% to 35% off most files), eliminating nesting errors, adding structure (<hx> tags) to the data. Each page is now about 25K. Next step might be to subdivide the content a little more.

daamsie




msg:239710
 12:29 am on Aug 13, 2003 (gmt 0)

Monkeesage, let me step through the things you mentioned as JS only. Some can be done client side using CSS, but many are better done server side. This is generally far better for the user, because it isn't browser dependant.

Cannot hilight text on page.

Use CSS - just look at the site you are on now

Cannot dynamically evaluate / change content.

Not on the same page at least.. all that is needed is another page to go to. A lot more friendly to users without JS.

Cannot load new pages in frames or iframes except from links (not from a button for example, and technically, the "target" attribute of links is deprecated in XHTML even though it still works in the bowsers).

Frames should also be factored in as a bad thing.

Cannot link buttons to any other content.

What do you mean? Rollovers? Again, use CSS for simple rollovers. The complex ones are generally unnecessary.

Cannot read text input values.
Cannot do math.

What's a form for?

Cannot alert or prompt or warn users.

Better to not use JS, that way EVERYONE gets the message, not just those with JS turned on. Particularly if it's anything important. Just create an intermediary page with the message on it.

Cannot determine content dimensions (for whole or individual elements).

Fair enough.

Cannot determine current value of listbox content.

Agreed, it is useful for that.

Cannot get or set cookies.

Actually, I can use Coldfusion to set cookies..

Don't get me wrong, I'm not saying that sites that ALL sites that use JS are bad, I'm just saying that the probability of a good site having lots of JS is small. Same as the fact that sites with no incoming links aren't necessary bad, just a higher probability.

pages w/ large flash intros, and, generally pages using flash extensively should pe penalised.

ehem, isn't that already a self-inflicted penalty? Flash sites rarely rank well, because they are using flash and as a result have very little content in the serps.

- spelling errors

Don't get me wrong, I'm not beeign anal about this, jsut that I've often thought that if google would penalise spelling errors, a lot of people would start spell cehcking their sites, AND ergo, top results would contain well seplled clean pages made by people who really care about their pages, rather then fire-and-forget mess-content.

Did I just find four spelling mistakes in there :-) I like the idea of checking for spelling, but considering the large amount of user submitted content out there, I don't think it would work. I feel that similar to the flash thing, there is kind of a penalty for that already anyway. After all, if there are typos, those sites are missing out on coming up higher in the serps for the term spelt correctly.

GrinninGordon




msg:239711
 12:40 am on Aug 13, 2003 (gmt 0)

Dave_Hawley

That part was not meant to be serious. If you like, I can do this <joke>in future</joke>!

natural




msg:239712
 12:51 am on Aug 13, 2003 (gmt 0)

i hope that everybody that sees graphic rollovers and flash and other eye-candy type features are unnecessary, or should be penalized, aren't as boring irl. a whole closet full of white, shortsleeve, button-down shirts, and poly-blend slacks in different muted dark colors, and a matching pocket protector for every day of the week?

this is why page quality metrics will never work on a se designed for the lowest common denominator. too many people have no sense of style.

yawn.

Dave_Hawley




msg:239713
 1:00 am on Aug 13, 2003 (gmt 0)

RE: That part was not meant to be serious. If you like, I can do this <joke>in future</joke>!

Erm yes. Most would at least use a smilie if they ARE joking.

E: There is an awful lot of junk code out there, much of it very easy to tidy up.

Yes it is easy to tidy up.......if you know HTML. Most that use Frontpage etc know little or nothing about HTML. I find these sites stick to good quality content without all the bells and whistles. True they are larger than need be, but as it's not Flash etc the extra time to download, in most cases, is not noticable.

Quite frankly, I wish some of these 'so called' Web designers would go back to basics and stop adding all the fruit they can lay their hands on simply because they know how to.

Dave

daamsie




msg:239714
 1:07 am on Aug 13, 2003 (gmt 0)

i hope that everybody that sees graphic rollovers and flash and other eye-candy type features are unnecessary, or should be penalized, aren't as boring irl. a whole closet full of white, shortsleeve, button-down shirts, and poly-blend slacks in different muted dark colors, and a matching pocket protector for every day of the week?

lol. fair point. eye-candy has its place on the web, certainly.

Unfortunately, many of those sites are
a) slow to load
b) badly done because they are done by people who decided they should just use every gadget available rather than the ones best suited to the job.

I love eye-candy and do a lot of work in flash - I limit the distribution of that pretty much to CD-rom though, because I can bundle it as a projector and won't be inflicting bandwidth congestion on anyone. I do a lot of animation for tv as well and will make that as complex and as enjoyable to watch as possible. No restrictions there.

It's all about knowing the medium you are working in. Certainly there are times when you can use the web to dazzle people and I have seen flash and javascript used well for that purpose. Excess use of both of those though (in my experience) has never been a good thing and will rarely be what I am after in the search engines.

chiyo




msg:239715
 1:25 am on Aug 13, 2003 (gmt 0)

spelling is always a problem. Im still to find a spell checker which knows all scientific and professional terms (say used in legal, IT, medical, academic and any other specialist fields) foreign words and phrases often used in English pages, slang and local idioms (which can often be used quite legitimately in pages), and place/people names (like names of people, cities, towns and regions. Valid Accented characters and special characters often cause spell checkers to choke. Then of course you will have to have a spell checker for every language. I dont think even google can develop the perfect spell checker, which they would need if they were going to implement this as a measure of quality.

I can see some argument on checking for dead links. Maybe that is one area where webmasters should be responsible for making sure they still work. This is a major time-taker for us. Nowadays we dont link to any inner page, unless we are almost 100% sure that the url wont change. So if we want to refer to newspaper news pages, we will link to the newspaper domain only, and not the actual page. That has saved a lot of time. We assume that if people really do want the exact page, they can find it by cliking on in the referred domain or doing a search (if indeed the actual news item still exists at a later date)

I cant see any justification for penalization for using js or NOT using CSS. We use CSS in several sites, but in some, it is just not needed, and the idea that all sites should naturally or eventually use CSS and are somehow lower quality if they don't is badly flawed.

There are more important things however tha Google still have to perfect, - like dropping all redirect pages whatsoever, and those with substantial amounts of duplicate content.

MonkeeSage




msg:239716
 1:33 am on Aug 13, 2003 (gmt 0)

daamsie:

I agree with you for the most part, but a few points:

Use CSS - just look at the site you are on now

I was meaning like this, e.g., [squarefree.com...]
It could be done partially in pure CSS using a selector like *[hilite], but even at that you'd need JS to set the arbitrary hilite attribute on content elements according to user input.

What do you mean? Rollovers?

I was meaning actual input buttons, e.g., <input type="button" value="Do Something" onclick="func();"/> (where func() reads / manipulates other content elements, such as text inputs).

What's a form for?

*Looks down, shuffles feet...*
I never really learned forms...I always thought that they had to be used with server side to get information from them (i.e., action="/path/to/cgi.pl")? Looks like I've been mistaken about that?

Better to not use JS, that way EVERYONE gets the message, not just those with JS turned on. Particularly if it's anything important. Just create an intermediary page with the message on it.

Good point. I agree that for alert and warning another page is usually best. But I don't know about for prompt...I'm not sure how that would be possible.

Jordan

daamsie




msg:239717
 2:12 am on Aug 13, 2003 (gmt 0)

MonkeeSage, webmasterworld does a very similar highlighting trick using CSS. Have you ever noticed when you search for a term on Google and end up here, the terms are all highlighted. I'm not sure how Brett set it up, but it must be possible.

Basically, a lot of the things you mentioned are problems because you are relying on client side only. But why not do those things server side? It is so much more elegant to do a lot of that stuff through something like Coldfusion or PHP that won't be dependant on whether a user has JS or not. There's just too often sites that I go to that simply won't work because they are using too much JS. Most hosts offer some server side scripting at least and it really isn't that hard to learn. I find it easier than JS scripting anyway.

For prompting (I presume if someone has left a form field blank for example?) , I would make a validating page and send the user back to the page with the fields marked in red that they needed to fill out. That way, the message will be there for everyone (not just those with JS).

BTW, I do use JS on my site also. I use it for popup windows (for larger versions of photos for example) and I use it for a form that evaluates a country and populates a city select accordingly. The only other places I use it are in my admin section, where I know there won't be anyone else using it. Other than that, I have gotten rid of it. I had rollovers, but I ended up finding them unnecessary and decided to live without the extra loading time. It's the excess use of JS that I feel gives a higher chance of a less high quality site. Again, I'm not saying it's impossible, I just feel it's less probable that a site would be as good as another without the high level use of JS. For starters, the one without the JS is more likely to be more accessible to a wider range of users.

Maybe counting CSS as a bonus isn't really necessary. Although font tags aren't ideal, they don't really affect the user experience. A simple filesize check may be a better concept, and that would discourage code bloat at the same time.

These are really only my personal opinions.. probably not grounded in any factual data at all :-)

MonkeeSage




msg:239718
 2:27 am on Aug 13, 2003 (gmt 0)

daamsie:

Hmmm, those are some good points, especially about accessibility. Thanks for taking the time to explain your POV, I understand your inclusion of JS in such a metrics better now. :)

BTW, I have been planing on learning PHP, but I only recently got a host running PHP4. I guess it's about time to stop procrastinating and start learning...I should really do that...mabye later. ;D

Jordan

daamsie




msg:239719
 2:37 am on Aug 13, 2003 (gmt 0)

Monkeesage, definately you should get into PHP :) If you are grasping JS, then it really should be not much trouble at all. There's tons of tutorials out there that can get you started really quick. I am definately no PHP expert, but I know that working with server side languages makes all the difference to the user (and webmaster) experience. Let alone when you start driving your site from a database.

As an example, one of the things I cannot live without now is includes. This saves going through every page of a site everytime I want to change something like the body tag properties. And that has got to be one of simplest tasks for PHP (and even easier in Coldfusion).

mfishy




msg:239720
 12:59 pm on Aug 14, 2003 (gmt 0)

<<There is an awful lot of junk code out there, much of it very easy to tidy up. >>

Still don't see why that justifies Google penalizing sites?

The point is, why would the average user care? Google isn't attempting to deliver results to the .001% of people who know html.

Spelling would be tricky. Imagine what Google would do to this forum :)

GoogleGuy




msg:239721
 3:18 pm on Aug 14, 2003 (gmt 0)

I think good content comes in a lot of different ways. If you look at pages by PageRank values, you realize that pages with lower PageRank are actually more likely to misspell words, for example. But that doesn't mean that a lower PageRank page or site isn't a great source of info.

Another data point, from Brewer's (Inktomi) paper about crawling the web several years ago: some 40% of html pages had some form of actual error (not just a validation problem). Someone may want to check my memory, but there's a lot of good info out there that isn't always on perfectly written pages. :) That said, I like the lists that people are mentioning (e.g. # of dead links on a site)..

messan




msg:239722
 3:57 pm on Aug 14, 2003 (gmt 0)

Is there any stats regarding the "html error factor" in websites or is this only assumptions?

I believe that properly written html helps since it speeds up the loading (page less heavy) but I am not sure about google counting the errors for ranking.

But hey SEO is a world of speculations ;)

tedster




msg:239723
 4:48 pm on Aug 14, 2003 (gmt 0)

Still don't see why that justifies Google penalizing sites?

Junk code may carry it's own punishment, without a search engine adding on any penalty. Parts of the page may have errors that can't be handled by the error recovery routines that are used. Also, junk code can make the page size much too large in relation to the text content - I know that some algos used to look at that ratio. And I know I've boosted pages that I inherited, without changing any conent but only cleaning up the tags.

One of the common problems with junk code is misuse of HTML tags. What if an algo boosts the value of spaces because that's all that exists inside many of the paragraph tags and the spaces have <b></b> around them?

This 86 message thread spans 3 pages: 86 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved