homepage Welcome to WebmasterWorld Guest from 54.226.0.225
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 43 message thread spans 2 pages: 43 ( [1] 2 > >     
Can a site be too clean?
At what point is a site over-optimized?
oaktown




msg:3021593
 4:43 pm on Jul 25, 2006 (gmt 0)

I've seen a lot of people posting who say they got nailed at some point in the progress of everflux.

Frequently the advise given them is to the effect of "go back and clean up your code", in addition to other things. I haven't been nailed yet, but rather tahn wait for the axe to fall, I went through my small, 100-page site, using the w3c validator. Almost every page showed some error. Not horrible errors, but errors nonetheless. I cleaned everything up and now the pages all validate.

My questions are these:

1) some of my links to external sites contained characters that the validator didn't like. Would G have penalized me as non-wellformedness because it too didn't like the characters?

2) if a site is TOO clean, might G penalize it for being over-optimized?

Thanks in advance.

 

Oliver Henniges




msg:3021824
 8:09 pm on Jul 25, 2006 (gmt 0)

> 1) some of my links to external sites contained characters that the validator didn't like. Would G have penalized me as non-wellformedness because it too didn't like the characters?

I don't think so. I guess googlebot has learned to be quite tolerant on these issues, but details depend on the precise nature of the mistakes, of course. All that could have happened, was that these links actually did not count as inbound links for your partners.

> 2) if a site is TOO clean, might G penalize it for being over-optimized?

There is definitely no penalty for syntactical well-formedness. That would be quite absurd. I'd recon you go even further and check accessability and usability issues. There are no standards and test tools for the latter yet, but I assume google likes pages that follow the wcag-guidelines.

[webmasterworld.com...]

[cynthiasays.com...]

(hope the second link is within the TOS)

oaktown




msg:3021871
 8:56 pm on Jul 25, 2006 (gmt 0)

Thanks much, Oliver Henniges. I checked both and everything seems OK. I am just concerned that G might get curious as to whether a page is simply "Well-formed" or very tight SEO. Am I just being a Nervous Nelly?

oaktown




msg:3021875
 9:03 pm on Jul 25, 2006 (gmt 0)

Also, since I'm not listed in the ODP, do I need to have the robots/noodp tag?

buckworks




msg:3021882
 9:09 pm on Jul 25, 2006 (gmt 0)

Don't worry about that one unless you end up in the ODP with a description you don't like. Until then, omit it and keep your character count down.

trinorthlighting




msg:3021915
 9:35 pm on Jul 25, 2006 (gmt 0)

Oaktown,

All of my pages validate via W3C, it does help out a lot but it is only one small part of making pages rank well.

Good content, unique meta tags (and I mean unique) are the second part of the battle.

Do not worry about the ODP, I am not in it and I rank top 10 in the serps.

oaktown




msg:3021969
 10:38 pm on Jul 25, 2006 (gmt 0)

Trinorthlighting,

By unique, do you mean from page-to-page? My site is about Blue Widgets, so the meta tags (description, for example) tend to be Blue Widgets in boxes, Blue widgets in cans, Blue widgets in bags, etc. Is that sufficiently unique?

SuddenlySara




msg:3022026
 11:45 pm on Jul 25, 2006 (gmt 0)

Don't worry about validating via W3C for Google rankings.
Google, MSN and Yahoo do not validate on W3C terms (nor does most big keyword sites). This maybe an issue 10 years from now... Or some say 300 years from now.

That's not what this search engine mess is about. Sites are getting nailed now for spam related filters that none of the SEs are able
to control so they keep coming up with filters that effect good here
and bad there.

Besides these spam filters I think Adwords is discovering "big money" keyword sites and putting the naturals down the list for a bigger paycheck. We all know every search engine manually does some tweeking right?

Google is digging deep, so correct broken links and lost orphan files to get out of a doorway filter. Much more going on with Google filters then ever before. Also forget about having a links directory off of your site that is broken down like a Dmoz directory. Create a directory that is spot on with the theme of your site only.

So go ahead and keep your site clean.

I still think it is super easy to rank on top of keywords that have 1,000s of site competition. It's the sites that are able or were able to rank high on a million plus keyword searches right?

trinorthlighting




msg:3022083
 1:36 am on Jul 26, 2006 (gmt 0)

The more unique the better off you are. Especially in the keywords I have noticed the most.

Also, unique title tags.

Recently I looked at a w3c site that was supplemental, was a good site and the reason why it was supplemental was because of the title tags.

He had Page A- My company. Page B- My company, etc....

Once he removed the -My company (Which was a bit duplicate) he came out of the supplementals and improved in google.

The more unique the better, that way you do not trip a duplicate filter and google will understand better how to send people to your site.

If you sell widgets, then yes make sure the tags are for blue, red, in boxes, etc.... Google will understand your widget site and direct traffic appropriately when a person types in red widgets.

SuddenlySara




msg:3022099
 2:06 am on Jul 26, 2006 (gmt 0)

Yep, I think that is one of the tickets. Be as original as possible.

oaktown




msg:3022149
 3:01 am on Jul 26, 2006 (gmt 0)

Thanks guys,

I'll check, but I think out of blind, dumb luck I have done this (it just seemed logical). No for what may be the dumbest question asked on WW this year (if I win, I'll tell you where to send the trophy):

My site has very orginal content (with a few quotes from the person on whom it focuses) that I write. Frequently bloggers, forums and occasionally news soureces (no s**t) take substantial portions of my text (sometimes 500 words or more) without attributing it to me, and use it on their websites.

Q: I don't mind them passing along my writing, but am I at risk for a dupe content penalty, even though they are using MY commentary?

And thank you all for your help. It may keep my site from getting killed off and that's way important to myself and others.

Davronin




msg:3022240
 5:39 am on Jul 26, 2006 (gmt 0)

I look at it this way. Google knows full well how, backyard, do-it-your-self (and mostly clueless) webdesigners, will, also, display a very unique quality in terms of content. That is to say, original content - that golden river that Googlebot loves to feed from.

Think about it - the common man doesn't have degrees in Information Technology that most of us here have. It is the common man - the one who doesn't know html and who might make the tiny little mistakes. Indeed, will blogging, for example, turn a page of search engine finds into a sea of ego-centric tripe? It's getting hard enough as it is with the current state of S.E. technology.

Oliver Henniges




msg:3022625
 2:29 pm on Jul 26, 2006 (gmt 0)

> take substantial portions of my text (sometimes 500 words or more) without attributing it to me, and use it on their websites.

It's definitely copyright infringement. At least they owe you a link. If you have the time, ask those people to add a backlink to your side. After your side has stabilized as the authority on that niche-topic, you might try to draw some revenue from those, who refused to backlink, with the help of a good lawyer.

Personally I'm no friend of such copyright issues, good content should be spread as wide as possible. But to even refuse to give the source of a quotation is so immodest, that I'd really think of such an option.

Halfdeck




msg:3022661
 2:56 pm on Jul 26, 2006 (gmt 0)

When you're writing unique description meta tags, avoid making them too short (less than 50-60 chars) or Google will dig into your HTML. No problem if Google finds the content and not the menu links, but why roll the dice when you got it in the bag? When your site first gets spidered, run a site: search and make sure you don't see any "omitted results." If you do, you need to tweak your title/descriptions. After a while, those omitted results may turn supplemental.

trinorthlighting




msg:3023434
 1:16 am on Jul 27, 2006 (gmt 0)

I agree with half deck on the tags

coosblues




msg:3023451
 1:23 am on Jul 27, 2006 (gmt 0)

It's impossible to validate your code if you are using any type of adsense, and since G's own code won't validate (and you can not change it) I don't worry about it. I had every page validated before I added adsense, and now that it is added my position in the SERPs has not changed in the least. I agree with others. Your meta tags, H tags and such need to be used properly and links need to be checked etc. You should be fine.

trinorthlighting




msg:3023581
 3:29 am on Jul 27, 2006 (gmt 0)

I have adsense ads on my sites an they validate. Use the banner type.

trinorthlighting




msg:3024712
 7:04 pm on Jul 27, 2006 (gmt 0)

One other thought on w3c validation is that it helps cut down bandwidth and the page loads faster. I will give an example, today I created a page in microsoft word as and experiment. When I completed the page, I uploaded it and it was 25 kb in size.

Then I cleaned up the page, made it w3c compliant. It did not change the layout of the page one bit. When I was done with the page it was only 7 kb in size.

So, not only did I cut the bandwidth for the page by 66%, I also increased the loading speed of the page by 66%.

These are two big keys that most webmasters miss.

g1smd




msg:3027428
 8:02 pm on Jul 29, 2006 (gmt 0)

Non-valid code might just stop part of a page being indexed, or cause some links to not be followed to other parts of the site.

Checking the page for major errors should be automatic. No need to clear up absolutely everything, but if you want to do a good job, make your site as cross-browser compliant as possible, and be sure that you are providing the best code to the spiders, then you would spend a couple of minutes running a few pages through the HTML validator and checking the results.

SuddenlySara




msg:3027461
 8:37 pm on Jul 29, 2006 (gmt 0)

Search engine robots as Google are NOT about to judge you on compliance. They just want to know who is most popular for a given search. Then all of the filters come playing. Be original!

SuddenlySara




msg:3027469
 8:42 pm on Jul 29, 2006 (gmt 0)

g1smd made a great post!

SEO will get harder as the internet grows so start learning or hire someone that has learned more than you. Don't make changes to your site when you have no clue.

trinorthlighting




msg:3027670
 2:44 am on Jul 30, 2006 (gmt 0)

One other thing, a lot of people do not realize how a badly coded site will display on an apple computer browser. Now we all know IPODS are the "in thing" and soon I have a feeling once the wireless technology is out there that a lot of people will be viewing the internet in general from the ipod or a pda.

Just a thought, but in the future people will be surfing the net more via pda's and ipods or like devices.

Halfdeck




msg:3027714
 4:28 am on Jul 30, 2006 (gmt 0)

[code.google.com...]

The first thing to notice is the huge number of markup errors involving the meta element. Markup such as:

<meta name=description value=the best site for hot air balloons>

…which results in a meta element with eight attributes, and which doesn’t help anyone (least of all the search engines it’s aimed at, since the second attribute should have been content, not value, and therefore the entire element is likely to be ignored).

[edited by: Halfdeck at 4:29 am (utc) on July 30, 2006]

g1smd




msg:3028005
 4:21 pm on Jul 30, 2006 (gmt 0)

You don't have to loook very far to find websites with broken title tags like:

<title> We Goofed <title>
<title> We Goofed /title>
<title> We Goofed </title
<title We Goofed </title>
<title> We Goofed <meta name="description" ...

You think that their rankings are unaffected?

trinorthlighting




msg:3028344
 2:15 am on Jul 31, 2006 (gmt 0)

I agree g1smd

pageoneresults




msg:3028364
 3:09 am on Jul 31, 2006 (gmt 0)

1) some of my links to external sites contained characters that the validator didn't like.

Those are usually ampersands &. They need to be escaped...

&amp;

trinorthlighting




msg:3028883
 4:00 pm on Jul 31, 2006 (gmt 0)

or they have capital tags <A HREF, <A href, etc......

Very important to make sure google has no issues following links.

pageoneresults




msg:3029111
 6:36 pm on Jul 31, 2006 (gmt 0)

or they have capital tags <A HREF, <A href, etc.

Case would not be an issue in the HTML Markup. Google will index whatever you put there. As long as it is a valid attribute, etc., it's going to be indexed no matter what the case is.

gabidi




msg:3029122
 6:48 pm on Jul 31, 2006 (gmt 0)

According to Matt Cutt's latest video answers, seems like Google does take into account that vast majority of NATURAL pages do have errors (i think he mentioned around 40%) and that while it is a factor, so long that your page can be browsed by a Text browser then you shouldn't worry to much about it.

My guess would be if you are in push and shove situ with your competitor for a certain keyword ranking, then yes , clean code would be a MINOR factor.

trinorthlighting




msg:3029160
 7:36 pm on Jul 31, 2006 (gmt 0)

One thing he did stress was crawlability with a text browser.

If your site is w3c compliant it will be crawlable!

This 43 message thread spans 2 pages: 43 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved