Welcome to WebmasterWorld Guest from 54.198.93.179

Message Too Old, No Replies

Can a site be too clean?

At what point is a site over-optimized?

     
4:43 pm on Jul 25, 2006 (gmt 0)

10+ Year Member



I've seen a lot of people posting who say they got nailed at some point in the progress of everflux.

Frequently the advise given them is to the effect of "go back and clean up your code", in addition to other things. I haven't been nailed yet, but rather tahn wait for the axe to fall, I went through my small, 100-page site, using the w3c validator. Almost every page showed some error. Not horrible errors, but errors nonetheless. I cleaned everything up and now the pages all validate.

My questions are these:

1) some of my links to external sites contained characters that the validator didn't like. Would G have penalized me as non-wellformedness because it too didn't like the characters?

2) if a site is TOO clean, might G penalize it for being over-optimized?

Thanks in advance.

8:09 pm on Jul 25, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> 1) some of my links to external sites contained characters that the validator didn't like. Would G have penalized me as non-wellformedness because it too didn't like the characters?

I don't think so. I guess googlebot has learned to be quite tolerant on these issues, but details depend on the precise nature of the mistakes, of course. All that could have happened, was that these links actually did not count as inbound links for your partners.

> 2) if a site is TOO clean, might G penalize it for being over-optimized?

There is definitely no penalty for syntactical well-formedness. That would be quite absurd. I'd recon you go even further and check accessability and usability issues. There are no standards and test tools for the latter yet, but I assume google likes pages that follow the wcag-guidelines.

[webmasterworld.com...]

[cynthiasays.com...]

(hope the second link is within the TOS)

8:56 pm on Jul 25, 2006 (gmt 0)

10+ Year Member



Thanks much, Oliver Henniges. I checked both and everything seems OK. I am just concerned that G might get curious as to whether a page is simply "Well-formed" or very tight SEO. Am I just being a Nervous Nelly?
9:03 pm on Jul 25, 2006 (gmt 0)

10+ Year Member



Also, since I'm not listed in the ODP, do I need to have the robots/noodp tag?
9:09 pm on Jul 25, 2006 (gmt 0)

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Don't worry about that one unless you end up in the ODP with a description you don't like. Until then, omit it and keep your character count down.
9:35 pm on Jul 25, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Oaktown,

All of my pages validate via W3C, it does help out a lot but it is only one small part of making pages rank well.

Good content, unique meta tags (and I mean unique) are the second part of the battle.

Do not worry about the ODP, I am not in it and I rank top 10 in the serps.

10:38 pm on Jul 25, 2006 (gmt 0)

10+ Year Member



Trinorthlighting,

By unique, do you mean from page-to-page? My site is about Blue Widgets, so the meta tags (description, for example) tend to be Blue Widgets in boxes, Blue widgets in cans, Blue widgets in bags, etc. Is that sufficiently unique?

11:45 pm on Jul 25, 2006 (gmt 0)



Don't worry about validating via W3C for Google rankings.
Google, MSN and Yahoo do not validate on W3C terms (nor does most big keyword sites). This maybe an issue 10 years from now... Or some say 300 years from now.

That's not what this search engine mess is about. Sites are getting nailed now for spam related filters that none of the SEs are able
to control so they keep coming up with filters that effect good here
and bad there.

Besides these spam filters I think Adwords is discovering "big money" keyword sites and putting the naturals down the list for a bigger paycheck. We all know every search engine manually does some tweeking right?

Google is digging deep, so correct broken links and lost orphan files to get out of a doorway filter. Much more going on with Google filters then ever before. Also forget about having a links directory off of your site that is broken down like a Dmoz directory. Create a directory that is spot on with the theme of your site only.

So go ahead and keep your site clean.

I still think it is super easy to rank on top of keywords that have 1,000s of site competition. It's the sites that are able or were able to rank high on a million plus keyword searches right?

1:36 am on Jul 26, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



The more unique the better off you are. Especially in the keywords I have noticed the most.

Also, unique title tags.

Recently I looked at a w3c site that was supplemental, was a good site and the reason why it was supplemental was because of the title tags.

He had Page A- My company. Page B- My company, etc....

Once he removed the -My company (Which was a bit duplicate) he came out of the supplementals and improved in google.

The more unique the better, that way you do not trip a duplicate filter and google will understand better how to send people to your site.

If you sell widgets, then yes make sure the tags are for blue, red, in boxes, etc.... Google will understand your widget site and direct traffic appropriately when a person types in red widgets.

2:06 am on Jul 26, 2006 (gmt 0)



Yep, I think that is one of the tickets. Be as original as possible.
3:01 am on Jul 26, 2006 (gmt 0)

10+ Year Member



Thanks guys,

I'll check, but I think out of blind, dumb luck I have done this (it just seemed logical). No for what may be the dumbest question asked on WW this year (if I win, I'll tell you where to send the trophy):

My site has very orginal content (with a few quotes from the person on whom it focuses) that I write. Frequently bloggers, forums and occasionally news soureces (no s**t) take substantial portions of my text (sometimes 500 words or more) without attributing it to me, and use it on their websites.

Q: I don't mind them passing along my writing, but am I at risk for a dupe content penalty, even though they are using MY commentary?

And thank you all for your help. It may keep my site from getting killed off and that's way important to myself and others.

5:39 am on Jul 26, 2006 (gmt 0)

5+ Year Member



I look at it this way. Google knows full well how, backyard, do-it-your-self (and mostly clueless) webdesigners, will, also, display a very unique quality in terms of content. That is to say, original content - that golden river that Googlebot loves to feed from.

Think about it - the common man doesn't have degrees in Information Technology that most of us here have. It is the common man - the one who doesn't know html and who might make the tiny little mistakes. Indeed, will blogging, for example, turn a page of search engine finds into a sea of ego-centric tripe? It's getting hard enough as it is with the current state of S.E. technology.

2:29 pm on Jul 26, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> take substantial portions of my text (sometimes 500 words or more) without attributing it to me, and use it on their websites.

It's definitely copyright infringement. At least they owe you a link. If you have the time, ask those people to add a backlink to your side. After your side has stabilized as the authority on that niche-topic, you might try to draw some revenue from those, who refused to backlink, with the help of a good lawyer.

Personally I'm no friend of such copyright issues, good content should be spread as wide as possible. But to even refuse to give the source of a quotation is so immodest, that I'd really think of such an option.

2:56 pm on Jul 26, 2006 (gmt 0)

5+ Year Member



When you're writing unique description meta tags, avoid making them too short (less than 50-60 chars) or Google will dig into your HTML. No problem if Google finds the content and not the menu links, but why roll the dice when you got it in the bag? When your site first gets spidered, run a site: search and make sure you don't see any "omitted results." If you do, you need to tweak your title/descriptions. After a while, those omitted results may turn supplemental.
1:16 am on Jul 27, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I agree with half deck on the tags
1:23 am on Jul 27, 2006 (gmt 0)

10+ Year Member



It's impossible to validate your code if you are using any type of adsense, and since G's own code won't validate (and you can not change it) I don't worry about it. I had every page validated before I added adsense, and now that it is added my position in the SERPs has not changed in the least. I agree with others. Your meta tags, H tags and such need to be used properly and links need to be checked etc. You should be fine.
3:29 am on Jul 27, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I have adsense ads on my sites an they validate. Use the banner type.
7:04 pm on Jul 27, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



One other thought on w3c validation is that it helps cut down bandwidth and the page loads faster. I will give an example, today I created a page in microsoft word as and experiment. When I completed the page, I uploaded it and it was 25 kb in size.

Then I cleaned up the page, made it w3c compliant. It did not change the layout of the page one bit. When I was done with the page it was only 7 kb in size.

So, not only did I cut the bandwidth for the page by 66%, I also increased the loading speed of the page by 66%.

These are two big keys that most webmasters miss.

8:02 pm on Jul 29, 2006 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Non-valid code might just stop part of a page being indexed, or cause some links to not be followed to other parts of the site.

Checking the page for major errors should be automatic. No need to clear up absolutely everything, but if you want to do a good job, make your site as cross-browser compliant as possible, and be sure that you are providing the best code to the spiders, then you would spend a couple of minutes running a few pages through the HTML validator and checking the results.

8:37 pm on Jul 29, 2006 (gmt 0)



Search engine robots as Google are NOT about to judge you on compliance. They just want to know who is most popular for a given search. Then all of the filters come playing. Be original!
8:42 pm on Jul 29, 2006 (gmt 0)



g1smd made a great post!

SEO will get harder as the internet grows so start learning or hire someone that has learned more than you. Don't make changes to your site when you have no clue.

2:44 am on Jul 30, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



One other thing, a lot of people do not realize how a badly coded site will display on an apple computer browser. Now we all know IPODS are the "in thing" and soon I have a feeling once the wireless technology is out there that a lot of people will be viewing the internet in general from the ipod or a pda.

Just a thought, but in the future people will be surfing the net more via pda's and ipods or like devices.

4:28 am on Jul 30, 2006 (gmt 0)

5+ Year Member



[code.google.com...]

The first thing to notice is the huge number of markup errors involving the meta element. Markup such as:

<meta name=description value=the best site for hot air balloons>

…which results in a meta element with eight attributes, and which doesn’t help anyone (least of all the search engines it’s aimed at, since the second attribute should have been content, not value, and therefore the entire element is likely to be ignored).

[edited by: Halfdeck at 4:29 am (utc) on July 30, 2006]

4:21 pm on Jul 30, 2006 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



You don't have to loook very far to find websites with broken title tags like:

<title> We Goofed <title>
<title> We Goofed /title>
<title> We Goofed </title
<title We Goofed </title>
<title> We Goofed <meta name="description" ...

You think that their rankings are unaffected?

2:15 am on Jul 31, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I agree g1smd
3:09 am on Jul 31, 2006 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member



1) some of my links to external sites contained characters that the validator didn't like.

Those are usually ampersands

&
. They need to be escaped...

&amp;
4:00 pm on Jul 31, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



or they have capital tags <A HREF, <A href, etc......

Very important to make sure google has no issues following links.

6:36 pm on Jul 31, 2006 (gmt 0)

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member



or they have capital tags <A HREF, <A href, etc.

Case would not be an issue in the HTML Markup. Google will index whatever you put there. As long as it is a valid attribute, etc., it's going to be indexed no matter what the case is.

6:48 pm on Jul 31, 2006 (gmt 0)

5+ Year Member



According to Matt Cutt's latest video answers, seems like Google does take into account that vast majority of NATURAL pages do have errors (i think he mentioned around 40%) and that while it is a factor, so long that your page can be browsed by a Text browser then you shouldn't worry to much about it.

My guess would be if you are in push and shove situ with your competitor for a certain keyword ranking, then yes , clean code would be a MINOR factor.

7:36 pm on Jul 31, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



One thing he did stress was crawlability with a text browser.

If your site is w3c compliant it will be crawlable!

This 43 message thread spans 2 pages: 43
 

Featured Threads

Hot Threads This Week

Hot Threads This Month