Welcome to WebmasterWorld Guest from 22.214.171.124
Frequently the advise given them is to the effect of "go back and clean up your code", in addition to other things. I haven't been nailed yet, but rather tahn wait for the axe to fall, I went through my small, 100-page site, using the w3c validator. Almost every page showed some error. Not horrible errors, but errors nonetheless. I cleaned everything up and now the pages all validate.
My questions are these:
1) some of my links to external sites contained characters that the validator didn't like. Would G have penalized me as non-wellformedness because it too didn't like the characters?
2) if a site is TOO clean, might G penalize it for being over-optimized?
Thanks in advance.
I don't think so. I guess googlebot has learned to be quite tolerant on these issues, but details depend on the precise nature of the mistakes, of course. All that could have happened, was that these links actually did not count as inbound links for your partners.
> 2) if a site is TOO clean, might G penalize it for being over-optimized?
There is definitely no penalty for syntactical well-formedness. That would be quite absurd. I'd recon you go even further and check accessability and usability issues. There are no standards and test tools for the latter yet, but I assume google likes pages that follow the wcag-guidelines.
(hope the second link is within the TOS)
All of my pages validate via W3C, it does help out a lot but it is only one small part of making pages rank well.
Good content, unique meta tags (and I mean unique) are the second part of the battle.
Do not worry about the ODP, I am not in it and I rank top 10 in the serps.
By unique, do you mean from page-to-page? My site is about Blue Widgets, so the meta tags (description, for example) tend to be Blue Widgets in boxes, Blue widgets in cans, Blue widgets in bags, etc. Is that sufficiently unique?
joined:May 31, 2004
That's not what this search engine mess is about. Sites are getting nailed now for spam related filters that none of the SEs are able
to control so they keep coming up with filters that effect good here
and bad there.
Besides these spam filters I think Adwords is discovering "big money" keyword sites and putting the naturals down the list for a bigger paycheck. We all know every search engine manually does some tweeking right?
Google is digging deep, so correct broken links and lost orphan files to get out of a doorway filter. Much more going on with Google filters then ever before. Also forget about having a links directory off of your site that is broken down like a Dmoz directory. Create a directory that is spot on with the theme of your site only.
So go ahead and keep your site clean.
I still think it is super easy to rank on top of keywords that have 1,000s of site competition. It's the sites that are able or were able to rank high on a million plus keyword searches right?
Also, unique title tags.
Recently I looked at a w3c site that was supplemental, was a good site and the reason why it was supplemental was because of the title tags.
He had Page A- My company. Page B- My company, etc....
Once he removed the -My company (Which was a bit duplicate) he came out of the supplementals and improved in google.
The more unique the better, that way you do not trip a duplicate filter and google will understand better how to send people to your site.
If you sell widgets, then yes make sure the tags are for blue, red, in boxes, etc.... Google will understand your widget site and direct traffic appropriately when a person types in red widgets.
joined:May 31, 2004
I'll check, but I think out of blind, dumb luck I have done this (it just seemed logical). No for what may be the dumbest question asked on WW this year (if I win, I'll tell you where to send the trophy):
My site has very orginal content (with a few quotes from the person on whom it focuses) that I write. Frequently bloggers, forums and occasionally news soureces (no s**t) take substantial portions of my text (sometimes 500 words or more) without attributing it to me, and use it on their websites.
Q: I don't mind them passing along my writing, but am I at risk for a dupe content penalty, even though they are using MY commentary?
And thank you all for your help. It may keep my site from getting killed off and that's way important to myself and others.
Think about it - the common man doesn't have degrees in Information Technology that most of us here have. It is the common man - the one who doesn't know html and who might make the tiny little mistakes. Indeed, will blogging, for example, turn a page of search engine finds into a sea of ego-centric tripe? It's getting hard enough as it is with the current state of S.E. technology.
It's definitely copyright infringement. At least they owe you a link. If you have the time, ask those people to add a backlink to your side. After your side has stabilized as the authority on that niche-topic, you might try to draw some revenue from those, who refused to backlink, with the help of a good lawyer.
Personally I'm no friend of such copyright issues, good content should be spread as wide as possible. But to even refuse to give the source of a quotation is so immodest, that I'd really think of such an option.
Then I cleaned up the page, made it w3c compliant. It did not change the layout of the page one bit. When I was done with the page it was only 7 kb in size.
So, not only did I cut the bandwidth for the page by 66%, I also increased the loading speed of the page by 66%.
These are two big keys that most webmasters miss.
Checking the page for major errors should be automatic. No need to clear up absolutely everything, but if you want to do a good job, make your site as cross-browser compliant as possible, and be sure that you are providing the best code to the spiders, then you would spend a couple of minutes running a few pages through the HTML validator and checking the results.
joined:May 31, 2004
joined:May 31, 2004
SEO will get harder as the internet grows so start learning or hire someone that has learned more than you. Don't make changes to your site when you have no clue.
Just a thought, but in the future people will be surfing the net more via pda's and ipods or like devices.
The first thing to notice is the huge number of markup errors involving the meta element. Markup such as:
<meta name=description value=the best site for hot air balloons>
…which results in a meta element with eight attributes, and which doesn’t help anyone (least of all the search engines it’s aimed at, since the second attribute should have been content, not value, and therefore the entire element is likely to be ignored).
[edited by: Halfdeck at 4:29 am (utc) on July 30, 2006]
<title> We Goofed <title>
<title> We Goofed /title>
<title> We Goofed </title
<title We Goofed </title>
<title> We Goofed <meta name="description" ...
You think that their rankings are unaffected?
My guess would be if you are in push and shove situ with your competitor for a certain keyword ranking, then yes , clean code would be a MINOR factor.