Welcome to WebmasterWorld Guest from 50.17.117.221

Message Too Old, No Replies

Does Cleaning Up Code Help Google Rankings?

     
6:13 pm on Sep 10, 2007 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 6, 2006
posts:47
votes: 0


I have a site that has been existence now for over a year and has been moving up consistently in Google's rankings. To start the site, I bought a template and just used photoshop and some basic html programming to build out the site. Because I am not a professional web designer, I did not maintain "clean html code" as I built up the site. I know this is a bad thing, but the site originated as a hobby of mine and has since morphed into a business venture.

I recently put my site through the W3C html validation process and needless to say validation failed on most of my pages as I failed to adhere to the strict guidelines of HTML 4.01 Transitional.

I have since gone through and cleaned up all of the code so that it adheres to the W3C standards for HTML, and I was wondering if anyone has an opinion on the potential benefits of such an action.

Does Google reward sites with "perfect" coding over those that may not conform to strict validation standards (aka, missing some alt tags, etc)? I was always under the impression that Google loves "Mom-and-Pop" sites, which may not have the resources to hire out a web designer. Do you think improving the html code will help my site out in the SERP's? And if so, is this effect significant enough to merit hiring out a professional web designer?

Looking forward to hearing your opinions on the subject.

Kantro

6:39 pm on Sept 10, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3492
votes: 3


No and yes as some improper coding could cause the bots to get hung up and time out but if that wasn't the case clean coding will really not better you in the serps.

It is a good thing what you did as now whatever browser your site is viewed from you can feel confidant it will present itself as designed, in most cases.

I feel I have gotten a boost from clean coding from MSN more than Google or yahoo.

What you did to clean your coding up is Not about the rankings but is about taking pride in your work.

Congratulations.

6:50 pm on Sept 10, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member billys is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:June 1, 2004
posts:3181
votes: 0


While cleaning up code may not have a direct affect on search engine ranking it might:

- Help increase repeat visitors.
- Increase the likelihood of getting a link from an authority site.

There are probably many more benefits, but these two stand out in my brain (besides pride of ownership).

7:02 pm on Sept 10, 2007 (gmt 0)

Full Member

10+ Year Member

joined:Jan 31, 2001
posts:272
votes: 0


I spent a lot of time over the past 6 mos cleaning up my code so that each page validates. While I don't think there's any benefit to Google if you skip an Alt tag here and there, there is the idea that this could matter in the future. As Google marches along it increases the characteristics that are important to the algo. By doing this housekeeping, you are ready if this happens.
7:09 pm on Sept 10, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


There's also a benefit to you. By learning what it takes to write valid code - and especially the strict version rather than transitional - you will improve your comprehension of the medium you work in significantly - so you will be able to accomplish a wider variety of objectives on your pages, and with greater speed.

But Google absolutely does not currently rate a site by how valid the code is- many spokespeople form Google have said this, many times.

7:47 pm on Sept 10, 2007 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 6, 2006
posts:47
votes: 0


Interesting to hear all of your responses - Thanks for the input.

It definitely felt good to go through and clean up all the code, a positive step in the direction of becoming an actual (professional) webmaster.

Thanks again.

8:09 pm on Sept 10, 2007 (gmt 0)

New User

5+ Year Member

joined:Sept 7, 2007
posts:5
votes: 0


Cleaning up the code won't help your site in the SERPS but it will help via page load speed and compatability and it will also stop bots falling over themselves when they come accross a peice of code they don't understand.
8:19 pm on Sept 10, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


This has been said before, but it bears repeating, I think. The most problematic kinds of true errors in your HTML mark-up can be very difficult to spot by eye - they really need a tool, such as the W3C HTML Validator [validator.w3.org], to be ruled out with certainty.

What are these problems? Things like an unclosed quotation mark or a missing angle bracket on a tag. You can stare for hours at your source code and miss that kind of thing. But until you fix that kind of error, there is a section you intended as content that just looks like an invalid attribute, or something like that. Browsers have different error recovery routines, and just because the content displays on screen is no guarantee that Google's index will "see" it as content.

Eventually Google's error recovery routines may pick up a clue farther along in the code - and after that point, the rest of the page can be indexed. But there can easily be a gap, sometimes with important content, that just gets skipped. I speak here from painful experience.

8:23 pm on Sept 10, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


There are several types of HTML coding errors that potentially can be bot stoppers.

It is best to run a sample of pages through the HTML validator to make sure that nothing major is going on.

[Hmmm. Tedster types quicker.]

11:05 pm on Sept 10, 2007 (gmt 0)

Full Member

10+ Year Member

joined:July 29, 2003
posts:245
votes: 1


also use Google Webmaster Central tools to list your sitemap and see what errors it finds helps
11:31 pm on Sept 10, 2007 (gmt 0)

Full Member

5+ Year Member

joined:Dec 8, 2006
posts:227
votes: 0


Cleaning up the code won't help your site in the SERPS but it will help via page load speed

I don't think that's the case. Pages that validate have - in most cases - more code than those which don't validate. That's probably the main reason Google homepage doesn't validate at all...

3:47 am on Sept 11, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Apr 28, 2006
posts:1043
votes: 1


OK, I admit it, I use a WYSIWHG editor.
The one thing that constantly gets dinged when trying to validate is the <meta blah/> tag that has no ending tag, but all built into the defining tag.

If I was to go through the site and change every <tag/> to <tag> </tag>, could this actually improve the ranking of the page?

I only mention this because in another thread, the front page "bullet" in the source was affecting someones results.

Jake

4:18 am on Sept 11, 2007 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 22, 2006
posts:50
votes: 0


according to SES NYC -- it doesn't have a big impact unless your code is so screwy that the bots can't spider it.

Also - the cleaner the code the site should work more efficiently across different platforms so you can never go wrong having good code.

What WYSWYG is just fine. Spend the time creating content rich pages and getting good neighborhood links.

11:28 am on Sept 11, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


I have never encountered the <tag /> problem as I have never felt that XHTML offerred anything extra that I might need compared to HTML 4.01 Strict. In fact I use HTML 4.01 Transitional most of the time.
2:52 pm on Sept 13, 2007 (gmt 0)

Junior Member

5+ Year Member

joined:Nov 17, 2006
posts:80
votes: 0


It doesn't help in SERP's. Althoguh, you should clean the code so your not making it difficult for the SE to find the content. I have come across many sites that have great content but when you look at the source code, the content is buried way deep. Especially if your working with a lost of nested tables. Many times the code cleaning is simple as cleaning the mess of unclosed tags and JS. Using CSS where possible also helps when cleaning the code. If anything your pages will load faster and a better user experience.