Forum Moderators: open

Message Too Old, No Replies

Tables and Google

lots of table on my site

         

SirFroggZ

1:57 pm on Apr 24, 2003 (gmt 0)

10+ Year Member



I currently use alot of table on my web-site. Are table robot friendly or is there a better format to use?

zuko105

1:59 pm on Apr 24, 2003 (gmt 0)

10+ Year Member



I'm all ABOUT tables SirFroggZ.

Just validate the html first and all should be good.

Zuko

SEO practioner

4:26 pm on Apr 24, 2003 (gmt 0)

10+ Year Member



Hi zuko!

Since your much better than me at tables, have you ever seen a case where a site had such bad design problems with the tables that it actually sort of prevented Google or any other SE from indexing the site in any way?

I'd be curious to find out from the "table expert" himself

:-)

SEO

John_Caius

4:32 pm on Apr 24, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



All SE spiders cope fine with tables. However, there are better and worse ways of optimising tabular layouts for good keyword ranking.

Visit the CSS forum to find lots of people who try to replace tables with divs to reduce page size and get their content to the top of the HTML. Do a site search on "table trick" to find out about how to get your right-column content listed higher in the HTML than your left-column content. Higher in the HTML = better for ranking.

[edited by: John_Caius at 4:33 pm (utc) on April 24, 2003]

chiyo

4:33 pm on Apr 24, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As long as they are not nested (or even just one level of nesting), and dont have masses of attributes, shouldnt be a problem. Sometimes using tables means that the text you want to be read is lower down in the code, but there are ways around that using spans and blank cells.

Make sure it validates. forgetting a TR or TD or mising up a /TD or TD can make it look fine in IE but may turn up a blank page in other browsers. Dont know how that affects spidering though.

Rule of thumb, the less code as a paercentage of page weight the better.

zuko105

4:45 pm on Apr 24, 2003 (gmt 0)

10+ Year Member



In the beginning, parsers for the spiders would not be able to index a site if the html was incorrect. Anyone who has hand coded pages...(man I hope there are still people like me out there, I'm so anti-auto-generated html, probably because I've had to step in at times and maintain sites that use these crappy tools...homesite where are you?!?!)</end rant>.....has probably hand coded for more than the IE browser.

If you viewed your page in NEtscape 4.x and half or all of the page was missing, then that would probably mean that the page's html was not correct. Most of the time (for me at least) it meant that my table tags were not correct i.e. missing closing tags.

ex: <table>
<tr>
<td>
</tr>
</table>

missing closing table data element tag: </td>

Spiders would not index these sites, because they did not know where the copy body started and stopped. If you looked at that example in IE, with content of course, it would appear just fine, and spiders are getting better at this.

Also, directories like DMOZ and Yahoo editors will scrutinize your site and might tell you to go back and fix your mess(....at lest I've heard :) )

My ghetto way of validating html and table tags is just breaking out the Netscrape 4.x browser and seeing if everything was cool. IF so, then you're ready for the spiders, otherwise go back to the notepad. There's various free online tools as well to do the same.

P.S. I know homesite was purchased by macromedia and is now included in the dreamweaver, blah, blah, blah.... It just seems like to get a chair to sit in, I had to buy a house that had the chair in it. I'm not bitter, not at all...

Zuko

zuko105

4:58 pm on Apr 24, 2003 (gmt 0)

10+ Year Member



Sorry for the long post but looks like a couple of people beat me to the response.

There are two replies that pose a good argument and here's where I stand on it.

Tables, higher content, lower content has never been a concern of mine. And has never limited me to achieving #1. It might have been, 1 year ago though. I can list a number of sites that are #1 for competitive keywords that aren't even in the body copy. Keep link text in mind for trying to get ranked for keywords.

Keep good site quality in mind: navigation is always a concern if you want to retain visitors, good content at the top and at the bottom, and lots of animated gifs! (just kidding on that last one).

Remember the search engine is the first doorway for your users to get to your site. After that, if your users like it they will come back, (not through the search engine), and use it for whatever you were offering the first time.

Basically don't sacrifice quality and good design for trying to get your <h1> keywords at the top of the site. Besides you look like a spammer if you do.

Don't look like a spammer...

John_Caius

5:07 pm on Apr 24, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What I was referring to was things like using CSS absolute positioning to get your content at the top of the code but wherever you like on the page and using CSS font control to make a heading look reasonable on the page, e.g. {font-size: 1.3em; font-weight: bold;}, rather than the massive H1 usually displayed by a browser. These aren't spammy techniques - they're good site design. Using absolute positioning to take content outside the viewable area of the page - now that's spam.

In my experience the single most important factor affecting ranking is inbound link anchor text, closely followed by title text. But if you can code your page more optimally and still have it looking the same then why not?