Forum Moderators: open

Message Too Old, No Replies

Should Copy Still be at the Top of the Page?

         

166geary

6:53 pm on Jul 9, 2004 (gmt 0)

10+ Year Member



Hi everyone,

Back in the old days, I remember being told that with in the HTML coding, that copy that is placed higher on the page get more weight. Which means that if you have a complext table layout which pushes the copy down the page, Google would be busy looking at all the coding, rather than the copy on the bottom of the page.

Is this really still true? Was this ever true?

dirkz

6:51 pm on Jul 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> Is this really still true? Was this ever true?

Definitely.

JimM

7:46 pm on Jul 10, 2004 (gmt 0)

10+ Year Member



I would also like to hear more about what the experts say about this.

On my site the flash slide show (pictures from my area) and my navigation shows up before the text in the source code.

Is there an easy fix to change the tables around so that the text shows up first?

Thanks,
Jim

troels nybo nielsen

8:41 pm on Jul 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> Is there an easy fix to change the tables around so that the text shows up first?

You might use CSS [webmasterworld.com], but some of us think that CSS is not really easy. ;)

166geary

6:58 pm on Jul 12, 2004 (gmt 0)

10+ Year Member



Thanks for your input guys.

CSS (with DIV tags of course) is definately the way to go. We were able to do that for our own site, but its next to impossible to get clients to do that - even if they are redesigning their web site.

suidas

10:02 pm on Jul 12, 2004 (gmt 0)

10+ Year Member



Doesn't Google attempt to detect recurrent navigational elements, and screen them out of their calculations? I would think this would moot the issue, without requiring CSS positioning solutions which—assertions to the contrary notwithstanding—do not always fit the bill.

From my experience (limited, of course), tightly-coded, recurrent navigation does not impair positioning.

Swanson

10:17 pm on Jul 12, 2004 (gmt 0)

10+ Year Member



JimM - there is a way to make text show before navigation if your links are on the left hand side.

Split your page into a header table and for the content & links use 2 tables. Put a table a the top with the content in and align to the right, put a table below it aligned to the left with navigation in - use percentages for widths e.g. 75% for content and 25% for links.

That would put the content above in the HTML but would show next to the navigation in a browser.

Works for me.

BigDave

10:35 pm on Jul 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think it depends how you define things.

I would actually be rather shocked if HTML tags before the copy made any difference. By that, I mean what is strictly between '<' and '>'. Of course you can end up with a lot of text that is not a part of your content mixed in with those tags pushing down your content, and that is a different story.

buckworks

11:07 pm on Jul 12, 2004 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



On some sites you have to scroll through screenfuls and screenfuls and screenfuls of source code clutter before you get to any real content. That can't possibly be a good thing, SEO-wise.

A site might get away with that if the competition was no better, but if a site came along that had a better "signal to noise" ratio in its source code, that new site would likely gain ground, all other things being equal.

Swanson

11:22 pm on Jul 12, 2004 (gmt 0)

10+ Year Member



There is definately a limitation in the G algo with regards to placement of links and content within the HTML. That is probably because the algo can't take into account what the user sees in the browser as this is always different to what content is indexed and the weighting applied to.

I have been able to prove this personally by changing page construction and seeing radically different rankings even taking into account links.

BigDave

11:46 pm on Jul 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



buckworks,

From a programming point of view, it weould be significantly more difficult to take the position in the file into account than the position within the content.

The first thing you do is filter out the content between tags that you give bonus points to, such as title and h1.

Then you just strip out all tags. They just slow down all the additional processing that you will do, and complicate your parsing algos. That's file parsing 101, throw out the garbage before doing your real work.

Patrick Taylor

12:30 am on Jul 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Regarding tables, a Google search for "the table trick" will reveal a solution. Personally I don't use tables any more - my text content is CSS-ed hard-up against the <body> tag irrespective of the visual page layout, but having said that I've seen no evidence of any SERPS benefit, nor such methods in common use amongst the pages that rank high on a search for "search engine optimisation".

buckworks

12:35 am on Jul 13, 2004 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



more difficult to take the position in the file into account than the position within the content.

I don't know enough about programming to comment on that. Intuitively I'd have guessed it would be the reverse.

What I do know is this: when I find ways to move the content "up" and reduce code clutter on my own pages, or persuade a client to do the same, the effect on rankings is always either neutral or positive, never negative.

dirkz

9:58 am on Jul 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> Then you just strip out all tags.

You're talking about a very 'simple' algo for digesting HTML. Just take note of some important stuff (h1, strong etc.) and then ignore it.

I can't prove this, but I think they are using something more sophisticated that leaves all the tags in place while parsing, this way you don't lose ANY information during parsing, and you always can react according to the state you're currently in.

BigDave

4:18 pm on Jul 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It makes sense that they would keep a copy that retains all the structual elements, but it makes no sense to leave those elements in during the processing of regular content, and storing the copy to search for complex sets of keywords and to generate the snippet.

When it makes sense, you can expect that google will choose the simple approach. Especially when it will lead to a speed improvement with no detrimental effects to their results.

doc_z

6:23 pm on Jul 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google will probably use those algorithm that improves the SERPs. I doubt that retaining tags lead to any improvement. Moreover, I guess that the quality would be even worser (and, as memtioned before, it would take more time).

Also, in practice I haven't seen any sign that Googles favours pages with less tags.