Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Update Saga. Part 5

         

Brett_Tabke

8:26 pm on Nov 9, 2005 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



What say you?

Over and done with?

All done all through?

reseller

10:25 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Kimkia

Many thanks for your generous sharing and contribution. Much appreciated.

If you wish, you are most welcome to post the same message on the thread I have just started

Dealing With Consequences of Jagger Update
[webmasterworld.com...]

Where I hope to compile as many valuable tips as possible for the benefit of our kind fellow members.

Thanks a bunch.

Gimp

10:35 am on Nov 13, 2005 (gmt 0)

10+ Year Member



The layout of an internal page does in fact have a big effect on the duplicate content status of an internal page. For something like a city guide, catalog, or hotel guide, the page is laid out around certain standard sections. How to get there, how many rooms, locations of offices, and so on. This is duplicated page to page. Only details change. The percentage of duplicate content in many cases is very high.

If is nice to say IF the content is different. The essential content is different, but the content is duplicate as far as the search engine is concerned because there are so many duplicate lead ins.

tigger

10:39 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



one thing I am noticing is sites that have been "neglected" for a while and not had any real link development work are starting to rank -

Eazygoin

10:58 am on Nov 13, 2005 (gmt 0)

10+ Year Member



Gimp:
How on earth do companies like EBay survive, when their page layout is similar, but the content is different? Are you saying that this is duplicate content?

It's absolute nonsense to say that the layout of the page has to be different on all internal pages. Dynamic websites run on a set layout/structure, so that A goes to B, B goes to C, and so on.

I also think the comment on internal linking is wrong, as pages that link to one another fromm any or all parts of a website, create a sitemap, and make it easy for robots to follow.

Gimp

11:17 am on Nov 13, 2005 (gmt 0)

10+ Year Member



I do not know how ebay survives.

If two hotel pages have a high percent of the content indentical because of standard text, they are being penalized as duplicate.

Our catalog sites on our Chamber of Commerce site has been hit particualy hard.

It is not ridiculous, it is fact. It has happened to us and we are doing everything we can to work around the algo.

Nick0r

11:20 am on Nov 13, 2005 (gmt 0)

10+ Year Member



I'm just wondering if GG can confirm this is a rollback on .9.104

I track about 15 keywords in one sector and they've all gone back to exactly how they were at j2.

About ebay... I believe that when a site hits a certain threshhold of links,pr or whatever, they are exempt from some of the minor penalties that we receive, being just ordinary small-medium websites on the internet.

Miop

11:42 am on Nov 13, 2005 (gmt 0)

10+ Year Member



<How on earth do companies like EBay survive, when their page layout is similar, but the content is different? Are you saying that this is duplicate content?>

The layout is similar but the sections are themed so only content relevant to that section appears on the page. One of the decent top sites in my field is also built like this and rarely moves from the top. I also notice that Ebay has kw's in the page url's (can't remember if that was always the case). Ebay is like a massive collection of shops,each with unique content. I've just checked two pages from the same section on Ebay and the pages are only 1% similar!
On my site, other content appears to such a degree due to the templates and menus that each page is at least 80% similar.
My previous handmade html site always ranked highly because I knew how to organise the content so that each section was hardly similar at all. If I was a half decent web designer I could use the php content to build a site with unique content on most pages, but I'm not (which is why I bought an out of the box solution - the wrong one probably!)
I doubt it is the same for all templated sites - there are some at the top of my sector, but most use a particular site design which eliminates the top menus (mine can't do that without tweaking which I don't know about).

<It's absolute nonsense to say that the layout of the page has to be different on all internal pages. Dynamic websites run on a set layout/structure, so that A goes to B, B goes to C, and so on.

I also think the comment on internal linking is wrong, as pages that link to one another fromm any or all parts of a website, create a sitemap, and make it easy for robots to follow. >

There's nothing wrong with that, but what would happen if for some reason you ended up with 5 links to one particular section (due to linking from articles or promotions in addition to the normal menus)?

Miop

11:43 am on Nov 13, 2005 (gmt 0)

10+ Year Member




<one thing I am noticing is sites that have been "neglected" for a while and not had any real link development work are starting to rank - >

I'm noticing several new sites appearing - maybe they are coming out of the 'sandbox'.

Gimp

11:46 am on Nov 13, 2005 (gmt 0)

10+ Year Member



Would it make sense to use the rel="nofollow" tag to avoid excessive internal linking?

tigger

11:51 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>I'm noticing several new sites appearing - maybe they are coming out of the 'sandbox'.

nope these sites have been around for years, prior to this update wasn't ranking at all - and therefor had next to no link development work done, it almost makes you think this is an anti link update

This 1356 message thread spans 136 pages: 1356