Forum Moderators: Robert Charlton & goodroi
Many thanks for your generous sharing and contribution. Much appreciated.
If you wish, you are most welcome to post the same message on the thread I have just started
Dealing With Consequences of Jagger Update
[webmasterworld.com...]
Where I hope to compile as many valuable tips as possible for the benefit of our kind fellow members.
Thanks a bunch.
If is nice to say IF the content is different. The essential content is different, but the content is duplicate as far as the search engine is concerned because there are so many duplicate lead ins.
It's absolute nonsense to say that the layout of the page has to be different on all internal pages. Dynamic websites run on a set layout/structure, so that A goes to B, B goes to C, and so on.
I also think the comment on internal linking is wrong, as pages that link to one another fromm any or all parts of a website, create a sitemap, and make it easy for robots to follow.
If two hotel pages have a high percent of the content indentical because of standard text, they are being penalized as duplicate.
Our catalog sites on our Chamber of Commerce site has been hit particualy hard.
It is not ridiculous, it is fact. It has happened to us and we are doing everything we can to work around the algo.
I track about 15 keywords in one sector and they've all gone back to exactly how they were at j2.
About ebay... I believe that when a site hits a certain threshhold of links,pr or whatever, they are exempt from some of the minor penalties that we receive, being just ordinary small-medium websites on the internet.
The layout is similar but the sections are themed so only content relevant to that section appears on the page. One of the decent top sites in my field is also built like this and rarely moves from the top. I also notice that Ebay has kw's in the page url's (can't remember if that was always the case). Ebay is like a massive collection of shops,each with unique content. I've just checked two pages from the same section on Ebay and the pages are only 1% similar!
On my site, other content appears to such a degree due to the templates and menus that each page is at least 80% similar.
My previous handmade html site always ranked highly because I knew how to organise the content so that each section was hardly similar at all. If I was a half decent web designer I could use the php content to build a site with unique content on most pages, but I'm not (which is why I bought an out of the box solution - the wrong one probably!)
I doubt it is the same for all templated sites - there are some at the top of my sector, but most use a particular site design which eliminates the top menus (mine can't do that without tweaking which I don't know about).
<It's absolute nonsense to say that the layout of the page has to be different on all internal pages. Dynamic websites run on a set layout/structure, so that A goes to B, B goes to C, and so on.
I also think the comment on internal linking is wrong, as pages that link to one another fromm any or all parts of a website, create a sitemap, and make it easy for robots to follow. >
There's nothing wrong with that, but what would happen if for some reason you ended up with 5 links to one particular section (due to linking from articles or promotions in addition to the normal menus)?