| 7:54 am on Dec 13, 2005 (gmt 0)|
Reseller!..Good morning to you!
Let me tell you a story about my clients site..
I wrote many articles about safety and usage of his product line..
I have a secondary index about those articles..all original content and some have been picked up across the web and reprinted. Fine. no problem. They give full credit and a link back (the link back in most cases). All the articles carry our copyright in them and in the meta tags.
On some of them, I just did a description of them with a ...read more
and on some of them I used the first 3-4 sentences and a ...read more
Well, on all the ones I used direct content- the 3-4 sentences- have ALL gone dup content and relegated to supplemnetal listings in all the test dc's and varies back and forth on my google.com
So..I think this means that they have turned the dup content filter up so much that it is kicking even 3-4 sentence repition into supplemental.
The bad thing about this is that the articles are very pertinent and I get many hits on other se's from them.
If google ever gets it together and eliminates the supplementals- all those articles will go down the drain.
But it is proof because the articles I used just the description of in the sub index are not supplemental.
| 8:08 am on Dec 13, 2005 (gmt 0)|
I have been battling with this thing (and mayb esome others) since august when my main site was dropped from Google. I say droppped as traffic via G was 7 000 unic a day and the next morning 50 or something.
I do not know the exact reason, but have done several modifications since that.
- I have a cell phone related site and used to comment press releases when a new phone hit the market. I used to quote some keyline in the press release and then comment the new phone with my own editorial stuff. I think that isn´t duplicate content for humans, but in fear of Gs filters I stopped doing this.
- I had an affilite template for selling ringtones as a part of my site. There are many other of the same affiliate sites out there so again was I afraid of beeing banned or filtered due to duplicate content and I have made the template <noindex> via roibots.txt (I guess I should also put the nioindex on the template itself as at least Yahoo is indexing the site though I have the noindex in the robots.txt)
- I have cell phone reviews and the technical specs of the phones are what they are and that is why there is similar looking pages on many other sites also. I do not know how this is treated, but here also I now try to use other words than in the original phone specs given by the manufacturer (use cell phone instead of handset etc.)
I think this rewriting of the tech specs is actually really stupid, but what can you do? Google says make pages for human readers not for SEs. Right - then I wouldn´t rewrite the specs and I would quote the original press release as I think this is something that you do to make a good article (I´m also a journalist so I think I know something about writing too).
All the above mentioned was a quite easy step to prevent duplicate content. Then came the off page factors that I´m still working with.
In august I had more than 400 incoming links from scraper sites that used my content or directory sites that had my url along with snippets of my site. I guess those sites somehow could be treated as having same content as my site. And most of them were spammy sites so links from bad neighborhoods could be an additinal problem. Google helped me out with these scarpers also (have been banned / removed etc.) but there are still many of them. There is not much you can do about this.
Then there is the hijacking and related issues and in a way you could say (?) they are duplicate content matters also. There were several sites (and still some) that showed my page "within their URL". I´m not shure what techniques were used, but they showed (still show) my page and content though the URL is theirs. Some of the sites have been removed from Google thanks to G and some of them actually removed the redirect "thing" when I contacted them.
I aslo rewrote pages that I found had been copied elswere. Not a nice job - the content was originally written by be, but what can you do when it has been copied to ten or hundreds of sites out there? Can you be shure G knows your is the original?
Now I have a copyscape logo on my site and info that anyone taking my content will be reported without any notice. I do not know if this helps, but you have to try...
And I´m still not ranking on Google, but see some odd signs of hope. One page is now ranking 30 + on a very competitive keyword. All other pages still nowhere or buried in 60 + searches (used to be 1-10).
| 8:18 am on Dec 13, 2005 (gmt 0)|
A similar thing happened to me back in June. A clients financial advice site, mainly for existing clients (not passworded, but nofollow tags inserted) got hit by google. The only thing I could think was that it was our using of paid for syndicated articles from the financial info body that advises firms like these.
I got round it and back in the charts in august, by using quote marks around my 'read more' phrases and a full copyright notice on each intro page. I then linked these to framed pages which surrounded the basic info page supplied by the financial body. Thus my customers were none the wiser and google only credited the originators of the document.
Now a few months on and the page views for this site have not fell from their pre-june figures.
| 9:46 am on Dec 13, 2005 (gmt 0)|
"I aslo rewrote pages that I found had been copied elswere. Not a nice job - the content was originally written by be"
Yes, I have done that to every page on my whole site. It has made absulutely NO difference. Complete waste of time so far.
"Can you be sure G knows yours is the original? "
Google can/does not decide correctly, that is the problem.
"Pages still don't rank but I am hoping"
I think that about sums it up. Many of us are "Hoping" that Google can sort out this out along with other Jagger/Sept 22 issues.
I have decided that running around trying to "deal" with the consequences of Jagger is the wrong strategy.
The majority of the issues sites are having are caused by Google bugs (canonical,duplicate,links etc). Its impossible with all the fog to see the wood for the trees.
These are Googles problems not ours. IMO if you run around in circles trying to fix your sites now you may have to do it all over again in three months when the issues are finally resolved/rolled back.
| 9:57 am on Dec 13, 2005 (gmt 0)|
Hi Reseller from a blindingly sunny (but freezing)day here in Madrid!
Yes, I made some changes for a very small number of pages that I could use to draw conclusions from.
It worked to a degree, ie increased the pages that were ranking in the index so they ranked bit better.
My problem is not with those few pages that are still in the index rather with the hundred of pages that are blocked now in Google and the complete original site lost to a (assumed) duplicate filter.
Remember, we are now nearly three months after Sept 22.
So I say again that we cannot go on running round in cirlces trying to fix things for Google, they have to clean house a bit first!
| 11:06 am on Dec 13, 2005 (gmt 0)|
" hundred of pages that are blocked now in Google "
Cleanup my advise ,google will never index those pages again but will have them in its daft brain ,so trasfer them to another directory.And wait until they get a new PR.Of cource not at once because you will have the red flag of hundreds of new pages ,plus don't delete the directory , make all the old pages just blanck text only or put images on it or what ever.Remember Google has all data even if is not at there index so to be sure that not after a few months you get another dub penalty dont delete the directory or the pages that MIA.Just change them.
| 11:14 am on Dec 13, 2005 (gmt 0)|
Another advise.Don't underestimate Yahoo because you are possessed from _google.If you pages have top rankings there do not delete them or change them.If I haven't had top rankings at Yahoo I could be starving today.
| 11:42 am on Dec 13, 2005 (gmt 0)|
|Hi Reseller from a blindingly sunny (but freezing) day here in Madrid! |
You must be kidding? Can you be freezing in Madrid? Merry Christmas from the north (Finland :-)
Anyway. Could rewriting and a reinclusion request be something worth trying?
| 12:21 pm on Dec 13, 2005 (gmt 0)|
reseler as far as i can see the thread has no future ,it will be boring like the saga one.Google is ice dry cold like the north wind that blows at Nyhavn and at Kongens Nytorv while you skating (do they put ice this year?)even if it is a Sunday sunshine morning at the middle of Decemper .Hmm... I'll better dring a Gluwein on a frozen Alster in Hamburg instead...Google .What a waste of time for Us West Europeans.....................If you know what I mean.
| 12:24 pm on Dec 13, 2005 (gmt 0)|
Cold in Madrid? oh yes my friend, Continental climate (hot summers and cold winters) plus an altitude of 750m mean that although nearly always bright, winter temperatures are often the same as or less than my native London.
"transfer them to another directory.And wait until they get a new PR."
Worth a try I suppose, I will make a some tests. As you say I don't want in anyway to screw with MSN and Yahoo success just for the sake of screwed up Google!.
| 12:25 pm on Dec 13, 2005 (gmt 0)|
As far as I can see on that page
Google hates Western Europe I can see only postings from DK FI DE ES UK.....Funny init?
| 12:28 pm on Dec 13, 2005 (gmt 0)|
Cleanup ,Gracias mucho mi amigo.No se olvide de sus enemigos.
| 12:42 pm on Dec 13, 2005 (gmt 0)|
Why not salud.;)
Why we keep on loosing our national identities? In favor of PAX ROMANA?
| 12:51 pm on Dec 13, 2005 (gmt 0)|
I have not lost my national identity, I am still a Londoner - a pesar de haber vivido 12 años en Madrid.
| 12:53 pm on Dec 13, 2005 (gmt 0)|
Unfortunately this Forum administration is not a European administration ,so I know my future here.As soon the insiders wake up back in (ROME) they will ban me for posting here.LOL.
But first they should better read the book
The last days of Pompeii.
| 12:54 pm on Dec 13, 2005 (gmt 0)|
cleanup sorry I thought you are from Spain.
| 3:15 pm on Dec 13, 2005 (gmt 0)|
Bon, on parle de quoi ici?
| 3:39 pm on Dec 13, 2005 (gmt 0)|
Very nice "language" exchange :-)
Lets go back to:
Dealing With Consequences of Jagger Update
| 4:43 pm on Dec 13, 2005 (gmt 0)|
I agree about the dupe content now being turned up high.
My site departments are set out so that the user can easily access products (not more than two clicks away from home page ISTR someone saying), so we would have body widgetery and that would be split up into subsections called for example gold body widgetery.
One very large section is split into three - metal body widgetery etc. black metal body widgetery, gold metal etc. All those three kw's used to rank highly, but now the are all demoted in the Serps because when you search for metal body widgetery, G finds all three pages even though two of them are a different kind of metal body widgetery. It seems to be attaching more weight to the keyword on the page.
I have now moved the two X metal body widgetery sections inside the metal body widgetery section to see what happens, but again this is making it harder for the user, plus the PR won't be passed down as the sections are deeper in the site. Most of my third level sections have been dropped by G, so it may be that they won't get found at all, but being at position 100+ is pretty useless anyway.
It seems to me that New Google places much more importance on the *hierarchy* of the site structure than previously. I could be wrong but this is a message I am getting on a daily basis now I am seeing how the update is panning out for my site.
| 4:52 pm on Dec 13, 2005 (gmt 0)|
Actually I'm revising that a bit - G is ignoring the black metal widgetery section totally and returning a completely different page with *1* single item with that word in it amongst 30 other items without that word!
I still reckon over-optimisation and dupe content factors high.
| 5:01 pm on Dec 13, 2005 (gmt 0)|
I have a site that contains categories, sub-categories, and then individual items. Some of the sub-cats are broken down into as many as 30 individual titles,before seeing an item list, and so far it hasn't affected my site whatsoever.
I am mentioning this in case you are changing your site around for the wrong reasons. But more than that I can't help more....sorry :-(
| 5:05 pm on Dec 13, 2005 (gmt 0)|
Thanks Eazy - I had other issues with the site too so I don't know exactly, but we have a lot of complex and similar items on the site. I can't see why else G would ignore the main section for the particular kind of widget and just show the page where the word appears only once.
I am just guessing over-optimisation filter.
Our rankings are so pathetic I can't see I'm doing harm by having a minor tweak and see what happens - if it is that that is the problem, a minor tweak might be all it takes. :)
| 5:57 pm on Dec 13, 2005 (gmt 0)|
Sorry I can't be more helpful. You may want to post on the Google public support forum at Google groups, as theres some good guys on there, who post replies. I use it to keep up with certain info.
| 8:20 pm on Dec 13, 2005 (gmt 0)|
Why do some results for my pages use page text snippets in the description and others the "meta" description?
Is this good, bad or neither>
| 8:34 pm on Dec 13, 2005 (gmt 0)|
Meta descriptions will be used as snippets usually only when the keyword / keyphrase is not present on the page (eg. you rank for the keyword because of link text or keyword in URL).
I have however seen instances where the meta description is shown as a snippet when the keyword / phrase is on page. When this occurs the site never ranks for that phrase in Google.
Eg. I believe i should be ranking for "blue widgets", I am top in MSN and Yahoo and nowhere in Google. If I put "site:www.example.com blue widgets" into google it returns pages but with the meta description as snippets.
| 11:32 pm on Dec 13, 2005 (gmt 0)|
Thanks reseller for this topic I believe that there is much to be discussed.
As far as duplicate content...I find it hard to believe that a scraper scraping a percentage of content would be viewed against the originating site. To me that would be asinine. That would actually create another tool in an unsightly compeditor's arsenal of ways to destroy another's site.
I don't know what it would take for Google to penalize/downgrade for onsite duplication. To me I would think it would be a page only thing not a site-wide thing. Why would they have penalize the complete site for, let's say, re-destributed artlices even if it had a good percentage of unique content? Maybe they view this content as a way to push up internal PR which in turns links to other parts of the site? Why not not just discount single pages considered duplicate and links from those pages leaving the original content in place?
Many webmasters have worries of using snippets of their own articles/contnet when linking to the actual articles/contnet as in topic sub-sections. How is this viewed by Google? How about RSS feeds to other sites that use snippets of articles?
You know there are just too many questions along this line. We will never get a straight answer from G anyway and since there are so many factors in play...even sharing experiences can lead us into false conclusions...still wold be fun to discuss.
| 11:58 pm on Dec 13, 2005 (gmt 0)|
>>> I have however seen instances where the meta description is shown as a snippet when the keyword / phrase is on page. When this occurs the site never ranks for that phrase in Google. <<<
That's not true! Pre-Jagger had us 1-10 for around 10 very competitive 2 and 3 word phrases; all 50-200m results, all with the phrases in the meta description and body, every one displayed the meta description.
Post-Jagger we now rank #8 for one of these phrases (60m results). Our meta description is still displayed. This includes the key phrase, as does the body text. Other phrases are down between 2nd and 4th pages, but all (except 1) are still displayed with meta descriptions.
Usually, it's the other way round from what you stated - i.e. If G cannot find the exact phrase in the meta description, it uses a body snippet instead. However, there are occasions where it drops the meta description in favor of either a snippet, or more often your DMOZ description -- I previously believed this was when the phrase was too prominent in the meta description, but not prominent enough on the page itself (also if the meta description was too long), but I now believe this also happens when it's too prominent site-wide (i.e. title straps, etc.).
| 12:49 am on Dec 14, 2005 (gmt 0)|
Just illustrating what i've seen. I have never seen the meta description taken in favour over the body text for a 1-10 ranking. Doesn't mean it doesn't happen. I would be interested in a example though.
| 2:19 am on Dec 14, 2005 (gmt 0)|
Just sent you a sticky.
| 7:14 am on Dec 14, 2005 (gmt 0)|
Good morning Folks
I don't know what it would take for Google to penalize/downgrade for onsite duplication. To me I would think it would be a page only thing not a site-wide thing. <<
In the case of onsite duplication. If duplications happen on few pages, I guess its mostly a page penalty.
| 8:00 am on Dec 14, 2005 (gmt 0)|
"In the case of onsite duplication. If duplications happen on few pages, I guess its mostly a page penalty. "
That would be my guess. Well not a actually a penalty but more like being filtered.
Now the case with re-distributed articles I would have to say any single page that is re-distributed may get filtered rather than site-wide type penalty/exclusion. The effect of course will be on how many of your pages are re-distributed content. Which in my mind I have no problem with as long as the unique content is left undisturbed and only benefiting from links from other unique content on and off site. I can see how using re-distributed articles can be used to "push" up PR internally and in that case Google can just downgrade the effects of such pages. Distributing article elsewhere can also be used to manipulate PR in which G can (and I think does in some instances) downgrade for such things. This enables site owners to share information benefiting directly from site to site but not getting unfair benefit in G SERPS. I could be wrong though.
Now offsite duplication is a bit out of control in my opinion. It remains a full time job for many of us to keep up with all of that. And that is complete duplication of content (Full articles and even design). If G did crank up their dupe filter (off site occurances) it really can make a mess of things which would include many thousands of scraper sites scraping bits of content. (anyone with high ranking sites will surely have a scraper problem). I really have a hard time believing that G would intentionally penalize a site for such occurances. I believe they know that these scrapers are way out of control and that most webmasters cannot keep up with it all.
Now if G did a sudden downgrade of any benefit from links from scrapers can affect sites and make it appear to be a "penalty". Having links from such sites that once were keeping you in high the SERPS when taken away your site falls off the face of the planet.