Welcome to WebmasterWorld Guest from 18.104.22.168
Jagger is winding down and life must go on. If Jagger has been kind to your site, Congrats. But for the rest of fellow members who lost rankings or their sites dropped of the index, its time to do some thinking and decide on what to improve or change on your affected websites. Still ethical measures are what interest me most.
Some food for the thought.
After my site was hit by Allegra (2-3 Feb 2005) and lost 75% of my Google's referrals and hit for second time on 22nd July 2005 ending up with only 5-10% of pre-Allegra Google's referrals.
My site is now back to the level of around 50% of pre-Allegra Google's referrals and growing... until further. I say "until further" because who knows what the next update or "everflux" do to my site!
Before my site returned back around 19-22 Sept 2005 (very slow at the begining), I went through my site several times for months and did the followings:
- removed duplicate pages. In my case it was several testing pages (even back to 1997) which I just forgot on the server.
- removed one or two 100% frame pages.
- removed some pre-sell affiliate program pages with content provided entirely by affiliate program vendors.
- removed few (affiliate referrals) outbound links which was on the menu bar of all pages (maybe we are talking about sitewide linking).
- on resource pages, I reduced the outbound links to be less than 100 .
- made a 301 redirect non-www to www (thanks to my good Norwich friend Dayo-UK).
- finally filed a reinclusion request in accordance with the guidelines posted on Matt's blog (thanks Mr. Inigo).
Would you be kind to tell us how Jagger Update affected your site, and what do you intend to do about it.
I do not know the exact reason, but have done several modifications since that.
- I have a cell phone related site and used to comment press releases when a new phone hit the market. I used to quote some keyline in the press release and then comment the new phone with my own editorial stuff. I think that isn´t duplicate content for humans, but in fear of Gs filters I stopped doing this.
- I had an affilite template for selling ringtones as a part of my site. There are many other of the same affiliate sites out there so again was I afraid of beeing banned or filtered due to duplicate content and I have made the template <noindex> via roibots.txt (I guess I should also put the nioindex on the template itself as at least Yahoo is indexing the site though I have the noindex in the robots.txt)
- I have cell phone reviews and the technical specs of the phones are what they are and that is why there is similar looking pages on many other sites also. I do not know how this is treated, but here also I now try to use other words than in the original phone specs given by the manufacturer (use cell phone instead of handset etc.)
I think this rewriting of the tech specs is actually really stupid, but what can you do? Google says make pages for human readers not for SEs. Right - then I wouldn´t rewrite the specs and I would quote the original press release as I think this is something that you do to make a good article (I´m also a journalist so I think I know something about writing too).
All the above mentioned was a quite easy step to prevent duplicate content. Then came the off page factors that I´m still working with.
In august I had more than 400 incoming links from scraper sites that used my content or directory sites that had my url along with snippets of my site. I guess those sites somehow could be treated as having same content as my site. And most of them were spammy sites so links from bad neighborhoods could be an additinal problem. Google helped me out with these scarpers also (have been banned / removed etc.) but there are still many of them. There is not much you can do about this.
Then there is the hijacking and related issues and in a way you could say (?) they are duplicate content matters also. There were several sites (and still some) that showed my page "within their URL". I´m not shure what techniques were used, but they showed (still show) my page and content though the URL is theirs. Some of the sites have been removed from Google thanks to G and some of them actually removed the redirect "thing" when I contacted them.
I aslo rewrote pages that I found had been copied elswere. Not a nice job - the content was originally written by be, but what can you do when it has been copied to ten or hundreds of sites out there? Can you be shure G knows your is the original?
Now I have a copyscape logo on my site and info that anyone taking my content will be reported without any notice. I do not know if this helps, but you have to try...
And I´m still not ranking on Google, but see some odd signs of hope. One page is now ranking 30 + on a very competitive keyword. All other pages still nowhere or buried in 60 + searches (used to be 1-10).
I got round it and back in the charts in august, by using quote marks around my 'read more' phrases and a full copyright notice on each intro page. I then linked these to framed pages which surrounded the basic info page supplied by the financial body. Thus my customers were none the wiser and google only credited the originators of the document.
Now a few months on and the page views for this site have not fell from their pre-june figures.
Yes, I have done that to every page on my whole site. It has made absulutely NO difference. Complete waste of time so far.
"Can you be sure G knows yours is the original? "
Google can/does not decide correctly, that is the problem.
"Pages still don't rank but I am hoping"
I think that about sums it up. Many of us are "Hoping" that Google can sort out this out along with other Jagger/Sept 22 issues.
I have decided that running around trying to "deal" with the consequences of Jagger is the wrong strategy.
The majority of the issues sites are having are caused by Google bugs (canonical,duplicate,links etc). Its impossible with all the fog to see the wood for the trees.
These are Googles problems not ours. IMO if you run around in circles trying to fix your sites now you may have to do it all over again in three months when the issues are finally resolved/rolled back.
Yes, I made some changes for a very small number of pages that I could use to draw conclusions from.
It worked to a degree, ie increased the pages that were ranking in the index so they ranked bit better.
My problem is not with those few pages that are still in the index rather with the hundred of pages that are blocked now in Google and the complete original site lost to a (assumed) duplicate filter.
Remember, we are now nearly three months after Sept 22.
So I say again that we cannot go on running round in cirlces trying to fix things for Google, they have to clean house a bit first!
"transfer them to another directory.And wait until they get a new PR."
Worth a try I suppose, I will make a some tests. As you say I don't want in anyway to screw with MSN and Yahoo success just for the sake of screwed up Google!.
I have a site that contains categories, sub-categories, and then individual items. Some of the sub-cats are broken down into as many as 30 individual titles,before seeing an item list, and so far it hasn't affected my site whatsoever.
I am mentioning this in case you are changing your site around for the wrong reasons. But more than that I can't help more....sorry :-(
Meta descriptions will be used as snippets usually only when the keyword / keyphrase is not present on the page (eg. you rank for the keyword because of link text or keyword in URL).
I have however seen instances where the meta description is shown as a snippet when the keyword / phrase is on page. When this occurs the site never ranks for that phrase in Google.
Eg. I believe i should be ranking for "blue widgets", I am top in MSN and Yahoo and nowhere in Google. If I put "site:www.example.com blue widgets" into google it returns pages but with the meta description as snippets.
As far as duplicate content...I find it hard to believe that a scraper scraping a percentage of content would be viewed against the originating site. To me that would be asinine. That would actually create another tool in an unsightly compeditor's arsenal of ways to destroy another's site.
I don't know what it would take for Google to penalize/downgrade for onsite duplication. To me I would think it would be a page only thing not a site-wide thing. Why would they have penalize the complete site for, let's say, re-destributed artlices even if it had a good percentage of unique content? Maybe they view this content as a way to push up internal PR which in turns links to other parts of the site? Why not not just discount single pages considered duplicate and links from those pages leaving the original content in place?
Many webmasters have worries of using snippets of their own articles/contnet when linking to the actual articles/contnet as in topic sub-sections. How is this viewed by Google? How about RSS feeds to other sites that use snippets of articles?
You know there are just too many questions along this line. We will never get a straight answer from G anyway and since there are so many factors in play...even sharing experiences can lead us into false conclusions...still wold be fun to discuss.
>>> I have however seen instances where the meta description is shown as a snippet when the keyword / phrase is on page. When this occurs the site never ranks for that phrase in Google. <<<
That's not true! Pre-Jagger had us 1-10 for around 10 very competitive 2 and 3 word phrases; all 50-200m results, all with the phrases in the meta description and body, every one displayed the meta description.
Post-Jagger we now rank #8 for one of these phrases (60m results). Our meta description is still displayed. This includes the key phrase, as does the body text. Other phrases are down between 2nd and 4th pages, but all (except 1) are still displayed with meta descriptions.
Usually, it's the other way round from what you stated - i.e. If G cannot find the exact phrase in the meta description, it uses a body snippet instead. However, there are occasions where it drops the meta description in favor of either a snippet, or more often your DMOZ description -- I previously believed this was when the phrase was too prominent in the meta description, but not prominent enough on the page itself (also if the meta description was too long), but I now believe this also happens when it's too prominent site-wide (i.e. title straps, etc.).
I don't know what it would take for Google to penalize/downgrade for onsite duplication. To me I would think it would be a page only thing not a site-wide thing. <<
In the case of onsite duplication. If duplications happen on few pages, I guess its mostly a page penalty.