Forum Moderators: Robert Charlton & goodroi
I'm not an SEO or HTML expert; far from it, but I've taken what I learned at WebmasterWorld and applied it fairly consistently with excellent results -- until now.
Now, I've gone from over 7,000 visitors per day to just over 3,000 - and my affiliate revenue has gone down the toilet.
I've read this thread, trying to figure out this stuff about duplicate content penalties etc. And I've done the &filter=0 thing on my site - and, yep, it's showing me right at the top when I do that, and right out of sight on many, many pages without it.
Ironically, although lots of my original content pages have hit the dumper, one page that still ranks well is one of a small proportion that I copied with permission from a free reprint group.
One possibility occurred to me...a couple of months ago, I added a php based affiliate product catalog to my site. I placed this in its own directory, linking to it from my home page and relevant places within my site. This brought my page count up in Google from about 350 pages to about 1,800 pages - but, of course, the 1500 extra pages were exact reproductions of the affiliate site. I hadn't considered this impact when I installed it, nor did I promote it heavily - it was more a service to readers than anything else. Now I'm wondering...could it be this catalog that has tripped the update filter?
From Google's perspective, I guess, I'm now featuring predominantly copied pages, and I've had a huge increase in pages in a very short time period. Makes me look bad.
I thought this experience might help us all to sort out the scope of this filter or penalty. And my question is: can I recover my respectability by dumping this catalog post-haste?
We have hundreds of thousands of pages from world news articles to info pages on movies, book, etc.
We don't get up to anything shifty with SEO, have thousands of inbound links (mostly direct to articles)in fact we are even a Google News source site. Until last week on the main page about 20 times a day.
However, whatever they have done has ripped our traffic down to our daily repeat visitors. I think currently we are getting next to nothing from Google Search or news. I also noticed our peers have all seen traffic drop like a stone.
So I would not change anything. I know I won't be, I will eat my hat if it stays like this...might have to as will be short of cash :-)
I will be interested to see what anyone from Google has to say about all this.
I did note a small change Sunday morning with another page of results showing on a certain search...so I think still things afoot. Also the fact that some questionable sites that went a while ago are back makes me think this is almost like a regressive change. I noted one of our peers who changed domains a year or so ago now comes up under their old domain, clearly how they used to be listed at that time and incorrect.
If it does stay till end of the week then I will contact adsense and ask them about it...
James
I don't know what to do. My adsense income tanked big time. The only major change I saw was that the number of results when searching the name of our site increased from 18,000 to 84,000. Site is totally white hat (I don't even use H1 and all; just develop good content).
I really did think that it was as bad as it would get, but no. It's getting worse.
On Sunday, for the first time ever, the number of people coming from bookmarks exceeded the Google-supplied people. Jeez.
Re: Affected sites since 22nd Sept 2005
We observe changes, suggest and guess in good faith. That's all what we can do.
But we need to be open minded too during any opdate, and look for all possibilities that might have caused a site to drop.
Fellow members have been focusing mostly on the duplicate (stolen content) issue. Fair enough .. power to you. But arenīt there other factors that we really need to look at?
- How about duplicates within your own site of your own making (in good faith of course)?
- How about the presence of javascript redirects?
- How about 100% frames pages with contents originated from other sites than yours?
- How about pages with "gibberish" texts?
- How about those "lovely" doorway pages?
- How about sitewide linking?
- How about...how about etc...
The reason Iīm asking such question is because I could read between the lines of our good friends at the plex GoogleGuy and Matt recent posts that fighting spam is becoming more and more as priority No.1 for Google.
If we assume that the main priority of the current update is fighting spam, then we need to look at all factors that Google mark as such.
Therefore this post ;-)
[edited by: reseller at 6:51 am (utc) on Sep. 26, 2005]
wouldn't it be better if google (GG or matt cutts)just exposed at least in general terms what they are trying to fight against so that all good-faith webmasters could try to follow such guidelines?
Just a thought in case GG or MC read this thread...
Over the last year or 2 Google has realised that there is only a certain amount of "money terms" - they have the best information available on this via Adwords and Adsence. Lets say they have taken the first 10000 terms - they have 2 or 3 indexes that we cannot access.
For this example - lets say there are 2 test indexes and then the final index which is the public one - this public index is introduced via all the data centers as normal.
Index one is EVERYTHING no filters applied with new sites ranking etc etc. These are then manually reviewed and highlighted as "sandboxed" - I have seen evidence of this somewhere maybe from this forum (IE manual reviews). These sites appear in the public index but are automatically "sandboxed" for the SERPS IE indexed but filtered completely out of the SERPS UNLESS it does not trigger one of the 10000 (maybe its 500000) terms predefined (MONEYDICTIONARY). If a site appears in this process to "PASS" the SPAM guidelines it is released from the first index's sandbox. This now becomes index 2.
In index 2 the same process takes place except this time if the site "PASSES" its released from the sandbox straight into the main index.
This explains the 3 to 12 month - sandbox. I have looked at the "so called" google SPAM guide that was aparently released by Google to their reviewers.
A way around this is possibly to get the site through the sandbox - once its through keep on adding pages that automatically go straight through - I think this is what people are now seeing IE this manual review is now being switched to pages rather than sites - or even if content on existing old pages are altered to drastically it gets put back into the sandbox for review again.
The spam report if real does make sence - basically a site needs to be there because the user would want the information it provides. Not a bunch of text muddled together or stolen with affiliate links, or PPC.
I just wish google would give us a break and tell us how we should pass a manual rview if the above is correct. What are they looking for? They have told us in the past. They keep talking as if there are no manual reviews.
Bear sent me a message with some interesting hijack results on my site...but both links end up nowhere...one with an "account suspended page" and the other with a page not found...so i'm not sure if I can use the Google removal tool effectively for those.
Perhaps there's a combination of factors at play here, at least for me...the two 302 hijack links and the increase in links from the affiliate catalog (last time I ever take the easy way out and let an affiliate "create" pages for my site), plus whatever wierd dup content filter that Google has cooked up (seems to work in reverse for me).
I had removed one of the hijack links using Google's removal tool in March. But the removal tool says it will only remove the link for 180 days...thought I should mention this, in case other ppl might have hijackers coming back to haunt them after 6 months break.
I agree December 04 is the time that I think problems started - did not see much improvements in Feb for the areas I watch though.
GG did mention an update to the supplemental index was expected this summer - well summer is just about over in the UK and no signs here.
But as has been mentioned here - even if Google does update the supplemental index there is always the old supplementals that keep popping back.
Whether this is the root of the problem I am not sure. The supplementals are certainly badly thought out and a mess IMO - but with other issues of 302s, Canonical url problems, stupid page counts now embedded into the index (due to supplementals probably) I just wonder which problem causes the decline of good long standing sites. Probably all related of course - but how to come back if G just wont give up on the old 302s, the old supplemental pages and the old wrongly indexed sites (non-www vs www).
That time I got hit, and began to recover in Feb. The only change I made then was changing meta description on the home page.
I've never really figured out why it happened - but now the exact same thing has happened again.
There have been no major changes to my site recently, nothing out of the ordinary. I honestly think this is collateral damage caused by some seriously faulty duplicate content detection.
Hard to believe that it's nearly a year later and google still hasn't got it right. In the mean time I scratch my head and have no idea why my site suddenly plunges in the serps for no apparent reason...
I am sorry to be so harsh but this is business. It does n't always go your way.
There is far too much emotion here. Successful businesses do not get ahead by becoming emotionally attached.
If you have lost out becuse someone has copied your site then that's tough but whinging about it will not bring in the dollars.
If you cannot detach emotion from business you might as well go back to being a corporate slave.
Our problem is/was that we are a large non-commercial site, of some importance in our niche area. Having had some staff research this over the weekend, we have concluded that the problem here is that we added an entire library a few weeks ago.
Many of the published papers in this had been published by our people elsewhere, as we don't restrict them, but many had not. We are assuming that this has 'tripped' the new filter in some way, either through duplication or through the simple addition itself.
For us the issue however is not the listing of the library, it is that the filter has hit the entire site and caused it to 'disappear' in terms of meaningful searches. We have therefore taken the step of restricting 'Googlebot' from all our library sections.
We frankly doubt that this will recover the situation, but it seems to be a sensible and simple step. We will not be contacting Google on the matter, as really we do not think we should be reacting too much to the vaguaries of an individual search engine, and indeed, it could be that they will correct whatever they have done here themselves.
Also, to prevent the same thing happening to other institutes in our niche we have this morning emailed them all, with a full explnation of what has happened, and how to restrict 'Googlebot' from specific website areas. Many of these will undoubtedly put that code in place before adding their own libraries, which at least will help them.
For us though now the chapter is closed, as I doubt we can do too much else. I do wish the rest of you well though, as it is clear that many quality websites have been unfairly penalized by this.
Maybe you can speed up the cleaning by using the Google URL removal console. But use it with care.
[services.google.com:8882...]
(I hope this URL is allowed here. If not, please delete...)
One day if the FBI and Interpol take over to see where the organized crime have invest lots of laundry cleaning will find millions of ADSENSE PAGES........
FANTASTIC SCENARIO>>>>>>>>
#1:Since Google became a stock exchange company WHO DO YOU THINK BOUGHT THE BIG PORTION OF THE CAKE?
answer:
A couple of years ago the Porn @ Gumbling industry was the #1 internet business ...now declining...
Since the organized crime godfathers realised that is money out there to make legally (billions)...there conciliatory told them ...lets get that money...
#2:concilieries told to Mafia ...instead of pushing out drugs our pushers ,let them make adsense pages
# 3:organized crime buys (they got all the money you can imagine)Authority sites with an offer you can't refuse!and put adsense to them.
#4 having many seats at the Google stock holders they are able to tell to Googleplex bosses to give orderds to theres -nerds engineers- and stuf what to do every couple of months so thay can make more money.
GOOGLE IS DETERMINED TO HAVE AN END LIKE MICHAEL JACKSON.
"BAD EGGS"
Traffic from G dropped to a quarter, and has slowly fluctuated back up to about a half of what it was
In response to reseller's questions:
- How about duplicates within your own site of your own making (in good faith of course)?
None that I know of. Site is static and I make sure that things are uploaded in the folder.
- How about the presence of javascript redirects?
The javascript I use are only from phpadsnew to serve the ads that we sell. And other ad networks like Adsense, etc.
- How about 100% frames pages with contents originated from other sites than yours?
Definitely none. We don't use frames, and we are so stingy with the free links we give to other websites.
- How about pages with "gibberish" texts?
Definitely none. The site's been around since 1997 and the site does not need these desperate measures just to earn and get traffic
- How about those "lovely" doorway pages?
I don't know how to do doorway pages (I'm a non techie who has heard of CSS but does not know how to use it and still does the website in frontpage)
- How about sitewide linking?
The only place where I have sitewide links is my navigation, which links to key sections of my site and is on all my pages.
- How about...how about etc...
I occasionally use articles submitted to us by other writers, but last I put contributed content was early last month. Could that have been the cause?