| 4:13 pm on Sep 23, 2005 (gmt 0)|
50% drop in G-serps in one day on my site also (1,650 average-per-day in Sept upto and including 21st, 824 on 22nd); Yahoo and MSN show no change.
I had asked G to slow their Mozilla-bot down [webmasterworld.com] back in June (they switched it off) and at first assumed that that was the reason. It is a relief to know that there is probably no connection (relief?).
|For the past year I have experienced periodically being completely dropped off Google |
A year ago my site had a million hits/month. In November--then again in Dec, Feb, etc etc) the G-serps were slashed by 50%, 90%, etc etc, and slowly rose again over the following weeks. On the 21st Sep the site was at about 50% of the previous year's hits. By Thursday that dropped to 35%. Many other webmasters are reporting broadly-similar behaviour from Google.
It seems that this must now be considered to be "normal" behaviour for the Google Search Engine.
As Webmasters, we either allow the G-bots on to our sites, or ban them; the latter option is a no-no for most of us, but not all [webmasterworld.com].
As regards to the frustration, fear and worry that G's actions cause, I honestly believe that it is, in the end, a waste of time (and I have had a fair bit of all 3 myself in the last 9 months). Google is, by their own declaration, an inhuman company (a bit like the Borg?). They rely on technological fixes, and are distrustful of human-mediated actions. They are blinded by the bright-lights of their achievements to date (those achievements are undeniable), and unlikely to respond to the nervous quiverings of any individual webmaster. If that is an accurate statement of reality, then it is best to face up to it, and get on with life.
| 4:42 pm on Sep 23, 2005 (gmt 0)|
>>Why is it that nobody knows what is going on?<<
Let me guess ;-)
During the current part of the update, the folks at Google are testing new algos, aiming mostly at removing as many spam sites (and sites which are not in accordance with Google´s quality guidelines) as possible.
Once they decide on which algos to proceed with, we shall have a weather report either on this forum from GoogleGuy or from Matt on his blog.
| 4:44 pm on Sep 23, 2005 (gmt 0)|
exactly - and the "inner-circle" will just be logging all the current serps and related info to analyse later and test :P
| 4:48 pm on Sep 23, 2005 (gmt 0)|
what is a.....'frame breaking code on every page'
how does one ad this to the webpage...Dreamweaver 3
| 4:55 pm on Sep 23, 2005 (gmt 0)|
LadyTechy, it's a bit of code to break your site out of being in a frame on someone else's site. There's some code shown a couple of pages back in this thead [webmasterworld.com].
You just need to add it into the head section or body onload attribute, so it's executed as your page is loaded, and it'll break you out of the frame - hope that makes sense.
Personally I don't think that's at all related to the current changes some are seeing....
| 6:14 pm on Sep 23, 2005 (gmt 0)|
|'frame breaking code on every page' |
how does one ad this to the webpage...Dreamweaver 3
(to flesh out mcavill's reply):
Whilst I cannot answer for specifics on Dreamweaver (have never used it) here is general info which is accurate for HTML (I have just implemented one of them, and it works fine--essential code, since there are so many sites that will try to frame your pages):
This is the one I've used; place between the <head>...</head> sections on each page, usually below the last <meta> statement:
<!-- start Frame-busting code -->
<!-- end Frame-busting code -->
The next option will also require a <meta> statement in the HEAD section:
<body onload='if(self!= top) top.location=self.location;'>
| 7:17 pm on Sep 23, 2005 (gmt 0)|
when was Google's last major change? This could the final one (minus tweaking) for this year.
Obviously, I believe this is an update.
| 8:21 pm on Sep 23, 2005 (gmt 0)|
"your site has lost Google's 'trust'."
Unless you click "repeat the search with the omitted results included", in which case "trust" seems to miraculously return.
"I seriously doubt Google's intention is to kill informative sites like mine, but that is exactly what this algo change is doing."
It's only an algo change in the sense of misreading their own data, so there is no usefulness in calling it that. What we have is Google improperly ranking sites because it confuses itself, with many of the most authoritative domains in a niche being the ones hit because they are the ones most likely to be copied.
Sites that "copied themselves" via non-www and www duplicate content have been hit for a long time, but the key difference now is a webmaster can do nothing about this, since it is entirely out of control who steals or copies your content... plus the Supplemental index keeps these copies in its database (for all intents) permanently, meaning you can't even demand the stolen copies are removed since Google will remember them forever.
The bottom line is, unless Google completely abandons what they have done this week, very clearly any competitor with enough time on their hands can (some percentage of the time) destroy any domain they target.
| 8:31 pm on Sep 23, 2005 (gmt 0)|
Steve, I think you've hit the nail on the head (as usual). In the areas I follow we're seeing the return of a lot of scrapers and other spammy sites to the index, having the effect of pushing down the sites they're scraping. This does not appear to be an algo change per-se, but a re-inclusion (or de-filtering/de-penalization) of many sites previously removed from Google's index.
| 8:40 pm on Sep 23, 2005 (gmt 0)|
>> Unless you click "repeat the search with the omitted results included", in which case "trust" seems to miraculously return
that is the classic dupe symptom. Sadly it tages ages (at least it did) for 301-ed sites to come back. At least one major update cycle
| 8:48 pm on Sep 23, 2005 (gmt 0)|
this time it really hurts hard. Even after allegra my site hasn't been wiped like this. I've lost all of my motivation to work on that site. Unfortunately I do not have any motivation to work at all today.<<
But we are only at the begining of the update. Things use to change during an update especially during the last part of it.
>>I think I should take a break until monday...<<
Very wise decision. Have a great weekend ;-)
| 9:25 pm on Sep 23, 2005 (gmt 0)|
Hmmn update eh? And here was I thinking that my confessional re-inclusion request had finally hit a heart string.
| 9:26 pm on Sep 23, 2005 (gmt 0)|
The site I do work for has taken yet another hit.
Doing a bit of poking around I have located at least 3 sites using IP delivery methods to mirror other sites or pages from other sites.
One of these sites is also using a known IE exploit to deliver a virus as well as deliver a duplicate copy of a site.
This situation has been reported to Google.
You should be aware that one of these sites is using a DMOZ dump to provide listings of sites. If you are listed in DMOZ you may already be affected, it depends on how far along Google got with its indexing of the site doing IP delivery.
You should do a search for a fairly unique text string from your home page.
| 9:33 pm on Sep 23, 2005 (gmt 0)|
|The bottom line is, unless Google completely abandons what they have done this week, very clearly any competitor with enough time on their hands can (some percentage of the time) destroy any domain they target. |
I have several authoritative sites that regularly pick up one way inbounds without my asking for them. The content is outstanding, written by industry experts with whom I share a percentage of revenue.
One of them dropped dramatically from it's rankings. Traffic disappeared to it. I think it may have been flagged for adding too much content too fast, because I added a site map that brought a lot of forum posts out of supplemental. It may have statistically fallen out of the norm for what occurs within this quiet niche that is not known for seo spam.
It may be that what happened is a set of considerations were implemented, including what steveb is advocating. In any case I agree with steve that this has got to be dialed back because there seems to be heavy collateral damage.
| 10:12 pm on Sep 23, 2005 (gmt 0)|
[I think it may have been flagged for adding too much content too fast, because I added a site map that brought a lot of forum posts out of supplemental. It may have statistically fallen out of the norm for what occurs within this quiet niche that is not known for seo spam. It may be that what happened is a set of considerations were implemented, including what steveb is advocating. In any case I agree with steve that this has got to be dialed back because there seems to be heavy collateral damage]
This seems to be exactly what has happened to our site. We added a whole library (scientific) which the sci guys had been thinking about doing for a long time. Oops. The whole site suffers as a consequence.
The main guy here has today emailed the other institutions in our niche to tell them about it and to warn them. The concensus is that they will be fine if they block Google from crawling the new stuff.
That sounds entirely logical, and that is now how some of the others will proceed. It's too late for us though :-(
| 10:18 pm on Sep 23, 2005 (gmt 0)|
Walkman - you are so right!
I took a whole paragraph from my index page and the results suprised the "·$% out of me.
First I see another 250 other sites using my script.
Then when I do the repeat search with omitted results
I see the same results but with mine at the top!
Update - most of those other sites are just using a snipit from my index page with a link, and a good many of them are out of date and give 404's when clicked on.
This looks like a holy mess to me. I too am going to check back in a few days. This is the crazyiest index I think I have ever seen.
| 10:40 pm on Sep 23, 2005 (gmt 0)|
IMHO major SERPS changes are dupe filter, this being very smart, believes me, pages with few unique content must use noindex,follow...dont spam googlebot with tons of useless/similar pages.
One directory of my site was penalized for two weeks, but quickly back doing very well after clean-up with removal tool some similar old stuff, maybe whitelisted after manual check, I dont know exactly
| 11:16 pm on Sep 23, 2005 (gmt 0)|
I think my stats proggie has a problem ;-)
9/20/2005 Tuesday 16,369
9/21/2005 Wednesday 16,418
9/22/2005 Thursday 6,922
9/23/2005 Friday 3,959
| 11:17 pm on Sep 23, 2005 (gmt 0)|
And the weirdness just keeps on coming.
An interior page ranked #5 for a competitive term three days ago.
I just checked and that page now ranks #487...
... AND #550...
... AND #615.
Apparently I've just traded Brett Farve to three players to be named later.
(It is still at #5 if you click the more results thingee on the last result page... twenty million results returned for this search btw.)
| 11:26 pm on Sep 23, 2005 (gmt 0)|
>>An interior page ranked #5 for a competitive term three days ago.
I just checked and that page now ranks #487...<<
Any affiliate referral links or PPC spots on that particular page?
| 11:29 pm on Sep 23, 2005 (gmt 0)|
Yep, the ole duplicate content filter.
I'm more than ready to spew out some domain names but Brett would probably revoke my ability to post or exchange sticky mails.
| 11:56 pm on Sep 23, 2005 (gmt 0)|
Im sitting going through my stats for 23rd / 22nd / 21st and one thing thats hit me is there are obviously huge google referrer drops but local google searches are falling off in a big way.
On the Sept 21st 17 of 25 were google.com /.co.uk /.ca etc.
3 were google fed SEs.
3 other sites
On Sept 23rd 8 of 25 were google.com /.co.uk /.ca etc.
2 google fed SEs
11 other sites
When I see altavista in my top 25 referrers I know im in the s$*t.
Joking aside, I have always had a lot of local google referrals, has anyone else noticed this?
| 12:26 am on Sep 24, 2005 (gmt 0)|
I know I'm just repeating the experience of most other people here but, what the heck, I'm fed up with it too so need a moan.
My site is in a very competitive category (online casinos) but is - IMHO - a genuine diamond in a sea of useless junk, consisting largely of (expensive) paid for articles.
Where does it rank now? Nowhere. You can search for it by site title and sites that mention mine are found - mine is somewhere on page 4 zillion.
| 12:28 am on Sep 24, 2005 (gmt 0)|
I am glad I still have my day job.
| 12:38 am on Sep 24, 2005 (gmt 0)|
I don't know if it is an update or not, but something is certainly going on. One of the sites I lost in Bourbon is back! Lots of Google referrals for the past couple of days now, compared to the last four months with nearly none. I hope it sticks.
I feel bad for those whose sites are having problems now - that's what happened to me before. But I sure am relieved to have one site back anyway!
| 1:15 am on Sep 24, 2005 (gmt 0)|
>> think it may have been flagged for adding too much content too fast, because I added a site map that brought a lot of forum posts out of supplemental.
Hmmm... we removed a bad robots.txt which had caused a few hundred pages to get de-indexed. I'd guess it is a statistical abnormality, but I'm not sure if thats the reason the site's been knocked down.
Steveb, Well spotted. We're back on the front page with the filter=0 param. This might indicate that its not a temporary glitch. Also, What do you think it causing it?
| 1:22 am on Sep 24, 2005 (gmt 0)|
What's the search format for 'filter=0'?
| 1:26 am on Sep 24, 2005 (gmt 0)|
Just paste &filter=0 after a google search page URL
| 1:35 am on Sep 24, 2005 (gmt 0)|
I didn't actually see any difference in the SERPs in my category when I added it.
| 1:36 am on Sep 24, 2005 (gmt 0)|
I am having the same issue with what were my best terms when I add filter=0 my site is where it was when I don't it's gone completely. I understand that filter=0 removes the duplicate content filter but what does that mean?
Does it mean someone is stealing my content or google sees the page as having duplicate content or that it thinks my site as a whole has duplicate content.
My site is all original content which is virtually unchanged over the last year and the site has been in existance for 4 years.
In addition it has had virtually the same placement with minimal fluctuations over the past 2 years for the search terms in question.
| 1:39 am on Sep 24, 2005 (gmt 0)|
Looks like this is a directory-wide or site-wide filter (not too sure about this). Pages added in the last 3 or 4 days -- which never surfaced to the top 30 are in the top 30 with the filter=0.
So, I don't think this is a dupe content filter like a couple of people suggested. filter=0 seems to just disable the new thingy that google rolled out.