Forum Moderators: Robert Charlton & goodroi
A set of holiday season penalties and filters just rolled out.
[edited by: tedster at 3:21 am (utc) on Nov. 1, 2008]
Nope, it wasn't a test, whitenight.
lol, ok, Matt, I'll play along.
Since I'm not into labels, just results...
Praytell, what is it then?
Why, or should i say HOW, is this the 2nd time this "bug" has happened like this?!
Why is the bug still floating around?
Since it has NOT been fixed yet, as promised, when shall we expect it to be fixed?
Care to explain why the "bugged" results still sit so firmly on 216.239.59.104
Thanks in advance.
but if there is a penalty and there is no message at WebMaster Tools, how can you know what happened?
Yeah, I just want to make sure this one gets quoted for emphasis. Are we supposed to consult a psychic when we experience a sudden large loss of traffic for no apparent reason on a pure white hat site? I mean, I'd gladly correct whatever rule I've apparently broken if I had a single solitary clue what it was.
No change since the random 70-80% traffic loss I got a month ago after I made the apparently horrific mistake of cleaning up my template and cleaning out unused entries in my css, but leaving the content alone. I'm seriously reluctant to even do updates on it anymore because I have no idea if I'm making it worse when I do or not. All I can really do is hope it's one of these "tests" and that it's going to end at some point. It's like the hand of God just plucked my site up and threw it into the abyss without explanation.
i still hope that one day the web results get their own dedicated link for users wishing clean results in the old format.
MattC, since you're here?
Now whitenight, I am nowhere near as well informed as you, but could this not be an accidental sharing of datasets that should have sat ring-fenced on 216.239.59.104? Your uncanny predictions of this 'event' notwithstanding
edit for spelling
[edited by: Shaddows at 2:11 pm (utc) on Nov. 3, 2008]
Now whitenight, I am nowhere near as well informed as you, but could this not be an accidental sharing of datasets that should have sat ring-fenced on 216.239.59.104? Your uncanny predictions of this 'event' notwithstanding
Sure. Maybe it wasn't supposed to "roll-out" but it's quite obvious they are "testing" something on that dataset.
Otherwise, why have it exist?
I don't necessarily buy that argument, but it's plausible.
It still begs the questions I posed to Matt.
OR
It begs the question, "What the heck is going on at the 'plex that this has happened twice now?"
Sorry, but in GIVING Goog credit for KNOWING what they are doing, it gives away alot of valuable information that they wouldn't want webmasters to reverse-engineer
If they DON'T know what they are doing, then.. umm.. hello. <begin and end Google Noise rant here>
So I'm left with 2 arguments. The first makes more sense in MY mind.
It's a case of "Ignore the man behind the curtains".
At this point, it doesn't matter too much to me.
The "bug" gave me more than enough info to use advantageously.
I hope others think carefully and logically about the hows and whys of this event and make their own conclusions.
I confess to not sitting up for your 'golden goody' moment, nor having the tools to gain those goodies. However, the results evolved from ugly to nearly normal with some noticable re-evaluations (not to say penalties), indicating something significant had been devalued.
Reading others' experiences and observations, I guess social netwoking and off-topic IBLs have been devalued, though thats a guess as they do not feature heavily in my backling profile, and with few exceptions I've rather benefited from this. (Reasonably successful/respected ecommerce site with value added rather than simple product listings)
i can guess following this logic (i am a computers programmer) that they are calculation PG (real page rank i mean the real equation) on line so to make an Algo do this online in a dataset of many GB of indexes and so it is not so simple, google also uses CloudComputing that is based on many virtual "dedicated nodes" and many database clusters, so the "Flushing" or as i am guessing the let´s say "current dataset version" needs to be up to date all the time what is near impossible
then as you introduce new pages or new inbound links appear or the opossite not found pages are cached or static not modified content is kept, links are stagnant then google needs to again, crawl compute and network share this propagating data in all goog servers again
title, dmoz and directories references, other factors are giving a "penalty" feeling but could be just part of the goog's own growing, my Not Yet Assigned (page rank is giving 5 days up to 7 to propagate) and also Sitemap structures are being observed i think that deleting your sitemap if not needed could help to fast expire and recycle old internal indexes caches
well need a couple of months to have more clear vision of all this stuff i started to track this traffic lost from my site a month ago and it seems that any changes i made are taking weeks to propagate in the googl Serp
It doesn't fit in with my other line of thought, but I just can't get my head around that early disappearence of homepages with inner pages still there, plus the simultaneous disappearance of sitelinks. My only thought is a forced relevency check (hence ignoring the 'noisy' or heavily SEO'd front page). Anyone else got a theory on the missing homepage phase of the 'event'
[edited by: Shaddows at 4:41 pm (utc) on Nov. 3, 2008]
Flushing I like. It felt like an iterative rebuild with slightly different tweaks, then the clever stuff (filters and the sitelinks algo for eg) bolted on near the end, with more iterations.....However, the results evolved from ugly to nearly normal with some noticable re-evaluations (not to say penalties), indicating something significant had been devalued.
Yes, you saw exactly what Cain, I, and some others saw.
Perhaps "test" is the wrong word.
"Rebuild?" "Flush?"
Either way, it looked PURPOSEFUL. Not some bug.
Agreed?
There's something "new" and "big" in this "whatever-happened" that I believe will become more apparent at the beginning of the New Year.
i can guess following this logic (i am a computers programmer) that they are calculation PG (real page rank i mean the real equation) on line so to make an Algo do this online in a dataset of many GB of indexes and so it is not so simple, google also uses CloudComputing that is based on many virtual "dedicated nodes" and many database clusters
Actually Jagger and BigDaddy updates were the "updates"/"infrastructure" changes that took care of this process so it runs fully through about every 72 hours or so.(for "real" PR)
Before then, the "updates" WERE mostly based on what you describe above.
Oh, don't forget about AdWords... G won't change anything if profit increases.
Hi,
this is my first post since some years... Excuse my bad english. I am owner of a multidomain project >300 domains and since 2002 I work with the special problems to have a trademark and a multidomain network. My problem since 2003: I want to be listed only 1 time for a search phrase:-)
The changes, happened on friday 31.10.2008 are not a test, if big changes are in a database(google) it is always a new process to bring the new pieces together. If something changes I see it very early, so on friday morning I have seen the changes(working process of the filters) in the database(google).
Some of my domains are filtered out, for me not bad because I want to stay in the -30 -50 filter. Crazy, but it is the real thing. In 2003 - ~500 of my doms were listed from place 1 to 500 in google for some serchkeys, than the next competitor was listed. Since this time I must think about dup content, trust etc. and the difference between filters and penalty.
On friday morning they(google) set the filter(s) on their database. So a lot of projects were filtered out. After some calculating time the algo, step by step, put a lot of domains out of the filter back in serps. Actually a lot of projects are in this filter(take a look in the datacenter which is showing this effects). A lot of webmasters had luck to get back in the index, but think about the following:
---------------------------------------
1. New index is not rolled out yet
2. A lot of projects are filtered out on friday in the early steps... they got back in serps but....
---------------------------------------
In the next days when things are clearer I will tell you the complete story of my knowledge and experience with this filter thing... Therefore I put the story on one of my websites... You will be informed...
[edited by: tedster at 6:32 pm (utc) on Nov. 3, 2008]
There was one site in particular that was at the very top of the SERPS for a bunch of competitive keywords. I knew there was some dubious-colored hat stuff going on. That site is now nowhere to be seen.
My site is the same place where it's always been--actually, I've moved up a few slots now that the spam stuff is out of there.
For the first time in a long time, I think that Google is finally showing the correct results.
It was such a vast change I cannot beleive it was a test, G must be VERY careful about massive changes too quick as they kbnow it affects the lives and livelyhoods of so many people, unless it took them by surprise how many links were dynamic.
If so, they learned much from it....
but I just can't get my head around that early disappearence of homepages with inner pages still there
IMO with Universal search Google’s goal was to give more visibility to lesser seen and fresher content. Basically with an automated algo based upon links you can’t have both index pages and lesser linked to pages ranking well simultaneously without forcing the results. In other words you have to dilute or remove many index pages to get what you want seen or push sites out. The dilution would come by attacking the SEO. If you only push sites to other areas it likely creates a yo-yo effect because the sites don’t fit in those areas based upon the automated algo. In effect they bounce.
Even if this isn’t the situation it’s completely beyond me why whitenight would assert this is a good thing for people experiencing this problem. If any page has gone untouched for years or months and suddenly it vanishes or has a significant drop in ranking we can't keep holding Google above reproach.
As I was implying earlier what people are reporting seems too targeted to be a bug or test.
Reading others' experiences and observations, I guess social netwoking and off-topic IBLs have been devalued, though thats a guess as they do not feature heavily in my backling profile, and with few exceptions I've rather benefited from this. (Reasonably successful/respected ecommerce site with value added rather than simple product listings)
I do not know how many of you look at a large set of keywords across many spectrums but I do and I would also point out the trend of 1000+ keywords did not change abnormally during any period of time. There is a slight increase in rank churn percentages but its statistically marginal. The biggest churn area both popping into and out of the top 20 rankings I see is wiki (am I allowed to mention wiki? haha).
gford, are you saying that in all the chaos, trends were not significantly dislocated (to use a stock market term)? Or just that plotting trends today compared to 30th October shows strong correlation?
After much thought it was definately a 'recent' link de-valuing
I could not disagree more. I've found that 'stale' links pass less juice, and new ones are being showing special powers. A bit like social-media links have done for a while, but more broadly (in keeping with bumping the lastest fad, only for it to drop like a stone when the buzz dies, but link remains).
wheel, I'm surprised we are suffering different fates. You post a lot in link-building, and your outlook on this area is not dissimilar to mine- and as I say, I've done no worse, and broadly better.
[edited by: tedster at 11:09 pm (utc) on Nov. 3, 2008]
[edit reason] fix formatting [/edit]
Webmasterworld runs a SERPs thread on a monthly basis. This thread will soon be 8 pages long, after only 3 days. It is impractical to keep it as an unbroken thread- it would be stupidly long by now.
To see that post in context, please visit last month's thread:
[webmasterworld.com...]
The (massively oversimplified) context was that SERPs went crazy, which is usually down to filter and/or penalty re-evaluation. The consensus is this was something else. Read the whole thread for full info (over 200 posts now!)
And welcome to the Forums!
Even if this isn't the situation it's completely beyond me why whitenight would assert this is a good thing for people experiencing this problem. If any page has gone untouched for years or months and suddenly it vanishes or has a significant drop in ranking
Outland,
Sorry. talking two different things here.
Shaddows and I were talking about the "authority sites" that had their home pages go missing during that 24 hour period and then regained their rankings at, or better, than before.
If your pages and/or rankings are STILL missing/lowered, then yes, you should do some serious studying of the SERPs and figure out what the problem is.
< continued here: [webmasterworld.com...] >
[edited by: tedster at 9:26 pm (utc) on Nov. 5, 2008]