man.. google's results pages are getting so messy now. local results, sitelinks, shopping results, postcode filters etc. sure.. some of these things can be useful at times but the results page beginning to look like yahoo's homepage :) the relevant natural results are getting buried in amongst all this crap - why does every service eventually seem to try and kill itself by adding features and complicating things?
JS_Harris, you are not alone. I have had a 33% decrease in traffic to my index page from Google since the October 31st SNAFU.
i still hope that one day the web results get their own dedicated link for users wishing clean results in the old format.
|Nope, it wasn't a test, whitenight. |
lol, ok, Matt, I'll play along.
Since I'm not into labels, just results...
Praytell, what is it then?
Why, or should i say HOW, is this the 2nd time this "bug" has happened like this?!
Why is the bug still floating around?
Since it has NOT been fixed yet, as promised, when shall we expect it to be fixed?
Care to explain why the "bugged" results still sit so firmly on 220.127.116.11
Thanks in advance.
|but if there is a penalty and there is no message at WebMaster Tools, how can you know what happened? |
Yeah, I just want to make sure this one gets quoted for emphasis. Are we supposed to consult a psychic when we experience a sudden large loss of traffic for no apparent reason on a pure white hat site? I mean, I'd gladly correct whatever rule I've apparently broken if I had a single solitary clue what it was.
No change since the random 70-80% traffic loss I got a month ago after I made the apparently horrific mistake of cleaning up my template and cleaning out unused entries in my css, but leaving the content alone. I'm seriously reluctant to even do updates on it anymore because I have no idea if I'm making it worse when I do or not. All I can really do is hope it's one of these "tests" and that it's going to end at some point. It's like the hand of God just plucked my site up and threw it into the abyss without explanation.
|i still hope that one day the web results get their own dedicated link for users wishing clean results in the old format. |
With Sponsored Results on the right (possibly the top, but coloured) for monetisation.
MattC, since you're here?
Now whitenight, I am nowhere near as well informed as you, but could this not be an accidental sharing of datasets that should have sat ring-fenced on 18.104.22.168? Your uncanny predictions of this 'event' notwithstanding
edit for spelling
[edited by: Shaddows at 2:11 pm (utc) on Nov. 3, 2008]
|Now whitenight, I am nowhere near as well informed as you, but could this not be an accidental sharing of datasets that should have sat ring-fenced on 22.214.171.124? Your uncanny predictions of this 'event' notwithstanding |
Sure. Maybe it wasn't supposed to "roll-out" but it's quite obvious they are "testing" something on that dataset.
Otherwise, why have it exist?
I don't necessarily buy that argument, but it's plausible.
It still begs the questions I posed to Matt.
It begs the question, "What the heck is going on at the 'plex that this has happened twice now?"
Sorry, but in GIVING Goog credit for KNOWING what they are doing, it gives away alot of valuable information that they wouldn't want webmasters to reverse-engineer
If they DON'T know what they are doing, then.. umm.. hello. <begin and end Google Noise rant here>
So I'm left with 2 arguments. The first makes more sense in MY mind.
It's a case of "Ignore the man behind the curtains".
At this point, it doesn't matter too much to me.
The "bug" gave me more than enough info to use advantageously.
I hope others think carefully and logically about the hows and whys of this event and make their own conclusions.
There is a 3rd argument
The 3rd argument is that they aren't "testing" and it's not a "bug"
They are FLUSHING something. So again, what are they "flushing"?
What can we learn from it?
I'd be happy with the results I'm seeing on 126.96.36.199, with the exception of the UK and AU sites littering the top 10. Some of the sites I know to be doing some blackhat linking are gone, or have had the links devalued. With out spending hours searching this DC, I don't see the problem with the data on it...
Flushing I like. It felt like an iterative rebuild with slightly different tweaks, then the clever stuff (filters and the sitelinks algo for eg) bolted on near the end, with more iterations.
I confess to not sitting up for your 'golden goody' moment, nor having the tools to gain those goodies. However, the results evolved from ugly to nearly normal with some noticable re-evaluations (not to say penalties), indicating something significant had been devalued.
Reading others' experiences and observations, I guess social netwoking and off-topic IBLs have been devalued, though thats a guess as they do not feature heavily in my backling profile, and with few exceptions I've rather benefited from this. (Reasonably successful/respected ecommerce site with value added rather than simple product listings)
188.8.131.52 gives very weird results for a certain kw from here.
Looks like having the kw three times in the title is good?
>They are FLUSHING something. So again, what are they "flushing"?
>What can we learn from it?
i can guess following this logic (i am a computers programmer) that they are calculation PG (real page rank i mean the real equation) on line so to make an Algo do this online in a dataset of many GB of indexes and so it is not so simple, google also uses CloudComputing that is based on many virtual "dedicated nodes" and many database clusters, so the "Flushing" or as i am guessing the letīs say "current dataset version" needs to be up to date all the time what is near impossible
then as you introduce new pages or new inbound links appear or the opossite not found pages are cached or static not modified content is kept, links are stagnant then google needs to again, crawl compute and network share this propagating data in all goog servers again
title, dmoz and directories references, other factors are giving a "penalty" feeling but could be just part of the goog's own growing, my Not Yet Assigned (page rank is giving 5 days up to 7 to propagate) and also Sitemap structures are being observed i think that deleting your sitemap if not needed could help to fast expire and recycle old internal indexes caches
well need a couple of months to have more clear vision of all this stuff i started to track this traffic lost from my site a month ago and it seems that any changes i made are taking weeks to propagate in the googl Serp
Just had a thought. What if Google deliberately excluded homepages (the first big indication that something was up) and rebuilt from there- seeing how homepages reflected the ACTUAL content of the site and not what the owner was pushing.
It doesn't fit in with my other line of thought, but I just can't get my head around that early disappearence of homepages with inner pages still there, plus the simultaneous disappearance of sitelinks. My only thought is a forced relevency check (hence ignoring the 'noisy' or heavily SEO'd front page). Anyone else got a theory on the missing homepage phase of the 'event'
[edited by: Shaddows at 4:41 pm (utc) on Nov. 3, 2008]
|Flushing I like. It felt like an iterative rebuild with slightly different tweaks, then the clever stuff (filters and the sitelinks algo for eg) bolted on near the end, with more iterations. |
....However, the results evolved from ugly to nearly normal with some noticable re-evaluations (not to say penalties), indicating something significant had been devalued.
Yes, you saw exactly what Cain, I, and some others saw.
Perhaps "test" is the wrong word.
Either way, it looked PURPOSEFUL. Not some bug.
There's something "new" and "big" in this "whatever-happened" that I believe will become more apparent at the beginning of the New Year.
|i can guess following this logic (i am a computers programmer) that they are calculation PG (real page rank i mean the real equation) on line so to make an Algo do this online in a dataset of many GB of indexes and so it is not so simple, google also uses CloudComputing that is based on many virtual "dedicated nodes" and many database clusters |
Actually Jagger and BigDaddy updates were the "updates"/"infrastructure" changes that took care of this process so it runs fully through about every 72 hours or so.(for "real" PR)
Before then, the "updates" WERE mostly based on what you describe above.
In case of a bug in PR calcs (or whatever we call it: TF/IDF, index-time-boosting, etc.) it will take few weeks to fix it (MapReduce tasks on over 10k CPUs)
I don't think it is bug. It could be fault of a system, or kind of a new feature which will be self-tuned soon (for instance, by tracking users via G-Toolbar and/or their clicks on AdWords).
Oh, don't forget about AdWords... G won't change anything if profit increases.
|just can't get my head around that early disappearence of homepages with inner pages still there |
...Anyone else got a theory on the missing homepage phase of the 'event'
That's the Ghost Dataset.
I don't see it as a bad thing those home pages went "missing". I see it as a good thing.
< moved from another location >
this is my first post since some years... Excuse my bad english. I am owner of a multidomain project >300 domains and since 2002 I work with the special problems to have a trademark and a multidomain network. My problem since 2003: I want to be listed only 1 time for a search phrase:-)
The changes, happened on friday 31.10.2008 are not a test, if big changes are in a database(google) it is always a new process to bring the new pieces together. If something changes I see it very early, so on friday morning I have seen the changes(working process of the filters) in the database(google).
Some of my domains are filtered out, for me not bad because I want to stay in the -30 -50 filter. Crazy, but it is the real thing. In 2003 - ~500 of my doms were listed from place 1 to 500 in google for some serchkeys, than the next competitor was listed. Since this time I must think about dup content, trust etc. and the difference between filters and penalty.
On friday morning they(google) set the filter(s) on their database. So a lot of projects were filtered out. After some calculating time the algo, step by step, put a lot of domains out of the filter back in serps. Actually a lot of projects are in this filter(take a look in the datacenter which is showing this effects). A lot of webmasters had luck to get back in the index, but think about the following:
1. New index is not rolled out yet
2. A lot of projects are filtered out on friday in the early steps... they got back in serps but....
In the next days when things are clearer I will tell you the complete story of my knowledge and experience with this filter thing... Therefore I put the story on one of my websites... You will be informed...
[edited by: tedster at 6:32 pm (utc) on Nov. 3, 2008]
|Small Website Guy|
These new results are much better than the results from last week. All of the spam sites are GONE from my corner of the web.
There was one site in particular that was at the very top of the SERPS for a bunch of competitive keywords. I knew there was some dubious-colored hat stuff going on. That site is now nowhere to be seen.
My site is the same place where it's always been--actually, I've moved up a few slots now that the spam stuff is out of there.
For the first time in a long time, I think that Google is finally showing the correct results.
After much thought it was definately a 'recent' link de-valuing. Accidental or not is not important.
A site normally at no.1 that I KNOW employs an SEO that cycles links (if paid) round its 300+ sites dissapeared off the serps.
Old sites, with old stable links, even if not many, jumped up.
One site i had, fell to the same position it had with not too many links, (before i had started a link campaign).
It MAY have been a test to further the age old Google capaign to weed out the link farmers which went too far.
It was such a vast change I cannot beleive it was a test, G must be VERY careful about massive changes too quick as they kbnow it affects the lives and livelyhoods of so many people, unless it took them by surprise how many links were dynamic.
If so, they learned much from it....
|but I just can't get my head around that early disappearence of homepages with inner pages still there |
IMO with Universal search Googleís goal was to give more visibility to lesser seen and fresher content. Basically with an automated algo based upon links you canít have both index pages and lesser linked to pages ranking well simultaneously without forcing the results. In other words you have to dilute or remove many index pages to get what you want seen or push sites out. The dilution would come by attacking the SEO. If you only push sites to other areas it likely creates a yo-yo effect because the sites donít fit in those areas based upon the automated algo. In effect they bounce.
Even if this isnít the situation itís completely beyond me why whitenight would assert this is a good thing for people experiencing this problem. If any page has gone untouched for years or months and suddenly it vanishes or has a significant drop in ranking we can't keep holding Google above reproach.
As I was implying earlier what people are reporting seems too targeted to be a bug or test.
Ummmm ... I am no longer ranking for my terms ... but according to Analytics, I am still getting the same traffic from Google through those keywords.
Is this a SERPS analysing filter? ;)
Yeh, but as someone said earlier, most links are pointed to homepages.
Devaluing of links would hit homepages first
|Reading others' experiences and observations, I guess social netwoking and off-topic IBLs have been devalued, though thats a guess as they do not feature heavily in my backling profile, and with few exceptions I've rather benefited from this. (Reasonably successful/respected ecommerce site with value added rather than simple product listings) |
Something else going on - either instead of or in addition to. I have neither social media nor off topic inbound links. Such a devaluation would only help me, not hurt me.
What are you using to check your links?
After closer scrutiny it seems there was several weeks (~4-5) where link anchor text was losing value and on or about the 31st of Oct. this trend began to reverse by as much as 20% from its low, but still anchor text links not as powerful as 1 month ago+. At least not yet.
I do not know how many of you look at a large set of keywords across many spectrums but I do and I would also point out the trend of 1000+ keywords did not change abnormally during any period of time. There is a slight increase in rank churn percentages but its statistically marginal. The biggest churn area both popping into and out of the top 20 rankings I see is wiki (am I allowed to mention wiki? haha).
Strange, I had thought Wiki was doing reasonably well - and others have thought so too (I refer to Wheel on Nov 1st back on page 5). I'm getting frustrated with declaring my lack of broad-spectrum data- I'm going to start monitoring SERPs that have no relevance to me, except for analysis.
gford, are you saying that in all the chaos, trends were not significantly dislocated (to use a stock market term)? Or just that plotting trends today compared to 30th October shows strong correlation?
|After much thought it was definately a 'recent' link de-valuing |
I could not disagree more. I've found that 'stale' links pass less juice, and new ones are being showing special powers. A bit like social-media links have done for a while, but more broadly (in keeping with bumping the lastest fad, only for it to drop like a stone when the buzz dies, but link remains).
wheel, I'm surprised we are suffering different fates. You post a lot in link-building, and your outlook on this area is not dissimilar to mine- and as I say, I've done no worse, and broadly better.
[edited by: tedster at 11:09 pm (utc) on Nov. 3, 2008]
[edit reason] fix formatting [/edit]
Strange one for me. One of my sites has lost 80 or so positions for one particular phrase search, but for all other search positions the site ranks as before. I'm not sure what might have triggered this.
RE: "A set of holiday season penalties and filters just rolled out"?
Sorry, but I can't find exactly what this list of penalties and filters refer to. Is there a list somewhere in this post that explains what the "penalties and filters" are?
bkmed- nope, that was just the first post selected by tedster when he opened a new thread.
Webmasterworld runs a SERPs thread on a monthly basis. This thread will soon be 8 pages long, after only 3 days. It is impractical to keep it as an unbroken thread- it would be stupidly long by now.
To see that post in context, please visit last month's thread:
The (massively oversimplified) context was that SERPs went crazy, which is usually down to filter and/or penalty re-evaluation. The consensus is this was something else. Read the whole thread for full info (over 200 posts now!)
And welcome to the Forums!
|Even if this isn't the situation it's completely beyond me why whitenight would assert this is a good thing for people experiencing this problem. If any page has gone untouched for years or months and suddenly it vanishes or has a significant drop in ranking |
Sorry. talking two different things here.
Shaddows and I were talking about the "authority sites" that had their home pages go missing during that 24 hour period and then regained their rankings at, or better, than before.
If your pages and/or rankings are STILL missing/lowered, then yes, you should do some serious studying of the SERPs and figure out what the problem is.
< continued here: [webmasterworld.com...] >
[edited by: tedster at 9:26 pm (utc) on Nov. 5, 2008]
| This 210 message thread spans 7 pages: < < 210 ( 1 2 3 4 5 6  ) |