Welcome to WebmasterWorld Guest from 22.214.171.124
Now this is interesting.
A site with 40 000 "real" pages and some 80 000 duplicate content pages excluded using robots.txt (it's a forum - see my prior posts about vbulletin) and still some 80 000 duplicate pages that are not yet so excluded.
Additionally some 500 000 non-thread pages also excluded in robots.txt and most of those already delisted. The whole site is listed as www; nothing is listed as non-www at all.
Looking purely at indexed threads:
site:domain.com shows 90 000 www pages all as normal results; including some duplicate content that will eventually be excluded.
site:domain.com -inurl:www shows 24 000 www pages all of which are marked as Supplemental Results and all of which also have an old cache date. This search should show zero results. It certainly should not be showing www pages at all, the search was for "-inurl:www". What is going on?
[edited by: tedster at 8:44 pm (utc) on June 13, 2006]
[edit reason] split into new thread [/edit]
I am now seeing "New Good Crap Stuff" spread out across all data centers!
The new good stuff just went back to the "good old crap" of several weeks ago on the "New Good Crap" data centers.
I think that we need to give the data sets a name so that we all know what we are talking about.
I suggest that we call the new good stuff "Copra" and the even newer crap "Turd".
Just to be clear Copra is the set which Petehols and I were hoping would propogate and Turd is the absolute crap that we noticed yesterday.
Lets face it its all crap by different names and one mans crap is another mans fertilizer :)
The remainder have "Turd" on them.
[edited by: tedster at 7:29 am (utc) on June 14, 2006]
These are tidal waves.
Yes, in my little corner they are big changes, the biggest I’ve seen in 2 years, but they are not necessarily bad changes. Many of the old ‘part of the furniture’ sites have been uprooted and shuffled for the first time.
Age looses way to youth and beauty.
also to this point BD had opened the gates to pro spamming that was previously under control...
Last night 3 more servers "defected" from Copra to Turd. None switched from Turd to Copra.
Copra - 26
Turd - 44
Additionally, the results (for my two word phrase) in both the Turd and Copra groups had a spread of 3-5 points between the individual servers. This morning, all the servers I looked at in the Turd group, 18 out of 44, are EXACTLY synchronized. I haven't got time to check every single one, but I'm betting that they're all like that. The Copra group still shows a spread of 3 points.
The only conclusions I draw from this is that A) whether they know it or not, G does have two distinct sets of results, which are based on two distinct algos or filter combinations. B) Speaking strictly from my own perspective, the Turd results are closer to what I was seeing before all this BS started than are the Copra results. C) The Turd results MAY be gelling into what we will have to work with in the immediate future. D) Steveb called it correctly. (paraphrase) They tried a new formula and it blew up in their face. Now they are doing damage-control and trying to recover.
[edited by: tedster at 7:30 am (utc) on June 14, 2006]
Is it now just a case of them being the biggest so in effect they can do what they like? "who cares if the serps are relevent as long as those adwords get purchased" kind of attitude.
I concur with seing the same data movement and boy is it ugly!.
I can also add that About half an hour ago they were displaying google adwords above the serps results and under them (rather than down the sides) ie similar to Yahoo. So position 11 in the natural seps would become position 16 and position 21 would be about 32 etc etc. That was even worse if it can be.
All in all, imo google have gone from previously being a quality provider of search to a greed driven public company with no vision other than how to reduce serps quality in order to try and increase adwords spend and frankly i think they have pushed the wire so far now that they are no longer a relevent search engine.
How long it will take for users to deflect is anyones guess but i wouldnt want to be holding stock in Google thats for sure, i think its days are numbered now the boys were right to sell out when they did imo
Lets wait and see what this way ends....
Old SERPs with Some Variations
[edited by: tedster at 7:27 am (utc) on June 14, 2006]
Going forward, you should focus on whether the links are relevant and useful to users of the web site, and located in a place where those users will find and use them.
Google is trying to distinguish between links that are only there for SEO purposes and links that are there for other (more "legitimate") reasons.
Lots of discussion of this topic is available elsewhere in WebmasterWorld.
Oh yeah! There are thousands of them. Now that is a programming bug.