Welcome to WebmasterWorld Guest from 18.104.22.168
Now this is interesting.
A site with 40 000 "real" pages and some 80 000 duplicate content pages excluded using robots.txt (it's a forum - see my prior posts about vbulletin) and still some 80 000 duplicate pages that are not yet so excluded.
Additionally some 500 000 non-thread pages also excluded in robots.txt and most of those already delisted. The whole site is listed as www; nothing is listed as non-www at all.
Looking purely at indexed threads:
site:domain.com shows 90 000 www pages all as normal results; including some duplicate content that will eventually be excluded.
site:domain.com -inurl:www shows 24 000 www pages all of which are marked as Supplemental Results and all of which also have an old cache date. This search should show zero results. It certainly should not be showing www pages at all, the search was for "-inurl:www". What is going on?
[edited by: tedster at 8:44 pm (utc) on June 13, 2006]
[edit reason] split into new thread [/edit]
What I have been calling the old results have now changed and in my opinion for the worse. They were bad before but now paid for links with "search term" in visible link text is being rewarded even more. A quick search shows two sets with some slight variability in my niche in the "New Crap" DCs.
What I am calling "New Good Stuff" here is what some of us have been holding our collective breath's for.
New Good Stuff
I still live in hope that the "New Good Suff" will win out.
[edited by: tedster at 7:29 am (utc) on June 14, 2006]
I totally agree with this conclusion the new good stuff is the way forward and hopefully The Big G will realise this. The new good stuff shows a far better return for searches and far more targeted to what people are looking for. In the words of the Big G "build it for the customers!" I think G should take a leaf out of their book and proprogate the new good stuff as that would definately be built for the customers.
Also the questions asked about whether they would prefer showing good results in the natural listings I think should be answered with YES! because if they are showing good results in the natural listings it would benefit the customer and they would be more likely to stay on google for longer again increasing revenue and page impressions of the ads.
Just my 2 cents worth anyway.
If that's the shape of things to come...I love it! (at least in my nano-micro-niche)
SteveB, FWIW, I think you're on target as usual. G seems to have a standard practice of testing by implementation, rather than test - THEN implement.
We stand in the wading-pool filled with gasoline and they throw matches. Sometimes we get burned, sometimes we don't. There's just no telling what those whacky fellows in the Gplex will do next.
site:domain.com -inurl:www returns several thousands of www pages marked as Supplemental with old cache dates (without the -inurl:www parameter you get 100 000 normal www pages listed).
site:domain.com -inurl:forum returns thousands of forum (folder name) www pages marked as Supplemental, all with old cache dates.
site:domain.com -inurl:www -inurl:forum is just as bad, if not worse.
site:domain.com inurl:www -inurl:forum shows loads of URL-only non-forum-URLs (but which are www URLs) and which have been excluded by robots.txt for the last 6 months.
I've just been back to see what happenned over night and 9 that I reported as new good stuff have now become new crap and 9 of the new crap have now become new good stuff.
The IPs therefore seem to me to be meaningless in terms of watching for migration. Are they switching machines with different algo/indexes on or are they simply messing around with their DNS servers and allocating different IPs to different machines.
Do they even realise that they have more than one data set?
Life's a lottery and I'm wondering if I've lost my ticket!
[edited by: tedster at 7:34 am (utc) on June 14, 2006]
I saw the exact same thing. It seems they are changing things over on a minute by minute basis. I don't think they know they have 2 sets of serp results out there. What happened to that google liason guy that was hired to liase with us webmasters and google has anyone seem him in here recently maybe he can shed some light on this topic. is it andrew or alan or something like that?
"if by update you mean 9 months of turmoil and constant churn then yes we are in the middle of it...but then to say its the middle means you know where the end is..do you?"
IMO, and I said it before, from now on Google's serps shall be in continuos dynamic movements. You may call it For-Ever-Flux. However, there will be of course some short periods with relative stable serps, IMO again :-)
IMO, and I said it before, from now on Google's serps shall be in continuos dynamic movements.
Websites will always shuffle regularly as certain factors change but an update is still definitely required...
As far as I am aware there has not been one since November.
There are so many webmasters now with <12 month old websites that are ranked everywhere bar Google - in my opinion that's not a coincidence and they are all awaiting an update.
We're not talking about a small drop in positions on Google - most sites aren't even recognised in the top 900 on competitive phrases which just can't be right at all.
So I'm afraid I don't buy into your continuous dynamic movements theory at all.
I think you're confusing 'updating' (as in adding / updating data) the index with 'updating' the calculation / ranking of the index.