Welcome to WebmasterWorld Guest from 126.96.36.199
does anyone know if 188.8.131.52 is being used as a test server?
I seem to see my site at #9 across ALL datacenters excepts for this one which has me fluctuating between #1 and #4.
[edited by: tedster at 11:16 pm (utc) on Oct. 3, 2006]
Thanks for the info. Does that mean no more data refreshes until the new year?
IMO, data refreshes will continue to occure once a month or so.
However, you may wish to read this part of what Matt Cutts wrote today:
I know that webmasters are especially sensitive to quality/webspam/ranking changes in Q4 because of the holiday season. If weíve got something that evaluates well and that we think will improve quality, we canít just pause for 1/4th of the year, but if anything big launches Iíll try to be available to answer questions and help get a handle on any changes. (Right now Iím not expecting radical changes in webspam ranking, but I know better than to make a promise.)
Following the discussion on the trust rank.
It looks like old infrastructure datacenters for example 64.233.183.* are not indexing for new websites anymore and didn't updated the PR change a few weeks ago.
Had some major improvements on PR last update, these are not seen in these datacentres. Google news indexing from these datacenters are not showing newly added sites to the google news website etc.
Why is google not synchronising this infrastructure anymore? Or am i totally wrong?
Datacentres that had the new infrastructure and results on them were being updated regularly, but the last few datacentres that had data that was going to be phased out were updated less regularly - until one day their data was completly swapped to the new stuff. I guess, why waste resources on updating something that you know is going to be scrapped in just a few days time?
But do a PR check on all the datacenters and see for yourself. Anyway for some countries these datacentres hold the main datasets (NL/BE).
So you can understand that not updating web and google news for new websites is not really a nice thing. And probably also more countries datasets have this problem.
>> - For a brief while last week, site: only returned three results from a host. Someone mentioned it to me by email, but the first web report I saw was by DaveN on Friday (thereís your link, Dave). Fixed/working by the end that day, I think. It was related to a binary/executable that was going out, but a different binary than the one mentioned above. <<
Matt Cutts says he first noticed the shrinking site: results on the 5th. That's several days after it was first posted in here.
Yes. I was surprised to read that. And that indicates again that Matt hasn't been well informed :-)
Maybe he need to visit WebmasterWorld Forum 30 at least once each morning and each evening. His info for sure will be up-to-date, always ;-)
For few days ago, I thought that The Mother of All DCs likes my keyphrases. Today I see that the lady has changed her mind.
Oh well..... women and DCs (:(
Matt's forecast just makes me shake my head:
"It brought smarter Googlebot crawling, including tricks like full gzip support and a crawl caching proxy that means less bandwidth usage for site owners."
Seriously, the new googlebot crawling is pathetic, and nearly infinitely stupider, if only because maybe 1% of webmasters care about "less bandwidth" rather than "index my pages you inept search engine".
Sadly, Google really still seems awfully clueless about their key problems... which doesn't even mention their idiotic "index every blog and freehost page we can find" priority.
Forecast = continuing problems with no fix in sight this year
joined:Dec 28, 2005
Forecast = continuing problems with no fix in sight this year
The world of search engines and the way they crawl/index the web has changed forever once SERPs PPC was introduced to the world. The BigDady of all slippery slops.
Forecast 2006-2007 = Polish and refine your PPC skills. You are going to need them if you want to survive this crap.
The pages were previously normally listed, and now just do not appear in the SERPs at all.
It is NOT due to a change in Supplemental, or "phantom supplemental" URL reporting. Those are still there in the searches that I looked at.
You know how I keep on saying:
>> "Supplemental Results cannot harm you if they are for URLs that are 404, or are for URL that are redirects?" <<
Well scrap that. Now they can. And very badly.
Disclaimer: This is based on looking at only three sites that have lost 20 to 40% of their pages from the index in the last 48 hours (site:domain.com). The effect occured in all datacentre at the same time. This effect might be temporary. This might all be co-incidence. Yada. yada. yada.
You'll recall how I wrote just a few weeks ago about a site that was perfectly indexed.
The site had 160 pages and all were listed as www. Site has all the usual redirects and fixes in place for a long time. There are also 20 pages that recently changed their URL, when they were moved to a different folder.
So as of last week, there were 160 normal results in the index for a site search, and a further 20 Supplemental Results for the 404 pages, when uisng &filter=0 search. Omitting the &filter=0 parameter made about 20 normal pages hide behind the "click for omitted results link".
Suddenly, all of the NEW URLs for stuff that has moved are NOT indexed at all. So, the site:domain.com search lists 120 pages that have not moved, and 20 Supplemental Results that are for the 404 pages. The 20 new URLs for the stuff that moved are NOT shown in the site search any more.
Moving pages to new URLs has always been a bad idea. If Google is now going to stick with old URLs, and delist the new ones, then it has become a very very bad idea to move pages, unless there is a 301 redirect from the old to the new to capture the traffic.
On this site, the old URLs now serve a custom 404 page with basic site navigation within. That is going to get changed tout suite to individual 301 redirects for the 15 moved pages.
Again, this might only be a temporary effect in the SERPs, but changing the old URLs from returning a 404 to instead return a 301 will at least get visitors who see the old URLs flowing directly to the new URLs right now.
There are another 20 URLs affected on that site, and I need to look and see what happened to them, and why too. So, it isn't conclusive that it is just moving pages about that is the problem.
I don't think that is the case here... at least I hope... site: results have been different every day for the past 2 weeks.
One day 70 pages the other day 140 pages, the third 40 pages, the fourth 19000 pages... today 960 pages... I am possibly looking through different data centres each time, but these types of differences don't make sense.
Only the first 20 or so results seem stabilised and proper...
I'd wait another week or until things are stabilised before jumping into conclusions... Unless you know something that we don't... ;)
On these domains if you type a keyword they list almost only artikles about the keyword. All big sites (shops mostly) are a goner.
All other datacenters show the "old" SERPS.
Is google trying this way to inprove the clicks on the ads?
[184.108.40.206...] - 13.400.000 results
[edited by: Gerwin7 at 1:32 am (utc) on Oct. 15, 2006]
When the results return 25,270,000,000, instead of a more manageable 27,000,000 for a keyword, then I know the SERPS prolly aren't too reliable. ;)
"I see very different SERPS on at least these two datacenters:
I see the same thing on all 64.233.183.x range.
Wonder what it is?
The only thing I can say at the moment is some of my pages which lost ranking for several months ago on competitive keyphrases are back (not as good as they were, but back in fair positions) on that set of DCs. Maybe its a testing DCs set.
With old PR data, not so frequently updated datasets etc. I think and surely hope this is just old infrastructure. The only problem there is, is that a lot of people connect to these dc's and get also some strange results, as a lot of not so relevant sites appear on top.