I lost faith in google and I think a lot of people are getting very frustrated
"But it seems you are of the opinion that Big Daddy has done what it has intended to do and the serps are showing these changes - just dont agree that the serps are reflecting the changes Google are introducing yet - that is all. ;) "
IMO, we need to look at BigDaddy as a developement project divided into stages with flexible time limits (to some extent) which might continue during the whole year of 2006. Later stages mightbe given different names than BigDaddy.
As such, and seen by the folks at the plex, BigDaddy is at the stage that they have decided it to be in. And we don't really know the details of each stage, more than what Matt tells us in his "coded" language ;-)
However, we do observe the changes and bugs during the BigDaddy developement process. And nobody dare to give those changes a name!
It will help us much to understand BigDaddy if we just forget for the moment the classical updates and how they were conducted and announced.
The more I read what Matt wrote/write the more I realize that he mightbe telling us that there will be no more "classical" distinguished updates in future or at least during 2006. I.e nothing like Allegra and Jagger for example. But to expect smooth transitions from a stage to another. Lets call them White Revolutions ;-)
Having said that, I expect PR and backlinks updates during 2006 to continue in the same manner as before.
In one topic that I have been working in recently, I see a lot of junk mixed in with some good stuff. But this isn't new.
I once posted on WW about college syllabi coming up for many terms and phrases. I asked if I am searching for "quick dinner recipes beef", why would I want to see a cooking class syllabi from some community college. Some wise-guys here thought that if I needed to search for recipes, maybe I needed a cooking class. Sorry - it's not relevant! It's not like I am searching for "cooking" or another vague term. We're talking about some very specific 3 and 4 word phrases and still getting crap.
I see links pages (ack!), pages that mention the phrase once (ususally in a nav menu), and worse yet - pages don't even contain the keyphrase. Example of this: I will search for some resource in a particluar region and another resource from a different region will come up. #*$!? OK so the resource and region I searched for ARE mentioned on the website somewhere, and the two resources are somewhat related, and if you'd like the one region you might like the otehr, but it was NOT what I was looking for.
Most of these crappy result websites are huge and the clout of the domain is supporting the somewhat irrelevant results. Same idea with the college syllabi. The university and/or college domain has a certain "value" an that pushes their pages higher in the SERPS that more relavant content from a lesser known, smaller or new domains.
sometimes its almost as though if for wotever reason they have to remove or turn down filters for a period they turn up the authority score to compensate. So you get huge sites that may give a clean look to the serps but are pretty lightweight for specific terms along with garbage.
"BigDaddy is at the stage that they have decided it to be in."
Yeah I'm sure admitting their is an error is the position they decided to be in.
Then stating results would be coming back within a few days. Well its been a few days and its getting worse for many people.
Real great stage for all of us. Completely ridiculous.
"Yeah I'm sure admitting their is an error is the position they decided to be in."
"Errors" happen all the time in any developement project. I guess you are referring to the supplemental issue, right?
IMO, its only the PR that the supplemental "error" has generated which the folks at the plex are concerned about. Not the supplemental issue by itself.
I wish to see a kind WebmasterWorld fellow member of software engineering background to visit this thread and explain for us a little about how developement projects are conducted in practice.
That for sure will help us understanding BigDaddy much better.
THANK YOU GOOGLE!
I did some major 301'ing awhile back in order to clean up my navigation. This sent my website into supplemental oblivion. I sat back and waited patiently for almost 9 months. During this time I have been building more and more content. I was frustrated by MFA and just plain crap sites outranking my site for everything. I came here and complained. Wondered why google treats these garbage refuse sites so great and a real effort at quality and content (instead of blatent keyword stuffing) like crap. I was even considering starting my own mass-generated MFA sites and giving up on content sites.
Bigdaddy has completely changed my outlook. I am now ranking #1 for hundreds of keyword(s). I never imagined it could be like this. People looking for obvious things are now going to most likely come to my site first. Now I feel like it is my responsible to make these pages even better so I feel like I deserve these top spots.
Oh, and thanks to WW for putting up with my less than sunny outlook on website development.
See you guys next month when I fall back into supplemental hell! Better to have ranked #1 and lost, then to have never ranked at all.
If I were Google I would forget trying to mess with the serps, trying to get the perfect set. Just let it go guys and carry on with the innovation. To be honest your a lot better at getting good idea's for the future than you are at supplying good quality search results.
IMHO leave it to Yahoo & MSN to fight over search engine stuff ... it's old hat now anyway. In a couple of years we'll all have suits fitted with GPS / Service Shoppers anyway ... so forget the internet altogether.
he he he ... I love to love
"Errors" happen all the time in any developement project. I guess you are referring to the supplemental issue, right?"
Problems happen. Thats fine, but at least let people know what the situation is. GG has posted twice saying they "THINK" they know what the problem is and that sites should be returning to normal. Its quite obvious they don't fully understand the error if they are asking for poeple to email them with specific sites. Here it is days later and its getting worse while you don't hear anything from them. Oh I forgot Matt Cutts posted on his blog:
"Gary and Rahul #26: Iím looking into it. Lots of people have come back, and Iím asking someone to read the stillsupplemental emails from WebmasterWorld."
Gee, thanks for finally getting to our emails. Maybe they will fix the problem sometime in 2006.
> gradual continuous updating process
As a matter of fact, the pagerank algorithm implies that ALL websites indexed by google are given their pagerank by calculating the value of ALL pages linking to them in an iteration process of presumably 100 loops, which requires an enormous amount of CPU-load.
Because even on a cluster of several 10k PCs this would take days if not weeks, it became common to talk about "updates" whenever google seemed to have performed this calculation once a month until (dunno) 2004 or so. This is what the word "update" had been used for. To talk about a 'continuous update process' is somewhat a contradictio in adjecto and not very helpful.
It is quite likely that this pagerank calculation does not play the same role in page evaluation as it had done before. It has presumably been elaborated or substituted by much more sophisticated means like e.g. noise analysis on google engineers shaking their legs full of laughter whilst reading threads on SEO. None of us actually knows at present, and I doubt we will ever know the details any more like many of us did a few years ago. In the past years I have continuously followed googles law #1: Concentrate on the user ( my customers: human beings), and my visitor statistics and turnarounds are constantly rising. Not exploding, but rising.
If graduition comes into play we should abandon the concept of "updates."
this thread is on watching DCs. I sporadically read out the RK-values for my site of a list of about 60 DCs. Today I noticed that about ten of them were unavailable, mainly those ending on *.*.*.106
18.104.22.168: I see fresh results with yesterdays freshtag, this is the DC which will stay.
That DC hasn't changed (as in going supp) ever for me. Its the only one that hasn't. Its consistently always showed fresh pages for my site.
I do work as a professional developer and realized several larger projects. Some very large.
In a "normal" project environment such as the banking sector, insurance or whatever else is coming to your mind, there will be a release date which has to be reached and a list of features which has to be realized / changed / implemented. The team usually tries to meet that goal - and the last few days are usually very busy. If the project leader is good - the new / modified solution will go live at the specified date. The solution will be stable, the features will work etc etc: JOB FINISHED. I have done that stuff in business critical environments for large companies several times.
For Google: it's the same. Search is the most important, most business critical application they have. We shall not forget - the logic to rate and relate contents is not only used for search ... but also for adsense etc. If you type in "abc" (or whatever), Google needs to know what kind of data you would like to find.
I was asking myself several times - why in gods sake do they release datacenters that look "not finished"? Because to me, they look not finished. It looks unprofessional on first sight.
On the other hand (remembering that we talk about professionals - and they are): Google is based on data. I don't mean the contents of our websites primarily - but data that is used to calculate rankings. For example: We have the Keyword "Bush". Which sites are being clicked by users, searching for that term, if being served pages precalculated by the new "bigdaddy" algos? Answering this question helps to add ranking factors. And for that you need users which are using your live systems. So, you set it live and gather data as an early or late stage of the project plan.
Or for something else - imagine you have a page that works with dc's behind it to serve content. Whatever content it is. Imagine you can change some dc's, vary the content and analyze which dc "pleased" your visitors best for certain phrases. In fact, where do people find the content they want best? Best means: where do they find it on page Number 1.
etc etc ...
which ones end in .106? I didn't know any ended in .106. Where did they come from. I'm watching about 50 but don't have those. This is not fun watching these as it appears BD is on just about all of them now and BD has my listings screwed up and supplemental.
the figures on the right are page counts
When I wrote "gradual continuous updating process" I was referring to change in algorithms or switch over to a new piece of infrastructure. In the past such process use to take place during a specific and limited period of time as for example Allegra and Jagger updates.
While in the case of BigDaddy it is "gradual continuous updating process", IMO.
As to PR updates, I wrote in my previous post:
"Having said that, I expect PR and backlinks updates during 2006 to continue in the same manner as before."
<22.214.171.124: I see fresh results with yesterdays freshtag, this is the DC which will stay.>
This DC will get taken over by BD just like all the other DC with current results
It has hung on much more than the others for some reason.
Your page count pattern looks identical to mine.
Thanks a bunch for taking the time to write such generous informative post. Much appreciated.
Your post shall help many of us to do a better Google Datacenters Watching and to better understand the BigDaddy process.
No good news for me as most of my PR3 pages are now PR2 or PR0 on some of the google data centers(particularily at 126.96.36.199). Is it time to probe into it or should I wait for the final results?
The other strange thing that I've noticed is that in the morning, a search on my website brings 26000+ pages indexed. While in the evening it shows 762 pages only. Is it termed as google dance too?
Strangely so, PR on [mydomain.com...] is different than PR on [mydomain.com...] Is this something I should work on?
Any and all help shall be highly appreciated.
188.8.131.52 is interesting as it has been the main default Google DC for some time now (over a month or so). The DC is not listed by MCDar and is rarely if ever listed by forum members until recently.
Could it be that this DC was opened up to retain default results while all others switch to Big Daddy - just in case they need a quick fix?
--- 184.108.40.206: I see fresh results with yesterdays freshtag, this is the DC which will stay.
Dude... that's not even BD
itloc + reseller
"the last few days are usually very busy" and then the client changes their mind on something :) twice :)
If the database structure is correct normally the data input/ouput can be "tweaked" to suit in a post-implementation review.
Any change in Google "infrastructure" or whatever is simply mind blowing given the amount of data. Opps - defending Google there - sorry!
Adwords is simpler as it relates to the keywords bidded on and the description or title text. Therefore revenue is not affected - just the "natural" SERPS.
With the large number of DC's it may take some considerable time for any "adjustments" to filter through even if the base BD is on all DC's.
>>> sometimes its almost as though if for wotever reason they have to remove or turn down filters for a period they turn up the authority score to compensate. So you get huge sites that may give a clean look to the serps but are pretty lightweight for specific terms along with garbage.
Soapystar - Inspired post. This describes most of the results I am seeing at present. Crap topped off with authority to keep the punters happy.
>Adwords is simpler as it relates to the keywords >bidded on and the description or title text. >Therefore revenue is not affected - just >the "natural" SERPS.
Google Search, Adwords & Adsense belong together. A page that displays Adsense is technically spoken nothing else but a big search query. Lots of text that identifies the topic of a page.
To find the topic is one thing then - to display the right ads another one.
I have one great example - the "out of focus" (sorry, best formulation I can find as a non native snglish speaker) - thing.
Imagine you are Joe Sixpack. Google is the Internet for you. You are looking for a product. If you type in some words that deliver you on first sight the right informations - would you click it? For sure. You are on the page finally but it looks too complex, too nervous ... whatever. Or may be it's not exactly what you have been looking for. But hey. There is an ad (Adsense) that shows you exactly what you have been looking for. So you click it and get exactly the stuff you need. Result: One happy Google, one happy user, one happy webmaster and one happy advertiser.
twist, yes it can be like that. And it can stay like that, through update after update, at least from what I can see.
And of course, the better your content gets, the more likely you are to start getting high quality links, and those links will tell google what it needs to know. Some links cannot be faked, and those are the links you want.
My new year's resolution was to not complain about things I can't control, and to fix things I can. sounds like AA I guess, but it's just common sense. It's better to fix a problem 6 months before search engines consider it a problem than to wait until an update fixes your site for you.
My page counts for DCs (A sampling I took from reseller's list earlier in this thread.)
220.127.116.11 -> 21.000+
18.104.22.168 -> 21.000+
22.214.171.124 -> 21.000+
126.96.36.199 -> 21.000+
188.8.131.52 -> 23.000+
184.108.40.206 -> 23.000+
220.127.116.11 -> 23.000+
18.104.22.168 -> 23.000+
These were ranging from 300-900 pages last week. It's been a steady climb from there.
On non-BD sites I have 400k-500k pages.
|22.214.171.124 -> 23.000+ |
126.96.36.199 -> 23.000+
188.8.131.52 -> 23.000+
184.108.40.206 -> 23.000+
These are now showing 24.000+
Good morning guru5571
Just wish to ask...
Do you also see different top 10 sites on the following two DCs sets, when using your testing keywords / Keyphrases?
Good morning reseller,
I checked all 4 DCs for my search terms. I get the same top 10 results for ALL four on ALL search terms.
The only differences I'm seeing is page counts. Some differences for some search terms on some DCs. It seems to vary. No patterns that I can see.
Ok, I dont think we are really disagreeing.
My point is that although Big Daddy is now on 90% of DCs I dont think we have seen any indication that any DCs have moved on to the stage of being able to correct the situation regarding missing sites due to canonical issues. OK the odd site may come back as we have seen over the last few months.
Your point seems to be that Big Daddy is now live and developing forward. Although I agree that new data is being added to Big Daddy and spam filters adjusted I still feel that it relies on aspects of the old data to rank pages.
>>>>IMO, its only the PR that the supplemental "error" has generated which the folks at the plex are concerned about.
Now I think you are getting on the right lines. :). If Big Daddy has a different crawl (which we are pretty sure it has) but it is using a ranking system from the old crawl (which it appears to be doing) then errors like that are bound to happen. IMO.
Eg. A hijacked page can be found in Big Daddy but can not be ranked - as this hijacked page is relying on the old ranking structure - which maybe uses a system that can not follow the 302 redirect.
I dont think it is a coincidence that some sites suddenly had problems on Big Daddy DCs shortly after a PR update using the old infastructure - esp. as the old infastructure added more split PR problems.
Just did a search on 220.127.116.11 to look if its BD. And it shows for
#1: sanfrancisco.giants.mlb.com/ NASApp/mlb/index.jsp?c_id=sf
is that new? All other DCs still show
| This 210 message thread spans 7 pages: < < 210 ( 1 2 3  5 6 7 ) > > |