Welcome to WebmasterWorld Guest from 22.214.171.124
Current BigDaddy status from what I can see:
- BD is visible on (using 'sf giants' test)
That is around 1/3 of datacenters listed on the McDar tool (I think).
- For me at least BD is returning a significantly lower number of indexed pages.
- BD index is still clogged full of scrapers, doorways and 404s. Meanwhile the sites sticking by the guidelines are getting trounced.
Google puzzle me.
[edited by: tedster at 6:01 am (utc) on Nov. 8, 2006]
[edit reason] thread split [/edit]
Saying you shouldn't rely on Google is idiotic: they have 60+% of the search market and if you have a web based business - as opposed to bricks and mortar or telephone/late night ads you have no choice but to rely on Google.
I have a lot of top 3 SERP pages in MSN and Yahoo and it didn't help me a whole lot when I went in the Google dumper for the 4 months of the year where I used to make 80% of my money in a seasonal industry.
Tell ya one thing. What I discovered is the nature of the Google algo allows sites to manipulate both rankings AND - with a concerted effort - destroy the PR and hence traffic of competitors. I got PR0'd in Google in large part because of a systematic campaign of content theft by a competitor who set up a ton of waste sites (mostly free hosted) for the purpose of duplicating competitors content. It is an inherent problem in Google and not one that seems to be a problem in MSN or Yahoo.
Took me 4 months of solid work to undo the damage, guard against theft and restore PR (at least I think it is - Google index keeps reverting back to 3 months ago lately).
By some querk of fate my listings came flooding back just before the sales process was started ... so I'm 3/4 mil better off today than I was last year.
Still it's interesting to keep my hand in with a few new projects that I started last year ... and they're not getting hit by today's fun and games. Must be doing something right ;-)
302 problem in december 2005 has cost me Ģ705,000 in lost income. No blackhat straight clean site.
People do place alot of importance on google results and when you follow the guidelines and build good sites, then google drops them because of the 302 issue then you begin to worry. Hey msn and yahoo have 302 sorted and my sites thankfully rank well there too.
We may aswell ignore supplimental results as they dont rank and MC has stated that they are on a different crawl etc to the main index.
Meaning, that there is a lot of sites out there effectively with only 1 page in the index!
Now, is this a penalty/problem/bug - or is this a base from where Google are going to crawl from going forward? (Relatively common for a new site just to have 1 page listed for a while before a proper crawl takes place.)
My head says it is a penatly/problem/bug - would love to be proved wrong - but cant see it.
At the time of the last posting it was across my general google a couple of random datacenters, big daddy and little daddy. The pages are now back as usual in Big Daddy [126.96.36.199...] nowhere else.
Another piece of info the cache date on the the rest of the sites with the supplemental problem the cache shows jun 05 - aug 05, prior to the problem they showed feb 2006. Big Daddy in now showing Feb 2006 no supplementals in the first 1000. The rest are still showing supplementals after the homepage
Started last week on big daddy where we started seeing highly ranked pages disappear - they are still missing. At least they aren't supplemental... yet.
Has to be a bug!
Very hard to say whether it is a penatly/problem/bug with this big confusion of what's going on. With all the pages being dropped EXCEPT the homepage, it looks like a penalty, BUT I see that the homepages still rank well for their competitive terms so this is quite contradictory to being penalized. Anyway that's not a good sign to depend only from a stand-alone homepage...I hope and just hope that it is a temporary bug and not a penalty. Who know I can be wrong.
Would seem simply that there are some very old pages in the index BD is using. Perhaps, unanticipated by Google, the BD algo applied whatever formula would normally be used to flag ancient pages as supplemental.
I know a couple of the senior people on here pointed out repeatedly that BD looks like it is very old - 2-6 months depending on the datacentre.
Did Google take a "snapshot" to test BD and is now in the process of merging that old data with the current index? Is this causing pages to be incorrectly identified by the algo as moribund?
some of the pages google shows in the site: command have been deleted for over a year.
I also show urls like:
that were only used for PPC campaigns in 2002 and 2003. Where is google pulling these urls from? It makes little or no sense.
IMHO I think the data-set will eventually be merged with a current update and all will rosy again. We'll all make loads of money and be able to enjoy the spring & summer [unlike last year].
Love and Peace to Everyone ( Gimme Coffee )