Forum Moderators: Robert Charlton & goodroi
Oh. My. God. [72.14.207.99...]
Changed again since this morning. Still get very wierd results for this DC, but when I add &filter=0 all (and I do mean "ALL") of the old supplemental results (for 404 pages and expired domains) are thrown away.
Normal Google search: 1 to 15 of 15.
This DC with ordinary search: 1 to 12 of 40 000 (all wierd results - you CANNOT get beyond 12).
This DC with &filter=0 applied: 1 to 1 of 1 - the CORRECT result for this search term.
[edited by: g1smd at 3:42 pm (utc) on Mar. 30, 2006]
[edited by: tedster at 9:56 pm (utc) on Mar. 30, 2006]
I am missing something....what are you talking about?
answers.com and amazon and ebay and others rank on multiple pages for the same keyword terms. Always have and always should if the search thing works as it is supposed.
What is it you think happens?
In its attempt to drive up adwords revenue it is now producing serps so way off the mark the end user cant do anything with the search engine imo.
At first with the canonical issues, lost pages, supplementals etc etc and lack of attention to sorting the issues out by google i thought that this was perhaps deliberate action by them. Are they doing this to try and increase adwords revenue - after all, how can a company with the market value of a large country make these kinds of errors and stuff so many authority and quality sites?.
After all if google can stop large authority sites ranking for different keywords maybe they will buy more adwords?. That was my conclusion.
Is it a case of pushing the serps quality so far down that the end user sees nothing but other directory sites and non relevent junk that they then click on more of the sponsored adverts? In doing this have they just turned the dial to far?
Having looked at all the results sectors that we get involved in on google.co.uk i realised just how bad the results are. At one point i thought is it me? is it just the sectors im involved in?, am i just to close to see the bigger picture?
Then this evening a friend phoned me (she knows zero about this and the internet) shes an office administartor, during conversation she was basically saying that she had done a search today on Google and gave up with it and asked her friend at work where else to try and they suggested Yahoo which amazingly gave her the answers she wanted.
On asking her to clarify further she said that she wanted to find information on courses available for a particular uni. Google returned results ranging from sites nothing to do with the course, nothing to do with that uni, results about other online courses she wasnt searching for and LOTS of directory sites and other places to search again rather than links to the information she wanted. Meanwhile Yahoo was able to deliver the relevent results she wanted straight away! - she now claims she is not using google in future.
Now ok this is may be an isolated incident and lets face it Yahoo has its own fair share of problems also but for google to not be able to deliver relevency is a problem its never had before, well not as long as i can recall anyway.
In all, i now genuinely believe Google has just dropped the ball big time with this BD update and due to its own greed and lack of attention to delivering quality has and still is failing its users which it may well start losing.
I think Google has less time than it thinks to get this right and the clocks been ticking a few weeks now. What started off a minor issue is now a major problem - if i was holding google stock i would be dumping it forthwith i can tell you!
My site came back on 3/7 to its "rightful" position in top 3 SERPS for all of my major kw's. That day, my site:mysite command had my index page listed first. Just this morning, we were back to page 4.
The ONLY change was that on a site:mysite command, my index page is nowhere to be found.
Looks like they are just testing. If your site's index page is not coming up first, then hang tight, G will figure it out and all should be fine.
I did have the non-www redirect to the www-version. No subdirectories are involved with the site, so that issue didn't play a part.
In an effort to get the site back, I killed the robots.txt & uploaded new sitemaps. I sent *countless* emails to the Gmail email address (which if you don't have, please refer to parts 1 or part 2 when GG first responded to this thread).
I only report to offer those of you still in SH a bit of hope. I do believe that all the sites will come back, eventually. The "time out" is definitely painful, so I definitely empathize.
Good luck!
The main problem when any of my sites come back from Google oblivion is that I've usually made so many changes in the mean time that I have no idea what might, if anything, have made the difference.
I guess this is what seperates me from the truly professional SEO's ... that and a good slice of luck ;-)
All the Best
Col
They must have realised that the split PR problem lead to sites/homepages being downranked.
Where is the attempt to fix this problem?
Where are the new rankings that Mozilla Googlebot should be generating?
Why is Google still using the old data to rank sites?
How can a site still suffer from Canonical problems if it is truly a Mozilla Googlebot generated index?
Why have hijack sites not regained any rankings if Mozilla Googlebot is supposed to follow redirects?
Now I am finding pages that heavily indexed over and over and yet are not making it into the main index. Pages that were created after the first index data of my page, and thus indexed later than mine, are showing in the serps while mine is not in the index.
More info here:
[webmasterworld.com...]
This same type of behaviour occured for me right before this all went to hell. Also I found that blogs tend to get added to google fairly quickly (within a day or 2)...a blog i just made about 5 days ago...showing lots of google indexing is not in the serps either.
So if you think your out of hell..you may get pulled back in.
If sub1.domain.com and sub2.domain.com have the same or similar content Google makes a wild guess on what subdomain to use and filters out the other subdomain.
I know that shouldn't be even possible. Anyone else experiencing subdomain related canonical problems?
Not that I am happy though. I know Google index is not yet settled. Still have many supplementals, and guess what, they are all blog posts!
The moral is: Google now hate blogs, don't do blogs anymore!
I disagree. I feel blogs get listed very very quickly and usually beat other types of pages in rankings.
Yeah, dayo-UK.
I think Google has been forced to spend the bulk of its time trying to fix the problems it generates for itself by the entire PR component for a long time now. Most of the ranking problems, etc. ultimately stem from the PR system which has likely become unworkable as an algo. Obviously all SE's use some ranking component in delivering search results, but presumably employ a much more streamlined system that is far less prone to unanticpated problems.
Hard to guess the reasoning behind developing BD on a JUL-AUG index and then trying to merge that index with the current (mid-FEB) one. I presume there were constraints that made it an attractive approach to Google. However, they either were totally unprepared for the supplemental, backlink and PR problems that occurred when they implemented BD OR they were indifferent to the disruption it would cause.
Right now its like they almost gave up on including the mid-Feb index. They seem to have an index of newly crawled results and the ancient BD working index merged into some weird mix. For a number of sites I see an index made up of pages crawled within 1-3 weeks and pages that existed in August but are now gone (and listed as supplemental).
For a long time I thought it would be pointless to use the Removal Tool to get rid of these long deleted supplemental pages but I'm doing it now as I think they're gonna be there until the next update (or at least I'm not confident they won't).
Hello all friends,
but someone has any idea when google will fix all problems?
or google will not fix any problem and I can't hope to see again my site in a good rankink as it had before BD?
The site went supplemental and is sitll there
I have been with this hosting company for 3-4 years now and they have always handeled moving DNS to a new IP correctly. Google has it indexed by the IP
Well the catch has been Feb 17 th till today now I see 24th Dates still IP
I have rebuilt the site resubmitted the sitemap changed some content and waited for the reindex to come.
How would anyone suggest to get this fixed or is there a way. I just had the hosting co check the DNS set up and as I thought it is correct. Coming up in the serps for good searches under the ip
I did a check on the DC showing sites coming out of supplemental and pages were reindexed under the www correctly but I am afraid I now have 2 sites in google and we all know what's fixing to happen.
Any suggestions would be appreciated
It's a bit sad that search engine results are a topic of conversation amongst friends. What a poor state of affairs; I'd suggest you try talking about the weather or what movies you may have seen.
Actually another here also moved hosting in Mid March. Things aren't looking good for him either.
I did a reinclusion for a pharm website out of Canada that had 100,000 pages and 10,000 listings in Google front page rankings last year around May 05, whom moved from shared to dedicated and it took three months, killing all his links, and a reinclusion request to get him back in.
By July his pages were picking back up and so were his rankings..
Cannot say for sure you have this long of a wait, Googles new update may make it shorter, but you are looking at having a brief wait I believe.
But hey, search is difficult. They moved away from a PR based search (it had become too easy to manipulate) to something based on sematics and 'trust'. What they didn't know was that that would actually make them more vulnerable to spam techniques than before. Hell, if you look at the results then most are relevant - if you want Amazon, epinions, rateitall, dealtime in your results. Personally, I already know those site are there - I could put them in my favourites if I wanted to.
Their quality control people are probably saying the results have been great over the past 18 months - very relevant. When a company grows it often loses it. Maybe Google has. If it wasn't for the lack of serious competition then I would be predicting the demise of Google.
Thanks for the reply.
Your right about lack of serious competition, i think thats why they are not suffering enough as a result of having poor serps and i cant see MSN or Yahoo putting them under pressure can you?, well not in the short term anyway.
Mind you, they can only retain this not giving a dam attitude towards the serps for so long. Webmasters and users alike will remember this if they dont do something about it and when MSN come along with IE7 users wont need much of a push to switch to changing.
In order for Google to remain top dog and stay ahead they need to retain the confidence of webmasters and users - currently they are failing both imo.
BTW..Matt didn't like that comment and removed it a day later.
The only hope I think is those <RK> figures that were displayed for a short while on the BD DCs - these were pretty obv. a page rank or a page rank like calculation by the Mozilla Googlebot and they have virtually (not all) every site with a logical and correct looking set of values.
Homepages have the highest <RK> - links of the homepage the same or minus 1 etc.
In other words it looked like Mozilla Googlebot was correctly allocating PR.
But there is little indication that this is being used in the serps to rank and crawl (remember Matt confirmed PR is still very important only days ago for that aspect) sites - I agree that they seem to have gone back to an old calculation for ranking.
There is also little doubt that PR is a bit screwy - I mean just looking at TBPR we have it from prior November, November to Feburary and post February figures depending on what DC you hit.
But I still think all these figures were calculated by the normal Googlebot rather than Mozilla Googlebot.
But - Google are always saying that PR is continually calculated and applied - so we must be using Mozilla Googlebot data surely? - But then things might have changed again from that aspect with the lastest update and we need that PR export to correct this utter mess in the serps.
[google.de...]
"In our continuous struggle against webmasters artificially influencing our ranking, we decided to implement a new software architecture, which has now been launched. The core basis of this new software comprises an algorithm, which performs some MAD CUTs off the main index, copying a randomly chosen number of pages to the secondary part.
Because - as everyone knows - true randomization is impossible on any give von-Neumann-machine, and because we were quite sure webmasters would have found out about any randomization-algo we'd have taken, we decided to employ 365 chimps and sat them in front of some new 64-bit-computer-keyboards. Twelve of the monkeys were only given a numeric-block, from which we choose the relevant IP-numbers, the results of what the rest did was taken as the basis of a lexical semantic filter. Those pages of the relevant IPs, which most closely matched this filter, were transformed to the supplemental index.
As a matter of surprise, one of the 353 remaining chimps came so close to natural english sentences whilst hammering his keyboard, that we decided to publish his poems as a blog. His name was Nim, and the feedback we got from this blog amazed us more than anything else."
The one thing I did notice is that of the 1200 pages the site has only 130 or so are indexed. Over the last few weeks this had come and gone, up and down, loads of supps etc.
I haven't been closely following Big Daddy so I was looking for advice on this.
Is it just a matter of wait it out and see if the remaining 1000 pages are reindexed? A lot of the missing pages have quality IBLs.
Cheers
Its pretty much impossible to figure out what the end is going to look like and a lot of us have had page counts go from 10,000+ to 1 (or 0) and back up again in past weeks.
Unfortunately, doesn't mean you or any of us are going to like the final result but its hard to think at this moment what can actually be done to influence the outcome (other than the obvious of removing dead pages, bad links and all that kind of basic stuff as Googlebot is crawling steadily and those results may get included into whatever "final" index is produced).
I think its at least possible that - unlike usual - Google is updating the BD index daily with recent crawl results (I don't mean just adding new pages like usual, I mean making the kind of broad changes that usually are part of an update). Its a guess (obviously) but we all know the BD index contains results circa AUG-05 and I have noticed changes that had to come from the ongoing crawls. They could be trying to fix the BD index "live" via a very quick inclusion of newly indexed pages (LOL or I could be totally wrong).
I saw about 15 pages return today from the supplemental index with page 1 or 2 rankings. Out of all the pages on this site, I could understand why these pages might be supplemental (possible dup content problem). This is the first "positive" I have personally experienced on the supplemental issue.
Maybe there is hope!