Forum Moderators: Robert Charlton & goodroi
I checked my robots.txt for failures and my htaccess but nothing strange. My old content is also still indexed. What can be the cause?
Add a new page link to your index page and Google will now take longer to cash it. Any links to pages off it and its a few weeks minimum.
What concerns me, though, is that there are some sites that do get their pages indexed rather quickly. That is what makes me wonder if this is a leftover issue from the supp hell bug many of us went through.
During the following days I started creating internal links to it from other pages on my site. Half way through this process the page climbed to #26.
By the time I had finished linking to this page internally (the whole process probably took over a week) the new page then fell totally out of the top 100 in the Google SERPs. Who knows?
If you are able to sticky me the url of that site I would be intrested - no worries if you dont want too.
Personally, I *still* think that it is due to the cross-over from the normal Googlebot ranking to the Mozilla Googlebot ranking.
If crawl depth, rank and indexing is supposed to be based on PR then there is little doubt in my mind that the PR generated by Mozilla Googlebot so far has not been applied. (Even when it was applied to the TB is was a bodge - whether it has been applied to the ranking, crawling and indexing of sites at all is what I am not sure about)
By the time I had finished linking to this page internally (the whole process probably took over a week) the new page then fell totally out of the top 100 in the Google SERPs. Who knows?
No one does really. And, given the time factors being quoted within this topic, I don't think enough time has passed to really make any assumptions. One month, two months, etc. If the sites in question are fairly new (within the past 6 months), it's going to take a bit before things settle down and you start to see consistency in the results of your changes.
And again, no one really knows. Each and every one of us has a different circumstance and there is no "one answer" for all. Some may be dealing with whatever the Sandbox is. Others may be dealing with technical issues. And others may be dealing with penalties. Who knows? ;)
Personally, I find that watching this stuff is like watching paint dry. One page? Come on now. What about all of your other pages? The Infoseek days are gone. :)
One of the mega-brains at Google recently introduced a mega-bug into the PR scoring mechanism that resulted in millions of new pages being granted an artificially high PR. Hence, lots of accounts of brand new pages with innexplicably high PR values (often higher than the page that linked to it).
The end result is a bunch of confused and overstretched Googlebots that are unable to schedule a decent deep crawl of many sites due to the deluge of artificially high priority new pages.
It would be nice if Google could go a week, or even two, without introducing fresh bugs into the dreadful mix that is Big Daddy.
All conjecture of course...
Just so everyone else knows - the site definetly has Canonical issues - still have not seen a site with this problem (or the supplemental) that does not have Canonical issues.
Same problem that has been blighting the index for so long.
Just so everyone else knows - the site definetly has Canonical issues - still have not seen a site with this problem (or the supplemental) that does not have Canonical issues.
Sorry to be a pain. But could you please give a very precise definition of exactly what you mean, i.e. the definitive test that you just did to arrive at the "yes, this site has canonical issues" conclusion.
The reason I ask, is that by my yardstick, my site does not suffer from any canonical issues, but has suffered greatly at the hands of Big Daddy - 1000s of dropped pages (but no supplemental issues).
There are different ways of testing the site - some can be a bit obscure. As it is slightly harder with the latest PR update to see.
For the site that wanderingmind stickied me - it was easy as both the non-www and the www homepage were cached and indexed by Google.
I'd guess (speculate) it's teething troubles stemming from the new bot infrastructure or whatever they've been twiddling with.
NOTHING is wrong with google or its algo
Do webmasters here not think that all this is pure deliberate action on behalf of google to try and push up adwords revenue?.
Its that simple imo
Look at the recent results. Last quarter revenue was up by 66% over expectations. During this quarter they have NOT indexed all sites fully, not listed webmasters pages, had poor serps, dropped pages from authority sites, not indexing new content etc etc etc.
The net result of all this has been webmasters moaning in here about it (me included) and a sharp increase in revenue for Google. Whos the idiot?
The only problem with the serps being so bad due to Google doing taking this action is that they risk losing market share SHOULD msn or Yahoo get it right, however i think i have more chance of having tea with the Queen Mum than one of Googles competitors taking any market share off them.
Google dominates, webmasters have put them their and we are reaping what we sow as a result
That would work only for pure commercial sites that need to sell. A drop in rankings wuld mean they are tempted to use Adwords.
However, I suspect that the larger number of sites would be informational - huge news websites for example. There is no way they can use AdWords as they are not selling anything.