Forum Moderators: Robert Charlton & goodroi
i have an old site, which is not in the sandbox, when searching by title tag in google the site ranks under the top 10.
Even some inner-sites which "good phrases" are ranking well.
But most of the indexed pages are ranking not very well, even for very, very bad keyword combinations.
What is it?
It seems to me, that the site has a maximum number of allowed "pages" to rank "normal". And all other pages are ranking not very well. If i check it without filter, the pages, which are "ignored" are ranking very well, too.
Has anyone a clue?
How would you name it?
Thank you!
I have a six year old site on which pages that were added since June this year are very hit and miss since mid-October. The page topics are mostly informational, and some appear in the top ten whereas others are inexplicably buried in the mid-nineties or even further.
On another site, an older page which ranked 4 or 5 for over a year (for its title, h1, etc) has dropped markedly except on ww*.google.com.au where it is still at 4 or 5. Australian Google seems to understand the correct ranking for the page, but all others not so.
This is purely a guess, but my impression is that the use of exactly the same terms in the page URL, title, h1, and internal anchor text is tripping a filter.
Beacuse: Some pages are ranking very well, and other pages do not. But the optimation for the keyphrases is the same.
It looks like me, that there is now a maximum count of allowed pages per site. You can increase this amount, but slowely! All new pages, which are over the "normal amount" will be sandboxed.
I think there is now a maximum number of "allowed pages" which is not allowed to increase to fast!
I habe a new site with 200 pages, but just 4 were crawled from the right bot and came up in the index!
These 4 pages are in the same sub-folder! All other sub-folders were not crawled by the "right" googlebot.
Other pages are still crawled, but from the "Mozilla GoogleBot" and are not finding there way in the index!
Even if the purpose of the Mozilla Googlebot is unknown, i cant understand his behavior, because it takes bandwith to crawl many pages daily, but not index them.
Do also people find the Mozilla Bot in there logfiles, even if they are not using AdSense on their site?
Greetz!
It would lead me to assume that the "Sandbox" is not a site-wide penalty but a penalty on specific pages. It likely impacts entire sites because the part of the algo that is responsible likely uses factors that might cross pages on sites. On top of that people are more likely to notice the hit on their main page as opposed to less promoted internal pages, in many cases.
Since they said there is no actual "sandbox penalty" but instead it is a side-effect of other filters, that makes sense to me.
Nothing to do with new pages my site has barely
increased page count in the last two years.
All pages, used to rank now only a handfull (and the
index page) rank where they should/used to.
My opinion, again like the other well documented bugs that are dogging G these days this is just another symtom of a sick search engine.
Let Google sort it out, I don't think there is much we can do from our end.