Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Site halfed "sandboxed"?

         

code12

3:18 pm on Dec 6, 2005 (gmt 0)

10+ Year Member



Hello,

i have an old site, which is not in the sandbox, when searching by title tag in google the site ranks under the top 10.

Even some inner-sites which "good phrases" are ranking well.

But most of the indexed pages are ranking not very well, even for very, very bad keyword combinations.

What is it?

It seems to me, that the site has a maximum number of allowed "pages" to rank "normal". And all other pages are ranking not very well. If i check it without filter, the pages, which are "ignored" are ranking very well, too.

Has anyone a clue?
How would you name it?

Thank you!

RunnerD

4:44 am on Dec 7, 2005 (gmt 0)

10+ Year Member



I have similar problem with my site since jagger

Patrick Taylor

1:57 pm on Dec 7, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Since jagger, yes.

I have a six year old site on which pages that were added since June this year are very hit and miss since mid-October. The page topics are mostly informational, and some appear in the top ten whereas others are inexplicably buried in the mid-nineties or even further.

On another site, an older page which ranked 4 or 5 for over a year (for its title, h1, etc) has dropped markedly except on ww*.google.com.au where it is still at 4 or 5. Australian Google seems to understand the correct ranking for the page, but all others not so.

This is purely a guess, but my impression is that the use of exactly the same terms in the page URL, title, h1, and internal anchor text is tripping a filter.

code12

2:25 pm on Dec 7, 2005 (gmt 0)

10+ Year Member



I dont know, if it's trickering a filter, or not.

Beacuse: Some pages are ranking very well, and other pages do not. But the optimation for the keyphrases is the same.

It looks like me, that there is now a maximum count of allowed pages per site. You can increase this amount, but slowely! All new pages, which are over the "normal amount" will be sandboxed.

bluent

4:30 am on Dec 8, 2005 (gmt 0)

10+ Year Member



Same is the case with my site. My website is four years old and before Jagger's Update it was ranking pretty well. But now it is not ranking for any key phrase, so irritating. Now I shifted my website from HTM to ASP, all pages are listed in Google except index page. Can somebody tell me what is happening why google is not crawing my index page.

srinivas

5:10 am on Dec 8, 2005 (gmt 0)

10+ Year Member



i have a new site which was index for 15 days after it was lanched and today i am seeing my site,my site has 5000 content pages but google shows more than 45000 pages does any one have any clue please help me,is my site sandboxed

ljgsites

5:42 am on Dec 8, 2005 (gmt 0)

10+ Year Member



Yeah...I'm experiencing the same thing. It's like a sandbox for new pages.

code12

4:17 pm on Dec 8, 2005 (gmt 0)

10+ Year Member



@ljgsites:

I think there is now a maximum number of "allowed pages" which is not allowed to increase to fast!

I habe a new site with 200 pages, but just 4 were crawled from the right bot and came up in the index!

These 4 pages are in the same sub-folder! All other sub-folders were not crawled by the "right" googlebot.

Other pages are still crawled, but from the "Mozilla GoogleBot" and are not finding there way in the index!

Even if the purpose of the Mozilla Googlebot is unknown, i cant understand his behavior, because it takes bandwith to crawl many pages daily, but not index them.

Do also people find the Mozilla Bot in there logfiles, even if they are not using AdSense on their site?

Greetz!

aeiouy

1:06 pm on Dec 9, 2005 (gmt 0)

10+ Year Member



I have one site where I compete for very different keywords on many different pages. Some pages rank where they should and others are sandboxed.

It would lead me to assume that the "Sandbox" is not a site-wide penalty but a penalty on specific pages. It likely impacts entire sites because the part of the algo that is responsible likely uses factors that might cross pages on sites. On top of that people are more likely to notice the hit on their main page as opposed to less promoted internal pages, in many cases.

Since they said there is no actual "sandbox penalty" but instead it is a side-effect of other filters, that makes sense to me.

SEOPTI

5:28 pm on Dec 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There are definitely different steps of being "out of the sandbox" for a domain.

Like 30% out of the sb, 60% and 100% out.

cleanup

5:36 pm on Dec 12, 2005 (gmt 0)

10+ Year Member



My sites exibit the same "partial sanbox" symtoms
since Jagger.

Nothing to do with new pages my site has barely
increased page count in the last two years.

All pages, used to rank now only a handfull (and the
index page) rank where they should/used to.

My opinion, again like the other well documented bugs that are dogging G these days this is just another symtom of a sick search engine.

Let Google sort it out, I don't think there is much we can do from our end.

SEOPTI

5:55 pm on Dec 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



They don't need to sort it out, the partially sandbox syndrom is something they built.

cleanup

5:56 pm on Dec 12, 2005 (gmt 0)

10+ Year Member



Then they built it during Jagger matey.

Erku

6:05 pm on Dec 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How long is the duration of a Sandbox?

SEOPTI

7:48 pm on Dec 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You will need at least 200 backlinks from different domains and at least 3 months to get out.

tedster

1:49 am on Dec 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



SEOPTI, are you being tongue in cheek with that answer? I have popped sites out of the "sandbox" within three or four weeks with just a handful of (admittedly strong) links.