homepage Welcome to WebmasterWorld Guest from 54.167.185.110
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 472 message thread spans 16 pages: < < 472 ( 1 2 3 4 5 [6] 7 8 9 10 11 12 13 14 ... 16 > >     
Sandboxed Sites - Back Together?
Do they come out together or one by one?
McMohan




msg:118288
 10:09 am on Nov 20, 2004 (gmt 0)

Most of the new sites that I work with are still in the sandbox. Was just curios to know, if all the sanboxed sites come out of the sandbox during one fine major updation or one by one, over the rolling updates?

That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)

Thanks

Mc

 

BeeDeeDubbleU




msg:118438
 4:08 pm on Nov 24, 2004 (gmt 0)

The bottom line is that Google has dictated that all sites in the future will not rank well unless they behave like a normal site would behave and unless they are well considered by the Internet population.

Interesting theory.

But ...

... wait a minute!

How come "normal" sites that have been introduced since February are also missing?

Pimpernel




msg:118439
 4:49 pm on Nov 24, 2004 (gmt 0)

"Normal" by the criteria that you apply to your old sites, which is the mistake everyone is making. The fact is that the old sites would be gone as well if google had the ability to downgrade them. But it doesn't have the data so it can't apply the algorithm. The new "normal" sites are the ones that are performing perfectly well in the SERPS and were created / registered since February. If you look hard enough you will find them.

Bottom line, for the large majority of new sites it is going to be a long hard haul to get from the bottom of the SERPS to the top, not like the old days when you could get ranked in a week.

wanna_learn




msg:118440
 4:52 pm on Nov 24, 2004 (gmt 0)

Pimpernel,
Why dont you make it easier by throwing example of so called only 5 such Normal sites performing well on compititive KWS?

airpal




msg:118441
 5:05 pm on Nov 24, 2004 (gmt 0)

This is so ridiculous, there's no such thing as a "normal" or a "spammy" site! Does anybody else have more feedback regarding specific results they have seen, so we can try and solve the REAL sandbox matter once and for all?

BeeDeeDubbleU




msg:118442
 7:00 pm on Nov 24, 2004 (gmt 0)

You want to know how Google ranks pages?

Have a look at [google.co.uk...]

Here's an excerpt ...
Google uses PageRankô to examine the entire link structure of the web and determine which pages are most important. It then conducts hypertext-matching analysis to determine which pages are relevant to the specific search being conducted. By combining overall importance and query-specific relevance, Google is able to put the most relevant and reliable results first.

Oh yeah? So is Google saying that virtually no sites that have been introduced during the last nine months provide relevant or reliable information?

There is no mention of new sites or new pages being treated differently from those that are established. Isn't Google's mission to deliver SERPs that are all based on their algo and all sites being treated equally?

If this situation is deliberate then, if not actually lying, they are being very economical with the truth.

steveb




msg:118443
 7:57 pm on Nov 24, 2004 (gmt 0)

"Just very difficult and seriously anti-spam."

Talk about backward. The spammiest tactics are what beat the sandbox (<<<<<giving up in the jargon wars).

Jane_Doe




msg:118444
 7:59 pm on Nov 24, 2004 (gmt 0)

>>> It doesn't take more than nine months to build a new index. Does it?

I think with projects like that the hard part and really time consuming part is expanding fields in all of the places an index may be used - all the reports, temporary files, files that get transferred to external companies, screen layouts where the field is used etc. The more business partners you have who have to change all of their systems, screen layouts and reports to accept a new size field, the more complex the project becomes.

Namaste




msg:118445
 8:03 pm on Nov 24, 2004 (gmt 0)

Pimpernel, how do you explain what I said earlier, that even new pages that don't fall within the keyword categories of an existing (well listed) sites are sandboxed.

airpal




msg:118446
 9:05 pm on Nov 24, 2004 (gmt 0)

Pimpernel, how do you explain what I said earlier, that even new pages that don't fall within the keyword categories of an existing (well listed) sites are sandboxed.

You will be extremely hard-pressed to find somebody who will agree with you that new pages on old sites are sandboxed at all. I have launched numerous pages on an old site that were ranking very well within days. Those pages had hundreds of completely different "keyword categories".

UK_Web_Guy




msg:118447
 9:18 pm on Nov 24, 2004 (gmt 0)

airpal

It's impossible to generalise full stop - what you've expereinced is one thing, what another experiences is totally different.

I've got sandboxing of new pages on old sites.

It's not just specific keywords that Google seem to be using to determine what is sandboxed and what isn't - no one has figured out what they are using yet - hence why this type of thread appears every few weeks and hence why they become so long.

DerekH




msg:118448
 9:22 pm on Nov 24, 2004 (gmt 0)

I've tried to stay out of this "sandbox" debate, partly because other parts of the English-speaking globe have no idea what a sandbox is, and partly because it's been fun to watch people slagging each other off merely for the apparent sin of having different points of view.

Am I correct in assuming that the sandbox theory is either "in" or "out", according to one's preference? And that pages (or sites - I'm not sure which) - are either in this box-thingy or not?

Well, when I step back, I find that the whole concept of Google's PageRank, its SERPS, its toolbar and everything else is based on "better than" or "not as good as" - real quantifiable measures (whether you agree with the measure or not).

I personally find it rather sad that we're wasting time discussing the sandbox like it was some sort of portcullis - you're in or you're out of the Google Castle.

Do others really believe that the massive matrix calculations that define PR are then going to be adjusted by a coin-toss? In or out? Heads you win, tails you're on page 100?

I've a Masters degree in Mathematics, but I'm finding that I'm turning into a philosopher in this debate, trying to understand what I see, rather than making up "in or out" theories that are quite, quite childish.
We would, I think, be better served by trying to make sense of the contrasting and contrary things we're seeing here, instead of heaping coals onto some vast fire.
The fact that we ARE seeing constrasting and contrary things is what we should be grasping - not that we don't think we ARE seeing them.

----
OK - I've had my rant and I feel better now.
Sorry about that - us Brits tend to keep our emotions bottled up far longer than is good for us...

Why not skip over this and read the next post instead....
DerekH

mark1615




msg:118449
 9:45 pm on Nov 24, 2004 (gmt 0)

Whether or not one agrees with Pimpernel's theory it appears internally consistent:

Page are what is ranked not sites
Old sites cannot be subject to the same part of the algo that he thinks new sites are because the data wasn't kept prior to the algo change.
Links are tracked and aged.

One question on this though, a new page on an old site would still seem to be subject to the algo - no? Yet many people have experience that suggests this is not true.

Likewise the other commonly observed attribute that sites/pages in the so-called SB can still rank well for obscure 3+ word combinations. What does DerekH think about this with his background in higher mathematics?

And one other thought: The so-called anti-spam tactics employed by G are to fight a "problem" they largely invented. The basic premise of the G algo, we are lead to believe, is that links are votes. Well, then webmasters go out and get links. And anchor text in links is important, but the fact is that truly natural anchor text very often is totally unrelated to the keyoword and is thus devalued (we think) by G. So now in response to webmasters actions to get links - G (again, we think) takes action to combat aggressive linking - which causes this problem because of course, new pages and new sites have new links. This has resulted in G becoming unarguably stale.

DerekH




msg:118450
 10:24 pm on Nov 24, 2004 (gmt 0)

mark1615 wrote
Likewise the other commonly observed attribute that sites/pages in the so-called SB can still rank well for obscure 3+ word combinations. What does DerekH think about this with his background in higher mathematics?

Well, I'm not sure that this is anything more than the way the reverse or inverse index that looks things up is updated and made current.
After all, in addition to the algorithms that decide results, there is the data that is fed to those algorithms. With some pages on one of my sites indexed yesterday, and some not visited since last February, the spread of currency of the data is massive. Who can say what effect the age of the last visit of one of your competitor's pages has on the weight that page is ascribed.

For a long long time I've seen my sites rank really well for one keyword and not for another, and yet for the pair to beat sites that beat me on both searches.
I don't regard that as anything more than "something" in my site doing well for an obscure combination of keywords, any more than I regard the fact that a site doing well for an obscure search means anything more important than the fact that other sites don't.

My god what a sentence that was!
What I meant is that it's easy to do well in an obscure search. That's what obscure means.

And what I didn't say was that I don't actually have a view one way or the other about the sandbox. Some of my pages have done well, some have been wiped out; but the last thing I think is that it's something quite so black and white.

Anyway - you shouldn't as me to justify my rant <grin> - it was just something I needed to get off my chest...
DerekH

bak70




msg:118451
 10:27 pm on Nov 24, 2004 (gmt 0)

This might be controversial.

The sand box does not exist.

Google updates roughly every three months.
Im talking deep update.

If you have enough seo in time for the update you move
if not you stay.

I have 100 s of sites between my partner and I .
we have seen this happen many times.

If your site has not moved in 6 months than you havent done enough seo or you are doing it wrong.

If you have time to complain on this board chances are you havent done enough.

Powdork




msg:118452
 11:21 pm on Nov 24, 2004 (gmt 0)

THERE IS NO SANDBOX!

Google introduced new algorithms in February and these algorithms are tough, tough ant-spam algorithms. They are based on lots of factors like:

How quickly the links were amassed
Quality of links
How quickly pages were increased
Etc etc


You start off saying there is no sandbox, and then describe one of the popular theories as to its existence. If this terminology works better for you, thats fine. Wherever it says 'sandbox', just replace it with 'tough anti spam algorithms' and you should be ok. AFAIK Google has always kept the date a link first appeared. This is not something new.

When you search for a restaurant by its name and city and google does not return the restaurant's website even though it is indexed, just because the site's links aren't aged to perfection, it doesn't matter what its called. It's a reason to leave Google.
People aren't leaving Google in droves because every other aspect of the search engine is far superior to the competition. But each day the 'tough anti spam algorithms' continue, this aspect becomes more noticeable.

Google updates roughly every three months.
Im talking deep update.
bak70 if you think we've had a deep update in the last nine months, you may be in for a shocker soon. At least I hope so.

airpal




msg:118453
 11:37 pm on Nov 24, 2004 (gmt 0)

Powdork, after reading your post, the million dollar question becomes:

If you add a new page to an old site (+2 years old), and get hundreds of inbound links (from external sites) to it (for a competitive search term), this new page would not rank well at all for many months because the incoming links have not aged yet, right?

This can only be proved or disproved by real-life results that people have had doing this recently. Anybody experience this problem with a new page on their old site?

Powdork




msg:118454
 11:45 pm on Nov 24, 2004 (gmt 0)

Actually, the post doesn't reflect my true beliefs. Personally, i think they are full. There is plenty of anecdotal evidence that new pages on mature domains can rank very quickly and very competitively. This doesn't really disprove either theory since the link aging theory typically states that internal links are counted to their full extent immediately.

RoySpencer




msg:118455
 1:02 am on Nov 25, 2004 (gmt 0)

So far, I'll admit to believing in the "sandbox" penalty...tens of thousands of our pages that ranked #1-#30 or so, are now ranked #200-1000, or beyond. Page rank has been largely retained after the domain name change.

Explain to me how the following can be explained by anything else than the SB theory (I really am open to suggestions):

If I move a page from our domain with the new (possibly penalized) name to an older subdomain, it shoots up to #3 position and stays there. The PR of the linking pages are the same (PR6 index page => PR5 subpage => page in question). All outgoing links on the page were kept the same.

The only difference I can see is the newer versus older domain.

Oh, and DerekH, I'll see your M.S. and raise you a Ph.D. ;)

bak70




msg:118456
 1:34 am on Nov 25, 2004 (gmt 0)

Well i think its not ok to post specific searches so I cant prove what I have written.
But I can tell you this.
What Im doing has worked in the past and it works now.
I recently went after a new keword for an older site and within 3 months im on page 2 for the keyword.
(it is a very competetive term), After the next update it will be on page one.

I also started a brand new site and it should be either 1 rst or second page after the next update.
(Its already first page on the msn beta)

I feel the reason why some people do better in Msn is that it is updating more frequently than google right now.
I havent seen a change in the way google updates in about 2 years. Pretty much every three months.

Again some people might not notice these updates because popular terms are dominated by people who just do seo better. This results in the serps looking the same.
I have all kinds of sites so I see these smaller changes when they happen.

As far as restaurants not showing for there specific search.
It will take a few updates but if the site has some incoming links it will show up.

There was a very popular diet pill that didnt show up in the top ten for its name for almost a year.

I took a look at it and noticed the site had little seo work on it.

So it took a while.

bak70




msg:118457
 1:36 am on Nov 25, 2004 (gmt 0)

Powdork
What kind of shock.
Im hoping for an update in the next few days.
It happened last year the year before in november.

[webmasterworld.com...]

Powdork




msg:118458
 2:04 am on Nov 25, 2004 (gmt 0)

I was just referring to the fact that if you felt major updates in the last 9 months than you would find something like Florida or the cross linking update of Nov 2002 to be rather traumatic. In my opinion Brandy/Austin was the last major update and that seemed to be a lessening of the filters applied during Florida. And 'if' the sandbox is a capacity issue and they fix it and then apply a new algo at the same time, it could make Florida look mild.

I took a look at it and noticed the site had little seo work on it.
Thats what makes me think the sandbox is not an effort to reduce spam. These are not all spammy sites being caught up. They are sites with little or no seo, sites with quality seo, sites that are spammy. Their common denominator is that they are new. That and the fact that they don't show up for a unique company name.

bak70




msg:118459
 2:17 am on Nov 25, 2004 (gmt 0)

From what Ive seen
sites that a lot of real seo move every three months.
Isnt it possable that the majority of people out there
just dont know real seo anymore?
Think about it if everyone could do it it wouldnt be profitable.
If I built a new site and did everything I used to do
and the site didnt move for 9 months.
I wouldnt come on here and say there must be a sand box or a penalty.
I would say uh oh I need to learn the newest technique or Im done making money with websites.

Insanity is doing the same thing over and over and expecting a different result.

Powdork




msg:118460
 3:55 am on Nov 25, 2004 (gmt 0)

Think about it if everyone could do it it wouldnt be profitable.
Anyone can do it, just not with new domains.

I would say uh oh I need to learn the newest technique or Im done making money with websites.
I'm not interested in the newest technique, just the development of quality content. I want this to be for the long run. I have a hard time believing Google wants to reward the newest technique either.

bak70




msg:118461
 4:08 am on Nov 25, 2004 (gmt 0)

powdork

You took it the wrong way.
For me the old techniques still work fine as they have been since I started 3 years ago.

But if what I was doing stopped working than I would look for what the new seo criteria for google is.

You cant make a blanket statement that new domains dont rank well.
(well you can but its your opinion)
New domains in highly competetive areas might not rank well. It just makes sense that it would take longer to rank in a highly competetive area.

But I bet i could get top ten listins all day long for obscure keywords with a brand new domain.

like ny widget dealer
or long island lawn care specialist
lol

Namaste




msg:118462
 4:26 am on Nov 25, 2004 (gmt 0)

As we progress on this thread, I have been running all kinds of searches on Google. More & more it does appear that Google has listed sites under keyword categories. My conclusion is drawn from 2 observations:
1. Pages of sites are ranking well when selected keywords are in use. One example, a leading gourmet foods website is ranking well whenever the words "gourmet" or "food" or "gifts" or "baskets" are being used. This same site recently added pages for exotic plants, which have a PR of 5, but these pages are nowhere in the SERPS.

2. Google is making use of a Thesaraus like function to display results. This happens when it is unable to find good results under the string typed in. For example(not actual), Indigo Widgets SLC returns results for Blue Widgets Salt Lake City. This indicates a category like classification.

Some conclusions can be drawn from this and other observations:
- There is some kind of "sandbox"
- The sandbox exists at one of the algo layers of Google
- The sandbox is most visible with new sites, but is also visible when adding new pages to a website of "unrelated" keywords
- How to overcome this sandbox is not really known. It seems that the "sandbox" is cumpolsary for most sites (possibly leading sites or newspaper sites are exempt)

What Google dosen't realise in all it's wisdom is that this isn't going to deter SEO. Sure, it causes a strategic shift in the way we SEO, but dosen't stop it. I see action shifting to acquiring of domains categorised under required keywords; and also formation of alliances between websites who rank under desired keywords.

It does appear to a be a negative and undesired step by Google; who is appearing to be more & more overzealous about their overrated threat: seo.

dvduval




msg:118463
 4:55 am on Nov 25, 2004 (gmt 0)

This is about the 20th thread about the sandbox with 100 or more posts. Cleary, webmasters are not happy about it, and clearly Google is not commenting about it. This can only hurt Google in the long run.

hdpt00




msg:118464
 5:10 am on Nov 25, 2004 (gmt 0)

This is about the 20th thread about the sandbox with 100 or more posts. Cleary, webmasters are not happy about it, and clearly Google is not commenting about it. This can only hurt Google in the long run.

That pretty much sums it up.

HayMeadows




msg:118465
 5:17 am on Nov 25, 2004 (gmt 0)

If it ain't broke, don't fix it.

Vec_One




msg:118466
 5:21 am on Nov 25, 2004 (gmt 0)

Google's mission is to organize the world's information and make it universally inaccessible.

prairie




msg:118467
 5:23 am on Nov 25, 2004 (gmt 0)

To me it just seems like links are taking longer to have effect. Really old links from DMOZ and Yahoo are gold.

Pimpernel




msg:118468
 10:25 am on Nov 25, 2004 (gmt 0)

Why dont you make it easier by throwing example of so called only 5 such Normal sites performing well on compititive KWS?

I cannot give specific examples, and also I am not saying "I have lots of sites that have beaten the sandbox. I am expressing our hopefully well thought out and well researched reasons for why sites from February onwards are not in general performing in google

Pimpernel, how do you explain what I said earlier, that even new pages that don't fall within the keyword categories of an existing (well listed) sites are sandboxed.

I am not sure that I understand the question, but what I do now is that it is simple to get existing sites to perform in google under new keywords, even new categories of keywords, and it is a whole different ball game with sites created since February. This entirely fits in with our theory and simply reflects the fact that PageRank flows down through a site, so new pages will benefit immediately from the existing PageRank of the site.

mark1615 - See my comments above. Sure each web page is judged on its merits, but the large majority of rating of a web page comes from internal links (i.e. it is linked to from the home page of the site). So, in reality in most cases, you are actually looking at web sites rather than web pages when assessing rating. It is for this reason that new pages on an existing site have no problem with ranking. The problem for new sites is that they can't get a good rating and therefore cannot pass that rating on to the individual web pages which are the ones that perform.

If I move a page from our domain with the new (possibly penalized) name to an older subdomain, it shoots up to #3 position and stays there. The PR of the linking pages are the same (PR6 index page => PR5 subpage => page in question). All outgoing links on the page were kept the same.

I think the answer to that is don't believe PageRank is everything. The algorithms that are suppressing new web sites since February are anti-spam algorithms, not ranking algorithms per se. The above is entirely consistent with our theory that a new site must do far far better in our traditional measurement terms to beat an old site.

To me it just seems like links are taking longer to have effect. Really old links from DMOZ and Yahoo are gold.

Right on the money! The simple fact is that with a lotta lotta hard work you can beat the "sandbox" effect, although it is highly questionable whether it is worth it. And that is exactly what google wants to happen - we all give up because it is no longer worth it, and google can revert to making its own decision about what the most relevant sites are, without any interference from us nuisances.

As regards the sandbox effect, someone posted a messaghe saying call it what you like, the effect is the same. Well, I think we are talking about a fundamentally different thing here. There is no sandbox because there are lots of sites launched since February that are doing perfectly well. "Sandbox" suggests that every sites is affectd, which simply is not the case. That is why I don't believe in the Sandbox.

This 472 message thread spans 16 pages: < < 472 ( 1 2 3 4 5 [6] 7 8 9 10 11 12 13 14 ... 16 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved