Forum Moderators: open
Which is not to say this is a universal truth (as so many people here love to apply their tiny bit of experience as a universal &^%% truth).
No wonder steveb is sick of reading that crap. I'm sick of reading that crap.
Look at how adsense takes weeks to deliver relevant ads and how it takes it awhile to find all of your links - for example.
2nd post
Which is not to say this is a universal truth (as so many people here love to apply their tiny bit of experience as a universal &^%% truth).No wonder steveb is sick of reading that crap. I'm sick of reading that crap.
In the absence of your second post's "Which is not to say this is a universal truth" your first post looks like the type of "Universal truth" statement that Steveb (and others) normally rail.
Including "in my experience" or "i'm not saying it's a universal truth" seems to be all that's necessary to avoid being bludgeoned.
It is however annoying to add that to every post, I mean, shouldn't it be taken for granted that each post is merely that users experience *unless* they say "i've found this to be the undisputed truth and i've confirmed this over and over and over"?
Sandbox is one of the best things Google ever did.
I think if there were no more new websites built from now until eternity - the world would not miss much.
Google is a better search engine for having the Sandbox then it is without it.
Which is not to say this is a universal truth (as so many people here love to apply their tiny bit of experience as a universal &^%% truth).
No wonder steveb is sick of reading that crap. I'm sick of reading that crap.
I said
Google will either understand your website immediately, or never.I should have said "Google will either understand your website immediately, or you may have some issues, IMO".
===make a site www.tadpole-vaulters.com provide one link, number #1 in 10 days no problem===Yes, but does it stay there?
Yes, it stays there.
Post 210 accurately reflects what I'm seeing too. As does the post about money keywords and google maintaining their cash flow from adsense sales, this has to be considered as a major factor in this, as was absolutely correctly noted, google needed to raise their income before the IPO to ensure high initial stock prices, especially because they sold the stock directly, that means the money went into their pockets. And they need to keep that income high. Ignoring this element can't possibly result in a meaningful analysis.
It is however annoying to add that to every post, I mean, shouldn't it be taken for granted that each post is merely that users experience *unless* they say "i've found this to be the undisputed truth and i've confirmed this over and over and over"?
When jdMorgan tells you how to fix some code in your site, you can take it as gospel.
But phenomena like 'sandboxing' are entirely different. When posters write things like "The sandbox DOES exist," or "This seems to confirm that the sandbox does not apply to new pages on established sites," (no offense meant to either poster), those who know differently sometimes post to help others from needlessly reacting.
During one of the major updates of the past 12 months, lots of posters insisted that H1 tags were the cause of sites being penalized. So a bunch of other posters said that they were off to remove those elements from their sites. Ugh.
OPINION: There is no sandbox. There are only algorithms, filters and penalties.
When jdMorgan tells you how to fix some code in your site, you can take it as gospel.Amen to that. Never found anyone more helpful on any forum. And to weave that back in to the context of this thread, it was he that helped me implement the 301's to my new domain as GoogleGuy had suggested we do for sites moving to a new domain. I am increasingly starting to believe that the sandbox could be summed up as 'Florida for new domains'.
Thanks Freedom.
OPINION: There is no sandbox. There are only algorithms, filters and penalties.
Those algorithms seem to have the appearance (to many webmasters) of now encorporating time delays, especially in competitive areas. A time delay would certainly make sense from an anti-spam perspective.
Its probably fair to call this aspect of their algorithm a sandbox, except that the term implies simplicity in the algorithms. More likely they are trying to apply some fairly complicated algorithms, which have certain emergent behaviours that we see as things like sandboxes.
Agreed that this may be the appearance.
However, a question: If time delays were truly in place would they be completely random/variable, or would they be more consistent?
Some webmasters report emerging from the sandbox after a few months, some longer, some not even after eight months. Some new sites never see the sandbox.
Some new pages of old sites are affected, other new pages at older sites are not.
Does this sound like a time lag? Or, is it something that takes varying amounts of time to overcome...
There is no spoon. :-)
1. Was the domain brand spanking new, never used before?
2. Did you get a dmoz or Y! listing out of the gate?
3. How many results are returned for the keywords?
4. Can you sticky me the URL and term?
1. Brand new, never used
2. No Dmoz or Yahoo listing
3. 3,690,000 results - no quotes, 1,470,000 - exact match with quotes around it.
4. Sorry can't sticky url.
i am not try to start a webmaster strike here i just want to poin that we need a respect.
Google is a company, it's a for profit venture, like all the other search engines out there that people use.
What we need is a real alternatives to the for profit search engine, whether it's open source or whatever, there needs to be a way to get information about the web to people that doesn't rely on something like google needing to maximize its profits to jack up their IPO share prices or whatever, something of the quality of yahoo for example would be fine.
Sort of like the idea of public libraries in the United states. Have the for profit sector, but also have a non-profit, ideally open source model too that can focus on delivering just clean results, without commercial interests intervening in that engineering process.
Whole mozdex is trying they don't have enough money to create the necessary server farms, that's the real bottleneck, ideally a consortium of some type would form, like is happening with linux, where all major stuff happens under an umbrella organization, and groups can fund this organization out of self interest.
Something like Open Source Development Labs (OSDL), funded by major linux users like ibm, sun etc, they are creating a product that frees them from both restrictive unix and windows systems, that's in their current best interest.
Many major world governments have strong interests, or should, in having a search engine that is not privately held to give their countries free and open access to the data on the internet. The same goes for many large corporations, as well as most real world users. This is the same exact group that is able to override their restrictive individual interests while actually being able to serve their larger self interests with Linux.
things like the 'sandbox', which are almost certainly commercially influenced, are damaging the web.
However, on the bright side, my sites have never gotten any yahoo traffic at all for some reason or other, but in the last week or two, I'm finally starting to get yahoo searches, so I'm not the only one noticing these problems, this to me shows that even though some people here are able to make google work for them, standard users are starting to get turned off.
Lack of fresh results, it's absurd to call this a plus, google built its reputation on having the most upto date spidering and site indexing on the internet, it's total nonsense to now turn around and claim a 6 month sandbox somehow is a good thing, that's totally ridiculous, google is having problems, I don't really care what causes them, saying this is a good thing is like saying a car that doesn't start x percent of the time is good because it keep smog down.
It also sounds too easy to spam. If a SE is open source, anyone can spam it unless it's totally based on external links, and any algorithm so heavily waited to one factor would be useless.
It also sounds too easy to spam.
That's the same argument people make against open source stuff and security pretty much, real world results show the value of that argument. That's security through good programming vs security through obscurity, it's about the same thing as spammers trying to reverse engineer an algo, which I know they are able to do judging by at least a few of the posters here, vs the algo being open and available for inspection and testing and improving, and not being subject to commercial pressures like increasing adsense income. Linux seemed like a wacky idea from the land of the uber geeks not very long ago, until ibm/novell/sun and china/finland/germany/peru etc etc got involved, now it's not so funny any more. Same for Mozilla.
There's actually a significant national interest involved in not having a US based group of search engines control all major access to the web, much as there is a major national interest involved in not running your country's national computing on proprietary os's with built in backdoors, put there on demand of the us gov.
Having a bigger emphasis on human editing would definitely be a big plus, for example if there is a trigger for new large sites, or small sites that suddenly become large, rather than apply some automated routine just have somebody take a quick look at the site, it's easy to judge if it's spam or not in a few seconds usually.
Anyway, it's just a matter of time I think, after os's this is probably the next most important thing to have a viable non-private option available for. To see how viable it is, look at how much sergey and brinn needed to start up, it wasn't that much, 10,000 linux boxes or so, a few smart programmers, a good chunk of bandwidth, nothing that would break even a decent sized company's bank, let alone china or germany.