Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
The bottom line is that Google has dictated that all sites in the future will not rank well unless they behave like a normal site would behave and unless they are well considered by the Internet population.
Interesting theory.
But ...
... wait a minute!
How come "normal" sites that have been introduced since February are also missing?
Bottom line, for the large majority of new sites it is going to be a long hard haul to get from the bottom of the SERPS to the top, not like the old days when you could get ranked in a week.
Have a look at [google.co.uk...]
Here's an excerpt ...
Google uses PageRank™ to examine the entire link structure of the web and determine which pages are most important. It then conducts hypertext-matching analysis to determine which pages are relevant to the specific search being conducted. By combining overall importance and query-specific relevance, Google is able to put the most relevant and reliable results first.
Oh yeah? So is Google saying that virtually no sites that have been introduced during the last nine months provide relevant or reliable information?
There is no mention of new sites or new pages being treated differently from those that are established. Isn't Google's mission to deliver SERPs that are all based on their algo and all sites being treated equally?
If this situation is deliberate then, if not actually lying, they are being very economical with the truth.
I think with projects like that the hard part and really time consuming part is expanding fields in all of the places an index may be used - all the reports, temporary files, files that get transferred to external companies, screen layouts where the field is used etc. The more business partners you have who have to change all of their systems, screen layouts and reports to accept a new size field, the more complex the project becomes.
Pimpernel, how do you explain what I said earlier, that even new pages that don't fall within the keyword categories of an existing (well listed) sites are sandboxed.
You will be extremely hard-pressed to find somebody who will agree with you that new pages on old sites are sandboxed at all. I have launched numerous pages on an old site that were ranking very well within days. Those pages had hundreds of completely different "keyword categories".
It's impossible to generalise full stop - what you've expereinced is one thing, what another experiences is totally different.
I've got sandboxing of new pages on old sites.
It's not just specific keywords that Google seem to be using to determine what is sandboxed and what isn't - no one has figured out what they are using yet - hence why this type of thread appears every few weeks and hence why they become so long.