Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
1)It is quite right that google gives importance to the links which comes from the relavent web pages.
2)It's also true that the no. of such backwords links hugely matter in deciding the positioning of a site on SERPS.
SANDBOXING is a phenominan whereby google keeps a check on sites which are new and grow at an abnormal rate in a very short period of time(in terms of backwords links).
This clearly means that apart from the relavancy of links the new goolge algo is also considering the "AGE OF THE LINK" and the "PACE OF ACCUMULATING LINKS" for a webpage.
Any other comments.
This method would be a good way to slow the PR rich SEO's that can put a site up, have it too a PR 7 in a week.
PR is suppose to be an indication of natural popularity / content relevance and a SEO linking all of there sites to a new site does not mean the content is relevant to the average surf.
The "reality" is that when a new site is made and goes live it takes a while for links to be acquired and calculated...
I think this is a good move and will help to stabilize the SERPs for companies that have been online for a while and working hard to deliver "real" value to the end users (the respective search engine's users base)...
Good move on Google's part...
Unfortunately for the sandbox to not completely derail the idea of link popularity and such, the sandbox eventually has to go all the way away. If not it would make catching up to sites that have been around for a long time basically impossible if they concerted the same effort as you (especially if they put more effort into link building).
So it's sort of a temporary fix but as people realize this it just becomes another element which seasoned veterans learn to expect and get around while newbies are astounded by the difficulty of penetrating the upper echelon of SERPs.
It will also have a statistical determination of which sites are outliers (grow too fast).
I can see why it would want to sandbox the PR transfer to these outliers.
No it won't. And how would this be any helpful? What if I have a great, unique site, which grows in a popularity ABNORMALLY fast will google put a red flag on it? I don't think so. There is no such thing as pace of link accumulation, you dont have any facts to support this. As for the "age of link" - yes, it takes time for the links to start counting as they should be.
Currently, I'm optimizing my new (2 month old) site for a competitive blue widget keyword. It's getting links at abnormally high speed, and google likes it! I've made it to the second page already.
What if I have a great, unique site, which grows in a popularity ABNORMALLY fast will google put a red flag on it?
There is no such thing as pace of link accumulation, you dont have any facts to support this.
This is only an assumption, u know what they say about people who assume...
As a scientist, I must add that ALL science uses assumptions. We can't possibly understand the world or the universe in which we live, or solve all of the complex numerical equations that describe their respective physical processes, without using assumptions. For example, in algebra, you may make assumpions about certain parameters in order to simplify the equation and isolate a single variable. That said, there is nothing wrong with assuming and speculating. That is how progress is made. Google's algorithms use assumptions in order to make the complex computations possible. Therefore, the idea that Google may use link accumulation rates in their statistical analyses is perfectly legitimate. They must make assumptions about "normalcy" based on large amounts of statistical data in order to efficiently analyze the web. If they deem that websites are accumulating links at an "abnormal" rate then they have made a scientific assumption to the best of their ability based on "typical" accumulation rates. They may or may not be doing this, but the idea is valid.
[edited by: crobb305 at 12:22 am (utc) on June 16, 2004]
"Nor do you have anything to disprove it. We are all making educated guesses from the examples we have seen. It is very possible Google measures pace of link accumulation. If i was them, I would be using the data."
I will add this comment, this growth of links and popularity is more than just one item. There would be an Algo. that would be setup to properly introduce the combination of growth, newness, links and size as well as the total amount of site updates that are significant. Thsi all plays into a X in the current Y Algo. calculations. None of us know the exact Algo. type of formula but what is important is that you ask yourself in all seriousness, what would you do if you were Google to stem the spam and the manipulations of great webmasters as ourselves?
Keep asking this question to yourself.
I normally will never give hints to how to do things here as I feel it just fuels the fire of SPAM and I would also like to go on to say that many of us here will not give out ANY SEO techniques as it only helps our bad guys and good guys do better than us. We place 1st page on the main SEO terms. It takes a lot of work, but you must be smart and not quick, Quick 'ainta gonna' do it. Repeat, stop now if you think you know it all as you will be slapped at some point for it, if you are on the first page now and think wow I am smart It aint gonna last long. Do serious thinking and make content work better, not better for the Algo's.
PS - If you want serious longterm positions you need a serious engineer, not a fast talker!
The question is what is a "natural" rate. It is dangerous to assume that a new site getting lots of links is getting them from an SEO. What if that site is getting lots of links because it is newsworthy?
What if my new site has a unique product or service, and AP writes an article that appears all over the place about it? And then word gets out through a media channel, and webmasters link to it because they value it as well?
In this example, it would make sense for Google to react positively to the "unnatural" rate of the new links.
I think the sandbox has more to do with the amount of time that it technically takes Google to digest links.
If I had the choice I would be willing to have the above example site provided by grant sandboxed . There are numerous examples of how this would effect relevant sites but if the site has strong links already it should not need to rely on SEO traffic for the first few months.
Oh, and BTW, what exactly is a blog? Is it something deploying FOAF? Is it something hosted on a common blog domain? Is it something with one or more links to services like blogroll? It is something updated daily? Permalinks? TalkBack?
Those are all explicit checks which are high-maintenance and short lived, IOW expensive for Google. Google needs to see statistcial trending and make decisions based on behaviors. However, it takes time and knowledge to build those. That means for the short term, they prolly do use linear fact checks and decide based on such nonsense. Doing so generates data for the long haul research.
Sooo... if your site runs a FOAF protocol they decide you are a blog (short term ID). Eventually, as they run your growth patterns against trend data, they will once again not care if you are a blog or not.. they will know you by your behavior profile.
(so yes, if you are planting 2000 backlinks overnight, might be a good idea to put some FOAF links in there ;-)