|Google considering "Age Of Link" n "Pace of Link Accumulation"|
While Google Sandbox has become a familiar phenominan lately, i was just trying to intrepret the purpose it severs and its implication on googles algo.
1)It is quite right that google gives importance to the links which comes from the relavent web pages.
2)It's also true that the no. of such backwords links hugely matter in deciding the positioning of a site on SERPS.
SANDBOXING is a phenominan whereby google keeps a check on sites which are new and grow at an abnormal rate in a very short period of time(in terms of backwords links).
This clearly means that apart from the relavancy of links the new goolge algo is also considering the "AGE OF THE LINK" and the "PACE OF ACCUMULATING LINKS" for a webpage.
Any other comments.
Or is it just the age of the webpage only.
Google will have calculated an average rate of link popularity growth for new sites.
It will also have a statistical determination of which sites are outliers (grow too fast).
I can see why it would want to sandbox the PR transfer to these outliers.
This method would be a good way to slow the PR rich SEO's that can put a site up, have it too a PR 7 in a week.
PR is suppose to be an indication of natural popularity / content relevance and a SEO linking all of there sites to a new site does not mean the content is relevant to the average surf.
I thought I was sandboxed and then I changed to a different web host and my stats went up by about 100 x.
Could be totally unrelated, but thought I'd add my thruppence.
This is what Google is doing...
They want to stem the tide of webmasters/seo firms who are gaming the link relationship/popularity game with ready made inflated PR sites that are only days/weeks old...
The "reality" is that when a new site is made and goes live it takes a while for links to be acquired and calculated...
I think this is a good move and will help to stabilize the SERPs for companies that have been online for a while and working hard to deliver "real" value to the end users (the respective search engine's users base)...
Good move on Google's part...
It's an interesting move and obviously solves some problems.
Unfortunately for the sandbox to not completely derail the idea of link popularity and such, the sandbox eventually has to go all the way away. If not it would make catching up to sites that have been around for a long time basically impossible if they concerted the same effort as you (especially if they put more effort into link building).
So it's sort of a temporary fix but as people realize this it just becomes another element which seasoned veterans learn to expect and get around while newbies are astounded by the difficulty of penetrating the upper echelon of SERPs.
|It will also have a statistical determination of which sites are outliers (grow too fast). |
I can see why it would want to sandbox the PR transfer to these outliers.
No it won't. And how would this be any helpful? What if I have a great, unique site, which grows in a popularity ABNORMALLY fast will google put a red flag on it? I don't think so. There is no such thing as pace of link accumulation, you dont have any facts to support this. As for the "age of link" - yes, it takes time for the links to start counting as they should be.
Currently, I'm optimizing my new (2 month old) site for a competitive blue widget keyword. It's getting links at abnormally high speed, and google likes it! I've made it to the second page already.
|What if I have a great, unique site, which grows in a popularity ABNORMALLY fast will google put a red flag on it? |
yes, that is what i am saying.Sites that are made and the SEO puts all of their links to it, thus making it a PR 7 in the first week , should be sandboxed, as it is usually these sites that plague the serps with spam.
|There is no such thing as pace of link accumulation, you dont have any facts to support this. |
Nor do you have anything to disprove it. We are all making educated guesses from the examples we have seen. It is very possible google measures pace of link accumulation. If i was them, I would be using the data.
So where are those examples? This is only an assumption, u know what they say about people who assume...
i hate to be the one to tell you this, but this whole job revolves on assumptions. As google doesn't publicize its algo, we need to make assumptions - test those assumptions and change them as we see fit.
|This is only an assumption, u know what they say about people who assume... |
As a scientist, I must add that ALL science uses assumptions. We can't possibly understand the world or the universe in which we live, or solve all of the complex numerical equations that describe their respective physical processes, without using assumptions. For example, in algebra, you may make assumpions about certain parameters in order to simplify the equation and isolate a single variable. That said, there is nothing wrong with assuming and speculating. That is how progress is made. Google's algorithms use assumptions in order to make the complex computations possible. Therefore, the idea that Google may use link accumulation rates in their statistical analyses is perfectly legitimate. They must make assumptions about "normalcy" based on large amounts of statistical data in order to efficiently analyze the web. If they deem that websites are accumulating links at an "abnormal" rate then they have made a scientific assumption to the best of their ability based on "typical" accumulation rates. They may or may not be doing this, but the idea is valid.
[edited by: crobb305 at 12:22 am (utc) on June 16, 2004]
I agree with this trimmer....
"Nor do you have anything to disprove it. We are all making educated guesses from the examples we have seen. It is very possible Google measures pace of link accumulation. If i was them, I would be using the data."
I will add this comment, this growth of links and popularity is more than just one item. There would be an Algo. that would be setup to properly introduce the combination of growth, newness, links and size as well as the total amount of site updates that are significant. Thsi all plays into a X in the current Y Algo. calculations. None of us know the exact Algo. type of formula but what is important is that you ask yourself in all seriousness, what would you do if you were Google to stem the spam and the manipulations of great webmasters as ourselves?
Keep asking this question to yourself.
I normally will never give hints to how to do things here as I feel it just fuels the fire of SPAM and I would also like to go on to say that many of us here will not give out ANY SEO techniques as it only helps our bad guys and good guys do better than us. We place 1st page on the main SEO terms. It takes a lot of work, but you must be smart and not quick, Quick 'ainta gonna' do it. Repeat, stop now if you think you know it all as you will be slapped at some point for it, if you are on the first page now and think wow I am smart It aint gonna last long. Do serious thinking and make content work better, not better for the Algo's.
PS - If you want serious longterm positions you need a serious engineer, not a fast talker!
I agree with photonstudios on this who, by the way, did offer his 2 month old site as evidence (for what it is worth).
The question is what is a "natural" rate. It is dangerous to assume that a new site getting lots of links is getting them from an SEO. What if that site is getting lots of links because it is newsworthy?
What if my new site has a unique product or service, and AP writes an article that appears all over the place about it? And then word gets out through a media channel, and webmasters link to it because they value it as well?
In this example, it would make sense for Google to react positively to the "unnatural" rate of the new links.
I think the sandbox has more to do with the amount of time that it technically takes Google to digest links.
Note : I have not claimed the google uses this method.
I have said that I can see that it is
1. Technically Possible
2. A personally desirable methodology to remove spam
If I had the choice I would be willing to have the above example site provided by grant sandboxed . There are numerous examples of how this would effect relevant sites but if the site has strong links already it should not need to rely on SEO traffic for the first few months.
The growth of blogging has changed everything. Blogs can go from zero to 2000 backlinks in a blink, quite organically.
Oh, and BTW, what exactly is a blog? Is it something deploying FOAF? Is it something hosted on a common blog domain? Is it something with one or more links to services like blogroll? It is something updated daily? Permalinks? TalkBack?
Those are all explicit checks which are high-maintenance and short lived, IOW expensive for Google. Google needs to see statistcial trending and make decisions based on behaviors. However, it takes time and knowledge to build those. That means for the short term, they prolly do use linear fact checks and decide based on such nonsense. Doing so generates data for the long haul research.
Sooo... if your site runs a FOAF protocol they decide you are a blog (short term ID). Eventually, as they run your growth patterns against trend data, they will once again not care if you are a blog or not.. they will know you by your behavior profile.
(so yes, if you are planting 2000 backlinks overnight, might be a good idea to put some FOAF links in there ;-)