Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: martinibuster
I then took pains to demonstrate that Google's own website never condones reciprocal link exchanges as an acceptable activity, and that in *fact* Google never mentions reciprocal links at all. Although Google's website says it's good to gain links, Google never specifies to do it through reciprocal link exchanges. The only time Google mentions link exchanges is in the context of things a webmaster should not do.
Google's website does not state that reciprocal link exchanges are a legitimate webmaster activity. I'm addressing strictly what Google may or may not condone, I'm discussing Google's policy regarding Link Exchanges.
Now here is a quote from the recent Google Patent Filing:
A large spike in the quantity of back links may signal... attempts to spam a search engine (to obtain a higher ranking and, thus, better placement in search results) by exchanging links, purchasing links, or gaining links from documents without editorial discretion on making links...
So there you have it. Is exchanging links something Google frowns on?
Link development is kinda like that .... it is indeed all about balance.
I think you'll be seeing "natural" tossed around as the newest buzzword
Some of us have been singing that song for ages already.
If someone spent a fair bit on budget online ...that might create a 'large spike'. ..His he spamming?
Below is what happens :)
I think another important question is what is the result of building links too fast? Penalty? Slight drop in ranking? Large drop in ranking? Ratio of drop in ranking to how large the amount of links you got? Ban?
There is no penalty. There is no gain. Google automatically post dates all these links, aka link devaluation. The gain might only get factored in after your links come out of a link sandbox which is at this moment is an unknown.
The emergence of the link buying market has fuelled this. In discussion with some text link brokers, they warn that links will not give value for a certain period of time. I for one, never buy text links, I think its a rip off, and unless you know what you are doing, you will get very very little ROI!
Brett is going to release a new post :P
" A successful site in 36 months "
-12 months sandbox
-12 months link penalty
-welcome to ground zero.
In a "link exchange" the two links are likely to appear with-in hours, or days, of each other.
Given the delays in free link approval, the above could be just one of many possibilities.
I am hearing enough alarm bells from directory owners (not necessarily link directories) to think that directory links are deprecated until proven innocent. It should be easy for SEs to have automated signature recognition that would flag sites as directories, along with additional fields for manual tagging if noted as a link-selling site. There would be no need to check each new link against the global index to see if there is a reciprocal and it could affect many innocent sites, such as the ones mentioned in news stories in hundreds of cities. These would be picked up during a backlink update.
Only "respected" directories (long history, good reputation etc) might be exempt from such a filter.
Sites with a high directory:non-directory ratio could be regarded as candidates for filtering.
Similarly, an old site need not be more kosher than a new one. Some of the old but minor FFA sites might still be around. I haven't checked thoroughly, but the sites devoted to the recent tsunamis would be good ones to check because they would have a high proportion of reciprocal linking for the noblest of reasons.
If you wanted to detect unnatural link bloom, one way would be to look for an increase in links without an increase in search volume. The greater the difference the more spam like it looks. This would also help them know when to deliver fresher documents as opposed to stale ones.
<tin foil hat theory>You could break searches down into categories (think Dewey decimal like system) and establish more accurate baseline measurements for categories. Searches for people like Britney or Paris would naturally be spiky. Searches for euclidean geometry would be relatively flat</tin foil hat theory>
I think recips, or more specifically recip pages are going to be "so 5 minutes ago". For example if I put up some amazing content on my website and got a link from the New York Times, would I be devaluing it by putting up a link back to the Times website on my home page? Lets face it from an spiders point of view a recip page is pretty easy to pick out. The format and link to content ratio is probably fairly consistent across different sites on the web. However if the links were placed inside articles or even relevant blocks of copy it would look less like a reciprical link directory.
My two drachmas.
And for the record, I SEO many sites that have aggressive reciprocal linking strategies and all enjoy top 10 listings for HIGHLY competitive keywords. Until someone can PROVE that reciprocal linking doesn't work, then I will continue to use it to get high rankings.
So what are some rough guidelines we can put together?
I have been sending out 10 link exchange requests for each one of my site's every day for a little while. What this usually equates to is 2-3 new link partners for every 10 sent out.
This site has been around since late February. What are your thoughts?
All the Best,
So now all you have to do is get your site listed in DMOZ, you get a very large spike of backlinks due to many others try to replicate the directory. Also get on the first page of MSN Search for your keyword, MSN Offers RSS feed for the top 10 sites returned. Get your site listed in Yahoo Directory and get another spike. So the competitor cant do anything bad to your site, but another search engine position can? The funny part is that all mention above newly created links will be a part of the page that has Ads by Goooooogle. Its like saying the only useful link to our searcher that contains good information is when we say it is, if you think otherwise pay us 35 cents per click and as long as you do it’ll stays useful.
If you build a site, lets sa with a datafeed from "the largest online bookstore", use mode rewrite and place the same page footer on each page of the site, you will end up with about 30.000 NEW pages from a NEW site spidered in less than a week.
So it seems to me that a high peak is this: several thousands of links from one site to another.
As far as I understand, this situatio is considered a peak.
Having said this, we can discuss whether you will receive a penalization or simply Google will ignore these links which is what I guess it happens.
Rate of acquisition, above a "natural" acquisition rate based on a bell curve would be a red flag. The acquisition rate could be tempered by spikes or growing traffic on that term. Its all speculation, but not that hard to comprehend. Jumping in links from 0 - 2000 in a week is not a natural thing, unless the content is so pandemically viral that it explodes.
Reciprocal links from "unrelated" topics are not as helpful as those from related topics. Too many from unrelated topics, become hurtful.
There are several "topical" linkfarms that still are very powerful at Y and MSN, but totally disregarded at G. Once again G is leading the pack in determining that not all links are created equal.
Lots of good arguments in this thread. I believe that at this stage, most of the Google algo is autogenerated by the algo itself - identifying and reporting statistical anomalies for human review, and incorporation into the algo.
As far as testing the algo's, the toolbar is a natural - on the fly.
IMHO, the goal of the algo is to build artificial intelligence that sees the value of a page the same as a human, its just that the computation is too large, and requires the algo to do most of the work itself, with feedback from the toolbar.
Content, old fashioned on page optimization, and and backlinks from sites that approve of your content is the way to go. The shortcuts (buying, linkfarms, random reciprocals) are all short term fixes, and just like every other "spammy' or blackhat tactic have short term effects, and will eventually be discounted, or penalized.
Long term and bulletproof, or short term and targetted for extinction.
IMO, SEO is more about compliance and good content building in service to the user than about discoving shortcuts and workarounds. Leave that to the fly-by-nighters.