Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: martinibuster
Some of you have had bad neighbors at one time or another. Like the early riser who lops off branches with a chainsaw on the weekend, or the party monster who thinks nothing of blasting her stereo during a school night.
One of the most explicit rules handed down by search engines is the prohibition on linking to bad neighborhoods. Not bad neighbors, but the entire neighborhood. This implies that when you link to a website, you are linking to everyone else they are linking to.
Perhaps one of the troubling issues about defining a bad link neighbor is that a relevant site can be a bad neighbor, and some say that an irrelevant site can be a better neighbor. Oh my! What rules do we use to define the bad neighbors?
Does this include non-relevant sites?
Are certain websites just plain bad by virtue of their topic?
How do you define a bad neighbor?
[edited by: martinibuster at 9:31 am (utc) on Aug. 28, 2006]
In broad terms, a bunch of sites that are trying to 'game google' by mutual promotion, may be treated by Google as a bad neighborhood.
If you exchange links with one or more of those sites, then you may have joined the bad neighborhood.
The more 'suspect links' the greater the risk of being dragged down with them.
The less relevant the links, the more likely the links will be identified by google, which can spot suspect linking patterns at a distance of 483 metres.
Treat all this with caution, however; Google's advice on linking is clear, and Matt Cutts' blog has provided many amusing examples (and screams and gnashing of teeth have provided yet more).
BUT No-one (except Google) knows the details of the definition, nor exactly how detection works.
I have always promoted my new sites using my existing ones (Just as I'd use my bricks and mortar businesses to promote each other); I do it sparingly, as I do not want to bore my visitors with irrelevant stuff, but I've never had a problem.
From what I've seen of others, some things seem to be an invitation for trouble, such as a collection of site-wide links at the foot of every page (duh!).
I'm sure others will disagree with much of what I've said - especially those who have 'never had a problem'; they may be right, for all I know. But I'd read it as 'never had a problem - yet'. Google has tightened up, and all the signs are they will tighten further as they assess the response to their efforts so far.
[edited by: Quadrille at 9:14 am (utc) on Aug. 28, 2006]
50 plus domains, all linked , all showing zero pagerank, all online for a number off years, and boy did they look desolate
Is this the kinda thing you mean?
I couldn't quite say why they were like this, all very similar sites
IMO the consequences of linking to a site that happens to be part of a links scheme and to one using hidden links to pharma sites may well be different.
hidden links to pharma sites...
Well, it doesn't have to be pharma sites, it can be hidden links to a travel site. But the only way to find those is to check the source code and look for a hidden div. Saw one yesterday.
What about linking to websites that are less than a year old?
Do you check the content for originality?
How about checking the whois? Is a hidden registration a red flag to you?
Would you link to a site if the domain is regged outside of your country?
I'd probably try a couple of snippets in G, but that's it...
Even more, the way I read the patent, a link that goes directly to a bad "seed site" would be the most trouble, but you might have some issues by linking to a bad "suburb", too. And the further away your links are from the bad core site and neighborhood, the less problematic.
If I'm right about this, then that whole fuss last year about hand evaluations at eval.google.com [webmasterworld.com] is playing into the determination of bad neighborhoods, as well as strong authoritative sites and their trusted neighborhoods.
Another comment: how about looking at the percentage of reciprocal inbound links as an indicator -- especially for well established domains. If a site approaches 100% reciprocals for IBLs, then I think it is a very questionable link partner.
We found one one because a user reported it, and then we reviewed all the links and found 2 more out of several hundred. There had been no negative effects on ranking or search engine traffic that we noticed. However, if there had been 30 links to bad neighborhoods out of a total of 50 outbounds -- then I think it could be a huge problem, even if they had just been innocently misled.
When I review a site to determine if it is worthy of me linking to it, the prime determinate of whether I should link to it, is who they are linking to.
If I see that the site links to 100 different Bali hotels or 100 casino sites, and when checking those sites that they too link willy-nilly to anyone and everyone, then that is a bad neighbor. '
yahoo.com has little relevancy to most of my sites but I'm very willing to link to yahoo because yahoo exercises discretion in who they link to, so does Google.
TrustRank is very much a backwards iteration of PageRank, it's not who links to you that matters but who you link to. Can your links be trusted? Do you link to sites that link willy-nilly to all sorts of other sites without regard to who they link to?
In desperation for links people often swap links with sites that are also just as deperate for links. Another word for this is spam. If you do no exercise discretion in who you link to, then you (and your links) cannot be trusted. This incorporates you into a bad neighborhood.
Good neighbors are those that can be trusted. A bad neighborhood is where no one is trusted. If you'll link to any site then you can't be trusted, you are from a bad neighborhood.
This is not a complicated thing. It is all about discretioin in linking.