Forum Moderators: open
Due to the nature of the site, it makes sense to deep link to three of our other sites from various sections / pages.
This would be one way linking not reciprocal.
We have our own server but I read time and time again that if you're going to get incoming links they should be from different IP ranges.
Obviously we can't divide the server in two, so this creates a bit of a problem as we require an additional hosting location.
Has anyone got any evidence to support the claim that incoming links are much more beneficial from different IP ranges?
Or any evidence to suggest that linking in this way could be damaging to us?
Many Thanks
I think to an established site these links would not cause a problem (although they might not give much benefit, either).
However with Google nowadays, I would think this sort of link structure might get you 'sandboxed' - it seems any 'unnatural' link structure gets you sandboxed.
I would suggest getting some backlinks from unique IPs (maybe from some directories, and backlinks from press releases), let them be awhile, and then implement these links. "For some odd reason" new sites' backlink structures are just touchy in Google.
As described, and on a stand-alone basis, I would doubt it.
*incoming links are much more beneficial from different IP ranges*
Apart from the accepted wisdom that incoming links are worth more than internal ones, the only possible benefit would be a smaller chance of the links being seen as "affiliated" as per the Hilltop/LocalRank computations.
That's somewhere down the line though, and would probably also depend on the overall linkage pattern.
Over the last month, I have been moving my sites out to web hosters on the net and my SERP results have improved.
I beleive Hilltop is here and penalises (igmores) links from the same IP addresses (or C-class).
Apart from the accepted wisdom that incoming links are worth more than internal ones, the only possible benefit would be a smaller chance of the links being seen as "affiliated" as per the Hilltop/LocalRank computations.
Yet Google is filtering out massive amounts of links based on Hilltop/LocalRank, so I think this "possible benefit" is significant.
Some of my sites have been in press releases from PR8 PR9 sites (because of the so called unique content of my sites) and i am still at PR5 and 6,so
You need to look at the permalink for your press release. If the main page of the press release site is pr8, and your release is on their for a day but then permanently moved to a PR2 "permalink", than you have a pr2 link. Its still a relevant link.
I'll take a PR6 from relevant links over a PR8 from unrelated bought links any day...
I am using dedicated server it hosts many of my sites
seems pr passed ok...
I wouldn't class "PR passing" as an indication that the link is stable. Google recognises many different types of links and I don't understand how anyone can judge if a link is passing PageRank unless they don't have any other links pointing to them.
However with Google nowadays, I would think this sort of link structure might get you 'sandboxed' - it seems any 'unnatural' link structure gets you sandboxed.
Eh? Has "sandboxed" become the code-word for anything that you can't think of a word for? Besides that, many companies have networks of sites that link to each other. Done right it can really benefit but never as much as distant IP addresses - (Note that's DISTANT, not DIFFERENT).
AFAICR
AFUBR
Could you please explain both of these for the benefit of those of us who don't spend our lives in chat-rooms and in forums?
Are you seeing this in a particular sector NDK?
Mesothelioma and other competitive SERPs.
Eh? Has "sandboxed" become the code-word for anything that you can't think of a word for? Besides that, many companies have networks of sites that link to each other. Done right it can really benefit but never as much as distant IP addresses - (Note that's DISTANT, not DIFFERENT).
I have an exact meaning, thank you. When I say "sandboxed" I mean a new domain with "unnatural" incoming links is seemingly penalized to the bottom of the SERPs (it is not grey bar's though). Older domains with these same "unnatural" links arent penalized.
Plenty of sites on the same IP do link to each other (with no harm), but I am stating that the SEO benefits of these links will be filtered. They should yield other benefits if they make sense for your users. And for new (< 6 months old) domains, I think this will greatly increase your chance of being "sandboxed".
Does this make sense according to the Hilltop paper? No. But Google is not behaving according to the Hilltop paper. IMO they are about 2 years past that, and have added a whole bunch of "SEO filters" into the algo.
I have an exact meaning, thank you. When I say "sandboxed" I mean a new domain with "unnatural" incoming links is seemingly penalized to the bottom of the SERPs (it is not grey bar's though).
Brett and some Mods have all requested at some point that people stop using the term "sandboxed" to refer to the "ranking lag" experienced by some webmasters. Sandbox is the term used for experimental projects by both Google and MSN. I haven't got a clue why someone started using the word sandbox to describe what you are alluding to but it is an incorrect term. That's all I was saying, I wasn't commenting on your intelligence or anything ...
Webhound - are you in the UK? For some reason, every UK webmaster agrees with me that the commercial sectors are full of spam where the US ones say they are fine. This would go in line with the recent thread on German Google being "left out in the cold".
Brett and some Mods have all requested at some point that people stop using the term "sandboxed" to refer to the "ranking lag" experienced by some webmasters. Sandbox is the term used for experimental projects by both Google and MSN. I haven't got a clue why someone started using the word sandbox to describe what you are alluding to but it is an incorrect term.
I understand that the term "sandbox" "shouldn't" be used here - but it is the word that is commonly used and understood to be this 'time lag' phenomenon. If the word is commonly used, and commonly understood (which it is), then it becomes "correct" -- even if originally another word "should" have been chosen.
I have heard many people voice the same thing you just have, but until another word is commonly used to mean this 'time lag' phenomenon on new domains, "sandbox" is the best way to convey my meaning.
I also find it a bit, ermm, arrogant to try to change commonly used language - remember when Wired Magazine tried to change "Internet" to "internet"? Didn't work out so well...
That's all I was saying, I wasn't commenting on your intelligence or anything ...
Fair enough.
This past summer, I optimized two domains in the exact same way (heavy link building campaigns) - the 5-year old one, with a two year old DMOZ link, is doing gr8. The brand new one is "sandboxed" (or, "it it ranks towards the bottom of the SERPs without being greybared and the majority of its backlinks do not seem to count for anything").
Note that each site was legitimate, independent and had wholly unique content. The only difference was the age of the domain and the DMOZ link.
I guess my entire point here is, be careful with your link building. A DMOZ link or link from an authority site can give your site some "legitimacy" before doing anything "grey area", linkwise. And I would definitely be careful with backlinks' IPs with a new site. A long established site doesn't seem to need to be careful.
The categories I am referring to that are spammed up reeeal bad are prescription medication credit repair and contact lenses. All highly competitive cat's.
We have worked really hard to build deep useful sites for our visitors and link them in a way that we think wouldn't be hurting us, yet every month for the last 9 months we've lost Google positioning. Yet we see other sites with blog back links, template designs, template content, etc.. ranking on the first page. It defies logic. So whatever seo filter G has in place, its broken...at least in the above categories.
I have two options available to me at the moment:
1) Host on a completely different network
2) Host on the same network but with different subnet and host IP addresses
I'd really like to go for option 2 as we're very happy with our current host.
Would different subnet and host IP addressing be enough to separate the sites?