Forum Moderators: martinibuster
You can still run a risk, or at the very least, the search engines will not rank you as well. My personal opinion is that search engines are smart enough to take into consideration links from many websites better than links from a limited amount.
To create an effective network of sites, you will have to differentiate the backlinks for many of the websites, a big task for thirty. Closely interlinked websites have been getting whacked for years, and there have been many of them caught within the last few months.
I have a 'resources' page and also drop links in occasionally were appropriate for the visitor.
I am sure Google would expect people with multiple sites to promote them therefore allows some latitude.
If the links are numerous or simply to manipulate the serps then....
Regards
Rod
about 30 sites(related topic but different contents on each) that i would like to crosslink to each other in order to increase the link popularity of the sites?
So, are you trying to create an authority site on a main topic with these other 30 sites? I am curious what the driving motive of those 30 sites would be...
It sounds like something a competitor accomplished, and ranked well with for a while. They got caught, it appears.
<nickel's worth>
If you're going to go to the trouble, use 30 different hosts. It's got to be easier than developing independent backlinks for 30 sites - which is what I would focus on doing.</nickel>
You can't get PR out of nothing; so you are going to have to get links from sites other than your own, anyway. Might as well get those outside links before you link to your own.
Same hosting company is OK.
There's probably no difference for the visitor if they offer unique content. Big difference for Google.
The problem happens when you have a closed network. To avoid this, it's important to cultivate varied backlinks for each of the sites, or as many as you can. It's all about the linking pattern and has very little, if anything, to do with different servers or ip ranges.
I recall a webmaster who built a huge network of hundreds upon hundreds of websites that were all interlinked one to the other. That network was a monster. But last spring they hit a brick wall and bit the dust.
Build your own directories. Build websites that are resources and will attract links. Create a network of websites with differentiated backlinks. Then you can point them to your websites for a lift.
But unless you have a huge network, you may still need to link dev your websites.
Somebody please enlighten me about the fundamental difference between me having a site on widgets linking to a site on widget painting that I also created, to me having a site on widgets linking to a site on widget painting created by jane doe?
If you think of links as references or recommendations... which is what Google does... it becomes conceptually easier.
Suppose you're looking for a qualified electrician. All other things being equal, whom would you hire?...
a) someone whose recommendations all come from himself ("hey, I'm great. Hire me.")?
b) someone whose mother and father and brother say he's great?
c) someone who has references only from businesses and people you've never heard of and no one else has either?
d) someone who has references from the highly respected state electrical contractors' association (which has lots of other references/links from building related organizations), and many references from prominent builders that frequently use electricians?
When the sites are on the same server, Google assumes they might be related. If they're on the same server and link to each other a lot and are about the same subject, it looks likely that they probably are related... and thus that their references or links might not be completely objective.
You'll likely get further in the long run if you structure your content and invest your efforts to maximise your chances of gaining outside links, not just for SEO reasons, but for traffic. That could easily mean more than one site, but I suspect it would be something less than thirty.
While G can appear relaxed about cross/interlinking, run of site reciprocals or even bought links, IMO it's a cumulative thing, and it can be hard to resist the temptation to take it that one step too far.
Not forgetting Y! can take a more serious view of it...
Seriously though - I just assume that Google has something at least this intelligent and then go about imagining how an artificial vs. natural linking pattern would look to it.
Would your network of sites look like this?:
[nasa.gov...]
Or would it look like this?:
[mapquest.com...]
In my imaginary system you'd gain points for links to outside authorities and dangling nodes and lose points for links that feed back more than x% into the system. Lose too many points and you fall on the "spam" side of the bell curve and your network gets nixed.
Of course it might just be easier to snoop which sites you visit with the toolbar and then correlate that with C-classes, linking patterns, and Whois data - but where's the fun in that?
If you are going to do this make sure you:
* use different hosts.
* get 'outside' links to the sites for the first six months before you link them all to the primary.
I own a small web design and hosting company. On all sites we design and most that we host, a small link is placed back to our main website.
1) Is this a bad thing?
2) Should I place a link on EVERY page of a site we create or just the idex page of every site that we create?
Thank you for your suggestions.
Khem