Forum Moderators: martinibuster
Options seem to be:
Robots file to disallow your links page to be spidered
Use javascript to cloak the links
Use cgi "tracking" to make it harder for spiders to follow the links
Meta tag in links page to prevent spidering/following
...providing you prevent Google from seeing your own outbound links page(s) - then all inbounds should start to look like one-way links.
A tad naughty as it is not fair on the site linking to you of course!
Robots file to disallow your links page to be spidered
Use javascript to cloak the links
Use cgi "tracking" to make it harder for spiders to follow the links
Meta tag in links page to prevent spidering/following
I think this is just not right in the view of mutual benefits for link exchange partnership.
But.... Howabout www.domain. com and directory.domain. com are from different ips?
As Marcia said, penalization and devaluation are two different things.
Here's an idea - rather than trying to trick other webmasters, either don't use recip linking and focus your efforts on other types of links - or only recip link when it makes *sense* regardless of the engine algos.
Here's an idea - rather than trying to trick other webmasters, either don't use recip linking and focus your efforts on other types of links - or only recip link when it makes *sense* regardless of the engine algos.
That's exactly right. Focus on other types of links, and only do recips with related sites that offer great content that you do not... period. The key is one-way links, and you can get them by paying for them or by writing articles and offering them to other sites in return for leaving links to your site that you embed in the article in place. That works very well!