Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: martinibuster
Infact I beleive in it so much that I recently knocked up a fully automated cross linking package which I have just put up on [snip url - see profile] (see links in the toolbar). After a few teething problems during URL spidering it seems to be working fine.
Oh, and another thing... when someome places a link to your site you have to register that page with the search engines to pump up you popularity as quickly as possible.
(edited by: paynt at 7:44 pm (utc) on Jan. 18, 2002)
At the same time, you yourself need to be wary of this kind of system. It is very simple to cloak one's page so that your automated spider see's the links but the user (or google) does not. I did it for linkstoyou when that was still an effective means of pushing linkpop on new sites.
I'm afraid ggrot's word of warning goes even a little further. Here's what we're told according to Google webmaster information [google.com]:
...setting up pages/links with the sole purpose of fooling search engines may result in permanent removal from our index. If you think your site may fall into this category, you might try 'cleaning up' the page and sending a re-inclusion request to firstname.lastname@example.org. We do not make any guarantees about if or when we will re-include your site.
Unfortunately, many have been around that bend before, with some very sorry consequences.
Thank you for the good intentions, but doing a good read around the board can be very convincing to do it the old-fashioned way and seek out quality, relevant linking partners. It may be time consuming, but it still works and doesn't carry the high risk factor with it.
The way I over come it is be parsing the spidered HTML source looking for HTML comment tags and I also monitor changes in pages that link to my site. I automatically get notified if there is a change and if there is a possible fraudster trying to link to me.
Also the last step in my software is a manual check since a) I want to be sure that the link to my page can be seen. b) I won't accept links which are of not related to my web site.
Acutally the purpose of me writing the link software in the first case was because I was fed up of cross linking with people and then visiting their site a few months down the line just to find that the link isn't there anymore!
I understand and accept google's comment about 'spam links' but I actually would like to provide a service to my users as well as increase traffic to my site. The only way this can be done is if I have an efficient way of managing the links.