Welcome to WebmasterWorld Guest from 23.20.10.127

Forum Moderators: martinibuster

Message Too Old, No Replies

Dirty Tricks

What tricks to look out for from "recriprocal" link partners.

     
9:02 pm on Oct 7, 2005 (gmt 0)

New User

10+ Year Member

joined:Apr 5, 2005
posts:36
votes: 0


I'd like your thoughts on how to identify any "dirty tricks" in reciprocal link exchange.

After checking a few of my reciprocal links today, I have spotted several dirty tricks used by my partners, essentially making my link a one-way to their site.

What do I check for to verify a potential link partner is exchanging links with me ethically?

Here are some examples:

- noindex in the page metatag
- nofollow in the href link
- using javascript to mimic a href link
- using server settings to display 404 (not found) or 302 (object moved) to the search engine when it tries to crawl the page.
- disallow the page or directory the page is located in via robots.txt (not sure how to check this one)
- in reading other posts, it seems you can also cloak the links page. (not sure how to check this either)

Are there other methods I don't know to check for? Is there an easy way to verify ethical reciprocal links, without having to scrutinize the html or server responses?

Thanks for your help!

9:27 pm on Oct 7, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 13, 2004
posts:1425
votes: 0


Some will go so far as to mask their phony .php etc. links with an
"on-mouseover" device to show your realistic looking link in the taskbar.
I wouldn't consider linking to an outfit like that.

The main thing is to look at the source code of the link back page. -Larry

9:37 pm on Oct 7, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 28, 2003
posts:1978
votes: 0


Some sites cloak their pages. So, humans will see the link but the bots get different links pages, ones that don't contain your link.

The only real way to catch this (sort of) is to analyze the backlinks of a site they're already linking to--and make sure that those links show up and they're getting credit for them.

8:57 pm on Oct 16, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 15, 2003
posts:920
votes: 16


Forgive me for reviving a relatively old thread...

The worst dirty trick I've seen recently is people using their root URL with a query string as their links directory, as in http://www.example.com/?cat=widgets&page=1492

The Google Toolbar database apparently canonicalizes all such URLs and displays the root URL's PageRank for these pages, even though their actual PageRank value is... well... who knows? The pages never seem to be cached, and the sites that use this little game seem to always have thousands of pages, so its hardly worth trying to find if these pseudo-link pages are even getting indexed. It just rubs me the wrong way when I see these scams.

9:21 pm on Oct 16, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Jan 26, 2003
posts:371
votes: 0


disallow the page or directory the page is located in via robots.txt (not sure how to check this one)

Simply view http://www.example.com/robots.txt and check if there is a "Disallow" statement for the links page (or the directory it is in).

9:25 pm on Oct 16, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Jan 26, 2003
posts:371
votes: 0


Are there other methods I don't know to check for?

Here's another one I just thought of: if other pages of the site link to the links page, make sure these links don't contain a rel="nofollow"! (I suspect that would give the links page a PR 0; that would obviously be a telltale sign...)

 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members