Forum Moderators: open

Message Too Old, No Replies

Blocking Inbound Links

         

hgerman

9:33 pm on Mar 28, 2008 (gmt 0)

10+ Year Member



How do you stop another Website from linking to one of your pages?

Example: Website A places a link to Website B. The Webmaster from Website B doesn't want Website A's link and instead wants to link it to another page. Is there a code that Website B's Webmaster can write on his page to accomplish this?

g1smd

12:02 am on Mar 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Set up a 301 redirect, but there will be no content visible at the old URL.

penders

1:12 am on Mar 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You could check the referrer (in website B) - the referrer being the website the user has followed a link from...? If the referrer is website A then jump to another page (return the appropriate HTTP status code, like 301 - moved permanently - as g1smd suggests), display a smiley face or something?!

If in JavaScript, you could try examining the value of

document.referrer
? If server-side then
$_SERVER['HTTP_REFERRER']
in PHP? This is not 100% reliable however.

How will this effect your serps?

hgerman

3:39 pm on Apr 3, 2008 (gmt 0)

10+ Year Member



I learned how to do it! See below:

<SCRIPT LANGUAGE=JAVASCRIPT TYPE="TEXT/JAVASCRIPT">
if ((document.referrer != '') && ((document.referrer.indexOf(http://www.bad-url.com) != -1) ¦¦ (document.referrer.indexOf("http://www.bad-url.com") != -1))) {
//
document.location = "http://www.bad-url.com";
}
</SCRIPT>

The bad site that is linking to my Website still has my link on their Website, but when you click on it, the visitor stays on this Website and never comes to mine. I don't want traffic from this site and the script above solves this problem. Thanks to all. However, in the back of my mind, I wonder if I have to worry about any detrimental SEO side effects? Do I have to worry about slipping in the rankings due to this script?

g1smd

6:54 pm on Apr 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Search engines generally can't see the javascript code, and it won't work for people with javascript turned off.

I have a bad feeling about it, but can't put a finger on it.

Lord Majestic

7:54 pm on Apr 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Do the same redirect based on referer but use script so that it works with the search engines too.

penders

11:32 pm on Apr 3, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The bad site that is linking to my Website still has my link on their Website, but when you click on it, the visitor stays on this Website and never comes to mine.

I think, strictly speaking, since you are using JavaScript the visitor does in fact arrive at your site, your page is downloaded to their browser (at least part of) and it is only at this stage that the visitor is redirected back again (by the browser).

If, like Lord Majestic suggests, you were able to use a server-side scripting language (such as PHP) to check the referrer then as soon as the request for your page is made (the visitor clicks on that link), you can direct them back again (with an appropriate HTTP Status Code, that search engines should acknowledge). Your page is not downloaded to their browser, no JavaScript required. Just a thought.

I wonder if I have to worry about any detrimental SEO side effects?

Well, as g1smd suggests, I don't think your (JavaScript) code will have any effect on a SE robot. They will follow the link and arrive at your page as if the code wasn't there. But is this a good thing? If you don't want this site to link to yours then presumably you don't want to be associated with the site in any way. If a SE robot is able to follow the link then may be it will assume there is a connection?!

(If it was a server-side script then this will effect robots as well.)

g1smd

9:57 am on Apr 4, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Robots don't "follow" links to other websites, and they do not send a referrer.

They work through a list of documents to download, and the downloaded documents are then sent on to another process which parses them. Among many things, that process extracts any URL references within the page, and simply adds those URLs to the list of documents to be pulled. The bot will then download them a few hours to days later.

penders

10:50 am on Apr 4, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Robots ... do not send a referrer.

Ah, thanks for the clarity! So even a server-side script (based on the referrer) will not function to filter out robot traffic that had originated from (or the link was found to be on) website A. Only user-navigated traffic can be acted upon in this way?

But (I am assuming) the robots are recording from where they found links to website B? Quality of inbound links - SERPs etc.