Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Can a "bad site" with spammy links affect your other sites

         

graeme_p

11:00 am on Jan 5, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have a (low traffic, neglected) site which has some spamy links on sub-pages that show revisions - i.e. I have removed the link from the content page, but it shows in the history.

I have now disallowed these pages in robots.txt.

In the meantime some of these have been index by Google.

The related good site used to link to the "bad" site a long time ago (about two years ago), while the bad site still has links to the good site.

Could this site have had a negative impact on my other sites (same whois, in Webmaster Tolls etc. so clearly associated)? If so, what do I do about it?

adder

10:46 am on Jan 6, 2014 (gmt 0)

10+ Year Member Top Contributors Of The Month



low traffic, neglected

Based on this honest description, I'd say - kill it. What's the point keeping something "bad" that can potentially affect a site that apparently matters to you?

To answer a question of whether owning a bad site can affect the ranking of your good site, you'd have to base your assumptions on anecdotal evidence. Although I've heard some webmasters saying they've been penalised "across the board" (all the sites listed on GWT been hit), I doubt Google pays its staff to go out looking for "bad" webmasters and penalising their whole portfolios.

The way this "accross the board" action happens, I think, is because the webmasters leave footprints and they often apply similar tactics to all their sites.

the bad site still has links to the good site

If it's a bad site, the link doesn't help you improve the ranking of the good site. So the question is why keep the link?

lucy24

4:57 pm on Jan 6, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Is this the same site you posted about elsewhere on the subject of removing backlinks?

If the page is already indexed, DON'T block it in robots.txt. Instead apply a noindex meta. If this is awkward to do internally-- as implied in the other thread-- you might have to resort to an X-Robots header, though this isn't the usual approach in html pages.

graeme_p

1:36 pm on Jan 7, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@lucy, yes, and its properly done now. the robots.txt was a temporary measure, and I was going to use the URL removal tool if I got really desperate.

@adder, because it has good content and potential. Some real people like it. The problem was deleted spam showing on history pages.

My sites are entirely dissimilar and I have not used similar tactics, so that should not be a problem.