Hey, I have a website with 2000+ pages, old, all static html- wondering if some of my problems with it (and there are many) are due to linking to penalized websites- sites that were fine before but now are "bad"- is there any online or desktop utility that can go through your site and find them?
You can use all kinds of link checking apps, but I doubt that you can define a "bad neighborhood" in a way that automated detection will work. The big problem (as you may know) is those domains that expire and get picked up by a domain/link farm of some kind -- or by a completely different kind of site than was there when you linked.
This is a kind of checking that I assign to a human being -- just going on a clicking rampage through the links. Anything that looks different from what the link description says gets the boot. The bad ones usually jump out at you, unless your original link collection was rather indiscriminate. It doens't take too long, and the eyes-on part is important. You can catch bad links even before Google hits the domain in question and it is better to be preemptive.