Forum Moderators: phranque
If you mean broken links then it's not a good thing to have anyway!
Not sure whether an intelligent search engine would apply a "lost cause" logic to a site that was full of 404's or not; I doubt it, but that's not saying they don't.
It could be considered a waste of time to continue spidering a site that was 404 after 404, so I would certainly not go there if I were you.
<anecdotal evidence>A local party site that I run was suddently very much down in Google after the beginning-of-January update this year. I loked all over the site for clues - nothing that could be viewed as hidden text, cloaking etc. The only things that were faulty in a major way were some pages where I had archived the 'events' and 'links' content from 1996 onward - of course a lot of the external links were dead by now. As a stopgap measure I included these pages as disallows in robots.txt - from the next update on the site was in Google's good graces again.</anecdotal evidence>
If you move pages that are indexed then you have no choice but to wait for the spiders to come and update the index, but there's no excuse for broken links!
> One 404 is one too many.
A pint for you if we meet, sir.
It is simply *so* much easier to use the "controls" built into HTTP and into web servers to "do it right the first time" and not have to deal with all the secondary effects!
I will forgive 404's from "confused" and buggy search engine spiders, but if the 404 is my fault, I feel quite derelict in my duty to present a professional Web site to my visitors.
MHO/YMMV,
Jim