Andylew - 9:57 am on Jul 27, 2010 (gmt 0)
The question you have always got to ask yourself is would you find the site useful, if your honest answer is not really then it is unlikely to do well in the listings. Certainly google specificly looks for 'bad' indicators which would suggest the site has little use such as 20 domains all pointing to each other.
However, the thing that always has to be considered is where the line is drawn. Google is an automated system and is likely to score sites in terms of probability of usefulness to a searcher. Say this is a score from 1 to 10 the line might be drawn at 5 so any site that scores less than that doesnt feature - it is blocked by the spam filter. If the site gets through the spam filter it is then scored on the ranking algo to determine its position.
This is automated so yes you could improve the sites but you would need to eliminate the 'bad' factors. When it comes to automated sites generaly they are 'bad' however it doesnt mean all automated sites fall into this category - one site we run is fully automated however the value isnt in the site it is in the algo we have developed for the site (the site was created as a test bed) which can take a paragraph of text and re-write it, essentially an AI which 'understands' what is being said. This in essence makes what would be a duplicate site which gets penalised by google into a completely unique site. However, for the tens of thousands of hours it has taken and £££££ so far to develop this script you could just hire some people to rewrite the site by hand......
The answer is think unique, think useful and dont try to outsmart google as they have made it virtually impossible.