Currently working with a company that deals with approximately 150 small SEO clients in one niche. The company creates a page of new content a month per site - it's all on the same topic, but it's well-spun by human content writers. Given Panda - will this strategy be maintainable for effective long-term SEO? If not, what do you do with 150 clients who all need content?
I would be careful in claiming what is and what is not the Panda's job. People are still researching Panda and most of the guesses have not been confirmed or confirmed to be false.
150 sites on the same topic? This doesn't sound like a Panda issue. This sounds more like the age old problem of dealing with duplicate content.
I am guessing these are local sites. If that is the case one simple solution is to add geographic references to each page of content. That will help but not take care of the entire problem.
I would avoid spinning or re-using content. I would use many part time writers (at least 50). I would give them the information bullet points and a brief description and then have them create original content. This will cost a little bit more in money and time to manage such a large team but way better than trying to respin an article 10 times.
Another solution to generating large amounts of unique content is to tap into user generated content (UGC). There is alot of unproven speculation that poor quality content from UGC sources may trigger Panda. To be safe you should invest in an editor to make sure the UGC has proper grammar, no typos and is of sufficient length.
I think it is definitely manageable for the right amount of money as long as you are creative with your solutions. You may find it easier to avoid these situations in the future.