I hold out hope that it does get sorted out. It's never boring webmastering! At times it seems futile but I'm confident that this will get addressed. I'm just not sure if there are enough people out there voicing the issue of losing out to scrapers.
I think nobody so much cares about scraping until it's at the point that you lose rankings and traffic because of it. It's probably never going to be perfect. I accept that.
I'm pretty passionate about this subject now. I still am baffled that on this forum, so few people are unaware of the Google doc which you can fill out to communicate searches where a scraped page outranks an original page. How is that it's not a priority situation right now? Either I am in a minority with losing rankings to a scrape job, or I'm actually more enlightened than most.
As webmasters we need to use whatever means there are to communicate that the algo is having issues. That doc is one such (and very rare) communication tool where we can collectively say there is a problem here. Enough examples submitted and there might be some extra effort or acknowledgement of possible issues. If nobody says anything, then there is no issue. I can also accept that I'm one out of 1,000,000 webmasters seeing my site tank in GooG but not in the other search engine. I just hope that more people consider this option first (investigate) before deciding to chase the pot of gold at the end of the Panda rainbow. Before blowing up your site at least to a GooG vs. the other search engine to see who is ranking for your own content. It's simple and may save you a LOT of stress and frustration.
Added: I simply don't buy into this theory that my site had/has an issue and that the issue caused loss of authority, and thus the scraper outranks me. Let me hear that from GooG officially that their algo wants to work this way and that's a desired outcome. Sounds pretty f'up to me if you create something where scammers can flourish. Wow, that's scary and no, I don't believe that they would knowingly accept that as being part of their outcome. How could that not be consider a failure? Um, GooG QA team says site A is bad, we are Pandalizing them, but seeing scraper site B, C and D take those rankings is an appropriate outcome? And so if site A can't figure out what's wrong, then having the "theft" is "working as planned"? I can barely get my head around this thought. Which comes first the chicken or the egg. Yeah.