tedster - 6:19 pm on Dec 2, 2011 (gmt 0)
Original content... I'd like to see how they will handle this war with scrapers and non-original content sites.
During his Pubcon keynote, Matt Cutts addressed this challenge during the Q&A period. His suggestion for sites that have a big scraper problem was to use pubsubhubbub and make a "fat ping" to Google immediately when new content is published.
I have just such a client, so we implemented PSHB within a week - and we are now seeing a REMARKABLE improvement in rankings. We also put a delay on the RSS feed and went from a full feed to a partial feed.
Especially with so many syndicators and mash-ups adding legitimate value, Google's challenge of original authorship attribution is far from trivial. If a site's RSS feed goes live at publication time, then the scraped, syndicated or mashed-up version may well be cached earlier than the original. PSHB's fat ping capability seems to avoid that problem for the publisher.