incrediBILL - 5:48 pm on Oct 5, 2012 (gmt 0)
To figure out the future you have to analyze some recent history to see where they're taking us.
The impact of crowd sourcing is already here in search suggestions.
I used to get a phenomenal amount of long tail traffic from the random search patterns. The minute Google crowd sourced the searches themselves and used the most common search patterns as suggestions there was a ton of traffic lost plus the amount of sites that could play was narrowed down as the number of search strings linearly decreased. You could see the impact easily in analytics as the keywords started to clump together and the total number of keywords being used diminished.
What this means is searchers are being funneled into using specific patterns and a reduction of competition because there is less room to play and only the most trusted sites are being given a seat at that table. Most of the recent updates have been attempts to dump sites that they call 'low quality' but I think are less trust worthy to searchers.
In the future you're going to need a trusted site just to show up in the top 10, with high end user engagement and low bounce rate, something sites like Yelp easily attain.
Here's the reality, Google knows a lot about everyone and every site and they call the shots. By doing an all inclusive strategy with AdSense and Google Analytics they have a complete loop of everything everyone does from the minute you start typing a search string, what site you click on and they're still monitoring right up to the minute you leave the destination site. They've said before that bounce rate isn't a factor in the SERPs but if it isn't being used already (meaning they're lying to us to get continued usage of GA) it has to be coming as that's the true crowd sourcing metric. No amount of +1, Like, or reviews says this is our favorite site like the actual time spent on that site. Figuring out how to engage visitors is going to be paramount to ranking in the future.
However, in mobile search the criteria might be just the opposite - how fast can you get to what you need and service the intent of the request as efficiently as possible without long term engagement. That's why I say there may need to be two types of sites that address the specific needs of the visitors based on the type of device they're using and not just use a responsive design to address formatting.
Regardless, the number of sites servicing requests for either desktop or mobile are already being funneled down into smaller subsets of sites. Many websites will drop out altogether because they simply won't be able to compete as they are further and further squeezed out in every update. Many of those discussing this very topic probably won't survive and will be referred to in future threads as "collateral damage". ;)
We obviously can't predict the future nor code sites specifically for what will come, but I'd suggest at a minimum that the next generation of SEO will start to happen by sites making a strong social foothold today. The crowds are already deciding what search terms to use and dictating what sites to pick. If you wait until the future to adapt to what the herd wants someone else will already have your spot and it will be too late. Not only will it be too late, I think when the smoke clears that the popularity of those top 10 spots will be so strong that any newcomers trying to break in will be nearly impossible.
The future of SEO might require creating new branded words just to carve out a new top 10 list that didn't previously exist. Getting the masses to say something new, like Pinterest, Instagram or Twitter, is often easier than making other sites budge for existing keywords and if it's your trademarked term you might get the #1 spot by default. Obviously not everyone has the ability to create such viral sites and terms, but if you can, it's certainly easier than trying to unseat Yelp or Angieslist out of the top 10.