Welcome to WebmasterWorld Guest from 22.214.171.124
This might be good for Google in doubling it's real estate assets.
Should webmasters have a strategy for this ?
The thought went through my mind that Google's cluttering of page 1 results with directories , You Tube , Wiki's etc etc and whatever else that will occur in the future will push users more onto the 2nd page for organic results.
Depends on what you're searching for. I haven't noticed any deterioration of Page 1 results in Google. Quite the contrary: They tend to be far better than they were a few years ago, when the Page 1 results from a search for "Hotel Whatsit" or "Widgetronic W-1000" would be packed with affiliate and dealer pages.
Also, believe it or not, some of us like seeing Wikipedia results on Google's SERPs. In terms of relevance, they're nearly always spot on, and they tend to be informative and useful.
they're nearly always spot on, and they tend to be informative and useful.
Does that include the plagiarized stuff, and the factual inaccuracies?
In the past couple of weeks I have had opportunity to read a couple of dozen wiki articles. Time and again I have seen content literally cut and pasted from other sites - often, but not always, manufacturers sites. Also had opportunity to see articles flagged for being biased, and have come across any number of serious factual errors in areas that I have some knowledge in.
The wiki experience is deteriorating. Hardly what I would consider "best of class" for many searches.
Time and again I have seen content literally cut and pasted from other sites - often, but not always, manufacturers sites
Well, the editor of WIRED got caught plagiarizing from Wikipedia, so what goes around comes around. :-)
The real problem we, as webmasters" have to deal with is how to get "widgets" to the top of the other 12M other "widget" sites offering THE SAME EXACT info. Hence SEO and all that other stuff.
Walk into any big city central library, or a large university library and find what you are looking for... just a visualization of what the web has EXCEEDED info wise.
I generally find the best part of what I'm looking for on Page Two... or Page One of Bing which is a few petabytes shy of G's older database which continually offers cached pages from dead websites more than 10 years old. So yes, in my opinion, jockeying for a spot on page two is the next "holy grail".
Google would be fine if they would simply boot every (useless) shopping/comparison site from their results.
I'm all for that. Trouble is, a lot of the useless shopping/comparison pages (and mislabeled "review" pages) are on otherwise legitimate sites like CNet and ZDNet that have useful, up-to-date information on other pages. Filtering out junk sites shouldn't be too difficult, but separating the wheat from the chaff on important mainstream sites is likely to be more complicated.
Can you imagine the SEO 2.0 sales pitches... "Ranking 11th is the new 1st! We can guarantee your site shows up at the top of the new valuable 2nd page... we'll also throw in some free internal no-follow link sculpting if you buy now!"
Ranking 11th is the new 1st!
Dang right--and two is twice as valuable as one!
As for the quality of the page one results, for this search I was bumped by an online magazine article on the Acme company from two years ago, a wiki article, and a couple of other low-quality results.
Is anyone actually seeing more traffic and/or income deriving from page 2 rankings? I can't say I've spotted such an effect so far.
Now that you mention it...
We have a site that sat at #11 for a plural single word term (#3 for the singular) that we have been trying to get onto page 1.
It has been bouncing in and out of the #9 spot the past week or so, and traffic on the week is down.
Not enough data to tell, but I sure hope getting onto page one doesn't result in a drop in traffic :(
You have to realize that Google is stat driven
If something stays up after a year it must be working and what people want.
Yes and no. i think this is true for popular searches that have a lot of statistical query data deciding what is relevant to users and what is not.
However, people type an enormous amount of wierd, random and poorly-worded queries into search engines. I think a lot of irrelevant search results are delivered to users every day because the queries they enter are NOT popular, but rare.
Google can't make any sort of solid statistical decisions about queries that only get searched 5 times per month.
[edited by: Acrill at 12:51 am (utc) on June 30, 2009]
If something stays up after a year it must be working and what people want
not so sure about that.
for example, whatever stats indicate that youtube videos in the front spots is what people want. now look at those people. mainly kiddies? be it the majority likes the current serps composition. what if there is a strong influencial or well-funded minority opposing?
or on the other hand, be it solely for the money: take some of the lower standard spam-like adsense ads: economically they work, too - but people hate them.
there are short-term and long-term perspectives to consider. short-term financial gains may result in long-term user aversion. short-sighted marketing decisions may result in people clicking less ads or turning their back on google search. especially in this point, you can't tell me google knows everything about their visitors and does everything right. in fact, i see a weakness here.
For example, one of my page 5 results is just as popular.
Search volume, result placement, incoming links, traffic from these links, email referrals, etc... all come into play and it's hard to say exactly which one is the winner.
"Hey, we've been working on taking away every listing on page 1 so you all have to resort to buying Adwords. But what do you know, page 2 is getting popular".