Is there enough of an overlap between the search terms and the company name that Google would see this as a branded search?
Google often treats brandname type searches as an indication that the searcher wants to see multiple results from the same site... almost as if the user were doing a site: search.
|I have noticed a site with the first three listings on google for the same search term. |
This seems to be the likeability social factor Google wants! I can show you results with the first 5 results from the same site PLUS +see more results from example.com, usually another 4 results.
And it doesn't seem to matter just how bad or spammy these pages are, this is what Google seemingly believes is the quality factor that Google wants us to adopt...bye, bye Google, I ain't producing garbage like that and I'm pretty sure it's not what anyone is looking for in my industry whether trade or retail purchasers.
Are you kidding? Feed in the right search terms and you'll get multiple listings for the same page. It just has to be dynamically generated so its address comes up in more than one form. One of my standard "test phrases" always starts out with two or three versions of an Amazon page selling the same book.
A lot has to do with Google's perceived user intention for the specific search phrase. If they have data that suggests many such searches are "navigational" in intent - actually trying to find a specific website - then that website tends to get more results.
This can make competing for that phrase a more challenging task - even daunting in some cases.
The local scraper site in our niche has this benefit. They take up the top 3 spots for most searches in our niche.
google need to tweak it a bit though...
search for "phpbb hacks", for example (modifications for the open source phpbb forum software) and you get the same site in the first 4 positions. but those extra 3 positions are just duplicates from the 8 site links that they've already has listed under position 1.
i feel sorry for the "actual" second site in the SERPs, because they have TWELVE links ahead of them which all point to the first site.
I've noticed this a lot in the keywords Panda deranked me for.
In several places different pages on the same site rank multiple times on the first page for the keyword. Sometimes those pages don't even really show any different, sometimes only one of them is even relevant to the keyword.
Yes I have that in my niche too. The first and second spots are taken up by different pages of the same website. The first page is relevant to the query. The second page includes everything on the first page and some additional products that aren't really relevant to the search phrase. How is it a good user experience to dilute the relevancy of the search results in this way?
Futile though it may be I have complained to Google in a similar case. It might have been coincidence but in a similar case with a different website I had a triple listing turned into a mere double (although the two pages in this case were both quite relevant to the query). On the plus side I overtook both pages in one fell swoop and killed two birds with one stone.
The search term in question is a generic term. Has nothing to do with the specific website
|i feel sorry for the "actual" second site in the SERPs, because they have TWELVE links ahead of them which all point to the first site. |
Really? I'm green with envy over the first site :). I'm reminded of the orangutang in that disney movie adaptation of the rudyard kipling story singing "I-I-I want to be like you-hoo-hoo". That's me, for sure.
Here are some references to previous discussions on the topic. Some of what people are seeing is likely to be expected behavior... or it may be something new.
Showing more results from a domain [googlewebmastercentral.blogspot.com]
Google Webmaster Central Blob
Friday, August 20, 2010
|Today we’ve launched a change to our ranking algorithm that will make it much easier for users to find a large number of results from a single site. For queries that indicate a strong user interest in a particular domain... we’ll now show more results from the relevant site.... |
And, the first of many discussions we've had on this topic since Google's announcement....
Google To Show More Results From a Domain
Do you mean top 3 as in three equal entries without being indented?
Multiple pages with second and subsequent entries indented have been around for as long as I can remember.
With Bing I can find pages from my site scattered throughout the SERPS (eg pages at 1, 3, 7 and 12)
few months back i have 2-3 listing of same website in local google, but now i had only one listing from one website, but now i have 2 listing of my different websites on same keyword.
This is a multi-part answer, as I am seeing several situations where three results are being returned together at the top. In response to the original post here, I had asked...
|Is there enough of an overlap between the search terms and the company name that Google would see this as a branded search? |
By "branded search", I meant are any of the keywords prominent in the brand or domain name of the ranking site? (I need to say that I have since seen the particular example ffctas was asking about, and IMO, it is a branded search).
The site returning three listings here is Widgets.com, and the search was [keyword widgets]. Even though "widgets" was used generically in the query, "widgets" is clearly a word that's part of the company and brand name. It's not like it's GoodWidgets.com... it's Widgets.com.
Beyond that, I see that Widgets.com does have text and image content in its category and product pages that appears to be more original and "deeper" than what's on the other competing widget sites. It's not great content, but for a catalog site it's not bad.
I don't think that three results in this case was automatic, though. On a single-word search for "widgets" in the above, Widgets.com ranks #3, so it's not a powerhouse... so it took some competitively good content along with a degree of brand identity/ keyword authority to produce the three results.
I'm recently seeing some new results that illustrate this point....
On a 3-word query that I've been monitoring for several years, there's an 800-lb gorilla Wikipedia-level site at the top which has just moved from showing two listings to three listings. The keywords here are not exactly branded, but the site has an entire section for which the 3-word query is a perfect fit, with a subdomain matching one of the queried keywords.
Moreover, and this is important, the content is definitive. So this might be an example of brand building over time. Even 800-lb gorillas have to work their way up from two bananas to three.
A PS to the above... something I considered posting and thought the post was getting too long.... Google has been returning multiple listings for some sites and queries, IMO, as a way of testing user satisfaction and perhaps ultimately of shaping traffic.
I'm increasingly seeing more queries returning three results (none indented)... even long tail queries... and I'm fairly certain Google is testing these results.
For those who are seeing three listings, it would be interesting to monitor them for a while and see what happens.
< moved from another location >
Mod note: title for this post before I moved it here was "Multiple Results"
I find this a little hard to fathom, I back tracked a ref link from Google just now, a 5 word search term. This is a highly specific term and the niche is highly competitive. The results are hardly what you would call "optimal" if your considering the end user experience.
We have in this order -
adwords in the top 3
3 results from one domain as the next result
4 results from another domain next
results 8,9,10 are different and relative
As a person searching the web I don't think this is providing me with enough variety. Its forcing me to "look right" - Anyone else seeing this sort of thing?
[edited by: Robert_Charlton at 2:43 am (utc) on Jun 22, 2011]
kidder - I moved your new post because of an observation I'd made above (emphasis added), which is right in line with your question...
|I'm increasingly seeing more queries returning three results (none indented)... even long tail queries... and I'm fairly certain Google is testing these results. |
The three pages returned by 800-lb gorilla site I noted above have been collapsed to just one... actually after just a few days. The three listings from the same site noted by the OP in this thread are still showing as 3 separate listings.
I do believe that Google is now testing user satisfaction for long-tail queries, much as they had previously tested for single-word queries. I suspect that this will further dent the content farms like ehow, which were precisely targeting many long-tail variants and doing very well at it, because there were in fact some factors in the Google algo, which, when combined, made Google vulnerable to what ehow was exploiting.
Not all multiple results are due to exploitation, though, and I believe that Google is testing whether they are what searchers want.
I just fed in my own pet search phrase and found that google-us still has amazon in 1,2,3 position ... but google-ca gives them only one, where before it was a pretty close match. Are Canadians less tolerant of nonsense?
|I do believe that Google is now testing user satisfaction for long-tail queries, much as they had previously tested for single-word queries. I suspect that this will further dent the content farms like ehow, which were precisely targeting many long-tail variants and doing very well at it, because there were in fact some factors in the Google algo, which, when combined, made Google vulnerable to what ehow was exploiting. ...Not all multiple results are due to exploitation, though, and I believe that Google is testing whether they are what searchers want. |
eHow passed Panda 1.0 with flying colors. So did Squidoo, IIRC. From the real content farms, looks like the only ones caught were the greedy content farms with 4-5-6-7-8 adsense and Chikita ads. Embarrassed of having ruined small businesses but letting eHow increase ranking, Google turned to the Chrome browser and asked the HackerNews crowd to weed out 'content farms.' That alone took down eHow but could have very well taken out Wikipedia since they are not liked either.
My point is that a lot of times we give Google too much credit. It's possible that they will now look at the signals that a certain site, eHow for example, sends and then nuke all the sites that send similar ones, but that's almost like taking manual action (plus the nuking of many innocent sites.)
walkman - The process you describe is essentially what I mean when I said that Google has been testing user satisfaction. User dissatisfaction is an aspect of user satisfaction.
I've almost always felt that, in one way or another, Google was looking at user satisfaction. It's inherent in the concept of PageRank and linking, and it's evolved from there.
Google has also used direct human input in lots of ways. It's not new, and I don't think we should think less of Google for using it. The trick has been to make it scale, to minimize possibilities for manipulation, to cross-check it with other behavioral and algorithmic signals, and to make corrections. That's what Google has recently been doing.
I'm very much with you that it's too bad they weren't perfect the first time around, but on changes of this scale that's very hard to pull off.
On questions of how may results one site should have at the top, that's likely to depend on the particular search, and it may be impossible to determine without testing and refining the algorithm.
One of my serps for the last month:
First three obviously 'adwords'
second 7 results are all 'places'
first 4 organic are all one extremely spammy site
My site #5 and receiving very little traffic. 14 places down now for the #5 organic result! My site is 100% clean and whitehat, 100% follows the webmaster guidelines and best practices. Completely relevant yet it's impossible to get any traffic unless I pay for adwords.
As with all the other google offerings and 'features' this is just another thinly veiled way of forcing businesses to use adwords. Even if you acheive great rankings, there is no longer any traffic. This is clearly by design and you can either go out of business or fund your adwords account.
Stop being EVIL.
shazam, why don't you go for a Places pages yourself? Clearly Google thinks this query phrase is Local Search.
Without divulging too much, for obvious reasons this just isn't possible. There are some unethical folks in my industry that have hijacked unclaimed places and of course google rewards this behavior. I refuse to operate my business in an unethical manner regardless of google's preference for evil. I offer extremely relevant and useful information on entire areas, I do not and will not set-up offices all around the world just to get listed on places, nor will I hijack existing businesses that haven't claimed their address.
So for this and many other queries, my #2 organic serp is 14 spots down! This is clearly by design and no mistake on google's part. The big boys in this niche spend literally millions of dollars every single month on adwords and google knows they can get away with pulling crap like this to force businesses to use adwords.
So, years of building great sites and writing useful relevant content is now completely worthless. A #2 organic serp was a good accomplishment and meant nice traffic volume. Now being 14 spots down this same serp = barely staying afoat.
My only reliable and profitable traffic these days comes from type ins and bookmarks.
[edited by: tedster at 5:15 am (utc) on Jun 27, 2011]
shazam - I'm hoping that the testing Google is doing will eventually sort out which pages deserve to be featured in serps as local Places.
I'm making a guess here, but from what I'm seeing, this is the kind of "Universal" exposure that Google has been looking at for some time. I'm assuming that to get enough data for local calibration, Google is needing to leave these local "refinements" in place for a while... longer than many of us would like.
I don't think that Google does have a "preference for evil" as you put it. Rather, I think, they don't like being gamed... and I think that getting sufficient data volume for locally based searches might take a while.
|shazam - I'm hoping that the testing Google is doing will eventually sort out which pages deserve to be featured in serps as local Places. |
It's not going to be resolved. The natural organic results will never be converted to places, it's a manual application process. Google will not abandon all these plans to drive the organic results way down below the fold and keep all the prime real-estate for google properties or paid listings. Places isn't going anywhere, especially with the recent purchases google has made.
Organic listings will soon no longer be a workable business plan. We will have a choice either Adwords or no google traffic for top converting keywords. The writing is on the wall. I have another site that has been sitting at #1 and #2 (2 listings) but only gets a few hundred visits a day since the top organic results are now way down below the fold. In the past I had substantially more traffic sitting at 5-8 than I now do at #1. They have been expanding not only the number of 'places' results but the height of each listing to further drive the organic results down.
Back to the original topic, I apologize for getting sidetracked.
It's looking like the best way to get more traffic is to make spammy copies of all your key landing pages. Sad, but true. If you look around the serps and find the top listings all going to one domain they are often just the same content that has been spun with different keywords and layout. Same girl different dress. Looks like spammy tricks are what google wants in the organic serps. Once again it fits perfectly in the over-all business model of driving more revenue. There's really no other logical reason for filling up the top spots with multiple results from the same domain.
In my opinion the only valid reason for multiple results from a single domain are in 100% obvious branded searches. Even in this case it's bad. The "site" operator has worked fine for years, but of course this doesn't increase adwords revenues.
|Google has also used direct human input in lots of ways. It's not new, and I don't think we should think less of Google for using it. The trick has been to make it scale, to minimize possibilities for manipulation, to cross-check it with other behavioral and algorithmic signals, and to make corrections. That's what Google has recently been doing. |
|I don't think that Google does have a "preference for evil" as you put it. Rather, I think, they don't like being gamed... |
The amount of power given to sites with Google's new multiple listings service of up to five results on one page, and often at the top, is an obvious target of the unscrupulous webmaster. Therefore Google should be much more careful in its use of multiple listings.
I get the "brand" idea and don't take issue with it. It's when searches are for non-brands, just keywords, Google is now, since Panda, defaulting to brands which use those keywords.
|The trick has been to make it scale, to minimize possibilities for manipulation. |
This is easy to implement. Use a [+] sign like other pages to expand the results. One click to see five results from the same site.
I didn't like it when Google started giving more than one page from a site, even though at least it was indented. I didn't mind the site links. I don't like the spammy feel of three-five results for the same search string.
How did we ever cope with only ten results for one search with only one link for each website?
Google now looks like a Spam Engine.
Here's one example of clutter from redundancy (and a wasted link)
Search for: Chicago Sun Times
You get the home page, then site links. The first is sports.
The first link after that is Sports.
What's the point of two links to the same page? It's spammy. I think they can leave the site links and stop the extra links to the same page/same site.