| 7:53 pm on Jun 11, 2012 (gmt 0)|
The sheep must like it, or the shepherd wouldn't use it.
| 7:56 pm on Jun 11, 2012 (gmt 0)|
I think its a plus for brands and a negative for competition and the rest of the internet
| 8:01 pm on Jun 11, 2012 (gmt 0)|
I can see one advantage, albeit in limited usage. For non-branded searches, host crowding is a tremendous advantage if a person has SEOd enough to show up for two positions on page one.
For example, if brandx was showing for money term in position 1 and position 10, they now appear in position one and two. This is a tremendous advantage for a "marginal" result (the position 10 result)
However that's the only reasonable reason I can see to remove it.
I agree that MC does seem to prefer it, and infers that nefarious SEOs would try and hack it to get around the system. In other words, the "hacker" would prefer a lack of host crowding. If he views that as hacking the results page then clearly the current state of affairs is less desirable from the user experience.
A follow on question may be warranted. Why would Google want this? It doesn't seem to be the preferred case for MC, it doesn't practically seem to increase result diversity, and there's in implication that host crowding was better fro the user. So why do it, save the edge case of "snapping" a second result up to the top?
| 8:04 pm on Jun 11, 2012 (gmt 0)|
> The sheep must like it, or the shepherd wouldn't use it.
If it makes the shepherd more money, of course they would use it. When you have a monopoly - it doesn't matter what the sheep think.
| 8:12 pm on Jun 11, 2012 (gmt 0)|
When I search for something on Google I often see links to only one or two website in the original SERPS above the fold. The rest is advertising. And sometimes Google Shopping Results.
Now what do people do in that case? Scroll down? Move to page 2?
No. They click the ads.
So from Googles perspective this makes sense - more money. Even if it means loosing a few users to other search engines.
This is especiall true for product searches. Often it's only the manufacturer above the fold - if you want to sell and want to be seen you have to advertise. This is a win win for Google. Searchers still find their products - not in the SERPS but in the ads - doesn't matter for them though. Satisfied users. Shops have to buy ads to be seen and pay money to Google.
So in my opinion showing multiple results for one site has only one reason: Block space in the SERPS, push other results down, increase ad revenue.
| 8:50 pm on Jun 11, 2012 (gmt 0)|
More comments on this thread [webmasterworld.com...]
| 10:14 pm on Jun 11, 2012 (gmt 0)|
Absolutely a poor experience, and that's coming from an avid Google user. In fact, it was the very first time that I looked at a SERP and thought "Ugh. Leaving." It doesn't happen often enough for me to switch to any other engine, but it was definitely a signal that things may be getting worse instead of better.
It's silly to seriously suggest, however, that any of these changes are made because they might increase ad revenue. As tedster wisely mentioned here [webmasterworld.com], "there's no direct dollar path from an organic search result to Google income." Web search is too much of a long-term asset, anyway.
I would therefore have to agree with ascensions: the "herd" has simply shown a preference for this. Sheep are odd, is all I can say to that.
| 11:29 pm on Jun 11, 2012 (gmt 0)|
|If it makes the shepherd more money, of course they would use it. When you have a monopoly - it doesn't matter what the sheep think. |
| 11:43 pm on Jun 11, 2012 (gmt 0)|
I'm not sure I'm seeing precisely the same thing that Brett is discussing.
The additional inline results I'm seeing this week appear to be search refinements that resemble what I'd call related inline Sitelinks. It looks almost as if these inline refinements are additional choices that searchers might look for if they backed out of a site and did a new search on Google.
In that regard, they're more query-specific than regular Sitelinks or inline Sitelinks that I've seen before. Different queries, of course, provide different kinds of choices.
I see them as ongoing tests. Google has been doing this for several years now, with the choices either based on user behavior data... or trying other variants to provide user behavior data. During the early days of Panda, eg, I was seeing Google return pages that were algorithmically relevant but basically weak and destined to be culled.
Right now, I'm seeing 2 or 3 additional inline refinement on test searches I've just tried. This current bunch, I'm thinking, may be trying to correlate prior search behavior with user intention... that's what it looks like. Regular sitelinks (up to 6) in some cases as well, may be displayed if it's obviously a navigational query.
While a general search on a brand name will return a lot of pages for a site, as soon as you start getting very specific, Google appears to reduce that number considerably. I've also been watching this since it first happened, and Google, if anything, appears to have reduced the number of pages given to a manufacturer. That said, if I were trying to sell Apple iPads online, I wouldn't be happy about the results Google is returning.
Like many dozens of other refinements that Google has tested, these may not stick. My feeling is that while they're all generally moving towards greater searcher satisfaction, they are also extra targets, providing extra exposure for Google to test whatever it's testing.
These inline Sitelinks are not the only way Google pushes extra clicks to the surface, but I'm sure they're one of the ways.
For more on some aspects of Google testing we've been discussing, I recommend checking out this discussion and the threads it links to...
Zombie Traffic and Traffic Shaping - Analysis
| 11:48 pm on Jun 11, 2012 (gmt 0)|
who have they asked in that sample? even if 80 percent of people slightly like host crowding, but 20 percent strongly dislike it: what does google win with that outcome? by now, too often they are confronted with user reaction like that: most of the people don't mind or don't notice their "improvements", whereas a small but strong fraction is upset to the point of bad pr and leaving the service.
one day this may break googles' neck. mistakes of a monopolist who got out of touch with his users.
| 11:48 pm on Jun 11, 2012 (gmt 0)|
>> I would therefore have to agree with ascensions: the "herd" has simply shown a preference for this
Then the 'engineers' at Google are idiots. They only became popular because geeks around the world flocked to the service and made it what it is today. I highly doubt anybody with an IQ above 80 finds Google search results that useful.
Once something better comes along, and it will, the real shepherds (not Google) will lead the sheep to something better.
If you look at the price of Google stock lately, the making money theory seems to make sense. Greed is blind.
I miss the 'do no evil' Google.
| 11:56 pm on Jun 11, 2012 (gmt 0)|
I guess I'm confused by what you mean by inline sitelinks.
Are you talking about the horizontal row of, say, four links underneath the snippet?
Or the four full search results all coming from the same domain?
Cause for my sites, I'm seeing BOTH. Four to a page, for the first few pages.
Here's what I mean:
1. Netmeg's Way Cool Site
blah blah blah about the way cool site and what is contained within
sitelink1 sitelink2 sitelink3 sitelink4
2. listing for sitelink1 page
blah blah blah snippet about this page
3. listing for sitelink2 page
blah blah blah snippet about this page too
4. listing for sitelink3 page
blah blah blah snippet about the third page
It does look broken.
[edited by: netmeg at 12:00 am (utc) on Jun 12, 2012]
| 11:57 pm on Jun 11, 2012 (gmt 0)|
|Matt gives quite a few comments about whey host crowding is a plus. In fact - listen close - seems almost as if he prefers the old method like all of us do... |
Matt often contradicts himself, often tells us one thing and then does another. I've stopped listening to him. I prefer to listen to my peers and my own experience.
I don't know what they're up to with the new system of displaying results, but I know it's not to improve the users experience, that much is for sure.
I do wonder whether it's in any way related to the phenomenon of traffic throttling.
| 12:30 am on Jun 12, 2012 (gmt 0)|
Edit: Just read the other thread and it makes more sense as to why the above post was just moved here.
@Brinked, you're right, this is a discussion forum and as far as I'm concerned, Outland88 was doing just that, discussing. I feel your remarks were a little unjust and unfair.
Anyway, lets not turn this in to a personal debate. Back on topic!
| 12:55 am on Jun 12, 2012 (gmt 0)|
Brinked, that's your opinion, but it's easy to skim stuff that you have no interest in. Your insults towards Outland88, are certainly not contributing to the thread either, are they?
SEO is full of tin foil hat theories, based on our own experiences etc. I personally haven't seen you post personal insults before and your contributions are usually good, so I'm a little saddened to see you post in this manner. Hopefully all of our posts will get deleted and everybody can continue to discuss the subject matter.
| 1:09 am on Jun 12, 2012 (gmt 0)|
@Netmeg, I too am seeing a mix of both, even within the same search.
I'm looking at a search right now, that has one website in the first 4 spots, mine in the second two and then the next result, has 4 indented results.
| 2:06 am on Jun 12, 2012 (gmt 0)|
I agree, Brett, if I'm not liking the first result from a site I NEVER click on the 2nd, 3rd, 4th, 5th, 6th, 7th, 8th, 9th etc result from that same site. If the page didn't serve my needs why would a category page linking to that page or a completely unrelated page linking to that page do me any good?
For me this nail ranks right up there with personalized results (on shared computers it's annoying as all heck) and with the butt-load of crap ahead of spot #1.
Google isn't a pure search engine anymore, they are too busy trying to copy everything else that is popular online. Wake up, Google. The following provides a measure of relief by turning off personalized results and at 100 results per page at least there is no page turning needed to get to the good sites which are, unfortunately, almost always below the fold now.
Sign out of your Google account for it to work. https://www.google.com/webhp?complete=0&hl=en&num=100&pws=0
| 3:22 am on Jun 12, 2012 (gmt 0)|
They've shown multiple pages from my site for a while now but this is the first time I've noticed they show the sitelinks and THEN the same pages. It's totally redundant, and I don't get it. They are the most popular pages on the site, but not so nice they need to be there twice, as it were.
| 3:47 am on Jun 12, 2012 (gmt 0)|
I'm seeing a lot of this sort of thing too .. being logged out makes it marginally better IMO.
I like Bings sitelinks layout .. I can find actual addresses and streets in Bing Maps, whereas Google Maps only shows the general locations whilst searching for obscure out of the way places ..
I do like Google's spellcheck quite a bit ..
Over the months, Bing has been whittling away at Google .. bit by tiny bit .. if this keeps up, there isn't going to be too much left for Google to do .. 'cept sit there and pine away over it's loss to Bing.
This whole business of having to look at the same domain occupy the first 5 or 6 positions in the SERP's is probably one of the dumbest things I've ever seen .. all it does is get me to page two quicker, providing that I don't leave first.
Who would have ever thought that the big-ole-Google-killer would be Google itself?
| 3:54 am on Jun 12, 2012 (gmt 0)|
|They've shown multiple pages from my site for a while now but this is the first time I've noticed they show the sitelinks and THEN the same pages. |
They might be testing the click throughs to see how the sitelinks compare to regular results.Yes, i do agree that click throughs for sitelinks are always going to be less as most normal users would be blind to them. But still they might be testing it.However I would feel that it isn't Google's job to essentially send users to all relevant pages on a site for the user query. Google has to essentially focus on sending the users to the best page on a domain for the user query. It is the job of the concerned site to lead users to other relevant pages.
| 4:48 am on Jun 12, 2012 (gmt 0)|
Are most users blind to sitelinks? I love them, when they have the page I wanted - store locator pages are a good example. I often look up retailers just for that, and it's much easier to click the "store locator" sitelink than try to find the link on the retailer's site.
| 4:56 am on Jun 12, 2012 (gmt 0)|
Wow. This has to be the stupidest thing they've done yet. I just did a common search to find out what everyone is talking about. The same three sites appear multiple times on pages 2,3,4,5. Why would anyone want this?
G have well and truly lost the plot.
| 5:01 am on Jun 12, 2012 (gmt 0)|
(Note that this is from a very sloppy series of test searches, not what I'd call exhaustive or systematic).
This seems to be the rollout of something new, but also likely to be a full spectrum test. It varies according to type of query and type of site.
For ecommerce searches, I'm seeing the pattern I reported above... "2 or 3 additional inline refinements" on the top several results, showing departments within the site returned.
I'm assuming that what netmeg saw was on a search where we'd previously have expected Sitelinks or mini-sitelinks, and I tried searches where I know a site is dominant, and also navigational queries.
I'm not yet seeing the full monty that netmeg's seeing, but I am seeing a larger mixture of different types of refinements than I have before. That does smell like a test.
The pecking order makes some sense in terms of type of site and search vs what's likely to be most useful. The general pattern I'm seeing is that a search on which a site is fairly dominant will return serps corresponding to one of the following, in order of increasing importance...
- inline mini-sitelinks
- inline mini-sitelinks and an additional page as a second listing
- six sitelinks (in two columns) and an additional page as a second listing...
And, as you start including terms contained in the domain names, moving towards a navigational search...
- you continue getting the six sitelinks, and the additional page below turns into a plus box offering more results from.
- then, inline mini-sitelinks are added to the above...
- and then, the ultimate that I've seen, with a navigational search for the New York Times, I'm seeing mini-sitelinks, each on its own line and each a top story headline and, instead of a more results from link, I'm seeing a search box below the six Sitlinks in two columns.
Haven't seen a search box in the serps for a while. Also, I think I'm seeing all this change over time. I should note that I'm seeing these results whether I'm signed in or not.
On a site when I'm signed in on an account where I haven't yet enrolled in Google+, though, and have Web History turned off, I'm getting this additional message on some results (in place of the number of pages normally displayed)...
|80 personal results. This is a limited preview. Upgrade to Google+ |
The number varies, and the "xx personal results" is a link. I tried the link, and candidly... no axe to grind... I very much prefer the unpersonalized results. The personalized results, IMO, are a total filter bubble.
netmeg - You may have gotten a personalized narcissistic serp jackpot. ;) I'm curious how long your particular mix continues, and whether or not the results you're seeing are somewhere labeled as personalized.
| 5:07 am on Jun 12, 2012 (gmt 0)|
PS: Very rough tests using "set location" to vary location suggests these refinements are likely to be location independent.
PPS: I'm also noticing on a test search that a current announcement (as in QDF) on a blog produces a single inline Sitelink that leads to a permalink of the announcement.
| 8:47 am on Jun 12, 2012 (gmt 0)|
personalized narcissistic serp jackpot is totally gonna be the name of my band
| 12:10 pm on Jun 12, 2012 (gmt 0)|
So these guys are smart?
I think that some advice to sites affected by the Panda fiasco was to use subdomains. Now this seems to be somewhat out of favour with the borG.
| 12:33 pm on Jun 12, 2012 (gmt 0)|
I often see forum sites in the serps ranking with 4 or 5 threads about the topic right under the main listing for that site. If I search for "blue widgets" there is actually a forum site ranking like that:
If that is what you are talking about, I for one, find this to be very helpful. Often there is a specific forum site about the topic I'm looking for and they have multiple threads where other sites don't even have one good thread. I've found myself clicking on all of the threads from the site with multiple listings in the serps to find the best thread on the forum site in question.
[edited by: engine at 3:44 pm (utc) on Aug 15, 2012]
| 12:59 pm on Jun 12, 2012 (gmt 0)|
Deadsea, the issue isn't with those results, it's with the new phenomenon of there being several individual results for the same website, one after the other. So for example, 4 results out of the 10, taken by a single site.
| 1:59 pm on Jun 12, 2012 (gmt 0)|
I've done about 50 searches this morning and I've only seen this once. The only example that I was able to find was "range rover". The serp has the expanded section from landrover.com followed by three additional listings for landrover.com before moving on to wikipedia.
Matt Cutts was talking about "HP" and another seo site was talking about "bjs menu". Neither of those queries have extra listings from a single domain for me. It seems like this thread is making a mountain out of molehill. Either that or its just not fully rolled out and I don't see it yet.
[edited by: engine at 3:44 pm (utc) on Aug 15, 2012]
| This 145 message thread spans 5 pages: 145 (  2 3 4 5 ) > > |