| 2:19 pm on Jun 12, 2012 (gmt 0)|
Search for brown leather Nokia theme
| 3:39 pm on Jun 12, 2012 (gmt 0)|
I just found a site with 9 (this is a record for me) listings above ours!
Now I understand why a couple of terms took a nose dive last week. We actually moved up on most places if you count 'sites' but a couple of the lower ranking terms took a real nose dive if you count 'pages'.
This is why the higher ranking phrases seemed to stay static or move a small amount up/down and the lower ranking pages dived.
The recent Panda update made virtually no difference other than the host crowding issue.
Do we have to wait for the next Panda update for this to correct? It's going to be a painful month for some people... and marvellous for those with 9 listings!
| 4:00 pm on Jun 12, 2012 (gmt 0)|
new? its been this way for months.
| 4:10 pm on Jun 12, 2012 (gmt 0)|
|Search for brown leather Nokia theme |
Wow, that's bizarre. Same domain over and over, and then another domain with sitelinks in the middle of the page. Never seen a sitelink domain in the middle.
There may well be some testing going on, but it definitely looks broken, too. I wonder if the way the results display is messed up?
| 4:43 pm on Jun 12, 2012 (gmt 0)|
|new? its been this way for months. |
Yep, it certainly has been in my sector however I'm now seeing it even worse and G's really loving Chinese directory results.
| 5:14 pm on Jun 12, 2012 (gmt 0)|
Well its almost Google vacation time (pretty much all of July and August for upper echelon at Google) so its likely another problem that won't be acted on until September if ever. Plus the stock prices don't seem to be as heavily linked to Search as they were.
| 5:41 pm on Jun 12, 2012 (gmt 0)|
Here is an absolutely priceless contradiction from Matt Cutts' blog...early 2011
|Seven of the top 10 results all came from one domain, and the urls look a little… well, let’s say fishy. In 1999 and early 2000, search engines would often return 50 results from the same domain in the search results. One nice change that Google introduced in February 2000 was “host crowding,” which only showed two results from each hostname (here’s what a hostname is). Suddenly, Google’s search results were much cleaner and more diverse! It was a really nice win–we even got email fan letters. |
I don't think Cutts has been behind the recent disaster in excessive duplicate domain results. Amit Singhal probably is or Larry Page probably made an executive descision on this matter.
I think it is so funny that Cutts brags about receiving fan mail for reducing duplicate domain links (which is a great thing) then reverses course with this weeks most recent video talking about how it can be appropriate for websites to have more than two listings on a search page (which is flat false).
| 5:46 pm on Jun 12, 2012 (gmt 0)|
People with short memories think / hope that everyone else has short memories too..Maybe Matt can do like Eric suggests, and change his name ? :-)
| 5:49 pm on Jun 12, 2012 (gmt 0)|
heh .. the way it used to be? .. It appears that Google has gone back to the way it used to be :)
Be careful what you wish for
| 6:00 pm on Jun 12, 2012 (gmt 0)|
one thats new on me for a product search, page 2 has 7 results for amazon (thats not new), but each one of those seperate results is a seperate customer review. 2 lines of a review plus 188 links for different products on each of those pages.
Now when this domain crowding hit it was mostly focused on sites that were highly on-topic and strong for the search but now it just seems to be for any domain that has authority whether or not they are strong for in brand terms (or any other terms) for the search. Crazy.
| 6:06 pm on Jun 12, 2012 (gmt 0)|
@smithaa02 ha, thanks for the citation!
And yes, this is pretty bad. I see only about 3 domains on page 1 for lots of queries. But hey, if these sites are of such high-quality, then why not show many, many pages from them?
| 6:11 pm on Jun 12, 2012 (gmt 0)|
my son loves mac and cheese so I did a search for mac and cheese recipes
1 site has the top 3 results for this search, and then that same site has 5 consecutive results on the 2nd page and 7 consecutive results on the 3rd page. This is unacceptable. Having mass results from 1 domain like this is not in the users best interest.
You can not even hunt for diversity because the same site is listed on pages 2-3-4 etc so what is the point? Not only that, there are only a handful of domains displaying in the top 5 pages.
These websites usually have a listing page of all their related recipes, so ideally this domain will have 1 result in google and it would be the listing page so you can simply just browse all the different recipes that site has rather than seeing them all on google.
| 6:29 pm on Jun 12, 2012 (gmt 0)|
|I think it is so funny that Cutts brags about receiving fan mail for reducing duplicate domain links (which is a great thing) then reverses course with this weeks most recent video talking about how it can be appropriate for websites to have more than two listings on a search page (which is flat false). |
And that ends the speculation that maybe users like it like it is right now. They didn't then, and they haven't gotten stupider.
| 6:31 pm on Jun 12, 2012 (gmt 0)|
What I am finding interesting is people are simply skipping past the results - they do not seem to care - for now.
If there ever was a chance for Bing...
| 6:41 pm on Jun 12, 2012 (gmt 0)|
Who cares what Google does, it's their serp, they can do what they please with it, and they do.
Until webmasters actually do something to encourage visitors to use another search engine nothing is likely to change.
And it appears that most webmasters would rather whine about Google that do anything to change user behavior.
| 6:56 pm on Jun 12, 2012 (gmt 0)|
brinked cites an example that goes way beyond anything I'd encountered. I now understand the downside of the situation. While I don't want to open up the forum to discussing other specific searches, I think this search is noteworthy enough that we should consider it as an example of the problem...
[mac and cheese recipes]
|1 site has the top 3 results for this search, and then that same site has 5 consecutive results on the 2nd page and 7 consecutive results on the 3rd page. This is unacceptable. Having mass results from 1 domain like this is not in the users best interest. |
In the remaining positions, several other domains also have more than what I'd call a reasonable share. We're certainly not going to distort the results by mentioning the search terms here. ;)
IMO, this looks even worse than content farms.
| 7:13 pm on Jun 12, 2012 (gmt 0)|
What I don't quite understand is what's it to Google to become just an extension of Amazon's navigation (or that other site reported by brinked )
In other words, why would it be beneficial for Google to not pick the one most relevant out of the 8 results they show and then simply hand it over to Amazon and let THEM guide people further to other relevant products. Who would know best what's relevant, Google or Amazon who's been selling those for years, have customers' behavioral data, sales history, customer reviews etc.
And, in any case, why give them (or anyone else in the same position) this extraordinary amount of screen real estate? Funny thing is that Amazon is probably the one website that needs free Google traffic the least. I cannot recall when was the last time I started ordering something from Amazon by not just typing "amazon.com" in the address bar of my browser.
(the 8 Amazon results on one SERP are the example from the other thread: [webmasterworld.com...] )
| 7:43 pm on Jun 12, 2012 (gmt 0)|
Thanks for not deleting my example Robert. I do not like posting specific examples as well and I know WebmasterWorld discourages it for the most part. Being that this specific search is something a normal typical family would do, and not involved in the ecommerce industry like most of what people are complaining about, I figured I would take the risk and post it.
As webmasters, we like to complain about changes that effect our websites negatively. This is something that is without a doubt effecting my experience as a user and I think that is what we need to emphasize here.
Besides being a webmaster, I am a regular google search user that searches for a variety of different things on a day to day basis. This experience has taken a negative hit in the last 2 weeks. No I am not going to pretend that it is forcing me to use bing, and no I am not going to over exaggerate that I can no longer find what I am looking for. I still use google and I still (eventually) find what I am looking for after tweaking the search phrase a few times as well as digging deeper into the serp's.
The example I mentioned, is not even close to being the most extreme case I have encountered today. I only posted that example because it is a neutral search, something that I have no business in and I doubt anyone here does.
Hopefully google fixes this soon..I will not be switching to bing any time soon, but it does make finding what I am looking for harder for the most part currently.
| 8:37 pm on Jun 12, 2012 (gmt 0)|
Very true , in fact until I read your description it looked fine to me - compared to what I see in my niche in the UK. For one search I just did we got pushed down 20 places due to these factors.
|The example I mentioned, is not even close to being the most extreme case |
| 8:54 pm on Jun 12, 2012 (gmt 0)|
These occurrences were pointed out in previous threads and apparently some felt it was a conspiracy and only confined to the Amazon domain. It also seems to affecting different areas drastically at different times. I don’t see it clearing up that quickly as some alluded to.
Personally I see it as some type of reverse testing where you’re summing the game out to zero and reapplying the layers to see where the problems lay. Since webmasters seem to be reporting it from many areas it might logically be a bug. In other words any PhD trained in testing methodology is not going to risk that much flak possibly coming from multiple directions with a flawed method. The tricky thing with this problem is when you run Panda or Penguin you may strike at a whole new set of sites at various times because the variables are ever changing especially if you’re trying to create self-learning. I don’t see it as that either. Is this the untouched core set?
Bottom line is you’re dealing in guess-o-matics and without transparency many could be re-building sites needlessly. In older days the answer was to minus the situation out of the equation. In other words fifty competitive search engines usually kept the others from going out on tangents that threatened all.
| 9:05 pm on Jun 12, 2012 (gmt 0)|
I just saw one that took up 16 or 17 positions over 3 pages, yet oddly I do not see the effect at all in many niches where I'd expect to see it. Test?
The Matt Cutts video, which Brett posted at the start of this thread, clearly lays out the positives and negatives of host crowding, along with the negatives of dropping it... and I feel at the end Matt hints that this might be a test. I can imagine a discussion in a Google meeting room where the approach was hashed out, and ultimately the opponents said, "OK, let's try it and see what kind of data we get."
The video is interesting to see again in light of the new results....
How does Google decide when to display multiple results from the same website?
Matt Cutts - June 11, 2012
| 9:14 pm on Jun 12, 2012 (gmt 0)|
I think the helicopter ( or drone !) view is maybe "whats makes the money" - if advert click rates increase as a result of multiple listings someone somewhere will say that in a meeting.
This may be a time when it will be interesting to watch Google's response.
| 9:22 pm on Jun 12, 2012 (gmt 0)|
When did the terminology "host crowding" begin? Just looking at the phrase makes me think of a particular hosting company not domains.
| 9:29 pm on Jun 12, 2012 (gmt 0)|
I wish I could post examples here. I'm seeing some incredibly messed up results for popular electronic product searches in Australia.
The common pattern is that no matter what page you look at, it's the same sites repeated over and over. So, page 1 is up to 70% the same sites as page 2,3,4,5,6 etc...
In affect, the top 10 results now have the chance of spanning multiple pages.
Mr average user comes along... Doesn't find what he wants on page 1. Skips to page 2, finds the same sites. Page 3, same thing. Choice has gone. Probably thinks they're paid results of some kind as the same sites are being pushed on him over and over...
WTF? Do google want to drive people away? It seems someone's literally gone insane with all these changes. The door for others to take over this space widens with every new stupid update they make.
The days of G being a leader are gone. The company has jumped the shark, keep going for the money grab (google shopping) are Literally forcing people to use their social network (g+ places) and slowly but surely a range of other services (which work better, are easier and have mobile integration) will replace the need for G in the general public.
Apples iOS 6 maps, passport and Facebook integration is one example of many that further reduce the dependancy on the not so big G.
Google has turned from a useful tool that helped to find information into a commercial behemoth that is willing to sacrifice quality in order to make more $$$. There's no other explanation IMHO that fits anymore.
[edited by: anteck at 9:40 pm (utc) on Jun 12, 2012]
| 9:31 pm on Jun 12, 2012 (gmt 0)|
|In other words, why would it be beneficial for Google to not pick the one most relevant out of the 8 results they show and then simply hand it over to Amazon and let THEM guide people further to other relevant products. |
Well, actually, it would probably benefit Google if they could get people to keep bouncing back to Google for more Amazon results rather than turning that searcher over to Amazon for the rest of the day. But I don't think this is going to accomplish that. This is just annoying, and makes you want to use anybody else's search rather than Google's.
| 11:34 pm on Jun 12, 2012 (gmt 0)|
Maybe somebody should create a "small sites" search engine that excludes big sites like wikipedia, amazon, youtube and ehow from the results. Such a search engine would make it easier for people to explore less-visited parts of the web.
| 1:59 am on Jun 13, 2012 (gmt 0)|
|Maybe somebody should create a "small sites" search engine that excludes big sites like wikipedia, amazon, youtube and ehow from the results. Such a search engine would make it easier for people to explore less-visited parts of the web. |
um .. working on that
| 2:33 am on Jun 13, 2012 (gmt 0)|
Wow...I used the search term brinked suggested above. The first 16 results on Google (I have it set to show 50 results per page), were all from the same domain. Even the videos/news that was interspersed among the results were from the same domain.
For such a generic search term, those results are unacceptable. I can understand, to a degree, what Google is doing. Sometimes for very specific searches there simply may be no better results - so Google displays the best of the bunch from one domain. But for a generic term like "Mac and Cheese Recipes", the results shown are ridiculous.
| 5:03 am on Jun 13, 2012 (gmt 0)|
An interesting development with brinked's search, and I'm not sure whether to post it here or start a new thread, but it's apparently a part of the interface "refinements" we're seeing. It may only coincidentally correspond to a search that also showed many results from brand authority sites.
The [mac and cheese recipes] search now appears to trigger an interactive recipe builder, still a little buggy and, I get the sense, a work in progress. It may be nothing more than the Google "Recipes" channel switching on automatically. While I often use Google for cooking info, I've never gone into the "Recipes" channel before, so I can't say how it behaved previously.
Now, though, various food terms will trigger it, including words/phrases like "stir fry", "microwave", "egg salad", etc. Here's what I'm seeing when it triggers....
On the left, apparently by default (as if I'd selected "Recipes", but not exactly), there are three sections of selectors, with headings in red, for...
Any cook time
For brinked's search... "Ingredients" has 'Yes/No' check boxes for things like macaroni, various types of cheese, dry mustard, basil, breadcrumbs, milk, butter, etc. The list changes as you make your selections. As soon as you select one of the ingredients, "Recipes" in Google's channels list becomes highlighted in red.
And, at the bottom of the page, I'm seeing intermittently (apparently applying to recipes, not to the serps)...
The link with info for recipe publishers jumps you to...
Rich snippets - Recipes
Note that also this evening, it's been reported in the Updates thread that Google is not auto-selecting "Location" for you, and I'm seeing that too. Last word was that it was a bug in Google Local, but I was thinking that the entire left side Search Settings menu might currently be a work in progress.
Also, with regard to the above, I'm still seeing non-recipe searches showing multiple results.
| 7:59 am on Jun 13, 2012 (gmt 0)|
|I can imagine a discussion in a Google meeting room where the approach was hashed out, and ultimately the opponents said, "OK, let's try it and see what kind of data we get." |
You'd think, though, that a test with just a small sample of users could get them plenty of data to make that call. Everyone's seeing it.
So is this the Mac-and-Cheese update? I'm allergic to cheese, so it seems a good fit.
| 8:03 am on Jun 13, 2012 (gmt 0)|
the funny thing is, search for a product and you get 70% amazon extreme domain crowding, search just for amazon and you get a variety of sites.
| This 145 message thread spans 5 pages: < < 145 ( 1  3 4 5 ) > > |