| 8:56 am on Sep 17, 2012 (gmt 0)|
Seconded the TA results - seems worse than before too - (town) hotel reviews seems to be 10/10 TA results now for multiple locations.
Also, is anyone seeing a drop in regional relevancy? Searched for a tradesman in Liverpool (UK city) returns some .au results (Australian business directory listing site). On Google.com - Google.co.uk just UK results in top 10. Definitely wasn't seeing this before the weekend.
I've got a load of hotel clients on the books and not really seeing any traffic impact though, although organic efforts have largely been giving review terms a miss though.
| 8:58 am on Sep 17, 2012 (gmt 0)|
|@Robert Charlton TA continues crowding all over the place, no change here |
TA is, or at least was, a different issue from most of the multiple results discussed. TA's large numbers involve multiple ccTLDs ranking for different pages on the same query, and enough pages satisfying a search that they often get a lot of pages per search, but different ones for each ccTLD. Check out the links I suggest via the host crowding thread and read the threads, and run some searches and look at them carefully.
With the host "uncrowding" increasing the number of results per domain, TA may have gone from two to say three or four per domain, increasing their overall number... and they do have a lot of domains, all apparently individually active and well linked.
But most host "uncrowding" results don't involve multiple ccTLDs per company, so it is a different situation.
And again, the competition for many of their searches is often thin. Many competing sites I've checked have obviously bought or scraped reviews, with not much original content. There's probably some of that on TA too, but we don't want to get into particular sites.
| 9:07 am on Sep 17, 2012 (gmt 0)|
Whatever they fixed with this one, it looks like they broken something else, as usual.
| 9:23 am on Sep 17, 2012 (gmt 0)|
In the particular TA search I looked at, there were 4 results from .com and 4 from .co.uk. In some sense this is understandable, but really Google can see these sites are interlinked and could certainly treat them as localized sites which is how they are intended. Return TA.co.uk to searchers in the UK and TA.com to searchers in the US. Is that really so hard for them to get their algorithm around?
| 12:45 pm on Sep 17, 2012 (gmt 0)|
>That type of multi-result is not happening anymore.
Deff not what I'm seeing locally.
>In my niche they did the opposite, now there is more form one domain then before.
Spot on. On some local language serps I'm seeing pages after pages with results from the same site.
| 12:51 pm on Sep 17, 2012 (gmt 0)|
I'm seeing a lot of scraping scum back at the top...not nice nor useful.
| 12:55 pm on Sep 17, 2012 (gmt 0)|
ditto ..ehow just took some major jumps upwards..now sitting at #1 for many things, above the sites that it's "writers" respun the content from..
| 5:28 pm on Sep 17, 2012 (gmt 0)|
|there is another issue: it makes a site look spammy and may destroy its brand in the longer term. people hate being spammed with the same domain over and over - they want diversity and choice, pretty basic stuff. |
as a webmaster, i definitely would assume black-hat tactics if i didn't know better.
This is a good point.
| 6:20 pm on Sep 17, 2012 (gmt 0)|
Have to say, things look a lot better in my niche in terms of diversity. Generally, the first 2-3 pages have a great mix of domains. After that, I see the same clumping as before, but really, but what difference does it make what page 5 looks like?
I haven't experienced any noticeable gains or losses, but it does make me feel good to see results that make a little more sense to me.
| 7:06 pm on Sep 17, 2012 (gmt 0)|
Google pretends to fail displaying worthless SERPs driving revenue to adwords by mistake.
Truth be told, the display of how many urls to show from one domain is a setting, it has nothing to do with the algorithm but since the lie will be repeated a multitude of times because parroting Matt Cutts is default behavior most will believe the googly "truth".
| 7:23 pm on Sep 17, 2012 (gmt 0)|
|ditto ..ehow just took some major jumps upwards |
Yep. Noticeed one yesterday. Top four positions. Two identical videos (one was 1 second longer) and two content articles with nearly identical titles.
Search term was about "measuring a widget"
| 7:51 pm on Sep 17, 2012 (gmt 0)|
|Truth be told, the display of how many urls to show from one domain is a setting, it has nothing to do with the algorithm but since the lie will be repeated a multitude of times because parroting Matt Cutts is default behavior most will believe the googly "truth". |
Because somehow YOU have inside access to the *real* truth...
| 8:57 pm on Sep 17, 2012 (gmt 0)|
Ehow's rise in this latest tweak can't be missed and it's not for a lack of quality choices as Tedster suggests.
It's becoming as big a blight on the internet as scrapers are.
The articles I have read are plagiarized articles written and researched by Ehow writers from content already available online from original sources.
That it is not recognised by Google as such but rewarded by placing four Ehow results above the original articles is scurrilous.
Google is rewarding a factory of people rewriting someone else's content.
| 10:01 pm on Sep 17, 2012 (gmt 0)|
|Because somehow YOU have inside access to the *real* truth... |
I don't think one really need inside access. All you need is some common sense and you may see the *light*. No it isn't rain....
Your continued defense for the un-defensible is priceless... :))))
| 10:32 pm on Sep 17, 2012 (gmt 0)|
|...the display of how many urls to show from one domain... |
This is probably a very difficult problem. Google is a data-driven company, and clearly, the data has shown Google that the old maximum of two results per domain isn't always best for its users. Lots of considerations are involved in determining what is the best number, and Google is not pretending that there's an easy or an arbitrary answer.
A while back, in a discussion about pdfs, phranque posted a link to a Matt Cutts video discussing the difficulties for Google in deciding when to return a pdf in the serps. The video gave me some insight into Google's view of the serps in general, and I feel it gives some perspective on this current situation. Instead of 'web pages vs PDF documents', think 'number of results per domain'....
What are the best practices for PDF optimization?
GoogleWebmasterHelp - uploaded Aug 8, 2011
Matt says (starting at c:45 sec in)...
|...But there's a more interesting question underneath this question to me, which is, how do you rank web pages vs PDF documents?... Google's philosophy is to try to determine, as best we can, what's the utility of the next result? Is the user better returned by serving a PDF? Or are they better served by returning a web document? And it's a really hard problem.... |
...It's an imperfect science. It's much more of an art than a science, because different people will have different philosophies. Some people don't like to get PDFs.... But we try our best to say... what's the best match for the user? What's going to give them the best value and help them out the most in terms of their information need?
I think it's a gutsy move for Google to run the tests and sort this stuff out, even if it means that they risk looking bad for a while.
| 10:36 pm on Sep 17, 2012 (gmt 0)|
|I don't think one really need inside access. All you need is some common sense and you may see the *light*. No it isn't rain.... |
Your continued defense for the un-defensible is priceless... :))))
Obviously you're new here.
| 1:03 am on Sep 18, 2012 (gmt 0)|
Not sure of the improvements from what I'm seeing in verticals i monitor. RustyBrick has some screenshots here [seroundtable.com...] as well to reinforce that belief.
| 3:14 am on Sep 18, 2012 (gmt 0)|
|Also, is anyone seeing a drop in regional relevancy? Searched for a tradesman in Liverpool (UK city) returns some .au results (Australian business directory listing site). |
In Google Australia, "Newcastle" searches used to give us Newcastle, UK results for many years. Must have taken years for someone in Newcastle to complain, but now the top 50 or so results in my tests are very much Newcastle, NSW.
Yet, the algo can't shake off the possibility that an Australian searching for a (specific tradesman) in Newcastle just might, on the offchance, be looking for one in the UK.
For (specific profession), all 100 top results were from Australia. There is some oddity with the trades and some services such as pizza, where the UK results keep coming back. Yeah, as if we would order pizza from the UK. (Yes, I have read about some takeaways who are famous for international deliveries).
In some searches, I am still getting dozens of Yellow Pages results at the top.
| 10:37 am on Sep 18, 2012 (gmt 0)|
Does anyone see a correlation between this update bringing back horde crowding instead of horde spamming and the valuation of backlinks by one host? (I'm not talking about footer links, e.g. the same link on all subpages, but different links pointing to one external host from one host)
imho it looks as if these multiple links from one host are calculated differently w/ this update.
| 12:08 pm on Sep 18, 2012 (gmt 0)|
In my niche this change did NOT work as intended. Now the top 8 results are youtube videos followed by 2 youtube links. There is nothing but Google above the fold.
| 12:27 pm on Sep 18, 2012 (gmt 0)|
yes, youtube seems to be excluded from horde crowding and is the new horde spammer in the serps. It's the old game of google sending us to google again :/
| 6:18 pm on Sep 18, 2012 (gmt 0)|
But couldn't user engagement be promoted or affected by the repetitive cycle of the top results, even if not valuable, continually being clicked. Granted time on page would diminish. so there should be some element that helps break that cycle or introduce fresh options into a users search.
| 6:26 pm on Sep 18, 2012 (gmt 0)|
|...introduce fresh options... |
Welcome to Webmaster World lostnirvana.
|But couldn't user engagement be promoted or affected by the repetitive cycle of the top results, even if not valuable, continually being clicked. |
Very much so, thanks for that :)
| 7:47 pm on Sep 18, 2012 (gmt 0)|
|Not sure of the improvements from what I'm seeing in verticals i monitor. RustyBrick has some screenshots here [seroundtable.com...] as well to reinforce that belief. |
Not saying this update worked/didn't work, but not the best example imo since its a navigational search
| 2:10 am on Sep 20, 2012 (gmt 0)|
I also see relatively low domain diversity in SERPs that are either navigational or considered local by Google. If anything, the dial on what is "local" seems to be set pretty loosely, as many of these queries would be happily satisfied by online business that ship nationally or even internationally.
To say it another way, a of queries that I don't think of as local still clearly trigger local responses with more domain cramming and little diversity. Maybe that "local" flag on the query makes more sense on mobile devices, but I'm also seeing these results on laptops and desktops.
| 5:49 pm on Sep 29, 2012 (gmt 0)|
I have a site that was negatively affected on the 14th - so does it mean that less urls are now showing up for a query that used to show more? I don't understand.
| 8:02 pm on Oct 2, 2012 (gmt 0)|
< moved from another location >
I'm new here, so hello everyone.
What i've joined for and would like to discuss is why Google is showing multiple results from the same site in search results. I've read throughmost of this post [webmasterworld.com...] and wanted to add my thoughts
I've come to the conclusion that Google is doing this on purpose and i'll explain why.
Here goes, (please let me know your thoughts)
Since i've noticed Google showing multiple results from the same site, I've also noticed the cost of my keywords going up. Some as much as 500% My stats have remained pretty even, but the cost keeps going up.
Now for the conspiracy theory.
Google is purposely showing multiple results from one site so they fill up the results with duplicates that sometimes make no sense. There are what seems to be repeated links, with duplicate content etc. Now when you see the results, you used to be on page 1 maybe listed 2-9, but now you are listed 50th or greater and nowhere near where you used to be.
I asked myself why would google do this? Why would they make the results so awful? Well my theory is they are doing it to force people to click on their adwords ads.
How you ask? Well think about it. If you see the same site over and over again on the search results, what is the only thing that is different? The adwords ad. What's the normal person going to do? - Click on the adwods ad, why?, because really that's the only different thing they are seeing in the results. - Google makes money.
Now I asked why the cost of my key words are going up? Well, aren't those key words more popular now? I mean if you want on the first page, you HAVE to purchase adwords to be listed on it now, Right? this drives up the prices of those key words, - Google makes money.
I am trying so hard to figure this out and I believe this is the only reasonable answer. I mean why would Google show results this bad? Why would they risk someone changing search engines and losing that person? I honestly think the average person isn't even going to notice.
I honestly think this was Google's plan from the begging with the "Updates" With these changes Google is blaming the penguins, or a "bug", but in the end it drives people to click on adwords and forces companies to pay for the keywords they need to stay relevant.
Didn't google have a record setting quarter for click advertising? I forget which quarter, but it was up 67%! that's huge. Coincidence?
I would love to hear other people's opinions. I have been trying to get response on Google's forum, but nothing.
FYI, i've been in SEO and web design for 20 years and try to keep up with anything related with the business. The multiple results and horrible listing has been driving me crazy. I want things to change, but no idea how to do it, except try to understand it.
[edited by: tedster at 1:53 am (utc) on Oct 3, 2012]
| This 57 message thread spans 2 pages: < < 57 ( 1  ) |