Interesting. The item below goes beyond what Matt Cutts tweeted at the time:
|Improvements to Penguin. [launch codename "twref2", project codename "Page Quality"] This month we rolled out a couple minor tweaks to improve signals and refresh the data used by the penguin algorithm. |
The data refresh was all that he mentioned, not the "minor tweaks."
|Freshness algorithm simplifications. [launch codename “febofu", project codename "Freshness"] This month we rolled out a simplification to our freshness algorithms, which will make it easier to understand bugs and tune signals. |
A rollback. If Google rolled back most of their "Search" changes over the past couple of years, then their "Search" users might actually be happier and less "curious", as was recently reported by Amit Singhal. *sigh*
To be frank, I think the whole entire "Search" algorithm needs a simplification.
I use the word search in quotes because Google apparently believes that Google != search.
|To be frank, I think the whole entire "Search" algorithm needs a simplification. |
Yes, the search algo is just a tad over optimised, is it not?
[Google = over optimised]
Editorial comments aside (PLEASE! [webmasterworld.com]) if we want Google Search traffic, then we should pay attention to everything they're doing and take it into account. There's every reason to note, analyze, and discuss the directions Google feels are important.
Three of these algorithm changes are about improved handing of freshness, apparently including QDF (Query Deserves Freshness) tagging.
Two of these changes are specific to search results on tablet computers. How many of us are even thinking about different results for different types of devices?
One of the changes us labeled "Simplification of term-scoring algorithms." I'm not familiar with the phrase "term-scoring". Apparently it refers to multi-term queries and how much relative weight is given to each individual term - which I assume includes semantic phrases scored as one single term.
|Two of these changes are specific to search results on tablet computers |
Why deliver different results? My tablets have 10" screens like a netbook, the normal results display perfectly, even my old Asus 7" netbook screen is fine with regular results.
Am I missing something?
They changed the layout somewhat with the Knowledge Graph; maybe they had to do some extra tweaks for tablets.
Tablets are mobile; I wonder if location related signals would get magnified for tablet results.
|tedster wrote: |
The data refresh was all that he mentioned, not the "minor tweaks."
Thinking back on it, didn't he also say it would affect 0.01% of queries? That's an odd thing to say for just a data refresh.
|Why deliver different results? My tablets have 10" screens like a netbook |
I'm guessing that the user requirements are a bit different and so search results get ranked differently. I'd love to know for sure, but I don't currently own a tablet to compare. However one is on order now - I can see that it's going to be a need for SEO.
|I'm guessing that the user requirements are a bit different and so search results get ranked differently |
To be honest Google's standard Android results are awful since they display mobile as default and in the UK it's nigh impossible to get to Google.com therefore if I wanted to see ANY non-UK results it has to be google.com/pda so guess what, Bing is my default choice and easy to switch between countries.
Just why does Google persist in believing it knows what I WANT?
It's interesting that there are two noted under project "page quality" one of them being the penguin data refresh, and the other being the detection of hacked sites.
I'm still seeing quite a few hacked sites in the results for searches I track. They emerged during the first Penguin update, and many haven't gone away.
Project "Page Quality" still has a way to go, but I'm actually pleased to know they are working on site hacking. I just hope they don't think they are done.
I have noticed some encouraging improvements since yesterday. Many of that blantant "thin" MFA's are finally dropping out of the results. One site simply had page after page of photos, then a staff of writers would write single or two word keyphrases as comments, hundreds of them! It was a ridiculously obvious effort at keyword loading and it was ranking highly for many weeks. If Google reads this, I give them a +1 for this apparent move.
I keep up with a few simple-minded websites, so I'm surely not in the league of you folks. I read webmasterworld with great interest and respect. Just curious: at this moment, can any one of you say you fully understand what G wishes, to best place your site in search results?
Fully understand? I doubt it.
Google is a tool without an owners manual.
|Simpler logic for serving results from diverse domains. [launch codename "hc1", project codename "Other Ranking Components"] We have algorithms to help return a diverse set of domains when relevant to the user query. This change simplifies the logic behind those algorithms. |
I'm increasingly seeing less domain diversity. Just now, page one consisted of six results from the same domain, three more on page two. Can't be right.
Re: freshness, I've noticed two shifts in how Google was presenting blog article dates in search since Penguin - the second being a shift back to the old way of doing it.
|Why deliver different results? |
I always thought it would be "smart" of Google to deliver different results based on the user's browser, operating system, etc. Results would be based on the statistical behaviors observed of searchers using these particular devices, browser, etc. For example, we've all heard tablet users tend to shop, read news/entertainment. In light of what Google's goals seem to be, why not deliver more shopping, news, and entertainment results (as well as content optimized for the particular tablet/browser) to tablet users? Each user type would get a different type of search result. It would be a smarter "bubble".
It certainly has no owners manual but it has a lot of tools.
|Google is a tool without an owners manual |
I suppose that I can understand the concept of providing different results for different machines, locations, interests .. but still ..
The fact that Googles overall intelligence is nearly only that of an insect, I'll hold fast with great interest every time something falls out of Google's bucket of buggaboo in the search related sense.
A bug can only be as intelligent as he is though, and if advancement in search is to occur at any time in the future, Google is going to have to come up with even more advanced technologies. Search, at it's very core is going to have to change .. The guys at the Plex are going to have to eventually do away with the dated foundation for search drawn from the last century and buck up to writing something new .. fresh .. unheard of .. and stop with the tinkering of filters and hooks as they might relate to their dated technology.
Even still, I'll look in on the progress of our favorited GoogleBug .. hoping that someone at the Plex, or even someone outside of the Plex, can be bold enough to move all of us beyond the tinker-toy that search has become these days.
The hc1 update regarding domain diversification seems totally broken, at least in europe.
Or by stating "simplifies the logic" they acutally mean not using the algorithms for any listings higher than top 15.
A friend of my mine from Austria called and asked if she had a virus that is infecting the google results as she had the same domain for all places on SERP 3 and up. I could reassure her it's just google that has the flu, not her pc.
|she had the same domain for all places on SERP 3 and up. |
Oh? .. sort of like here in the United States? .. Big G may indeed have the Flu me thinks
[edited by: tedster at 2:39 am (utc) on Jun 10, 2012]
[edit reason] removed screenshot - showed real domains [/edit]
- new golf feature
- new soccer feature
- other new features, all right on the search results pages
- new feature that eventually makes you obsolete, when they get around to your subject
- did Google just SEO their own "books" offering to make it competitive with websites too? (not their own books, of course).
You see where this is eventually headed right? The writing is on the wall. Matt Cutts recently suggested webmasters should pay attention to "social signals" moving forward. Everyone assumed it's because those signals will be incorporated into search results but it's just as possible that they want webmasters weened off of Google as a primary source of traffic as they continue to bypass us when possible. We are, after all, their biggest critic. What does a monopoly do to critics again?
Harsh? yes, and I don't believe Google to be evil but honestly its time to pay less attention to Google, not more as some suggest, because the symbiotic relationship is weakening with every algo and feature update of late.
Now that you know where I stand, and if you insist on paying more attention to Google, I think this was telling...
What do you suppose continuous functions means?
|This change replaces a number of thresholds used for identifying fresh documents with more continuous functions. |
The most frustrating thing for one of my website is that despite working hard to wean it off Google, it's still being *indirectly* affected by these Google algorithm changes.
In terms of Google referrals, the drop I've experienced from these recent updates are only about 0.5% of my total site traffic (but you know, it's Google, so this could all change by tomorrow). But the entire site is still down some 7-9% in overall traffic because a lot of our referring websites have experienced their own Panda/Penguin disasters. These sites range from personal blogs to professional, well recognized websites (in our niche, anyway), although about half of our referrals have gained as well (unfortunately they don't provide significant traffic to my website).
It's also hard for some websites to rely less on Google. For example official websites for a product where people search for the exact name of the product in Google a lot - for some reason, some of these types of websites in my niche are being affected as well based on some stats I have. This is despite their "money term" being consistently (as in never otherwise) the number one result returned (this term is also their domain name, but people seem to still rely on Google to verify the authenticity of a website, as opposed to simply typing in domain names based on a best guess). I'm thinking maybe competing ads and maybe more enticing rich snippets from the second result down, could be the cause.
So unless absolutely everyone weans off Google, it's going to be hard to avoid being indirectly affected, although you can greatly minimize the damage, and further reduce it by promoting your own brand.
Sgt_Kickaxe, I cannot agree more...but these days when i look for meaning of a few internet slangs or foreign words, google has it presented as if they own that content. I think they do something similar for these sports results. What I do most of the time is close down that browser window as soon as i get to know the answer. I hope other users do the same and let all search engine users go ad blind soon.
No matter how much I look for reasons or recent updates to explain the drop in traffic, the fact is that searching for the exact article title (without quotes) ranks scraper sites better than my own. They are no. 1 and I'm not on the first page. This is not an indication of my site's rank, it's a reflection of Google's search quality.
@Sgt_Kickaxe, you're exactly right, but what do we do about it? I've gotten rid of all things Google from my sites, stopped using Google search and other products, and started focusing on social media. But one of my sites was still heavily dependent on Google for traffic, and they Penguinized it.
I've been trying to make my sites less dependent on Google by pushing them in social media, but Google is still how a lot of people find sites. Also, I made the depressing discover that StumbleUpon traffic - my biggest source on one site - goes away if people stop finding your articles in Google. I'm not sure why that's the case when getting Penguinized made very little difference to Pinterest and FB traffic.
|The most frustrating thing for one of my website is that despite working hard to wean it off Google, it's still being *indirectly* affected by these Google algorithm changes. |
That's just it. That's what I'm talking about.
There's only so much any of us can do. I still think there is some value in using alternatives to Adsense and Analytics - at least Google would have to work a little harder to get the stats those products feed them. There are some open source and extremely affordable stats programs out there, so I found dumping Analytics very easy (my new stats program is so much more insightful, too). I'm not sure about Adsense alternatives, because I suspect for some niches it's the highest paying option. There are definitely CPMs out there that will work for some of us - that's what I use - but again, I don't know if those really work for every niche the way Adsense does.
< restored post - removed in error >
I wonder what that might stand for?
tweak reference 2
t_____ word reference 2
t_____ w_____ referral 2
twitter reference 2
tidy whitey referendum 2
[edited by: tedster at 4:15 am (utc) on Jun 12, 2012]
|What do you suppose continuous functions means? |
i'm sure they mean that in a mathematical sense of the curve being "not discontinuous" such as a step function or impulse function.
it this case it means the "freshness vs age (or whatever metric)" curve is a smooth curve.