koan... i think that is a great point... those of us who initially reacted to this algorithm update by pressing google to do its quality analysis on a page-by-page basis, were not thinking about the incentives that this would send out.
if google were somehow be able to judge quality on a page-by-page basis, there would be no disincentive to publish a lot of crap on a good domain... now, in the new world we are in, there is a massive disincentive to adding mediocre-quality pages to a good domain. this is going to help reduce the growth rate of pages google has to crawl.
I wasn't accusing you of being evil, I was stating what I got out of the post. I know that people here and webmasters in general run the full spectrum in terms of their content, goals, and business practices, so it must be difficult to try to write something that can be useful for everybody.
|..us lab rats talking amongst ourselves about the lab and the experiment(s) helps them too..;-) |
I often wonder if the Google engineers are laughing their heads off listening to all our theories. :)
On the topic of "thin" pages -- one of the most useful pages on the internet has NO content: Google search home page...
What you state about competition hasn't been my experience. I suspect we are using the term "content" to mean different things. The simplest web page I ever composed would have taken a few hours, most have taken several days, a few have taken weeks, though that may be spread out over years with updates.
It has often taken many months for good content to rise to the top, but that rise has always been inevitable (if there was interest in the subject), driven by quality organic links. Exactly how Google has broken itself to discount those qualities is beyond me, but I would term it a mistake.
So for those of us who are going through our sites looking for thin pages, it sounds like blocking them with robots.txt will have the same effect as removing them.
What do you think?
|On the topic of "thin" pages -- one of the most useful pages on the internet has NO content: Google search home page... |
But two words each appear 3 times "google" and "search" ( I'm not counting "igoogle" ) ..so the indicators as to "who" ( branding ) and "what" ( function/subject ) are unmistakable ..even to an aglo ( AI )..
|I often wonder if the Google engineers are laughing their heads off listening to all our theories. :) |
Having done a fair amount of lab research in an earlier phase of my life ..scientists don't spend much time laughing at the "squeaks"..but they do try to interpret them ..as they do any other responses to stimili ..negative or positive.
|I do think Google is giving up a little bottom line here and sacrificing their bottom line for quality. |
with respect ..that's not what I said is what they say it is for ..and what you say it is for ..
I think it is more of the nature of "maybe make a little less money now" to have "a better view of the subjects"..like moving the visitors drinks machine to let them better see the monkeys in the pen ..they will point out the ..most attractive monkeys to you sooner that way..
That and pushers always give some away for free ..Google knows from the beginning ..keep the ads unobtrusive ..give stuff away for free ..watch the reactions ..lock in the consumer to the brand ..You Tube is the perfect example of the strategy ..a generation has grown up feeding the biggest UGC machine and the branding is all over it ..some folks never get further on the net daily than YouTube and or Facebook ..like being addicted to "jackflash"..
Google is more interesting than Bing because it's not so "top down driven" in its experiments ..its engineers come out and talk about the whys ..and the whats ..you dont see that from Bing ..the bean counters ( and the head bean counter ) have "intimidated" much of the "let the engineers try things" out of Bing ..some interface innovation does happen certainly ..but for example the ad platform is still US / Anglo centric ..and the "merger" has put back any chance of there being an international alternative to adsense by a long time..the bean counter boss throwing chairs when he doesnt like the figures doesnt help to encourage "blue sky"..:(
I don't want any of them to "win" ..the more competition between them..the better for webmasters ..
I just find Google's experimental methodology the more interesting at the moment ..
I've mentioned Hari Seldon in relation to them more than once in past years here ..someone here in a thread recently again made the connection / spotted the similarities to the philosophy of the "shapers" of Asimov's foundation series again recently ..
Worth reading IMO ..gives some insight into some aspects of the plex way of thinking..and also into the way large groups of people / societies react also ..as webmasters that is what we are dealing with too..
Likewise ( and I've said this before here too ;-) ..I suspect that the plex has a blend of many religions and philosophies represented within it's people ..But they remind me most of the Jesuits ..much careful observation ..the better to subtly apply the levers ..in order to steer or create change ..or create stasis when it suits them ..getting rich doesn't conflict with that at all ..making the stockholders and others richer even smooths their path.
If one believes that all action..even inaction is political / philosophical ..then one must accept that the search engines are only acting as all other gatekeepers to information and exchange have acted through the ages ..
Which is why one must watch them closely ..and try not to depend on them for all ones daily bread ..or one gives them godlike power over ones lives..and as many here have learned recently ..an organisation which thinks that in the main it is only doing good for the many, can be blind to how it is crushing the few ..especially it's one time disciples .
We helped make the Golem .. obviously it would have to have feet of clay..
[edited by: Leosghost at 1:17 am (utc) on Mar 8, 2011]
hmm.. I build out a few local directories and over the last few days, my Adsense earnings appear to be higher on the main site that pulls the revenue, rankings/traffic are all in check, I would also describe these pages as "thin" business profile pages that are custom optimized to hit local terms related to the industry the business is in.
Keen to see what other type of sites have been hit as it appears for now, I haven't.
Also got the email over the weekend to add more Adsense units to my secondary directory -- doubt I will do it for now.
"Keen to see what other type of sites have been hit as it appears for now, I haven't."
Go over to webmaster central, I am seeing some very high end sites that have been nailed by the Panda, makes me not feel so bad as my sites are nothing compared to some of them.
If Google does this to them no one should think they are above any algo change, your time will come.
My friend, your last comment is one of the most ellaborate, intelligent things I've read on this forum. Thanks for the insights.
|If Google does this to them no one should think they are above any algo change, your time will come. |
I'm sure it will .. however your disaster recovery plan will have to be put in place when/if it does. :-)
Was not directing that statement at you was just a general statement I needed to pay attention to in the past, I hope your time never comes personally I do not wish this on anyone.
Disaster recovery plan is a must if you play with Google, I agree, if anything this woke me up just a bit :)
They did a major algorithm shift, just like this many years ago (one of my sites was caught up in it then) .. and it never has fully recovered to the point it once was, but changes like this get you thinking.
My comment wasn't being sacrastic @kd454, I actually agree that my day will actually come again. ha.. just hope I have enough resources behind me to ride things out again. Should be ok..
Bigger sites will have the major problems. Agree here.
Has anyone else noticed the trend that sites which rank well now seem to have their ad blocks outside the primary content area (e.g. headers, sidebars, footer, etc..)? It is not 100% of the time, however it does seem to be a trend. We were hit with a 35% decrease in traffic and have been working night and day fix things (e.g. removing pages, tweaking overall UI, changing the above the fold UI, taking suggestions for anyone who cares to voice them, etc...).We did have small ads appear inline with the content previously. It wasn't done in a misleading manner, I just hate the look of huge banners and ad blocks running across the site. I am grasping at straws as to what else to change. We gone so far as to sacrifice earnings by 65% to make things right and nothing... Assuming the changes are good for our end users, we are happy to change anything. Just wish we could find just what that is...
We would rather reach/help more end users than make a buck. On the whole, the algorithm does seem to be an improvement but it is far from 100% in some circumstances.
"In addition, it's important for webmasters to know that low quality content on part of a site can impact a site's ranking as a whole."
Hmm... what about UGC? What if you have a well categorized niche classified portal but you practically have little control on the ads posted? Rewriting manually tens of thousands of ads written by tens of thousands of members with different skills and knowledge is practically not an option... What is over your control is the pages that group certain ads, for example: "Widgets in Georgia" and this is where we used to rank well and lost rankings. So, if I want to rank for "Widgets in Georgia", I should move to a separate subdomain all separate ads just to rank for the desired phrase? Obviously, these specific ads will be the content that is less valuable as members will use titles and phrases like: "I sell my widget in Atlanta...", "We sale blue widgets...", "Blue widget my sell..." Notice the various grammar errors added intentionally...
"Has anyone else noticed the trend that sites which rank well now seem to have their ad blocks outside the primary content area (e.g. headers, sidebars, footer, etc..)?"
If you look at the sites on Webmaster Central that are listed and all the content farms, seems the amount of ads and placement play a BIG role in the trigger, ads above fold, in-content ads the more you have the bigger the hit.
Google Adsense reps advise you to put more ads in those areas and Google search quality team nails your site for it - this it enough to make your head spin.
I just received an email yesterday for another one on one with an Adsense rep, I am going to start removing ads not put more.
More than a single adsense block in most cases will kill your whole site, this is fact.
Well done adsense team .. keep on suggesting people should add more ads.
|More than a single adsense block in most cases will kill your whole site, this is fact. |
Not a single site I own with adsense on them with more than one block were touched in the farmer update.
[edited by: tedster at 10:00 am (utc) on Mar 8, 2011]
Excessive adsense ads + unknown + unknown = game over. It is a combination of factors where ads play a huge role.
[edited by: tedster at 10:01 am (utc) on Mar 8, 2011]
Take a look at your content management system / page creation script and see where it can be improved if these pages are being auto generated on the fly.
It would be of no surprise if you are creating alot of pages even with unique content + adsense on a certain scale that these pages maybe hitting a filter somewhere.
How the pages are being created and structured on the fly (custom titles, meta descs for each page?), internal linking structures, outbound links to more authoritive sources.. just a few things I would be carefully looking at.
|I've mentioned Hari Seldon in relation to them more than once in past years here ..someone here in a thread recently again made the connection / spotted the similarities to the philosophy of the "shapers" of Asimov's foundation series again recently .. |
Wow, Leosghost, staggeringly awesome post! Asimov and Heisenberg (and well-understood no less)... well said on all counts!
This is something I've personally noticed in the course of webmastering, something I'm sure most of us have noticed to one extent or another: individual human beings are unpredictable, but en masse their behavior is remarkably consistent. It's like thermodynamics -- it's the law of large numbers.
It is likely that nobody knows more about the en masse behavior of humans than Google. Facebook may be catching up.
As for the idea about previews, I was thinking about the same thing, and have made several preview-enhancing improvements.
|More than a single adsense block in most cases will kill your whole site, this is fact. |
|LOL... That's just funny. |
That hasn't been my experience either.
I personally think that Google is once again leaving webmasters behind and puzzled.
What I do not understand is the following: They say they cannot reveal information about the update because it would be gamed. But what would be the consequence if everyone knew Google's definition of quality? Right, everyone would modify or create sites based on that standard, at least if they want to rank in Google.
The effect? Improved quality all over the board, or at least Google's definition of it.
The problem? People would say that Google's definition of quality is not the holy grail. First, we would be exactly were we began. Content farms rank for many terms, scrapers outrank the sites were the content was originally published and autoblogs flourish.
The update does not appear to be about "content" quality, where content is text or other information that the visitor is searching for. The quality-score that they have added to the algorithm seems to be about unrelated elements, thin content pages that have been written 10 years ago can have a influence on the rankings of a page that's the best there is on a topic. Many say that to many ads, ad placement, and god knows what else can have a influence on the rankings. All of those elements have nothing to do with the key question: Is the page helping the visitor, or is it not.
When I search or research, I want to find answers. I do not care if a page has ten ads on it, if other parts of the site are sub-par or if the design looks like it has been last updated in the 90s. If the information are there, I can live with that.
Now that the algorithm has taken its toll, it can happen that I see sites at the top that look superb, with a low quantity of ads, that unfortunately do not give me the answers that I'm looking for. I'm not saying it is always the case, but it happens more often than before.
And the consequence is that I have to spend more time searching for the answer.
I agree vandread.
|I personally think that Google is once again leaving webmasters behind and puzzled. |
I believe that google looks at webmasters like a farmer looks at the sun. The sun,[webmasters] essential to growth, will always be there tomorrow.
And google doesn't so much care about the quality of wheat [sites] overall, they care about the factors they can control to make selling the grain [ads] the most profitable.
I guess the 'Farmer' update name put me in this mood.
Hi people, found an article from someone who thinks he has identified who Panda is, and has looked at some of the papers he wrote.
|One of the papers that Biswanath Panda and a number of other Googlers published for Google in 2009, described an experiment that Google performed on their advertising system, seeing if they could learn about the quality of ads and landing pages based upon bounce rates associated with clicks on those ads. |
The focus of the paper wasn’t so much upon the effectiveness of the ads in the experiment, but rather about the ability of the machine learning system to work on a very large set of data.
And here's the paper in question for those who want to wrap a wet towel around their heads and get to grips with it:
From the pdf...
PLANET: Massively Parallel Learning of Tree Ensembles with MapReduce
|For future work, our short term focus is to extend the functionality of PLANET in various ways to support more learning problems at Google. For example, we intend to support split metrics other than those based on variance. We also intend to investigate how intelligent sampling schemes might be used in conjunction with the scalability offered by PLANET. Other future plans include extending the implementation to handle multi-class classification and incremental learning. |
SEO = Semantic Engagement Optimization
The old SEO is Dead! :)
Thanks AlyssaS, great links to interject at this point in the discussion. This topic should go on for another 10 to 15 pages. :)
I think we should split this off. That PDF is a goldmine.
Even ignoring the maths, it contains such nuggets as
|We measure the performance of PLANET on the bounce rate prediction problem [22, 23]. A click on an sponsored search advertisement is called a bounce if the click is immediately followed by the user returning to the search engine. |
Ads with high bounce rates are indicative of poor user experience and provide a strong signal of advertisement quality.
So, there you go. And yes I know, Adverts and organics use different algos. And if an advert bounces as defined, the page almost certainly did NOT serve its purpose, which is less clear from organics (where a question might have been suitably answered, for example).
"Not a single site I own with adsense on them with more than one block were touched in the farmer update."
How about the placement of the ads? Any in-content or above article in the content area?
Let's not lose hope people, Google is still learning about the algo and hopefully they will update it by using the submitted sites as test cases. Even in Florida many if not most sites came back within a month or so.
Googlebot is going nuts on my site today again. By the end of this week I should have all my 'bad' tag pages removed from Google (noindex after removing them via WebmasterCentral) so I'll see if 'bad' pages caused my site to go down.
Walkman, I'm very interested how this turns out for you.
I had a very close look at pages that lost rankings and it looks like the new update has a problem with pages with pictures and a small amount of text.
My website about "widgets" has thousands of pictures of widgets and used to rank on the first page for "widgets". This was logical because people come to the website to view pictures of widgets and share them on Twitter, Facebook, ... Now it is on the 4th page for that keyword. I have hundreds of pages with thumbnail galleries (+ some text to explain what that collection is about) and each thumbnail links to a page with the full size image of the widget + text (between 50 and 200 words). Google might see this as thin content.
Unfortunately, through my breadcrumbs, each page with a thumbnail gallery and each page with a full size image is linking back to the main pictures index page, using the anchor text "widgets". My guess is that Google thinks I'm trying to cheat because I'm linking back all the time from "thin content" with that anchor text. As a result I ended up on the 4th page.
I have many exellent rankings on the first page of the serps for other keywords and other pages, but as soon as linking back from pages with pictures (and a small amount of text) is involved, I lose rankings.
Something interesting I just noticed, on the subject of previews:
Google picked up these images and has been displaying them in my previews.
About a day ago I added a couple of new pages. The previews for those pages (only) are displaying adsense units in the preview, and even highlighting some of the adsense text.
What have you seen with respect to adsense appearing in previews?