Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Updates and SERP Changes - March 2011

         

Whitey

4:53 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



< continued from [webmasterworld.com...] >

< related Panda Farm Update [webmasterworld.com] >


I keep dropping mentions of this , but no takeup , so i did some digging, for clues to my theory Chrome's passing back intelligence that could influence this new algo and future changes :

New Chrome extension: block sites from Google's web search results
Monday, February 14, 2011 | 12:00 PM

Today the Google web search team launched a new Chrome extension to block low-quality sites from appearing in Google’s web search results. Read more in the post below, cross-posted from the Official Google Blog. - Ed


[chrome.blogspot.com...]

Also - [webmasterworld.com...]

I think user behaviour data is being underestimated in this thread. Each website will have an depth profile building that feeds into a potential quality assessment by Google. What say you ?

[edited by: tedster at 8:15 pm (utc) on Mar 15, 2011]

walkman

4:44 am on Mar 16, 2011 (gmt 0)



@Walkman, you may be right. Afterall, John is a programmer, and programmers tend to speak in programmer's lingo, so the word "code" may have rolled of his tongue more naturally than "content".

That's what I think too. 99% of most sites is content, not code so ...

Also, if we analyze all his words, he also said this:
This is not limited to this particular algorithm update & your site

Meaning you can get out like many did on previous algo changes.

spaceylacie

4:52 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I stumbled on this old thread


Falsepos, I must reply to your comments again because my site was in it's heyday at that time(the thread you stumbled upon). The update did nothing but good for my site and I was sitting pretty making 10k+ a month from 300 or so pages.

I still have that same site now, and it's along the lines of building a house foundation, there is and always will be only one way to do it properly... or maybe not. New construction materials are now available that possibly could make a foundation even stronger.

It's never been easy to be the best in a field, but it's much more difficult now, esp. compared to 2005.

dickbaker

4:54 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The problem I have with using Webmaster Tools for any ranking information is that the information is really, really off. Even if I filter for web only, US only, and allow any number of impressions, I get rankings that are far off from what I see in a Google search. What WMT shows at position 4.5 I'll probably find at 14.5 or even 44.5.

For position, the rank checker in Firefox is much more accurate.

browsee

5:13 am on Mar 16, 2011 (gmt 0)

10+ Year Member



John Mu also gave some suggestions on 404 vs. noindex.

[google.com...]

Worth a read...

spaceylacie

5:46 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You gotta know when to hold'em, know when to fold'em, know when to walk away, and know when to run. Recent G SERP updates have left lots of website owners wondering where they went wrong.

Consider this, it wasn't your fault, but it was also your fault at the same time. Having fun figuring it out yet?

rash001

5:49 am on Mar 16, 2011 (gmt 0)

10+ Year Member



Nice and useful topic. Though I am new here, I am thoroughly enjoying your posts.

TheMadScientist

7:12 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think people sometimes over-analyze things, like every word said by a Google employee. John, most likely didn't ponder each word for hours, just said them casually.

They're close to, if not the most precise company, in the world.

My guess is that also applies to their language. I'm almost positive they don't throw around filter or penalty as if they're the same thing if they're not, in fact when we used to have a rep post here on occasion, I think they even used the word heuristic (which is what Google actually uses) rather than algorithm ... They're not the same.

There's a huge amount I've learned over the years based on the language they use. The fact they use heuristics, not necessarily algorithms, is probably not the least of those.

IMO The reason many sites don't do better is we often know too much and don't listen enough ... My .05˘

[edited by: TheMadScientist at 7:34 am (utc) on Mar 16, 2011]

tedster

7:16 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I agree - especially when you consider that John's comments were written, not spoken.

TheMadScientist

7:35 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hey rash001,

Glad you're enjoying out hair splitting...
Welcome to WebmasterWorld!

TheMadScientist

8:47 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Afterall, John is a programmer, and programmers tend to speak in programmer's lingo, so the word "code" may have rolled of his tongue more naturally than "content"

And the fact he is a programmer gives more weight to the words he chooses, even 'loosely' or 'off the top of his head', imo ... IOW They're likely very precise, because as a programmer you know $filter is not the same as $penalty and $code is not the same as $content ... When you code (a highly precise discipline) you cannot 'mince words' or 'slip' or 'generalize' and (for me anyway) quite a bit of that transfers to other areas.

99% of most sites is content, not code so ...

I actually disagree. View the source code on a few of your favorites ... My guess is they're fairly evenly split on many ... Even with HTML5, off-page CSS & JS, I still struggle to get below approx 25% to 30% of the page as 'html markup' most of the time.

The reason I pointed out what I did is:
Content would be a 'limiting' word in the context. Code is an 'inclusive' word in the same context. By limiting your train of thought or trouble-shooting to content (limited) rather than code (inclusive of content) you could be missing something.

Another way of saying the preceding assuming the same context:
Content is part of the code; Code is not part of the content.
In the context one is inclusive, while the other is limiting.

This is a very difficult 'game' we play and imo the need for absolute precision and inclusive, rather than limiting, evaluation of sites and pages is going to continue to escalate.

AlyssaS

2:28 pm on Mar 16, 2011 (gmt 0)

10+ Year Member



Sometimes, even after recrawling parts of a site, our algorithms will need a bit of time to confirm that the site has really changed for good.


IMO the above bit refers to the new Panda bits of the algo, where they split test to check user reaction.

I think they've actually got several algorithms in play. The main one for basic ranking and another one for page 1 on top, where they test user behavior such as bounces back into the search results, time on page and so on.

I am still seeing a lot of churn where sites suddenly pop onto the first page for 24 hours and disappear again. I think they do this because traffic on page 2 is too sparse to yield useful user data, so they have to pop a site onto page 1 to get enough data to assess it. Then if you get sent back into oblivion, it's because you've failed the page 1 test.

TheMadScientist

2:42 pm on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Good thoughts ... As far as algorithms go, several and then some is probably getting close to 'correct' but not quite there yet ... Stop and think about just analyzing html code and text positioning on the page even just to detect what is header and footer (link(s) + text) compared to body links + text, and then think about comparing a page to all other similar pages for duplicate and near duplicate content and think about the different thresholds for 'duplicate and near duplicate' you would have to have for 'information' and 'shopping' pages and from there ... most will probably just generate a large scale headache.

<head><body>
<p><div>Text Here</p>
<div><img></div></div>
</body></html>

How do you process the above?
Properly formatted HTML is one thing, and it's difficult, processing HTML on a trillion page scale where you're likely to run into a mis-ordered tag (x 1,000,000) and have to make sense of it with a piece of code is another story altogether...

Yes, that's a 'simple' you could just throw out the opening <p> tag question, but telling a script to do that and exactly which situations to do it in is MUCH more difficult than looking at the code and thinking 'Oh, well you just don't count "blah"' because it's out of place ... You don't even know it's out of place until you get to the next tag with a script...

walkman

3:49 pm on Mar 16, 2011 (gmt 0)



Sometimes, even after recrawling parts of a site, our algorithms will need a bit of time to confirm that the site has really changed for good.


And sometimes not :). Interesting. Can either refer to what Alyssa said (Algo testing user response--most likely) or maybe waiting for a more general site re-evaluation /ranking, as supposed to just the spider seeing that the page is a pretty good page.

Tedster I agree that Google is very precise, but I think that precision varies from a corporate press release and an online posting on support forums. But no matter how we look at it, he went about the same message: improve your site and we will eventually re-rank it, like in previous algo changes.

browsee

3:51 pm on Mar 16, 2011 (gmt 0)

10+ Year Member



Just wondering, div tag instead of p tag cause an issue here? I use div tag (not p tag), mahalo uses div tag, but eHow uses p tag.

TheMadScientist

3:55 pm on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Can either refer to what Alyssa said (Algo testing user response--most likely) or maybe waiting for a more general site re-evaluation /ranking, as supposed to just the spider seeing that the page is a pretty good page.

Or to multiple algorithms and the application of the information through those algorithms ... EG An additional link on a page may 'immediately' lower the value of the links already pointing to other pages within the PR system, but may take time to be 'awarded' to the page receiving the new link.

Sometimes, even after recrawling parts of a site, our algorithms will need a bit of time to confirm that the site has really changed for good.

Maybe something like: When we crawl a new link we wait to see if the link will remain before we 'award' credit to the receiving page ... When we see a new link on a page, the remaining links are immediately lowered in value. Could be applied here?

Sometimes it does, and sometimes it doesn't.

I think for the most part his statements are extremely precise, because if they aren't, even in a public forum and not a press release, he's (and they) are gong to have to listen to how full of FUD they are from all the whiners who 'just don't get it' and would rather blame them for a statement than doing something different or trying to figure out how the statements could apply or why they choose the words they do.

If you don't want to draw a distinction between code and content or think it was just using words loosely, because according to you 99% of most sites are content (what a gross misconception or imprecise exaggeration) then so be it, but personally, I pay attention to the words they use, because believe it or not, they are usually way more precise and exact in what they say than people who post here.

I think one thing many people miss due to the rampant generalizations, exaggerations and imprecision is they usually mean EXACTLY what they say. I mean EXACTLY, because unlike some they would not say 99% of all sites are content. It's a completely false statement and they would get torn apart if they were anywhere near that loose with their words ... Think about it for a minute ... Could anyone like Mu ever make that statement and not get hammered?

TheMadScientist

4:52 pm on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



BTW Wheel, if you think I'm picking on you for a single exaggerated statement you made in a public forum, try to imagine the scrutiny every statement Mu, Cutts, Singhal and the rest of them make is under, daily ... They all choose their words, even for a simple forum post, precisely, imo.

spaceylacie

5:17 pm on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



When you finish updating a page, you then upload the new source "code" you just created. I don't think he meant people should start running around changing all their <div>s to <p>s. I only use pure, solid hyper text mark-up language and my site was hit. Every bit of code I wrote myself.

Though maybe I should go check my <p>s and <div>s to make sure G knows what is content and what is just background noise(header, footer, etc.). We are talking about a computer program here, not a real person looking at the page. But I still think his comment was meant to be inclusive, not exclusive. Source code, or code, means everything on the page, both content and everything around it. Are we having fun yet?

econman

6:00 pm on Mar 16, 2011 (gmt 0)

10+ Year Member



I agree it makes more sense to assume Mu was being very precise and careful in what he wrote. It's not like he needs to be in a hurry, or is forced to write something when he isn't good and ready to communicate something new.

And, there are some useful, new bits here -- which are consistent with, but more specific than, the usual "spend your time on creating great content rather than trying to trick us into ranking you higher than you deserve."

He has an incentive to choose his words carefully. There are personal career risks if he messes up. His company faces legal and public relations risks if he says something incorrect or imprecise in a public forum.

And, perhaps of even more immediate concern, he knows that every little bit of information Google releases publicly has the potential to make it easier for SEO's to "game" their system, making their job that much harder.

All of this can and will take time.


It's easy to think of reasons why it will take time.

One reason: Wait to study user feedback data (from their toolbar, from Chrome, and observed ratio of rapid return to the same SERP, followed by clicking on a different listing, not followed by a rapid return to the same SERP, etc.).

Another possible reason: wait to see if low quality pages were removed and other site changes temporarily or permanently.

Another possible reason: wait until they get around to crawling and analyzing the entire site after various changes have been made, since they need to decide if the good to low quality content ratio is now within the "acceptable" range. And, the lowest quality pages would logically be the pages that are normally given the lowest priority for crawling and studying.

maximillianos

6:07 pm on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Another possible reason: wait until they get around to crawling and analyzing the entire site after various changes have been made, since they need to decide if the good to low quality content ratio is now within the "acceptable" range. And, the lowest quality pages would logically be the pages that are normally given the lowest priority for crawling and studying.


This makes the most sense to me.

As a side note, our friends overseas kind of lucked out. They got a "heads up" from Google saying it is coming. And now they know what is coming and potentially what changes to make to avoid getting hit by it.

I wish we had some kind of warning. Oh well. So goes life on the internet. =)

econman

6:12 pm on Mar 16, 2011 (gmt 0)

10+ Year Member



...99% of most sites are content (what a gross misconception or imprecise exaggeration)


99% is probably an exaggeration -- but I'm not sure how one would go about measuring this anyway.

On a sitewide basis, the ratio could be drastically different than a measurement for any given page -- especially if a lot of the code is contained in CSS and JS files that apply across an entire site.

spaceylacie

7:30 pm on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



potentially what changes to make to avoid getting hit by it


Knowing that a hurricane is coming doesn't always means that you can avoid your house being blown away. As a US based website depending on US visitors, had I known what was coming, I don't know that I could have been better prepared. Has this algo update been figured out to a degree that I don't yet realize? From what I've studied, there are no quick fixes. Just build up your brand name so people will click on you... that's what I am learning.

walkman

8:31 pm on Mar 16, 2011 (gmt 0)



I agree with Spacylacie, plus most people think that their site is fine and use the previous updates as proof.

maximillianos

8:33 pm on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think there is a consider amount of information coming out from Google in the past week to give folks overseas a good heads up and what they can do to minimize their exposure.

If you know a hurricane is coming you can seek shelter, board up your house, get some supplies, maybe even leave town.

If you have no idea a hurricane is coming you are far worse off.

spaceylacie

9:32 pm on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This isn't the kind of hurricane you can seek shelter from or board up for, I don't think. It's leave town or die.

dickbaker

11:39 pm on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Over the last few days I've seen a pickup in orders from the online store on my site. It's odd, because all of my phrases for the online store are sitting somewhere beyond #30.

In looking at the log files, I see that Yahoo is bringing a few customers, which is understandable, since I rank on page one for most of the phrases that I was de-ranked for on Google.

What's interesting is that most of the orders are coming from searches on Google. The log files show Google organic. I don't see my site anywhere in the organic results, so I assume that the buyers are going to Google Shopping, where I still rank well.

If that's the case, then there's been a change. Google shopping didn't bring much in the past. Could it be that users aren't seeing what they want in the regular results, and are turning to the shopping results?

Just a thought.

crobb305

12:18 am on Mar 17, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I just discovered something very interesting in my WMT data.

I have been looking at the the pages that took the biggest hit leading up to my penalty. In Google Webmastertools, I am looking at a date range from the day before to the day after the penalty that dropped my traffic 60%.

The 6 thinnest pages of my site all fell 100 to 300 positions. My site is about 109 pages (almost all pages are great content and ad-free), and the thin, money-making pages are ALL linked from the top navigation (on every page). I already knew that I probably went overboard in my linking to those pages from the homepage (in an attempt to direct more traffic to the money makers). The pages that took the biggest hits (200 to 300 positions) were linked the most frequently from the homepage (with varied anchor text).

HOWEVER, there is ONE money page, linked in the top nav, that GAINED (and still ranks to this day on page 1 for a variety of terms). That page was linked ONLY ONCE on the homepage (in the top nav, and no where else on the page). This page contains the same number of affiliate links (2) and the same length of content as the other money-making pages that were spanked; but it's clear that the algorithm may have appreciated the fact that it wasn't excessively linked to from a single page. This is just a theory, but may be an important clue in my case (and perhaps others here who try to direct more traffic to money-making pages using several links from the homepage).

Might be worthwhile to make sure you aren't excessively linking to thin or borderline-thin pages. Google may have a tolerance for ads, but quality might be objectively judged based on the number of times you link to that page (with a multitude of anchor text variations). Again, just a theory, and based on this, I am rewriting some of my homepage code (in addition to adding fresh content to "thin" pages).

**Sidenote to the Moderators: We have so many threads about recent Google updates and Panda, that I can't keep up with which ones to comment in. I see that I have been commenting in a minimum of 3 different threads. Is there anyway we can consolidate some of these? Or maybe it's my own confusion, for which there may be no cure. lol

tedster

1:45 am on Mar 17, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You have an interesting idea - weak content is not so big a negative as long as the site doesn't claim it to be something good. Worth chewing on.

browsee

2:06 am on Mar 17, 2011 (gmt 0)

10+ Year Member



@crobb305, interesting theory. Did you add noindex to your thin pages? instead of removing links, I would put noindex on thin pages.

crobb305

3:07 am on Mar 17, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



browsee,

I am considering a noindex on a couple of the thin pages, but I hate to sacrifice Bing/Yahoo traffic just to satisfy Google. Those pages rank just fine in Bing/Yahoo. Either way, I disrupt a nice navigation (that funnels traffic very effectively), or I deindex the pages in other search engines.

walkman

3:32 am on Mar 17, 2011 (gmt 0)



"I am considering a noindex on a couple of the thin pages, but I hate to sacrifice Bing/Yahoo traffic just to satisfy Google."

No need to. Instead of 'robots,' just use Googlebot in the noindex part
This 366 message thread spans 13 pages: 366