Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Updates and SERP Changes - March 2011

         

Whitey

4:53 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



< continued from [webmasterworld.com...] >

< related Panda Farm Update [webmasterworld.com] >


I keep dropping mentions of this , but no takeup , so i did some digging, for clues to my theory Chrome's passing back intelligence that could influence this new algo and future changes :

New Chrome extension: block sites from Google's web search results
Monday, February 14, 2011 | 12:00 PM

Today the Google web search team launched a new Chrome extension to block low-quality sites from appearing in Google’s web search results. Read more in the post below, cross-posted from the Official Google Blog. - Ed


[chrome.blogspot.com...]

Also - [webmasterworld.com...]

I think user behaviour data is being underestimated in this thread. Each website will have an depth profile building that feeds into a potential quality assessment by Google. What say you ?

[edited by: tedster at 8:15 pm (utc) on Mar 15, 2011]

TheMadScientist

2:15 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



But we got hit heavily on google.it in January

Browsee, you're way off ... Panda hadn't even rolled out in the US in January, let alone on google.it ... There's NO WAY the ranking changes Interista is talking about in January have anything to do Panda rolling out.

browsee

2:21 pm on Mar 18, 2011 (gmt 0)

10+ Year Member



Sorry mad my bad. Too much is in Panda world :)

TheMadScientist

2:22 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



lol ... It's cool, just wanted to make sure people didn't start a wild goose chase or go down a wrong path ... I understand Panda's a complete head spinner by itself though, so it's easy to knee jerk into 'Panda Mode'.

rowtc2

2:37 pm on Mar 18, 2011 (gmt 0)

10+ Year Member



I just noticed, when i click a result in Google serps and click Back in browser, appears the option with "Block all results" only for that url/site. Google track what user is doing.

outland88

4:24 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



When you roll out anything that dramatic likely quite a few people saw it well beforehand because of testing. They just didn’t no what they were totally seeing because it’s a head scratcher. I know I did and commented I was seeing something totally different. As of March 12th Google reversed its tactics and seems to be targeting the more highly trafficked keywords by slowly leaking it into categories. I’m seeing it less and less about quality.

It’s also highly unusual that you don’t have a consensus on many issues. Its Adsense, no it isn’t Adsense. It’s the position of Adsense, no it isn’t the positioning. They’re targeting directories, no they aren’t targeting directories. I’d also say that exceptions list was mighty large to create those irregularities.

TheMadScientist

4:26 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I’d also say that exceptions list was mighty large to create those irregularities.

So you think they're lying when they say they don't have one for Panda?

ken_b

4:33 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I just noticed, when i click a result in Google serps and click Back in browser, appears the option with "Block all results" only for that url/site.

Thanks for that, I've been wondering were to find the "block" link.

[added] The Block link doesn't appear if you have JS turned off.

outland88

6:08 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



MS,
I’d say it’s just difficult to explain why so many people are able to shoot holes in about all theories to date with legitimate counter-examples. There is no doubt many people will be able to raise individual page rankings by increasing quality (it was always that way). As for fully recovering I have my doubts about the vast majority.

TheMadScientist

6:29 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yeah, outland88, I really think it's a 'page picture' + 'site picture' = 'score' type situation, which makes it much tougher to 'pin down' because as I've said before the answer will be specific to a site and relative to related sites if that's the case, and since it's Google and everything is 'relative'.

Let me see if I can put what I'm thinking into some type of English:

I'll try to keep this as simple as I can so I'll use two 10 page sites competing for exactly the same terms on a page by page basis and how they might be scored so 'thin' content pages out rank 'high textual quality'.

Each page is analyzed for 'quality' :: Includes 'algorithmic interpretation of visual presentation' (for lack of some better way to describe it) and Content. (Only those two to keep it simple.)

The template on site 1 is the same site wide and it might score a 30% for quality, but the content is substantial and scores between 80% and 95% for quality, with the average being 90%. Then when factored together with the template (say it's 2 to 1 content to 'interpretation of visual presentation'). You get 180 (average content score) + 30 (average visual score) = 210 total / 3 = 70.

Then you have another site (site 2) that gets a 95% visual score for the template site wide, but has content all over the place with a 3 pages at a 90%, 2 at 20% and we'll pretend it averages out to a 70% content score. So for the second site you get: 140 (average content score) + 95 (average visual score) = 235 total / 3 = 78.3.

When you apply the average quality score site wide you end up with the 20% 'content quality' pages out ranking the 90% 'content quality' pages because of the overall site quality score.

It would totally throw people for a loop, because it doesn't only take content or ads or a 'visual score' of some type into account ... It combines the scores to give an overall site score and it's not 'one metric' people can focus on to fix the issue, so on one site ads might be okay, and on another thin content pages might rank, while on a 3rd one or the other might push the score low enough the entire site tanks.

I hope what I'm trying to say makes a bit of sense.

tedster

7:29 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm totally on board with that idea, TMS. One way I've been thinking of it is to tell clients that any place they know they've been cutting corners, just stop doing that and repair those existing cut corners ASAP.

TheMadScientist

7:45 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks tedster I'm glad someone can understand what I'm thinking a bit anyway ... It' good to know, because I can 'see' how I think it works, but trying to explain it so other people can 'see' what I'm thinking is really difficult.

tedster

7:52 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Your idea is totally in line with the initial machine training data Google says they collected from their army of human reviewers. They didn't need over a year to add something like "ads=75% of initial screen" to the existing algo.

TheMadScientist

9:46 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It actually makes quite a bit of sense to do it that way, imo ... I would guess there's probably a page-by-page score calculated, then a site-wide average score calculated and applied to the pages as a dampening effect, and imo there's quite a bit more that goes into it than content and template, like site speed and writing level and on and on and on ... and of course, what's 'right' would be on a query dependent basis.

Short Example: Score each page individually, calculate the site wide average, apply the site wide average to the score of each page ... Only probably way more complicated.

* Dampening effect might be more accurate as a 'balancing' effect used to 'equalize' the score across an entire site.

outland88

10:31 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



MS I understand fully what you are saying. Google has unfortunately/fortunately used the areas I traverse for years as a test beds. In other words they unleashed this thing full throttle in August in some areas. There was really no comparison level at that time so most would have related it to what they were currently seeing or past experience. It took me until about November to draw similar conclusions as you. I advised three site owners to submit re-inclusion requests because I saw nothing wrong with their sites. I explained it was seemingly a low risk proposition that likely would yield nothing but could save a lot of time. The requests yielded nothing as I expected. Most later regained “some” rankings by reworking pages. Plus as you’ve already learned its a hodgepodge of things.

Something else is in play that I can’t quite put my finger on though. It may be simpler than imagined. Nobody is wrong nor completely right.

You were a little perplexing because I said to myself how can he grasp so quickly what took me a few months. Is this a Google employee?

Most interesting to me were the counter-examples and the flip-flop Google did on March 12th. On that day I saw a bizzarro-world that targeted high traffic areas and with many good sites totally vanishing. It did not touch long tails. That made me wonder if people are experiencing two distinct things. It showed to me Google is either having problems evolving the algo or with the current one for some reason. I expected Crobb to recover but with any flip flop there is collateral damage.

As for visual quality there is a loose implication they do it in Adwords.

In the early going Walkman brought up an excellent point of busting the algo by adding a few words to the page. That’s a trick I’ve used in the past with success and I love to tinker. I think Google has gone past that. I am also sympathetic to Google bestowing gargantum levels of work on some webmasters while seemingly bypassing others doing the exact same thing.

tedster

11:48 pm on Mar 18, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



and of course, what's 'right' would be on a query dependent basis.

And as I currently understand it, query taxonomies are related to (or matched with) document taxonomies - the buckets that Google generates to handle their "document classifiers".

As for visual quality there is a loose implication they do it in Adwords.

There may well be some cross-fertilization with Adsense technologies for automating page quality scores. I'd also guess that Adwords click-fraud methodologies are in play to help clean the traffic data that organic results now use. However, it does seem like Google Suggestions were pretty easy to game with automated traffic, at least until recent months. What is happening with Suggestions seems to be a more and more geo-targeted set of suggestions, making it harder to game on a wide scale.

walkman

2:32 am on Mar 19, 2011 (gmt 0)



For some reason the one advice that Google keeps giving is: delete or redo the 'bad /thin' pages as they will hurt your entire site and wait for Google to index and later re-calculate.
[google.com...]

TheMadScientist

5:34 am on Mar 19, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You were a little perplexing because I said to myself how can he grasp so quickly what took me a few months. Is this a Google employee?

Thanks, but no ... I think it's probably because over the years I've read almost every, if not every, patent and patent application they've put out, and keep asking myself, 'Where are they going?' rather than, 'Where are they now?' and 'scoring everything' while 'scoring nothing' seems like it makes sense ... By scoring 'everything, while scoring nothing', I mean if you take pages and sites into account as a whole and don't score based on individual elements individually it seems you would end up with 'less gameable' results, because to 'game the system' you actually have to create a 'quality' page and site, which is what they're looking for in the first place.

And as I currently understand it, query taxonomies are related to (or matched with) document taxonomies - the buckets that Google generates to handle their "document classifiers".

That's essentially what I was meaning with less technical language ... A document (or collection of documents (site or possibly site part)) imo likely gets classified and is defined as 'quality' or 'not quality' based in part on that classification. So, using NYT and WikiPedia as examples, the NYT would likely be classified as news and 'quality news' would probably have a slightly different definition of 'quality' than WikiPedia, which would probably be classified as 'informational'.

You don't actually have to apply 'query dependent' rankings at the time of the query in all cases, all you have to do is select from the classification the query is determined to be related to, and by scoring different classifications in a slightly different manner you essentially have 'query dependent' scoring, without having to do it on-the-fly.

Here's an 'English' version of what I'm trying to say:
How do you not 'tank twitter' when comparing it to WikiPedia or Britanica? You can't in a 'single definition of quality' type system, so you have to define quality in different ways for different 'applications' (queries / sites) and then 'route' the queries to the 'matching' quality scoring category (document classification).

The preceding is why I think, for the most part, people are NOT going to find the 'reason' their site gained or lost ground on someone else's site. Google imo really has to use different definitions of quality for different queries to get close to 'right' so two sites may look and be built 'exactly the same way' but end up being scored differently.

outland88

8:05 pm on Mar 19, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As to what quality is, Cutts and a few of the engineers do elaborate on what Google perceives that to be. They state, something to the effect, that original research, studies, and writings will see higher rankings. That is a very trackable phenomenon especially with links. My sweeping generalization in November was they were moving in the direction of professional journalism standards especially in competitive areas. Branding would also gain more ground because of this. Bottom line can many webmasters meet those quality standards even with considerable changes. No doubt they can increase rankings somewhat but that front page of results is going to become even more rigged.

Also their whole scoring system seems to have holes even larger than Adwords. Quite the opposite I think Google was trying to make it a one size fits all system and it backfires when you attempt to truly ascertain quality.

crobb305

9:12 pm on Mar 19, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am becoming increasingly perplexed by the penalty imposed on my site, as it is all too similar to an OOP. Back in January, I made some tweaks to my homepage that included some new header tags. I had never used header tags on the homepage before (although I have used H3 on inner pages), and I wonder if their sudden appearance has tripped a filter. I added a main H1 to the top of the page, then four <h3> tags for some subheaders down the page. The thing that is eating at me is that the site survived a complete deep-crawl cycle after those changes were implemented.

It wasn't until March 10 when my site was penalized by 50%. The penalties seem to be more heavily applied to certain phrases, and most heavily to the 2-word phrase that is in the H1 tag (fell from page 1 to page 56). I simply can't imagine that G would penalize for legitimate use of H tags (Matt C uses them on his blog), but maybe this is exactly what I am seeing. I may have been thrown off onto the Panda trail when it could be an OOP. If it is an OOP, how could 2 months go by before penalty?

walkman

9:21 pm on Mar 19, 2011 (gmt 0)



"from page 1 to page 56"

That's odd and unless it's a mistake I'd say it's a penalty. It's unlikely that Google found 559 better pages in a week.

crobb305

9:35 pm on Mar 19, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



No doubt it's a penalty, but I am trying to figure out which road to take: OOP or Panda. Up until today, I have been treating it as if it were Panda (adding content, etc). Now, I am taking out the H tags on the homepage. Still, for it to be an OOP and hitting 2 months after the tweaks would seem really odd.

browsee

9:36 pm on Mar 19, 2011 (gmt 0)

10+ Year Member



I mentioned this before, my site is behind my twitter page on the first page. Just checked "Mahalo" keyword and their site is now #3 behind angelfire site(really angelfire?). What's going on Google, how low can you go.

It reminds me of Nero playing the fiddle when Rome was burning.

walkman

9:52 pm on Mar 19, 2011 (gmt 0)



Just checked "Mahalo" keyword and their site is now #3 behind angelfire site(really angelfire?). What's going on Google, how low can you go.


They are probably happy with Mahalo's rankings. Something tells me that Mahalo, suite101 and some others are manually checked to see if the algo screwed them good enough

browsee

3:07 am on Mar 20, 2011 (gmt 0)

10+ Year Member



keyword 'suite101' is fine. Something wrong with 'mahalo' and Google. They may have different levels of punishment or Mahalo may be the scapegoat.

Forbin001

7:11 pm on Mar 21, 2011 (gmt 0)

10+ Year Member



Im getting different results today from IE and Firefox. Im not signed in to gmail, and my web history is disabled with both browsers, and the results are different. Anyone else getting this?

gouri

7:58 pm on Mar 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I know people have often asked if outbound links should be nofollow and various opinions have been given. After the Panda update, I was hoping to get members' opinions on what is the best way to approach outbound links.

For Facebook and Twitter outbound links in the footer of a website, is it better to nofollow or follow?

Also, would you say that it is better to nofollow outbound links in banners located in your content area to advertisers' websites? Some say that if you are linking out to a good website it might be better to use follow instead of nofollow while others say that you will lose link juice so it is better not to.

I would appreciate your thoughts.

TheMadScientist

8:15 pm on Mar 22, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hmmm... one of the super-low traffic lately, haven't touched it in a year, but the info is still accurate so it doesn't really need to be updated, pages I have on a site I don't even look at any more exploded with traffic yesterday, I mean 2 to 8 visitors a day on average for the last month+ to 50+ yesterday ... I'm not sure what that means, but it must mean something to someone ... LOL ... Just thought I would share.

It's actually got a crazy wave pattern starting on the 15th ... Up to 20+ ... Up to 30+ ... Down to 20+ ... Steady at 20+ ... Down to < 10 ... Steady < 10 ... Up to 50+

NOTE: It's for a super small niche term, so 50+ unique visitors is a high number of visitors to it in a single day ... 20+ is doing fairly well imo.

[edited by: TheMadScientist at 8:26 pm (utc) on Mar 22, 2011]

walkman

8:26 pm on Mar 22, 2011 (gmt 0)



"For Facebook and Twitter outbound links in the footer of a website, is it better to nofollow or follow? "

Doesn't really matter: A no-follow link will bleed PR anyway and FB and Twitter are safe so no penalties from Google.

@TheMadScientist
Google has an old index (week or so old) in my DC right now. We could see a dance soon.

falsepositive

12:23 am on Mar 23, 2011 (gmt 0)

10+ Year Member



Question on linking to commercial sites. Panda is sensitive to ads now, but what if I link to commercial sites as part of editorial? How should I do it properly so it is not construed as an ad? Does anyone know the right way to link to a commercial site (not a paid link)?

Shatner

12:36 am on Mar 23, 2011 (gmt 0)

10+ Year Member



>>>Panda is sensitive to ads now

Who says? Nobody knows what Panda is sensitive to. Anyone who told you Panda is sensitive to ads is just making wild guesses.
This 366 message thread spans 13 pages: 366