homepage Welcome to WebmasterWorld Guest from 54.204.249.184
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 386 message thread spans 13 pages: < < 386 ( 1 2 3 4 5 [6] 7 8 9 10 11 12 13 > >     
Matt Cutts and Amit Singhal Share Insider Detail on Panda Update
tedster




msg:4276281
 10:54 pm on Mar 3, 2011 (gmt 0)

Senior member g1smd pointed out this link in another thread - and it's a juicy one. The Panda That Hates Farms [wired.com]

Wired Magazine interviewed both Matt Cutts and Amit Singhal and in the process got some helpful insight into the Farm Update. I note that some of the speculation we've had at WebmasterWorld is confirmed:

Outside quality raters were involved at the beginning
...we used our standard evaluation system that we've developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: "Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?"


Excessive ads were part of the early definition
There was an engineer who came up with a rigorous set of questions, everything from. "Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?"


The update is algorithmic, not manual
...we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons.

 

c41lum




msg:4277221
 7:52 pm on Mar 5, 2011 (gmt 0)

Hi Everyone, has anyone on here noticed the following scenario for news publishers.

1st. Dropped from Google News.
2nd. Wiped out by the farmer update.

If this scenario is happening in the US it might help other countries where the 'farmer' update hasn't happened yet to spot a pattern and make changes before it hits.

I know lots of sites were dropped in November,December and January from Google News. It would be interesting to find out if these same sites have been smashed by this latest update.

Planet13




msg:4277223
 7:53 pm on Mar 5, 2011 (gmt 0)

Did you see how Matt Cutts responded to Wired about Suite101, pure arrogance.


One man's arrogance is another man's truth.

I don't see why all these people think they were entitled to high rankings in google. To me, many of them seem to be ranking about WHERE THEY SHOULD HAVE BEEN RANKING ALL ALONG.

Most of those sites were nothing but snakeskin oil.

They were able to game the system for a while, and enjoyed some success. They should be thankful for that.

It's kind of like the guy that dates a girl who is way out of his league. Six months down the road, she dumps him, and the guy is heartbroken. And he gets upset about being forsaken when he should be feeling good for being able to hoodwink a hottie for half a year.

browsee




msg:4277232
 8:22 pm on Mar 5, 2011 (gmt 0)

@Pageonresults, fair enough. Check ViewPoints CEO reply in the comments. [google.com...]

dickbaker




msg:4277236
 8:33 pm on Mar 5, 2011 (gmt 0)

I don't see why all these people think they were entitled to high rankings in google. To me, many of them seem to be ranking about WHERE THEY SHOULD HAVE BEEN RANKING ALL ALONG.


Well, for years--or even for over a decade--Google thought these sites deserved high rankings. If the sites weren't ranking well, and the owners wanted them to, the owners would have taken steps to make them rank well (which most of us did).

I wish the rules here allowed me to walk you through some of the results I'm tracking across different niches. I don't think you'd be so flippant with your remarks.

walkman




msg:4277256
 9:28 pm on Mar 5, 2011 (gmt 0)

Sorry but the Viewpoints CEO makes a great point. Their site should not have been penalized, along with insiderpages and a lot of other sites.

gmb21




msg:4277260
 9:35 pm on Mar 5, 2011 (gmt 0)

Looking at some of the sites which did well out of this update (e.g. Britannica.com) and sites which Matt Cutts said are good quality (e.g. New York Times), they are quite heavy on ads. Most pages have a 728x90 leaderboard and a 300x250 rectangle above the fold, in addition to other ads on the page.

Most of the "articles" on Britannica are only 2 or 3 sentences surrounded by ads (this is because you have to subscribe to get the full article). If this update was about thin content or too much ad real estate, then I would have expected Britannica to fall in the rankings.

In the interview referred to at the start of this thread, Matt Cutts claims that the algorithm was based on initial human choices of what was a good site and what was not, so they would have been careful not to choose factors that would demote these "good" sites.

I think you look for signals that recreate that same intuition, that same experience that you have as an engineer and that users have. Whenever we look at the most blocked sites, it did match our intuition and experience, but the key is, you also have your experience of the sorts of sites that are going to be adding value for users versus not adding value for users. And we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons

browsee




msg:4277262
 9:47 pm on Mar 5, 2011 (gmt 0)

My current take is that we have a new algo that IS misfiring in many cases - more than we've seen in any update I can remember. Google apparently knows that, too. They are asking for examples of false positives [webmasterworld.com] - a rather public admission that the new algo has problems.


@tedster, is this something new or did they ask for the feedback after the algo changes before?

tedster




msg:4277265
 9:49 pm on Mar 5, 2011 (gmt 0)

I do remember Google asking for feedback on algo changes many times going back years. But it was usually done through email with a code word in the Subject line. This is the first I recall a dedicated forum thread.

Simsi




msg:4277267
 10:02 pm on Mar 5, 2011 (gmt 0)

Most of the "articles" on Britannica are only 2 or 3 sentences surrounded by ads (this is because you have to subscribe to get the full article). If this update was about thin content or too much ad real estate, then I would have expected Britannica to fall in the rankings.


But if I ran Google I would have an exceptions file which would contain Brittanica & NYT and others. There are some sites that you simply have to have in your SERPS.

And Google doesn't have to try to be "fair" as such (although I think it is in it's best interests) - it just needs to try and deliver the best results it can to searchers.

dickbaker




msg:4277270
 10:09 pm on Mar 5, 2011 (gmt 0)

But if I ran Google I would have an exceptions file which would contain Brittanica & NYT and others. There are some sites that you simply have to have in your SERPS.


Why would Brittanica be needed in the SERPS when there's Wikipedia, which has more content and no ads, or other informational sites that don't charge for their information?

As for the New York Times, why is that news outlet necessary? There are a great many people who, for political reasons, refuse to read it, and there are news outlets of better quality.

browsee




msg:4277286
 10:55 pm on Mar 5, 2011 (gmt 0)

Thanks tedster, just curious on why G is asking for the feedback in their forums, based on what I can see G knows that there is a big problem.

browsee




msg:4277290
 11:13 pm on Mar 5, 2011 (gmt 0)

There is an excellent article on SearchEngineLand on what to do next.

[searchengineland.com...]

NY Times also published article on Google algorithm. Worth read.

[nytimes.com...]

mromero




msg:4277292
 11:23 pm on Mar 5, 2011 (gmt 0)

I am reading so much on this peasant uprising I am getting a headache.

I hope this has not been mentioned before but Ms. Fox has a good insight I have been reading.

Doing a quick look at the analytic chart on one site is showing me a lot of stuff I was not aware of before.

Some makes sense, a little I am still trying to figure out.

One thing jumped out immediately is less adsense and zero direct ads shot up certain pages into the stratosphere.

More reading into the night......

ergophobe




msg:4277307
 12:41 am on Mar 6, 2011 (gmt 0)

So rather than a failure or a mere PR stunt


I don't think it was a *mere* PR stunt. I meant more that a bad solution was rushed to market to stifle the increasing clamor. So it was a PR stunt in the sense that it was meant to do something about the clamor and the mainstream press who don't seem to know how to follow this sort of thing accept it as "better" for now, even though it clearly needs a LOT of work.

In the meantime, though, they have raised expectations, the very thing Matt Cutts no doubt feared given his blog post about how the results are higher quality than ever, but the expectations are much much higher than ever before, with the result that people see the quality declining.

I have never used the spam tool to report someone who outranked me in the SERPs no matter what trashy spam they were putting out there. Not my job to do Google's research for them and if I started reporting everything that outranked my sorry ass sites, I'd be at it for months. But when I saw within 2 hours sites that were the very thing the new algo was supposed to stop outrank my page (a throwaway page on EZA), I reported them. Not because I wanted to stop the spammers, but because I wanted to throw Google's new algo back in their face and show them how far from working it is. I have no doubt the spammers who are outranking me can autopilot their way to top rankings on another domain with no trouble and bear them no personal animus.

Yep, it's coming. Google called it "a new layer"


Yes and no. The "new layer" is meant to respond to those who were killed in the rankings but did not deserve to get killed. It will not correct the fact that the new algo tune has done nothing, as far as I can see, to get the lowest quality spam out of the index. From that perspective, this is a total joke and the new layer won't do anything at all as I understand it.

What I see out of this is "phase 2", aimed at the big boy content farms, has knocked some really big players down a notch. But phase 1, which was occasioned by the complaints of the Stack Exchange guys about blatant scrapers outranking original content, has seen no improvement at all. It may be worse than that. By knocking out the content farms that have a certain minimum standard, low though it may be (EZA, Squidoo), they have just created more first-page spots for the true bottom-feeder spammers.

tedster




msg:4277312
 1:14 am on Mar 6, 2011 (gmt 0)

I agree - the Scraper update was a total wuss. That may even be a big factor in the Panda failures.

Planet13




msg:4277314
 1:38 am on Mar 6, 2011 (gmt 0)

As for the New York Times, why is that news outlet necessary? There are a great many people who, for political reasons, refuse to read it, and there are news outlets of better quality.


Uh... because the people who refuse to read it for "political reasons" are idiots?

They somehow think that one of the great capitalist traditions in the US is pushing a socialist agenda on the world, despite being full of adds for Macys and other businesses.

The NYT is all about making money.

Don't believe me? Ask them if they will advertise your business for free, or if they will give you a free subscription.

Planet13




msg:4277315
 1:41 am on Mar 6, 2011 (gmt 0)

I wish the rules here allowed me to walk you through some of the results I'm tracking across different niches. I don't think you'd be so flippant with your remarks.


You can sticky mail them to me. Would love to see them.

But the ones I have seen plummet are overwhelmingly sites that the majority of people on these threads have rightly called spam for years now, and should have gone south a long time ago.

vanessafox




msg:4277317
 1:52 am on Mar 6, 2011 (gmt 0)

To answer the question about when Google has asked for feedback before, they're always looking for feedback. As I noted in my Search Engine land article, one of the primary reasons we created the webmaster trends analyst position was to look for feedback across the web on these types of changes. But Google also asks for feedback formally every so often.

Matt Cutts opens comments up for general feedback all the time, as you can see in this post:
[mattcutts.com...]

Note that Google used to ask for feedback via their contact form or Matt's blog before we created the webmaster discussion forums.

And Google solicited public feedback on both Big Daddy and Caffeine:
[mattcutts.com...]
[googlewebmastercentral.blogspot.com...]

walkman




msg:4277323
 2:16 am on Mar 6, 2011 (gmt 0)

Gooooooglebot is going nuts on my site right now. Absolutely nuts, it might get 100% of the pages today if the trend continues (60% so far)

browsee




msg:4277326
 2:59 am on Mar 6, 2011 (gmt 0)

Search became a laughing matter now. I am happy to see people participating on G's forum to express their issues. Just saw one example from some research scholar.

[google.com...]

Is it okay for the same company to buy and use domain names with multiple extensions seemingly to monolpolize the first page of Google search results?


This is really a big issue for G, apparently G gives preference to key words in the domain name, these scrappers are buying similar key word domain names or creating sub domains and topping the G's first page.

browsee




msg:4277331
 3:26 am on Mar 6, 2011 (gmt 0)

@Walkman, same here. Crazy activity from Gbot.

dickbaker




msg:4277340
 4:53 am on Mar 6, 2011 (gmt 0)

Brownee, the only problem I find with using SearchEngineLand's suggestion to utilize Webmaster Tools is that the query data is either flawed, dated, or includes non-US datacenters. Webmaster Tools is showing phrases for my site at #7 where I find them at #20 (or worse).

vanessafox




msg:4277350
 5:33 am on Mar 6, 2011 (gmt 0)

dickbaker -- Filter the data to the US only, Web only, and change the date range to begin at 2/25 if you want to see US-based current rankings data.

(I'll edit the article to mention that.)

Note also that results have gotten fairly personalized so the average ranking position webmaster tools shows you is the average across what every searcher has seen. I'm fairly certain it's accurate.

walkman




msg:4277352
 5:41 am on Mar 6, 2011 (gmt 0)

@Vanessa

great article on Search Engine Land.

Assuming that we fix the on page problem and google indexes the page right now. How many days are we talking until rankings are restored? 2, 3 days, a week or longer?

dickbaker




msg:4277369
 6:50 am on Mar 6, 2011 (gmt 0)

Thank you, Vanessafox. I haven't gotten very in depth with my use of Webmaster Tools, so I didn't know those options were available.

Today I added more "gravitas" to a product (ecommerce product) page that had been hit hard, dropping from #6 to #84. A few hours later I received three orders for these $1000+ widgets. One order every day is very good for me, three in a day is very unusual, but three in two hours is something I've never seen before.

In looking at the log files, I saw that all three visits came from Google, using either "Acme widgets" or "Acme model XYZ" as the search phrase.

I don't see the page ranking anywhere for those results until I get to page 9, and I doubt very much that Google would turn things around that quickly. It's either an odd coincidence, or there's something else going on (perhaps a mention on a forum).

JohnRoy




msg:4277379
 7:13 am on Mar 6, 2011 (gmt 0)

Gooooooglebot is going nuts on my site right now. Absolutely nuts, it might get 100% of the pages today if the trend continues (60% so far)
<Fiction>; since it was suggested here that Google now classifies page quality (in addition to page relevance), the bot was sent out again for this important mission :)
trakkerguy




msg:4277383
 7:47 am on Mar 6, 2011 (gmt 0)

@johnroy - you forgot the </Fiction> :)

zoltan




msg:4277388
 8:08 am on Mar 6, 2011 (gmt 0)

Vanessa, this article is great. The main problem is that I do not see any pattern on what lost more and what lost less. It really looks like the target of this downgrade is the entire domain name, not just specific site sections. Except the brandname, all other US keyword traffic seems to be affected.

Jane_Doe




msg:4277394
 9:16 am on Mar 6, 2011 (gmt 0)

Good article Vanessa, thanks. I've been working away on one of my sites tonight and your suggestions helped.

Bigwebmaster




msg:4277463
 4:19 pm on Mar 6, 2011 (gmt 0)

I noticed an interesting observation and am just curious if anybody else has noticed this. In Google Webmaster Tools, if you look at your crawl stats there, ours have had a slight uptrend in the last 3 months, so nothing abnormal there, between 40000 and 60000 pages crawled per day. Time spent downloading a page is rather flat at around 150 ms over the last 3 months. The chart that stands out to be is the "Kilobytes downloaded per day". Since mid February that has tripled from about 500000 KB per day to 1500000 KB per day. Not sure how it could be related, but timing wise its related to this update and normally the Kilobytes downloaded per day is relatively flat, except since Mid Feb it is going up and up.

freejung




msg:4277477
 4:56 pm on Mar 6, 2011 (gmt 0)

Wow! Take a gander at some of those sites that people are reporting losses for. Go ahead and drill down to the MFA pages. I visited more than a few, all of them would have failed my initial sniff tests. Some of those folks have definitely stretched the limits of AdSense.

I agree that many of these sites have too much ad space, but there are several notable exceptions that lead me to suspect that ad space is not the primary factor here.

The one common thing I'm seeing in these sites has to do with overall look-and-feel and is hard to define, but it's basically this: upon landing on one of the internal pages of these sites, it's not immediately obvious that you've found the answer to your query (even in cases where a good answer really is there). This has to do with layout, typesetting and placement, which when badly done creat an overall impression of confusion.

My hypothesis, based on this, is that user interaction data of some kind is the primary factor here. People land on these pages and react negatively in some way that Google can measure as indicating "this page doesn't immediately appear to be what I want."

This 386 message thread spans 13 pages: < < 386 ( 1 2 3 4 5 [6] 7 8 9 10 11 12 13 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved