Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Updates and SERP Changes - March 2011

         

Whitey

4:53 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



< continued from [webmasterworld.com...] >

< related Panda Farm Update [webmasterworld.com] >


I keep dropping mentions of this , but no takeup , so i did some digging, for clues to my theory Chrome's passing back intelligence that could influence this new algo and future changes :

New Chrome extension: block sites from Google's web search results
Monday, February 14, 2011 | 12:00 PM

Today the Google web search team launched a new Chrome extension to block low-quality sites from appearing in Google’s web search results. Read more in the post below, cross-posted from the Official Google Blog. - Ed


[chrome.blogspot.com...]

Also - [webmasterworld.com...]

I think user behaviour data is being underestimated in this thread. Each website will have an depth profile building that feeds into a potential quality assessment by Google. What say you ?

[edited by: tedster at 8:15 pm (utc) on Mar 15, 2011]

live3life

11:35 pm on Mar 11, 2011 (gmt 0)

10+ Year Member



I think the strength of the algorithm can be only be fully tested when dealing with evolving sites, by downgrading the site and how a big spike will be required to reevaluate the site. The algo can only fully be proven when it breaks through an evolving site into top ranking. Until then it could be accused of Panda thinking.......black and white.

Jane_Doe

2:17 am on Mar 12, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I don't think I should have to grovel and write a letter, oh please put me back in the SERPs where I used to be.... even though I think my site was dropped unfairly.


I don't think letters are going to work. In my case I am probably going to have to redesign my site - at least rewrite all of the pages that tanked. But it isn't the first time I have had to do that and probably won't be the last.

zerillos

10:21 am on Mar 12, 2011 (gmt 0)

10+ Year Member Top Contributors Of The Month



today i lost rankings for yet another keyword. up until today, the article ranking for this particular keyword was not scraped at all. today i see two additional results when searching for phrases from that article. one of them is a page that doesn't even have the actual text on it, and the other result redirects to an amazon product page (Kindle). the subject of my article has nothing to do with amazon or kindle.

spaceylacie

6:06 pm on Mar 12, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't think letters are going to work. In my case I am probably going to have to redesign my site - at least rewrite all of the pages that tanked.


Even though I am really tempted to just give up the endless battle, I've started doing the same thing. I am going through sentence by sentence to see what has been scraped and what hasn't, most of it has been stolen many times over. So I get docked for duplicate content. I've spent about 40 hours so far re-writing about 5 sections, just 134 left to go now as all my pages tanked! Then, after all that work, I'm sure just a matter of time before the new copy gets scraped.

falsepositive

7:12 pm on Mar 13, 2011 (gmt 0)

10+ Year Member



Spacey, what's the point of rewriting when it will be stolen again? I would have to rewrite my entire website if that were the case. That's 4 years' worth of work. I have seen way more than 50% of my work duplicated in some capacity. I am currently running a report to assess the duplications to find out the extent of scraping. I am planning to send this report to Google once I'm done. That's the best I can do. It will hopefully be more worth my time than rewriting articles that have taken me many years to build up.

spaceylacie

8:10 pm on Mar 13, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Falsepos, mine is over 10 years worth of work. It's been scraped so much it's ridiculous. Multiple pages of results that are not my site when searching for paragraphs in quotes. I don't know how long it would take before it's scraped again, but at least I should be fine for a few years.

My first thought was also to compile my case and grovel to G. Go ahead if that's what you want to do. It just makes me mad that the little guys are stuck writing pathetic letters that probably won't do any good anyway.

Jane_Doe

8:16 pm on Mar 13, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



So I get docked for duplicate content.


Sometimes the scraped articles jut rank low because of other low quality issues. The scrapers ranking higher is sometimes the effect, not the cause.

Rewriting will take the longest so I'm tackling other issues first, like removing or consolidating smaller pages and fixing broken links.

maximillianos

8:21 pm on Mar 13, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sadly I have tried the letter writing approach in the past. 4 years ago I filed a DMCA with Yahoo and Google regarding a "new" competitor in my field. They had copied over 10,000 pages from my site.

Neither search engine seemed to care at the time. To think I went through the trouble of outlining almost 10,000 distinct URLs from their site matching it to my site (I wrote a program to help). They even had them all created on the same day and dated the same day! It was an obvious case, but neither search engine cared to do anything about it. They just told me I should seek legal help. Our lawyer said it would cost $40k-$50k if we went to court. Don't know about most of you, but we don't have that kind of money.

So hear we are 4 years later. This "competitor" is now leading our niche after the Panda update. Mind you I am not the only site they scraped. Every site in our niche that was scraped 4 years ago is now at the bottom of serps or off the radar. The two scrapers in our niche are the top 2 results of every search in our niche.

Weird how Google got it so backwards. It is too late now to stop them. They have had years of masquerading as legit sites so they have built up legit links, etc.

There is nothing we can do but try to compete in what is now an unfair playing field as we have been penalized and they have not.

tedster

8:26 pm on Mar 13, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There's hope. Here's Matt Cutts, speaking at SMX last week, as reported by Vanessa Fox:

Continued Algorithm Changes

Google is working to help original content rank better and may, for instance, experiment with swapping the position of the original source and the syndicated source when the syndicated version would ordinarily rank highest based on value signals to the page. And they are continuing to work on identifying scraped content.

[searchengineland.com...]

I also noticed this: "He said the focus this year is on making low quality content and content farms less visible in search results as well as helping original creators of content be more visible."

So it's not one or two updates and then on with other search factors. It's the search department's focus for the whole year.

spaceylacie

8:40 pm on Mar 13, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks, tedster for that link and info. You must imagine how some of feel that were around BEFORE google, we helped them spider the web in the beginning and they appreciated our input. Now, many of us are feeling like we were used then thrown to the gutter. I had heard that it's being worked on but not as recent and specific as this, thanks again for posting.

mrguy

8:47 pm on Mar 13, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's the search department's focus for the whole year.


Translated, that means be ready for many more changes and upheavals throughout the year.

tedster

8:53 pm on Mar 13, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



With the way the web scaled up, this whole area just got away from Google. There now are cross-purposes within the ranking algorithm, and each one on its own does make sense:

1. Original Source
- a journalist's focus or an academic concern, not traditionally a search engine matter

2. Freshness Factor
- the need for results to be current can outweigh the original author's intellectual property

3. User Preference
- The "big brand" source often gets the click, not the original source even if it appears

This 3-way conflict is a real rat's nest to sort out, even if the original source is clearly known. And knowing the original author is a major challenge, for technical reasons and also because of the massive scale involved.

TheMadScientist

1:53 pm on Mar 15, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Results look like they're back to what they were about a week ago for me this am ... Not exactly the same, but very close ... Anyone else seeing anything that looks reverted?

chrisv1963

2:11 pm on Mar 15, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Results look like they're back to what they were about a week ago for me this am ... Not exactly the same, but very close ... Anyone else seeing anything that looks reverted?


I see different sets of results. Do a search and then do the same search a couple of minutes later. You might see very different results.

I don't think it reverted. Maybe some tweaks, but nothing more than that. They are so "proud" of the mess they created and I don't think they will revert it.

TheMadScientist

2:21 pm on Mar 15, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yeah, I don't think they're going to turn the dial all the way back to pre-panda, but to me it looks like they're back where they were not too long ago ... I tried refreshing and didn't get anything different, but that's been 'niche (or even search) specific' for quite a while it seems.

Jane_Doe

2:43 pm on Mar 15, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



As far as original content, sometimes they just put the higher PR site up first and call it a day.

I have had content that was online and indexed in Google for over a year copied by a higher PR, high trust rank nonprofit .org site and my site got the duplicate content penalty. The page stopped ranking for anything until I rewrote it.

[edited by: Jane_Doe at 3:27 pm (utc) on Mar 15, 2011]

TheMadScientist

2:47 pm on Mar 15, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think you hit the wrong thread Jane_Doe ... Been there, done that myself.

tedster

8:19 pm on Mar 15, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



With regard to refreshing the results and seeing them change, I don't see any change in the IP Google IP address when that happens. I know that IP does not equal datacenter these days, so it still might be a different datacenter - but something in Google's back end is getting pretty hard to fathom.

crobb305

10:39 pm on Mar 15, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



TheMadScientist,

I thought I saw my rankings revert, until I realized I was seeing that personalized search crap. My traffic data show no improvements (still -58% to -65%). Hopefully you're really seeing improvements, as that would be a positive sign. I think I asked before, but remind me -- are you making changes or waiting it out a bit?

C

falsepositive

11:43 pm on Mar 15, 2011 (gmt 0)

10+ Year Member



crobb305, forgive me for interjecting. I read on the G forums that John Mu is encouraging changes if you were hit. He says:

This is not limited to this particular algorithm update & your site, but I'd like to mention it regardless:
I can assure you that our algorithms are not one-way streets. As a website is updated, recrawled, reindexed, and with that, the site's signals reassessed, our algorithms will take those changes into account and treat the website accordingly. There are countless examples of that happening here in the forums, I see them regularly.

That process is usually not something that takes place overnight after a webmaster has uploaded a fresh copy of the code for the website. For example, it takes time for us to recrawl the pages, the bigger the site, the longer it will take. The better a site is structured (less duplicate content, no infinite URL spaces, etc), the faster we'll be able to recrawl parts of a site and take that content into consideration. Sometimes, even after recrawling parts of a site, our algorithms will need a bit of time to confirm that the site has really changed for good.

All of this can and will take time. Personally, I'd recommend not waiting to see if a single, small change will make a difference, our algorithms rarely have a "one-track-mind," they take many factors into account. Because of that, I'd recommend always continuing to work on your site, to improve it, expand it, to get feedback from your users and to take action on that feedback (happy users come back and recommend your site to their friends!). Even when you start to see changes, don't stop there -- make your site into the best resource of it's kind.


So I'm taking that as to mean, keep working to make sure your users are happy.

TheMadScientist

12:59 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yeah, I don't think mine was personalization ... I was using a no-cookies, Google only browser with 2 proxies in separate tabs and scripts off ... I actually checked back a bit later and they reverted to the newer version, so I'm not sure what the deal was and like tedster said, there are things that are really tough to figure sometimes, so I try to not over analyze it too much and move on to the things I can control.

Thanks for checking though ... I've only been watching a small set of terms and am knee-deep in a project so I haven't had as much time for 'my stuff' as I normally would to get a better idea of whether something is a 'fluke' or more wide spread.

@falsepositive I think you should feel free to make great interjections like that any time ... Really, thanks for sharing the info!

crobb305

1:04 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



make your site into the best resource of it's kind.


This is just a rehash of the same, tired "content is king" argument from 7 years ago. Some good sites who have focused on content have been hit here. I am proof of that. Granted, I have some affiliate links, only on 4% of my pages, but I have tested ALL my content pages for duplication, and mine are the only copies out there -- no duplication. We have been very aggressive over the years at issuing DMCA on all copyright violations. Yet, we are penalized. They are just giving lip service. "Work on your sites...but give it time" (how long? A year? Didn't they say the same thing about the -50 penalties?)

TheMadScientist

1:14 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Crobb305 I know you're in a tough spot, but let me point out a couple of things that stuck out when I read the post not only to you, but for others. They might be 'nothing', but I think there could be something in the choice of words and maybe it'll give you or someone some ideas or a different way to look at things...

That process is usually not something that takes place overnight after a webmaster has uploaded a fresh copy of the code for the website.

A fresh copy of the code? Hmmm ... Why did he say 'code' not 'content'?

Sometimes, even after recrawling parts of a site, our algorithms will need a bit of time to confirm that the site has really changed for good.

Confirm that the site has really changed? Interesting...

All of this can and will take time.

And will take time? That's fairly definitive in terms of 'people are not going back to the top today from the fiddling they did yesterday, or maybe even last week.', imo.

He didn't say, 'it may take time' ... Indicating the converse 'it may not be long' would also be true ... He says can and will take time.

[edited by: TheMadScientist at 1:25 am (utc) on Mar 16, 2011]

trakkerguy

1:20 am on Mar 16, 2011 (gmt 0)

10+ Year Member



confirm that the site has really changed for good


Yes, that is interesting. And suggests a good reason for why they might have a delay before recovery (besides just covering their tracks).

Makes sense.

crobb305

1:28 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Points taken MadScientist. It's easy to get frustrated and throw inanimate objects over this whole thing, when I have worked VERY hard to bring a valuable resource, fought duplication and content thieves, and spent many hours researching topics for my site, then to get spanked by an algorithm that wasn't ready. It has hurt a lot of sites, and they know it. But you raise interesting points about his chosen words, and in particular the "fresh code". Are they looking at page upload dates from the server? Maybe he means "code" as all-encompassing (to cover everything from grammar, spelling, dead links, duplication, short/missing title/description tags, etc...)

TheMadScientist

1:43 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's easy to get frustrated and throw inanimate objects over this whole thing, when I have worked VERY hard to bring a valuable resource, fought duplication and content thieves, and spent many hours researching topics for my site, then to get spanked by an algorithm that wasn't ready.

Oh, I feel ya ... I didn't get tanked by the algo though ... I read a bunch of news articles that were about a site like it's the first of it's kind when I was part of launching the same idea on a larger scale months ago ... Talk about instant frustration and a nearly insatiable desire to throw things ... I put MONTHS into building the site that's already there and the one that launched was 'thrown together' (built if I'm being PC) in a few days and launched incomplete ... Didn't even have terms of service or a privacy policy, but it was the 'news'?

It appears the people involved in the articles didn't even bother to try a simple search on Google or Bing to see if maybe the type of site they were reporting on had already been done with a much higher degree of professionalism than what they were reporting on ... Yes, I'm still heated, because building the one I worked on was a HUGE investment of time and effort and a simple search would have shown the 'news' site is WAY behind what's already being done.

### Moving on to the Best of My Ability ###

I think it goes even farther than you're taking it with code ... Add: HMTL structure (including p1r's favorite, semantic markup), layout, you name it, it's in there, imo ... I know people say but we've 'tested' and 'blah' doesn't count any more, which imo is the case singularly but I think (other than keywords) almost everything 'works together' to count ... How else do you even find 200+ variables to score a page?

tedster

2:03 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



confirm that the site has really changed for good

It only takes a few weeks of reading here to realize that people often create redirects and URL changes, only back out of them within days. Then some will layer on yet another set of changes trying to repair their first errors.

I think the operative words in John Muller's statement are "for good". Imagine rebuilding the, that's foundation of a house several times within a few months. Because it's in the physical world, that wouldn't happen. But because it's "only code", I think many are far too casual about changing their website's foundation.

walkman

3:16 am on Mar 16, 2011 (gmt 0)



I think people sometimes over-analyze things, like every word said by a Google employee. John, most likely didn't ponder each word for hours, just said them casually. I read it as "You need make changes; we need to index it; you wait until we decide to remix the sauce again, and you can get out of this--sometime, and if you do the right right as Google sees it."

My guess is one month.

falsepositive

4:06 am on Mar 16, 2011 (gmt 0)

10+ Year Member



Not sure if this was discussed here yet, but I'd like to ask... I stumbled on this old thread [webmasterworld.com ]. Seems like this problem has existed since time immemorial -- throwing baby with the bathwater, scrapers winning, good sites that were scraped disappeared, etc. Deja vu? Does anyone remember what happened to this update? My site wasn't even born yet when this happened... I'm wondering what kind of resolution turned out for many of the victims of collateral damage.

crobb305

4:25 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



John, most likely didn't ponder each word for hours, just said them casually


@Walkman, you may be right. Afterall, John is a programmer, and programmers tend to speak in programmer's lingo, so the word "code" may have rolled of his tongue more naturally than "content". Just another way to look at it (again, I am over analyzing lol), and I still see MadScientist's points. I have been working since last night, following Google's suggestions, and finding some VERY interesting things in my WMT tools data for the pages that took the biggest hits. I may try to summarize my findings here soon. Considering that my site is small and static, it's fairly easy for me to track down recent changes and make changes to each site readily. I can't imagine having a 1,000+ page site and having to do this!
This 366 message thread spans 13 pages: 366