Another long-time lurker here.
We operate a network of sites. These are all relatively old, original content white-hat sites with really very little professional SEO work done. In aggregate, they reach many millions of US uniques per month.
Anyways, with the recent changes, we have seen that a site is either deemed "good", "bad" or "neutral" in it's entirety and have had all their pages scores adjusted accordingly. Of course, this has different results on rank and traffic based upon strength, competition, etc.
The high-level of what we have seen so far is:
1. 13% of our sites decreased dramatically.
2. 20% of our sites increased marginally.
3. The remainder had no noticeable changes.
The only thing that seems to have a correlation to being an "up" site instead of a down is that nearly all of our up sites have keywords in their domain name. 1 down site has a good keyword in it's domain name, and dropped on all terms except for the name keyword.
I agree with you ismailman. Good post that explains the situation.
ismailman, agree, this is a sitewide score adjustment.
Its the weekend so I guess I need go drink some beers. I see that some key phrases in my top twenty have recovered - while others that were untouched yesterday have been knobled. But what my stats are telling me isn't what google search is telling me - so it probably needs to settle. In most cases the changes seems to have resulted in me losing 6 or 7 places in the SERPS. In some cases the stuff between me and my old position is new but good, some is really old, some is just garbage. All appear to have overoptimized text, and about a quarter of them seem to be employing black- hat techniques like keyword stuffing that has't worked since 2002.
I'm not sure what Google means by copied content as most of the penalized sites including my own don't have copied content. I cant be subjective about the quality of my sites - but some of those listed higher are appallingly bad.
If this is worst of it - then I have to have hope that the worst of these sites will be weeded out by Personal Blocklist Chrome extension (whatever that is?) - so I might move a few spots higher.
greenham13 - I did a quick look and it does appear that the new sites have a least one part of the key phrase in the domain name. It might explain why my top key phrase was knobled but my number 2 phrase is till holding out.
@GeraniumV I'm one beer ahead of you.
I posted a new blog post this morning and the original post on our blog is nowhere for the title in quotes yet 3 sites for whom scraped the content could be found. Anybody know what's up with that?
What I've done:
Removed an entire 'thin' section (banned on Robots; remove request on Webmaster central)
Changed the titles to 'Widget My Domain Name' (my domain name is a keyword)
Removed a few mentions of 'Widget' in the page to thin it out
Hope and pray that as G indexes them, I'll cross the threshold. Any yesterday penalty /demotion is algorithmic, and as per Matt Cutts, it's solved as soon as G re-indexes and recalculates the rankings--assuming the problems are fixed.
It occurred to me after much of my ranting on this thread to think long and hard about what Google is going after in this update and how they might do it.
They want to eliminate or dramatically reduce content farm rankings. They need something pretty sweeping to make this kind of adjustment. It doesn't look like it's based on things like poor grammar or even keyword repetition. In fact, there are reports that sites with both of these features have "bubbled up" above long-term authority sites.
The fallout is much too high - with sites with very unique non-spammy content falling.
What I have seen almost universally in sites that have fallen is the following:
1) Lots of pages of content, sometimes generated over years.
2) A relatively small number of incoming links, or incoming links that are of low value/low quality.
Content farms survive essentially by generating thousands of pages of content on "authority" domains that rank well not because the individual pages have well-structured non-spam links, but because the domains themselves have "authority". While generating pages is relatively easy for an authority domain, generating high-quality incoming links is considerably more difficult--especially as Google gets better at identifying low-value links.
I personally think this is a new site-wide filter that devalues sites who fall too far afield from the link-quality-to-page-count ratio.
Of course, it's probably a little more complex than just "number of links" vs. "number of pages". Google could be looking at any number of factors when making this calculation.
If you take what Matt Cutt's says seriously, Google wants to value well-researched, well-written articles that provide unique content and perspectives. These are the types of articles that (1) are not produced very quickly; (2) are updated as more information becomes available; (3) are generally good at drawing links naturally; and (4) would generally be more difficult to write "quickly".
A site that is committed to these principals would naturally have a high link-quality-to-page-count ratio.
Now, I'm not suggesting you go and delete all your content. I don't really know. But, I think this isn't necessarily a bad theory on what Google might be up to.
I want to add in regards to G not identifying my content when searching for title in quotes, my page is indexed via site command crawled less than 24 hrs, but with the search it's not found.
At first I wasn't thinking sitewide or page-specific penalty as many have suggested, I realy feel it's just a shocking change to the index, but i'm starting to rethink this.
Scraper sites show up with my content and even though my page is cached it's not showing in a search. Really weird stupid frightening.
Reactions to the Google update on CNN are referencing Webmaster World, backdraft7, and DickBaker.
Read the Article [money.cnn.com]
Google gave a sweet heart deal to their corporate buddies in my field. If you're not as big as wal-mart in your field, you got toasted. It was a little and medium guy ($20 million size) wipe out for my subject matter.
It was the little guy that made google in the beginning, now googlesoft stabbed them in the back.
Curious to know if the people in the revenue department saw a huge drop.
Ismailman, that's an interesting theory you present. Here's a question, though: as I've said, I had many, many phrases drop from the first page to nowhere. One phrase, for example, has ranked #2 to #6 since 2005. It's kept that ranking through every strange update Google has thrown at us.
If it were a matter of scoring, why would this page drop all the way from #5 to #49 for that phrase?
As I check phrases I've tracked over the years, the drops are anywhere from as little as 6 positions to as many as 40 or 50. These are all phrases that were first-page until yesterday.
Here's a question for everyone: is there anyone here whose site has been affected who does NOT use Google Analytics, or is everyone who's been affected a GA user?
Amazon, ebay, walmart, sears and best buy all saw big increases in ranking, but not as big as Etsy and Zazzle did. There's also a good amount of small retailers in there without much content but with very user friendly stores that got a huge benefit.
My sites that have GA were affected.
My sites that don't, were not affected.
I am not saying google is penalizing everyone with poor backlinks. I am saying I am seeing a lot of sites get pushed off the first page who have poor backlink profiles, meaning there entire profile is poor or looks paid. Hopefully google is simply devaluing these links and not imposing penalties. My feeling is if google feels a link is paid/spam etc they should not count it.
FWIW, saw some activity on my statcounter from Google. Some time spent on my site a bit ago. It's possible that some manual reviews are being conducted. Could they be checking sites that have plummeted in ranking and reviewing their content?
The comment on backlinks: can't be the case because I have more than one site. The highest quality site I have is large, with links from commercial sites in the space, media sites, well known forums (natural), and some of my work even cited in printed books. I have smaller sites with my limited content, maintained haphazardly, not up to date, arguably the same topic areas and absolutely 0 back link profile. Yet, those sites gained.
So this is not about backlinks. It's something else altogether. The only thing I know is that the bigger site is heavily copied.
[edited by: falsepositive at 10:40 pm (utc) on Feb 25, 2011]
brinked: I think this goes deeper than not counting bad links. Google has been working on that for a while. Google appears to still be counting many of the links, but what they are doing is saying if your entire site doesn't meet a certain "trust" factor of some sort, all of your scores will be devalued. How this plays out in individual SERPS depends on just how far ahead you were from the competition without the filter in place.
Have never used Google Analytics or AdSense, have no scraped content, no paid links, and still see a significant percentage drop (40+%).
Why I don't buy the link theory for this update: The sites Google was supposed to target, all have some of the best link profiles (Demand Media, Squidoo, etc), so if they let the sites with great link profile stay, their mission would have failed automatically.
I think the link quality is only one factor in a much greater algo change, in my industry I have seen a lot of sites get pushed up the SERPS that have exact anchor text and the keyword in their domain. To me it looks like Google are making the results worse, not better.
walkman: they have a great link profile, but perhaps not considering their size... Size is a BIG factor here I think. Basically, in the past, if you had crap content sitting somewhere on your site (and by crap, I mean content that wasn't link worthy), you basically were not penalized. Other pages could rise just fine in the rankings, even if you had hundreds of pages that weren't very good. I believe this new algo. changes that. My theory on the new rule: Your link profile has to be strong enough to sustain your site size, otherwise you plummet. This is a great way to go after the farmers... If the farmer can get a strong enough link profile to support the potentially 1000s of pages that are produced, well then, they survive. but if they can't - if they want to build too many pages, well then their entire sites suffer. Now, I do think Google overdid it when they cranked up the juice on this filter, and hopefully we'll see some tweaks to how it is implemented. Again, its just my guess, but it makes a lot of sense to me.
FredOPC, your theory sounds very interesting, i tend to agree with it based on the profile of our sites. Thanks.
An interesting thing is that Hubpages literally got hit twice as hard as Squidoo with this update. I suspect it both has something to do with quality control on Squidoo and their internal link structure. If you just visit Squidoo and click around from one page to the next it takes a while before you see really bad content. If you go on Hubpages within 2 clicks you'll be on a really blatant spam page with nothing but an Amazon product or something.
Just speculation about the reason behind it.
|Google is going to look pretty dumb if there is indeed a rollback, even if it's partial |
Google already looks dumb, very dumb ... They want to punish content farms, but are not able to detect the difference between content farms and unique quality content.
We may be trying to make sense of something that's nonsensical. If their goal was to reduce spammy, generated and/or copied content, there's no way that this algorithm, or should I call it massacre, is doing what it's intended to do. I'll go back to my earlier example of syndicated content (which sells for quite a pretty penny I may add). So companies which take material and rewrite it are handily selling these rewritten articles to sometimes up to 15k sites at one time. But if you type in the exact title for one of these articles, guess what comes up almost solely for the first page of results now? Big players who all purchase the exact same articles and don't even change the title? Wait, they can't change the title because it violates the terms of service. I'd love to hear an explanation about how a company as large as Google with the level of engineers that they have can miss this and yet destroy solid sites which wouldn't ever think of either purchasing a link and/or purchasing content that's used on 15K sites because it adds no value. So obviously this new "improvement" is not targeting non-unique material.
And if you're a big player it's fine to republish day after day the same material as 15k sites online ---with not a sentence that's unique? Furthermore, this content is "valuable" and is rewarded? Something's terribly wrong with this picture.
@jessica97 in my scenario it's lesser scraper sites that were supposed to be dealt with showing up.
"Ismailman, that's an interesting theory you present. Here's a question, though: as I've said, I had many, many phrases drop from the first page to nowhere. One phrase, for example, has ranked #2 to #6 since 2005. It's kept that ranking through every strange update Google has thrown at us.
If it were a matter of scoring, why would this page drop all the way from #5 to #49 for that phrase?"
That's easily explained. Google uses 200+ factors to rank a page for a specific category. So essentially when you do a search for X they use the 200 factors and come up with a score for each page. Let's say URL 1 gets a score of 98, URL 2 - 75, etc. Well if you're URL 2 and now your 75 is a 50. And if the other URLs beneath you were clustered around 70-60 then you're going to jump a bunch of places. In another case if your page has a score of 100 and the next page has a score of 70, and so you get punched down to a score of 65 so you'll only jump down a bit.
It has nothing to do with non-unique material. They're apparently trying to figure out another way to deal with duplicate content, but that's a really major engineering challenge. Copied content still ranks and it has for quite some time.
When it comes to getting rid of the article sites, how to sites and things like that, this has been very successful though.
|I've got a 6 year old site, with at least 1000 or more words per page. Have been cited by media sites, commercial sites, blogs, books, etc. Yet, I was hit with a 50% traffic drop. |
I've the same profile except we saw as slight (maybe 7%) increase in traffic... Really hard to say because there is not enough data yet.
We run Adsense too, so it will be interesting to see how Google actually did the cut.
We should band together and keep a tally on how much the new google update has cost our businesses... a daily tally... then send it to our congress people. Additionally we should also keep a tally of how many employees and contractors we have/had to terminate as it relates to this craziness. Finally, we should keep track of all the websites that scraped content from all of our unique, old websites and are now sitting high in the search results making money from adsense... and bill Google for all the hours we put into our websites to make them attractive, useful and interesting to users... enough to where somebody scraped them to get some google traffic for some adsense dollars.
1. $8,000 US dollars in revenue per day they don't correct this.
2. 10 Employees will be getting pink slips on Monday if this not corrected.
3. Gathering up all the hours it took to create the web pages that were scraped and are now being listed on the first page of google ahead of my site, is going to be a difficult task... (hell, I have time now... I'll report back to you soon...)
What has this cost you?
|That's easily explained. Google uses 200+ factors to rank a page for a specific category. So essentially when you do a search for X they use the 200 factors and come up with a score for each page. Let's say URL 1 gets a score of 98, URL 2 - 75, etc. Well if you're URL 2 and now your 75 is a 50. And if the other URLs beneath you were clustered around 70-60 then you're going to jump a bunch of places. In another case if your page has a score of 100 and the next page has a score of 70, and so you get punched down to a score of 65 so you'll only jump down a bit. |
I wish I could show you the pages from my site that dropped 40 positions, and show you what's now above those dropped pages. I'm 110% certain that you wouldn't think those now-better-ranking pages could have any kind of score at all, especially when you get to page three or four. My niche isn't filled with exceptional sites.
Here's another head-scratcher. In 2001 I created a small site for a retail store. The site is in my niche. I got the store's site ranked well for a number of phrases. One phrase has been in the #2 spot since 2001, and only recently moved to #3.
I stopped working on their site in 2004, and they haven't touched the site since then. It gets very, very few visitors.
For the phrase in question, their widgets page has exactly one inbound link, and that's from a site that's scraping the product photo.
I was #6 on that page for the same phrase, and my widgets page has at least 100 different sites linking to the page. My page went from #6 to #48.
If their page stays on Google's first page, that will throw a wrench into just about every theory that's been presented here so far.
SEO was always a game of countermeasures. It is too early for me to come up with some, but after careful investigation, I now see a drop by really 12% in traffic on our site. That correlates to the statement they issued. It also means around $1,000 daily less income to our company. Not funny and not yet clear how to react on that. Let the weekend past and analyze is my plan.
Again, like on MayDay, where does that traffic go? Some came up on MayDay and said: good, I went up by XY%... where are these webmasters now. I would rather hear the positive stories and the reasons to rank than what to NOT to do...