homepage Welcome to WebmasterWorld Guest from 54.163.84.199
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 104 message thread spans 4 pages: < < 104 ( 1 2 [3] 4 > >     
Recovery from Google Changes on 16 Nov 2012
Sgt_Kickaxe




msg:4523367
 9:41 pm on Nov 28, 2012 (gmt 0)

On Nov 16th Google made a change that impacted a good number of sites. Some asked Google if it was a Panda update and Google said it was one of 500 changes Google made per year but was not a Panda/Penguin update, one of those is due out this week. You can read about some of the affected sites in the monthly seo thread here - [webmasterworld.com...]

Specifics
- The change was not sitewide, it impacted individual pages while leaving others alone
- Unique and on topic content on each page
- Very little else is definitive but we did have pages to compare within the same domains

While comparing affected pages nothing about the page, from an SEO perspective, stood out. Backlink profiles were varied but similar to the non-affected pages. On my own site the pages affected seemed to be more brand/transactional in nature but some of these were untouched as well.

Recovery
- The one common denominator on all affected pages was medium quality content. The pages I "beefed up" from Nov 17th to 21st have already recovered(as of yesterday).

What struck me is that this isn't the low quality type content you'd expect to see receive a downgrade. 100% unique, accurate, on topic and useful content received the downgrade but in all cases there was either not enough useful information or it was worded a little too generically and would likely get a borderline rating by a human review. Definitely not spam but only of marginal usefulness, too narrow in scope or not comprehensive enough to solve a problem.

The speed with which these pages are bouncing back, and the fact that the pages I haven't "fixed" yet aren't, suggests it's not a penalty but rather a low quality downgrade that can be resolved easily.

I thought I'd share this since some are getting drastic with their changes. "thin" content took a hit a long time ago, I'd say medium content with only minimal usefulness just did as well. Simply being "unique", "original" and "on topic" doesn't mean anything anymore.

Broad quality downgrades are not exclusive to Panda/Penguin!

 

hitchhiker




msg:4581128
 8:48 pm on Jun 4, 2013 (gmt 0)

Google replied a second time (via John) - Could anybody affected by this PLEASE comment on that thread.

We might have a legitimate chance of getting an answer to this.

I've asked that they (Google) simply *review that update, tell us if they're happy with the result they got in relation to UGC.*

ONLY if you were not involved in any anti-guideline activity, and were specifically hit by the Panda 21.5 ghost update on around Nov 17th/18th 2011.

[edited by: tedster at 3:51 am (utc) on Jun 10, 2013]

1script




msg:4581147
 10:22 pm on Jun 4, 2013 (gmt 0)

LOL, getting him to reply for the second time in the same thread is already an achievement! I hope something good will come out of it.

rango




msg:4582307
 11:05 am on Jun 8, 2013 (gmt 0)

Just chiming in here since we're in a similar boat. We weren't hit on Nov16th, but since that date there has definitely been a steady downward trend to our traffic to the point where we're at about 40% of the traffic we had this time last year.

Profile: 10+ year old UGC site with forums, photography, user blogs on subdomains, accommodation listings (possible problem area) and a wiki. Search traffic in Nov was at about 500,000 - now at 400,000 per month. Last May the traffic was at 800,000. The drops are hidden a bit by seasonality. I expect this coming Nov to be at about 200,000 assuming nothing else drastic changes.

The recommendation from John Mu to de-index is interesting. We did this during last year on quite a few of our pages and so far there is really no indication that it helped. Quite the contrary - I'd say our traffic has just worsened since then. In other words, tread carefully. I think you want to be very careful which pages you de-index. Blanket rules to deindex all profiles might be dangerous, since you could have some valuable links pointing in there. I don't really think noindex,follow is effective as a way of keeping the link juice flowing. From what I can tell, once Google noindexes it, they rarely look back. I've tried reindexing pages and it's much much harder to do than deindexing them in first place. So maybe first analyze your backlink profile carefully so you don't deindex any of the pages with good links pointing in.

1script




msg:4582349
 5:04 pm on Jun 8, 2013 (gmt 0)

@rango: the way I understood the low quality comments by JohnMu and the entire discussion around the subject is not so much that the low quality pages don't rank well themselves - I assume Google already knew how to prevent them from ranking long before Panda 21.5

I understood the problem as a certain amount (perhaps percentage-wise of the total indexed) of low quality UGC brings THE ENTIRE SITE down, no matter what quality the other pages are. The path he'd suggested seems logical but it could only possibly work if Google ran the same type of update once again after you've no-indexed low quality UGC. I have a very dim view of our chances to see this one-off update be ever run again. I don't believe it has achieved the result they wanted and therefore there would be no reason to run it again. For all we know, they might have simply abandoned this entire branch of anti-spam research that was specifically aimed at UGC sites.

I would add that if the page with UGC had attracted ANY outside links whatsoever (in a natural way, of course) , then it's likely by definition no longer low quality, no matter what the pages actually say. So there would be no reason to no-index it in the first place.

hitchhiker




msg:4582357
 5:55 pm on Jun 8, 2013 (gmt 0)

I've heard from a lot of webmasters now. It seems to have affected a great deal of sites that had, on Nov 15th, been authorities - and most of whom benefited from all the panda/penguin updates.

There's a hopeful consensus that it's a **misidentification** of some sort, muddied by the fact that many had not bothered with micro SEO for a long time (myself included)

I think we got too optimistic in the GoogleGuy days of "focus on content"!

Google didn't comment beyond the general guidelines. I'm trying another channel, I'll let you know how it goes.

I also learnt: Complaining about your traffic going down to 120k visits a day attracts a lot of trolling. We got treated like a bunch of whiny douches, they don't realise it took 10+ years of solid work.

rango




msg:4582421
 12:35 am on Jun 9, 2013 (gmt 0)

@1script - yes, that's the way I understood it as well. We noindexed millions of pages which were not necessary. For example we have ~2 million photos uploaded and each naturally had its own page. Most of these are now noindexed apart from ~20,000 which have been manually featured over the years. Similar story with our forum - we noindexed all threads that had less than 2 replies. We cut down the number of pages in the index from several million to a few hundred thousand. Maybe we need to cut even more for it to have a positive effect. Because so far the only effect has been even worse rankings than before. I am very wary of noindexing much more, because of the deleterious effects so far.

I think we may need to be careful to only noindex pages if they don't have any valuable inbound links. This means running lots of reports through SEO tools to really understand the thousands of inbound links we have and ensure they aren't lost to 404'd pages and so on. Yes, it's doing things only for SEO, but really this is what the outcome is of Google's changes of the past 2 years. Focussing just on your content / features, etc. does not work anymore.

hitchhiker




msg:4582623
 11:58 pm on Jun 9, 2013 (gmt 0)

"Focusing just on your content / features, etc. does not work anymore."

Undisputable now. I've been talking to so many of the people hit by this. We may have fallen for the 'content over SEO' line a little too easily.

Ironically: I recently saw a site with 0 backlinks, 0 content, 1 empty G+ profile - hit the top 10 on a extremely competitive phrase: [webmasterworld.com...]

- Maybe that's what they meant? 0 SEO, 0 Content - we just got mixed up. /s (sorry, being smug now)

1script




msg:4582829
 3:54 pm on Jun 10, 2013 (gmt 0)

Re: evolution of this thread.

I have been checking here from time to time and always go to the last message posted for expediency. But it was quite illuminating to go back to the beginning of the thread to see how this all started.

One thing that sticks out is that Sgt_Kickaxe (the OP) hasn't posted here since Dec 24th, 2012. I wonder if he has a success story to tell by now.

Another thing is that it was originally thought that the update had targeted only individual pages. My own stats are inconclusive - I never had trophy KWs which I could look at to see any measurable drop. In my case most of the traffic drop came from the further reduction of the total amount of long tail KWs.

So, even though JohnMu's comments in the Google product forum imply that Panda 21.5 works like a penalty and if you noindex bad pages, you may help good pages, it looks like this does not really work in this case. If you noindex more pages, all you will end up with is less Bing traffic. Google traffic to other pages will not be affected. Panda 21.5 appears to be just an event in which the "good content" bar has been raised significantly and plenty of UGC, perhaps already borderline, fell below.

Sgt_Kickaxe's initial assessment that the update destroyed traffic to "medium quality" content appears to be correct, it's just that "medium quality" is the new "low quality". Given that there's still plenty of low quality content in SERPs, I think Google simply took an easy road and labeled much of UGC as "low quality". Since they've had ability to identify forums ("x" posts by "y" authors note in SERPs had existed since at least 2009), they probably decided to give the low quality (less text?) filter there more weight.

Images have been initially thought of as having played a part but I think it's more like the lack of text on the image pages that was important, not the images by themselves. Perhaps at least a part of this update was aimed at controlling the explosive growth of Pinterest and the other UGCs just became collateral damage? Perhaps it might even explain why that update was not announced. How's that for a conspiracy theory?

Anything else I missed comparing the initial response to the update to what we know/suspect now? It would be awesome if some of the early posters in this thread chimed in again!

mamiakimo




msg:4582863
 5:22 pm on Jun 10, 2013 (gmt 0)

from 17th nov to now i lost visitors every month
i get down from 75K unique visitors per day to 18K now in jun
from 5 days i changed my forum from forum.example.com to forum(s).example.com using 301 redirect for some reasons

after 3 days my visitors increased to 33K per day
i know the penalty will move in 3 weeks and i will get down again

Bigwebmaster




msg:4583014
 4:23 am on Jun 11, 2013 (gmt 0)

That is interesting mamiakimo. So your traffic doubled within five days after 301 redirecting to a new subdomain? Would be interesting to see if that temporarily resolves it for anybody else. Would also be interesting how long that lasts for.

rango




msg:4583015
 4:29 am on Jun 11, 2013 (gmt 0)

I've heard quite a few reports that changing domains brought recovery from Panda / Penguin related problems. Ridiculously drastic step if you're a largish site though.

morpheus83




msg:4583017
 5:08 am on Jun 11, 2013 (gmt 0)

Google does treat www.mywebsite.com and mywebsite.com as two separate websites so that is something that can be tried.

n00b1




msg:4583021
 7:17 am on Jun 11, 2013 (gmt 0)

Actually there are algorithmic associations between both morpheus83. And indeed manual penalties applied to one, even if it isn't redirecting, can also affect the other. I know this from bitter experience. And also asking John Mueller along the way.

morpheus83




msg:4583022
 7:34 am on Jun 11, 2013 (gmt 0)

@n00b1 I did try that and saw a minor bounce in traffic but the traffic are back to the post Pengiun / Panda levels. I submitted a reconsideration request and none of my websites have got a manual penalty. They are purely alogithmic penalties.

hitchhiker




msg:4583040
 9:00 am on Jun 11, 2013 (gmt 0)

@mamiakimo can you tell us about your forum? Did you do any heavy SEO, was there any 'black hat' at any point. Did you benefit from the updates BEFORE 'Ghost 21.5'. How old is this site?

mamiakimo




msg:4583087
 12:59 pm on Jun 11, 2013 (gmt 0)

@hitchhiker read here please about my site
[webmasterworld.com...]

i need to say something
before 301 redirect i create some change
i deleted duplicated title from my threads

when i used vbulletin in the past i have this option that duplicate thread's title in each post RE: TITLE

when i moved to XenForo it copied this title and add it to the first of each post in <B></B> tag
because i have 20 posts in each thread i have title duplicated 20 times in each page so keywords destiny some time more than 20%

i deleted all this duplicated title from 135K posts in my forum using REGIX

i know that google penalty need from 2 to 3 weeks to move so i am not in safe side yet

i will keep you in touch about any changes happen
excuse my bad English!

bumpski




msg:4583984
 9:03 pm on Jun 13, 2013 (gmt 0)

Recovery
- The one common denominator on all affected pages was medium quality content. The pages I "beefed up" from Nov 17th to 21st have already recovered(as of yesterday).


I realize this is a pretty old thread but I wondered, if Sgt_K is still around, if he ever looked at "Search Tools" "All Results" "Reading levels" before and after successful improvement of his pages.

Some posts have indicated having fewer pages in the "Advanced" reading level category actually reduces the impact of Panda.

I've wondered if Google has come up with approved "distributions" of reading levels and if your site falls outside these boundaries you lose ranking?

1script




msg:4583990
 9:37 pm on Jun 13, 2013 (gmt 0)

@bumpski: thanks for the tip about "reading levels". Never occurred to me it could have impact. I would assume most "normal" users don't know anything about the reading levels either.

The results of a simple test I just did are rather intriguing : I find my page ranking at #6 when I select "all reading levels". Then at #3 if I select "basic" and a *different* page from my site ranks at #97 if I select "intermediate" (which is what most results fall under). None of my pages are ranking in "advanced".

In other words, it would pay to rank high in the "intermediate" which appears to be the default for most people, but how do you get there?

My page they rank high in "basic" is not all that basic per se. It's got about 14K of non-template content, all very relevant to the query. Oh, and the funny part is that it ranks behind an EMPTY LinkedIn profile (#2) and a three-photos page (no text) of the manufacturer that should rank for the KW because of the brand (#1).

My page that ranks disappointing #97 in "intermediate" has more non-template content (15K) but if anything, it's LESS relevant than the "basic" one. That's a head-scratcher...

Do you think they push UGC down in those "reading levels" ?

hitchhiker




msg:4583994
 9:54 pm on Jun 13, 2013 (gmt 0)

Wow, you're onto something here.

I switched up to 'advanced' and the 'blank' site (mentioned elsewhere) ranked #2.
Our English grammar site (grammar and English experts/teachers) is nowhere to be seen.

UGC - in general, is casual grammar. I'm betting somebody did not think this through :(

It would follow, from this initial observation (blank site) that the filter works by finding mistakes (more so than finding correct grammar). Therefore, the less content -> the higher you'll get scored by this.

Is this is being used as a signal?

1script




msg:4584001
 10:30 pm on Jun 13, 2013 (gmt 0)

@hitchhiker: which set of results are you mostly ranking in? I'm finding mine mostly in "basic", only sometimes in "intermediate" and never in "advanced" .

Also, it looks like my initial assumption that most people have defaults on "internediate" was wrong. The default seems "all types" - or is it? Is there a definitive answer on that somewhere?

I can't figure out what weights the "basic", "intermediate" and "advanced" are given when you mix them up on one page, if that even applies. Is ranking #5 in "intermediate" getting you above #5 in "basic"?

hitchhiker




msg:4584003
 10:55 pm on Jun 13, 2013 (gmt 0)

re: default - It would seem to be set on basic + a fraction. Maybe that's what they're dialing up gradually.

Beyond basic, most threads in my site disappear - > the blank site survives all the way.

I should point out, it seems all the forums 'disappear' (out of the first 2 pages) beyond 'basic'. (limited search obviously, but there are none anywhere to be seen)

1script




msg:4584004
 11:17 pm on Jun 13, 2013 (gmt 0)

default - It would seem to be set on basic + a fraction.
Intermediate seems to have 60%+ of results every time. Why would they intentionally eliminate 2/3rds of good results? It's safe to assume almost noone ever plays with the "reading level" setting, so in vast majority of searches it's set on default. Must be something that produces most results?

Hey, hitchiker, that can be one of the 10 questions for Google you've talked about in another thread:
what is the default setting of the reading levels?

rango




msg:4584010
 12:04 am on Jun 14, 2013 (gmt 0)

I can't really make sense of the reading levels. I mapped them all out for a test query

33% Basic
47% Intermediate
20% Advanced

Adds up nicely to 100%, so results should only ever appear in one of the categories right?

So the top 10 results with no reading level set mapped this way:

1 -> Appears in Basic + Intermediate at no.1 in both
2 -> No appearance in any category on the first page
3 -> Basic no.2
4 -> No appearance
5 -> Basic no.3
6 -> Basic no.4
7 -> Intermediate no.2
8 -> Basic no.5
9 -> No appearance
10 -> No appearance
11 -> No appearance (yes, there were 11 results)

What I'm most intrigued by is how #s 2,4,9,10 & 11 made an appearance in the main rankings without ranking in any of the reading level specific results.

Perhaps these are results where the content is less what it's judged on and inbound links are what's putting them in the top 10?

Results will vary for different terms too of course. I'm sure others are more slanted towards intermediate, etc.

rango




msg:4584022
 2:22 am on Jun 14, 2013 (gmt 0)

Another quick question for others in this situation. I'm investigating some of our analytics from our site in the days prior to panda updates. Often there is a traffic bump that happens and then when the update really kicks in, then traffic goes back to normal / worse. There's been some speculation that Google sends a flood to test a few of their metrics (bounce / time on site)

So I thought I'd look at those bumps and see which areas of the site are specifically being "tested" by Google on those days. In particular it does seem that our forums come off badly in the metrics. Time on site only at about 1 min for traffic landing there (compared to other areas where it's more like 5 minutes). And pages/visit also well below site average.

What kind of time on site do you all see specifically for traffic coming in from Google?

This line of thinking does give some credibility to the idea that culling those pages with high bounce rates from the index can help overall rankings. Don't want to be dragging down the average after all.

1script




msg:4584029
 3:22 am on Jun 14, 2013 (gmt 0)

So I thought I'd look at those bumps and see which areas of the site are specifically being "tested" by Google on those days. In particular it does seem that our forums come off badly in the metrics. Time on site only at about 1 min for traffic landing there (compared to other areas where it's more like 5 minutes). And pages/visit also well below site average.

Forums specifically have a hard time absorbing any traffic bumps, not the least because for any meaningful engagement the visitor needs to register (if they aren't already). If they are still finding that forum site via Google, then it's almost a guarantee that they aren't registered yet. So, a forum site can only grow slowly in time, regardless of any sudden traffic increases. If Google engineers spent any time whatsoever analyzing the results of such stress tests, I think they would find that they are not working well on forums.


What kind of time on site do you all see specifically for traffic coming in from Google?
You have to be proud of your time on site. Mine averages in only 24 seconds. I have obsessed about it before but it has been stable like that for years, all through various Pandas (except 21.5 of course). Besides, I find that I can get in and out of most sites these days in under 10 seconds myself. 5 minutes visit is incredibly long these days, what are they, falling asleep mid-page?

But Google's traffic is the worst among major SEs and averages at under 20 seconds. The only traffic that's worse than Google's is from large ISP affiliated search engines - AOL, Comcast, Verizon and the like, each powered by different major SE.

rango




msg:4584033
 4:31 am on Jun 14, 2013 (gmt 0)

5 minute visits are typically on the more transactional pages - they are looking through listings picking options, looking at photos, etc. before making a decision or leaving. It seems a good amount of time on site to me, but I don't think it's much more than average in that category. I think Google has a pretty good idea of how quick is too quick for a bounce back to the search results and I imagine that number varies for each query (or category of queries). I know from speaking to a friend who is a former Bing engineer that this kind of logic was definitely being worked on and implemented over there as he was the one doing it. I don't see any reason that G wouldn't do the same.

But of course, trying to improve the time-on-site / bounce-rate stats is really quite tricky to achieve for some terms. I get traffic on some pretty random phrases where we clearly are not the type of site they are after. Naturally these have a high bounce rate. I wonder whether I should manually noindex these somewhat non-ideal forum discussions to avoid them slanting things.

mamiakimo




msg:4584052
 5:49 am on Jun 14, 2013 (gmt 0)

google search send people to your site and can't define what they exactly do, what is the time they stay in your site

to know that they need to put code in your pages like google analytics and other services like it

morpheus83




msg:4584054
 5:56 am on Jun 14, 2013 (gmt 0)

I have been talking to a lot of webmasters who have been affected by the November 16 update. There are a couple of points -
1) The drop in traffic in most of them has been identical. The websites have lost close to 50 - 60% of the traffic since the update.
2) This is something I feel is very important - All the websites including mine are more than 5 years old. While we concentrated on Google's mantra of writing good content and the links and traffic will follow we missed out on scrappers. The older the site the more the content would be scrapped. Some of my articles have been copied hundreds of time over. Could it be that Google found so many identical copies of the content across the websites that it dropped the authority of the websites.

We are in the process of identifying scrapers and asking them to remove the content if not then disavowing them. I would be glad to share the results in the months to come.

Has anyone else tried this approach?

rango




msg:4584066
 6:25 am on Jun 14, 2013 (gmt 0)

@mamiako they don't know the time on site exactly, but if the user returns to Google and continues to click on the next result below yours then they have a reasonable idea that your result wasn't what the user was after.

n00b1




msg:4584074
 6:37 am on Jun 14, 2013 (gmt 0)

@rango

It isn't that simple. It is natural user behaviour in some circumstances to want information from multiple sources. If they are reading reviews, for example, they will likely want more than one opinion. There are many instances where clicking back to the results page is perfectly natural and not at all an indicator that the user didn't find the first result useful.

John Mueller has said time and time again that that sort of signal isn't used by Google. I know it's a hard thing for many here to take in who are absolutely obsessed with user metrics, but it's just how it is.

rango




msg:4584127
 10:03 am on Jun 14, 2013 (gmt 0)

@n00b1 Of course it's more complicated. I'm sure if they do something like this then it's going to be applied differently for different kinds of queries.

The pre-update flood of traffic has been observed by quite a few people.

Explain to me why Google needs to send a flood of traffic prior to an update if they aren't using it for metrics? I'm not talking about them snooping on Analytics or using Chrome data. I'm talking about them using *their* own site metrics to work out searcher satisfaction.

This 104 message thread spans 4 pages: < < 104 ( 1 2 [3] 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved