Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

60% drop of google traffic overnight

         

vladowsky

11:54 am on Nov 10, 2013 (gmt 0)

10+ Year Member



Hi everybody, i want to ask for opinion and help from WebmasterWorld community.

My site was hit with some sort of dupe content penalty on 31.10. - as it seems i ended up in suplementary index for my top keywords, but not all and i'm not sure what is the cause. My traffic declined over 60% but pages are still indexed.

Theme of site is alternative. I worked hard to provide good content, developed few great scripts and paid some seo services that included some link building. Site looks and feels profesional, we need to have company info because of law, we offer live services, there is lot of good connected suplementary content etc.

However script produced several duplicate title tag errors in wmt because of "?parametar".
I fixed this error by telling google to ignore that in parametars, and manually removed all links that was indexed and added nofollow and noindex to script folder.

I also added few canonical tags to similar sites- script is creating diferent content but all on same page so i made page for each category.
On top of all i discovered that some competiton linked us below 100%iframe what is known to be a bad practice, so I disavow those links too.

What of all of this would cause algoritmic penalty (no message in wmt)? All of them combined? are some errors more severe than others? was it panda, or penguin or wtf?
How long until recovery based on my fixing?

<snip>

Thank you in advance

[edited by: aakk9999 at 12:10 pm (utc) on Nov 10, 2013]
[edit reason] Forum Charter [/edit]

turbocharged

2:10 pm on Nov 10, 2013 (gmt 0)



I worked hard to provide good content, developed few great scripts and paid some seo services that included some link building.

The links your seo services made to your site are probably the problem since Google does a good job of handling parameters IMO. Keep in mind that Google indexes "content" and ranks this "content" from "signals." It does not matter how great your content is, if the signals Google analyzes are garbage then Google will put your site in the dumpster.

FranticFish

3:26 pm on Nov 10, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



developed few great scripts and paid some seo services


If your scripts were distributed widely and included links back to your site, even on your brand name, that could be an issue. Google rep Matt Cutts has said that he wants people to 'nofollow' links in widgets [webmasterworld.com...] (they don't do that with their own ones but, hey, it's do as they say not as they do in Google's world).

"some SEO services"
Have you had a link audit? There is a free tool here that could be a good starting point for this - [backlinks.webmasterworld.com...]

vladowsky

3:35 pm on Nov 10, 2013 (gmt 0)

10+ Year Member



Hi thanks for reply

@turbocharged- indeed but some signals might be mess up because of that parametar - I suspect on that because I found some wierd links (also in duplicate title tags section) that apparently googlebot creates to stresstest site- combine 2 links into one none existant, but it didn't show 404 but some wierd broken mashup.

@franticfish - indeed i have widget and it is becoming popular, I will definitely put noffolow.

I checked links with some tools where I found those under iframe and on sites that are not indexed and removed them.. maybe that was not enough..

JD_Toims

4:05 pm on Nov 10, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My site was hit with some sort of dupe content penalty on 31.10.

There's no such thing as a duplicate content penalty from Google. URLs with essentially the same content are grouped together and they show the one they think is algorithmically best.

On top of all i discovered that some competiton linked us below 100%iframe what is known to be a bad practice.

Since when is using an iFrame bad practice?

...paid some seo services that included some link building.

That's most likely the problem, imo.

vladowsky

4:19 pm on Nov 10, 2013 (gmt 0)

10+ Year Member



@JD_toims- it shure feels like penalty.
100%frame fits under category showing one thing to crawlers and another to people.
Without those services I was nowhere either despite great content. I used it for some time without any problems. Maybe it have cumulative effect, maybe is only newest the problem.

JD_Toims

4:23 pm on Nov 10, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



100%frame fits under category showing one thing to crawlers and another to people.

No it doesn't -- iFrames and their content are available to GoogleBot. How Google decides to treat the content within an iFrame is up to Google -- If you view the source on the site of at least one major PubCon speaker who happens to also be one of the biggest names in the SEO business you'll find plenty of iFrames. They're not a bad practice or cloaking at all.

aakk9999

4:26 pm on Nov 10, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There's no such thing as a duplicate content penalty from Google. URLs with essentially the same content are grouped together and they show the one they think is algorithmically best.

It is not a penalty as such, but from my experience, it has an impact on site's ranking and leaking huge number of duplicate pages can tank the site.

<added>
How Google decides to treat the content within an iFrame is up to Google

We had an interesting case of how Google treated frame described in this thread (the page was using frame rather than iframe though):

How is a page ranking with no content except for a frame?
http://www.webmasterworld.com/google/4602577.htm [webmasterworld.com]

</added>

JD_Toims

5:47 pm on Nov 10, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



...it has an impact on site's ranking and leaking huge number of duplicate pages can tank the site.

And it can propel a site past another in the results -- Think duplicator out-ranking the originator scenarios or a site like CNN republishing an article and out-ranking the source -- One of the biggest obstacles to SEO is the FUD and misinformation spread by none other than the SEO community itself.

Having duplicate content is generally not a best practice, but it's also not a penalty. It's not even harmful in all situations, and in some situations it can be a benefit.

aakk9999

7:57 pm on Nov 10, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I agree it is not a penalty.

But there is ranking side-effect for having a duplicate pages. Now, if you have an authority site and you copy someone else's page and rank for the term instead of the originator - obviously, you have a winner, as said above:

And it can propel a site past another in the results -- Think duplicator out-ranking the originator scenarios or a site like CNN republishing an article and out-ranking the source


But from the originator's point of view - their page got filtered out due to authority site duplicating it, hence they lost.

However, in my comment, I was referring to duplicate pages within own site - perhaps I should have been clearer. My experience is that having duplicate pages in any significant quantities is harmful to the site's ranking.

Why? Not because "you have duplicate, Google will penalise you" (we already agreed it is not a penalty) but probably because having duplicates affects other factors such as page rank flow, split PR juice if both pages are linked internally and externally, site crawl budget and perhaps some other factors. Otherwise, why bother at all with www/non-www (and other canonicalisation) if duplicate has no adverse effect?

JD_Toims

9:35 pm on Nov 10, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Definitely agree aakk9999 -- I was mainly trying to "bust the duplicate content penalty myth" that's become so "standard" to hear people talk about, even though duplicate content is not a penalty.

Otherwise, why bother at all with www/non-www (and other canonicalisation) if duplicate has no adverse effect?

Also, haven't tested it recently, but Google has gotten much better about "canonicalizing on their own" when there's not a canonical set between the www/non-www variations, so it might not be as necessary as it was at one time.

I think the bigger issues from duplication are likely query_string based in situations where it could algorithmically look like someone put up 100 copies of the same page to increase the in-text links to other internal pages, but that's not really the duplication being "the core issue" as much as it is the appearance of an attempt to manipulate via duplication.

JD_Toims

10:00 pm on Nov 10, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Interesting -- I just ran a little "let's see if there are any sites in the top 10 for a fairly competitive query that aren't canonicalized between www/non-www", figuring I might find one or two since everyone knows it's critical to canonicalize the two variants these days -- I was definitely surprised and didn't bother looking further when I found 1, 2 & 4 were not only not canonicalized, the number 1 result had both www/non-www available via http and https.

aakk9999

11:45 pm on Nov 10, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



^^^ Yes, interesting. My guess is that authoritative/strong sites can easier get away with it. Mind you, what we do not know is whether for example, #4 would outrank #3 if they fix the canonicalisation problem.

It *may* depend on how much of the split linking exists for the same www/non-www page - perhaps if most outbound/inbound links point to either www or non-www, it is not an issue as there is no much juice split. But why leave it to the chance?

What we do know is that having non-duplicate pages is the best practice. How much (self)damage (in the scale from minimal/almost none to seriously damaging site rankings) duplication causes is almost certainly different from site to site.

the bigger issues from duplication are likely query_string based

I think so too. At least with www/non-www and http/https Google probably knows this is not intentional, whereas with extra parameters/parameters' order it may not be so sure (although I think they are by now wise enough to ignore session ID and similar well-recognised parameters such as utm_source etc).

JD_Toims

3:28 am on Nov 11, 2013 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



One of the "makes me wonder how important it really is today" [wrt www/non-www canonicalization] things I've read is Google is taking the pages listed in an XML Sitemap as a "suggestion" of the canonical version of a page -- When I read-between-the-lines a bit, that information tells me they've been looking for a way to decide on "what's the canonical version" without being explicitly told for a while.

My gut tells me, canonicalization of www/non-www or http/https is really not "overly important" any more [I'd almost say it's a non-factor these days], because the reality is, a site with both www/non-www being available for visitors really doesn't have a negative impact on their results or on the visitor experience at that site, because AFAIK visitors really don't care if it's www or non-www or if they could visit both. What they care about is finding the information they're looking for.

What actually could make Google's results worse is for them to "not just pick one" and "send the link weight/ranking factors" from the other version to the one they pick as if it were canonicalized, because by not doing that they "give an advantage" to a "search-engine proficient" site over other sites which might have better information for their visitors.

(although I think they are by now wise enough to ignore session ID and similar well-recognised parameters such as utm_source etc).

Yes, they've actually been ignoring query_string parameters containing ID and "common visitor identifiers" for years now.

vladowsky

1:07 pm on Nov 11, 2013 (gmt 0)

10+ Year Member



@aakk99&JD_Toims -Thank you for educative and constructive discussion on duplicate content.

I also remembered that I recently put live chat script for customer support. Maybe such scripts could be the issue?
I researched backlinks a bit, nothing too scarry however I have several sites where I put some banners without nofollow (in sidebar). Maybe that was too much site wide links?
Day before traffic drop my adsense earnings was all time high. Maybe that was signal too?
Too much maybes in my opinion for exact science of searching..