Forum Moderators: Robert Charlton & goodroi
I have a software downloads site which got filtered from 3 months ago, the traffic dropped by 80%. My PR went from 5 to 3 and many pages lost their PR. I tried to figure out what the problem was and I discovered that I have many duplicate software titles (17.000 from 70.000 were duplicates), mostly because of spamming techniques from publishers/affiliates.
I found a way to group the duplicates and pick one from each group to remain, and redirected the other pages from the group to the chosen-to-remain page from that group.
My idea to (hopefully) speed up traffic recovery is to give 1.0 priority to the redirected pages in the sitemap and almost 0 priority to the other pages, in order to forge Googlebot to crawl these first and realize that the duplicates have been eliminated.
My question is, am I doing the right thing for this situation ?
I am an experienced programmer (not web) but a bit of a newbie in SEO, I would appreciate any advice.
You may have been affected by a human editorial opinion, so if things don't recover, definitely ask for a Reconsideration Request from your Webmaster Tools account.
I specifically used 301. Thanks for the advice.
Does anybody have any idea of how long it would take until I can see some results or otherwise apply for reconsideration (altough, is this necessary since my site is not completely out of index ?) ? I read on this forum and the most optimistic estimate was 3-6 weeks...
And another question: could this filter be applied because of an editor decision and not automatically ? I read somewhere that only the complete bans are given by editors, and not filters ?
Does anybody have any idea of how long it would take until I can see some results or otherwise apply for reconsideration
Google reps have commented that it can take "a few weeks" for new 301s to be stably factored in. Since you're introducing many at once, I'd assume that pushes the time frame to the max.
is this necessary since my site is not completely out of index
The request had its name changed from "Reinclusion" to "Reconsideration" for just this purpose - the fact that sometimes a site is stuck with a lowered ranking that comes from a "flag" that a human needs to remove.
However, six months is a long time. I'd suggest taking a hard look at the current url situation and make sure that there are no other technical issues hanging out still unresolved with your urls. Other possible issues:
1. www versus no www
2. https versus http
3. index.html versus folder root
4. query strings that are not required, including any user specific tracking in the url
5. switching the order of query string parameters
6. double slashes between directory names resolving as well as a single slash
7. if you use a rewrite scheme that keys off a nmuber in the url, your urls are "fluffed up" with added keywords, make sure that any old keyword doesn't resolve to the same content as long as the number is there
8. mixed case urls can be trouble
And there's definitely more. The rule is that no two urls should resolve to the same content. Two urls are different tif they are not an EXACT match, character for character. So stress test your url patterns, and plug up any holes you can.
i redirected all my sites with 301 and the site escaped 1 months later
from the penalty but after about 3-6 months g. catched my site again so i was catched 3times since 1 1/2 .
I thing g. inherit the penalty from the old site to the new site or something else.
if you redirect 17 000 sites at ones you coud be in trouble again with g.
Building one often-cached link to each of the old pages that are now 301'ed can substantially increase the time it takes for the redirect to catch and hold.
If those pages are RSS, you might have a linking angle if it has not already been used - a way to build some links to those pages.