"Does anyone know if this thing can be bypassed by aquiring more/better incoming links?"
I doubt it, BUT links will help you when this is solved. Google thinks that your site (and mine) is a dupe and has penalized it. My site had 100's more links than the redirect one and still...
I don't know if this is going to go away... it seems that Google has developed yet another penchant for freezing a site's exposure.
Some recent ranking rules:
* be old
* be unique
results with &filter=1: 200 pages
results with &filter=0: 500 pages
query: site:www.sprockets.com +sprockets
results with &filter=1: 500 pages
results with &filter=0: 500 pages
Since the keyword "sprockets" is on each page of my site - you'd think the results would be the same.
I've tried similar tests on other sites and found varying results.
Not sure exactly what this means. It seems what google considers duplicate content is dependant on the query?
Maybe google thinks that all the pages on www.sprockets.com are so much "about" sprockets that the filter plays less of an effect?
Can someone tell me, why for a major search term for my site.
with &filter=0 #38
result #1 through #5 with &filter=0 is the same site i would think that my site would be further out with the filter removed.
Please explain someone
Folks are keeping quiet on this one, if anyone knows what &filter=0 does and how to fix sites that got whacked please sticky me as its causing a lot of grief at the moment.
Sticky me too if anyone can help!
Has anyone managed to regain positions without filter=0? If so what, if anything, did you change?
I've tried a number of things including removing duplicate content without seeing any improvements.
"Sticky me too if anyone can help"
if anyone can help, please don't sticky: post it here for all.
A suggestion on this page [webmasterworld.com...]
Well what I heard is that you need to replace your high competitive keywords with a non-sense one and your site will re-appear in the same spot. I haven't tried this yet.
Although I'm not convinced, if G did rely mainly on incoming anchor text and put a penalty in place for over optimisation maybe it would work, anyone tried it?
More and more of my stuff is getting nailed by this. Stuff with 100% original content as well...pisses me off
I'm starting to think this is more to do with losing important (authority?) links than duplicate content.
Another one of my sites has disappeared unless &filter=0 is added, this site has recently lost a link from the BBC website.
Has anyone else whose site appears well with filter=0 but not without filter=0 recently lost important links?
If you lose important links you will lose rank quite apart from &filter=0 issues.
"Another one of my sites has disappeared unless &filter=0 is added, this site has recently lost a link from the BBC website."
coincidence. Search for a sentence from your index page and what comes up.
Sorry you are right, false alarm.
Has anyone who is experiencing these problems found any pages coming back into G and ranking well without filter=0 being added?
In a last ditch attempt I have emailed Google today explaining what I am seeing and asking whether it is normal or not. Should I receive a response I will of course post it here.
I'm not sure why you are all suddenly taking a look at &filter=0, as that functionality has been in the search results for 2 or 3 years now. It weeds out near duplicates, and limits results to two pages per domain (moving one up and identing it underneath the first result if the second result was somewhere on the same results page). You might also want to compare results with 10 results per page and with 100 results per page (that's with &num=100 in the URL) too.
|I'm not sure why you are all suddenly taking a look at &filter=0 |
Because last month, Google's algo shifted a bit, and some of us took some hits. Those rankings losses can be reverted back to their original positions by using &filter=0.
For example, a kwd phrase was #1 for several years, but a month ago, it fell in the rankings (not completely out of the rankings, just a sizable drop). However, if &filter=0 is added, it is suddenly #1 again.
The standard two pages per domain (one indented) definition of &filter=0 doesn't seem to apply to this particular situation, so some of us are trying to figure out what ELSE it may be doing that would account for what we are seeing.
because it's hurting entire sites. See this: [google.com...]
Lately it seems that G has tigntened the dupe filter, and didn't care about fixing their bug. Also, with agressive spidering more 302 links are getting spidered. Now innocent sites are caught in the middle. This has been happening at least since september, and every update we see (via postings here) a wave of sites getting caught. Maybe they'll be relaxed after the skiing trip and finally fix this.
"I'm not sure why you are all suddenly taking a look at &filter=0, as that functionality has been in the search results for 2 or 3 years now. It weeds out near duplicates, and limits results to two pages per domain (moving one up and identing it underneath the first result if the second result was somewhere on the same results page). You might also want to compare results with 10 results per page and with 100 results per page (that's with &num=100 in the URL) too. "
| This 46 message thread spans 2 pages: < < 46 ( 1  ) |