Forum Moderators: Robert Charlton & goodroi
I then saw in a site:domain.com search that other domain suddenly was there and had a cache of my frontpage, that was 302 links, meta refreash 0sec and some real hijackers. one of the 302 links even had a dissallow for google to spider.
So now Im 100% sure the site is filtered because of dublicated content and still is because of all those old caches around and maybe scrapers that copy a to large part of the site.
We can all see sites built on pure manipulated links still flourishing
This update is all about 100% non-reciprocal links
Conclusion: In order to beat the undeserving sites that are flourishing on pure manipulated links, I must create webpages on freehosts that feature non-reciprocal links to my network of sites. In other words, I must crate pure manipulated links. Fight fire with fire.
I just did a search on Google -ping 72.14.207.99 which it defaulted to - and am now seeing the exact same type of behavior. Getting 2 to 12 results and the this keeps changing with each refresh. Somtimes 2 results, sometimes 12. The rest of the 3,130,000 pages are listed as supplimental. Could things be on the move again? It is very strange behavior that I have had a few of you guys check out in the past. Looks to be things are moving again, at least for what I keep taps on...
Results 1 - 12 of about 3,130,000....In order to show you the most relevant results, we have omitted some entries very similar to the 12 already displayed.
If you like, you can repeat the search with the omitted results included.
Whatever happened to ole BradStevens? This guy seems to be a day late and a dollar short... Where was this wiz kid on 9/22? He had hope for saving the world from G's death gripping hands.
Gosh, and here I thought all the analysis that took place in here since then amongst the smartest webmasters in the world actually meant something. I should have listened to Brad and not "YOU PEOPLE". Come back Brad, Brad, come back... I will never let go man, I promise.
So far I've tried 301ing anything onsite that could be seen as duplicate (www. -> non www., /index.php -> / .. switch to absolute urls etc) and thought that on some datacenters Jagger 3 had helped a bit. Then it just seemed to pause after a few pages finally getting out of supplemental (not ranking for even the most obscure terms though). Still my main page is url only (and with www's despite a 301 in place to non www)
Finally yesterday I tried G. sitemaps, (too early to tell if it will make any difference at all) in the hopes it might give me a glimpse of what it's crawling / indexing problems might be. As well as attempting to block any scrappers I've seen in my logs.
Has anyone succesfully got out of this problem? What do you feel worked, or any advice at all.