Welcome to WebmasterWorld Guest from 22.214.171.124
Cross linking your own domains should trigger a filter also, because this means artificially inceasing link popularity.
In other words they should still be indexed but GG would eventually filter them for competitive KW's.
Also...don't trust GG backlinks...use MSN because it will give a better representation of what's the actually LP status of a site.
I'm still waiting for my cloaking specialists in europe to be penalized though!
Other than that I have submitted to other cheaters: one hidden text, the other domain hijacking and they have been penalized (PR back to 0) so well done Mr Google, they were cheating and penalty went fast!
Hopefully this will clean up SERP's, I would actually suggest to submit feebacks also to MSN and yahoo actually. We deserve clean SERP, no cheaters!
Just wondering whether anybody's had the same experience as me as a result of Jagger.
My site has both static and dynamic content, both were doing great for their relevant areas before the update started.
Static content (100+) pages, fine, no problems, stayed where they are, maybe moved one or two places, but nothing that is going to lose me sleep.
Dynamic content has gone...nowhere to be seen. That's 300,000 + pages that have just disappeared, still being crawled, but not ranked..[scratches head] I'm just hoping that when everything settles down they come back..they better or I might be in trouble..
I can only hope that 126.96.36.199 spreads and not the others.
Kinda weird month: Started really low 1/3 traffic that rose to extreme heights and now dropping again, not as bad though in the moment, but when the trend stops who knows. This seriously sucks.
Maybe each Google employee turning around on that algorithm should have his/hers income linked to the success of their algorithm so they emotionally understand what they are doing.
For those that think Google should rely less on algo solutions and more on human judgement to weed out spam
Its called the ODP and they can't keep up with the demand for sites to be added let alone weed out spam. Humans may make a better judgement as to a sites 'worthiness' or 'usefulness but a computer can do it way faster, all day, everyday.
To do this, it means that not everyone will end up happy. But it also means that the SERP's will be more valid. However, the update is not yet over, and until it is, the newly formed Google SERP's will not ne available.
The alternative choice would be to continue with a list of search results that could be manipulated by SEO's, scammers, and so on. At least Google is conscientiously trying to make it fair for everyone, and at the same time, hoping to provide a concrete base of good quality results.
It seems to me that the SERP's were getting out of control and Google is trying to fix it. My suggestion is that we all wait and see what the final outcome us, after the Jagger 3 update.
But one thing to remember is that when this update is over and you sill have no traffic then it's going to be time to go buy some wood so that you can build a bridge and get over it. That's when you can start doing some work that will bring your uniques back to the level you're used to
ska_demon, that is a crappy comparison.
Heh heh. Thanks
What I am trying to say is there is so many sites out there that google wants to index that they would suffer the same problems as the ODP if they resorted to human site selection and ranking. It wouldn't matter if they employed 10,000 people to sift thru it. Google would still be, as you put it, under resourced due to the amount of pages out there.
Our site has exactly the same problem as yours. Our site has been around for nearly seven years, reviewing music, films, theatre and opera. We used to do well until this update, with our relevant, original content appearing high up in Google's results.
Now, searching for an artist's name and the album title - plus our domain name! - lists sites linking to us or quoting us instead of our pages. Searching just for the artist and album leaves us nowhere.
Writing to Google about this produces an automatic response, thus:
Thank you for your note. While we're always working to include more
content in Google, sites can occasionally fall out of our search results.
Our spiders regularly crawl the web to rebuild our index, but keeping tabs
on billions of pages is tough work, and they may miss a few.
Please be assured that these changes are automated. It is certainly our
intent to represent the content of the internet fairly and accurately. Our
crawlers aren't bullies; they don't pick on particular sites.
We understand that these changes can be confusing. While we can't
guarantee that any page will consistently appear in our index or appear
with a particular rank, our Webmaster Guidelines, available at
[google.com...] offer helpful tips for
maintaining a crawler-friendly site. Following these recommendations will
increase the likelihood that your site will show up consistently in our
The Google Team
All of which is lovely except I've been building websites for over a decade and am well aware of how to build and maintain a crawler-friendly site, which mine is - and they'd have seen that if they'd looked at it. We do not resort to "blackhat" techniques. We're a source for Google News even, but now can't be found on the majority of searches we should be.
In common with a lot of people on here, I have no idea what recourse I have. I'm supposed to sit tight and watch our traffic drop and drop and hope it might all come right in the end? It's been over a month since this catastrophe happened and it isn't coming right. And I don't seem to have any course of action open to me to redress the situation.
Needless to say, our results in MSN and Yahoo remain impeccable, but with an estimated 80% of searches in Europe (we're in the UK) taking place through Google according to their figures, that's scant comfort.
lost traffic >> google sucks
gained it >> google rules, I see less spam ;)
Have you ever noticed that the people that complain the most in an update thread aren't usually heard from in the next update thread?
I think they actualy go out and diversify and have learned not to depend on something totally out of their control.
or they come back in Google :)
Personally, I held back diversifying because I was afraid of link penalty (I'm not kidding). I wanted to at least have some extra cash before taking the risk. It is ironic that I got hit by just that.
[edited by: walkman at 4:28 pm (utc) on Oct. 28, 2005]
One of my big competitors in the computer hardware business recently dropped out of google! I'm amazed. THey used to be a PR5 with high ranking product pages, and all now Zero.
I can't find any sign of them in Google. Totally wierd! Maybe it is temporary.
They had a very aggressive linking campaign I know that.
As I understand it, when you search from google.com you actually get results from a randomly selected Google 'datacentre'. Rather sensibly Google has many datacentres all delivering results.
The dc's mentioned in this thread are showing new results and are being used by Google as a test. They are also live but only deliver a fraction of all Google searches.
As the new results spread onto more dcs, so the search results from google.com become more new.
There are about 6 dcs showing new results of which this one: 188.8.131.52 is the mother (was the first and has what I think are final jagger2 serps).
To complicate things, we will get jagger3 next week where the serps will flux again!
2. We are half way through mixing colors on 4 out of 40+ DCs, how do we know it's ugly?
3. Hand Reviewing -- I have said it's priceless before here you go again:
8 bil pages, by reviewing 10 pages per hour * 8 hours per day * 10000 people reviewing = 800,000 pages per day * 365 = 292,000,000 pages per year : so 8,058,044,651 / 292,000,000 = 27.59 years to get all the way through the index once.
4. Where is Clint when you need him?
I wish people would read through earlier posts first...
The above is correct but irrelevant. If one site had 100,000 pages and Google trusted it's creator then Google need do nothing to hand review it. Google can just presume it's OK.
If it isn't it will get picked up by it's competitors' spam reporting it.
In fact, Google need do very little since a lot of the crap will get picked up by scam reports. And even then Google does not need to go through every page, it can just penalise the whole site.
Get used to the idea.. Human input in serps is here and it is here to stay.
A sensible strategy now would be to get on the good side of Google. Sitemaps, wc3html, clean backlinks.. These are all good ideas for a webmaster right now.