Welcome to WebmasterWorld Guest from 220.127.116.11
No, I don't disagree. I think this is exactly where the update is headed. I also believe that latent semantic indexing within the algorithms is playing a big part in this update. Namely, long tail, more obscure key phrases. A great deal of computing power would be needed to provide instant results for searches for longer key phrases, the current trend. G may be using the different DCs to see what their systems can withstand. Time is needed to do all this testing.
My main site has almost exactly 1,000 pages. Goggle was still showing 976 pages as indexed. No apparent problems with dupe content or supplemental pages.
But the site disappeared in the SERPS. I stopped looking at position 600 or so for my main keyword phrase. A few days later it showed up on or around page 20 for most search phrases and worked it's way up to page 8 slowly.
I'd like to think it was something I did to help it improve. But I didn't do anything worth mentioning. I've made the mistake before of changing too much too soon during updates and it took me months to recover not from the update, but from the changes I made.
Someone else mentioned symantics. Obviously it's a blend of all the above. Not so obvious is the blend itself.
For those that care, stop freaking out about updates, start concentrating on good links, good content and a good site all around and its going to be all good.
I just hope for those that have dropped in rankings while running a clean quality site can weather the storm till the ship rights itself.
does anyone disagree with the notion that this update is heavily skewed in favor of content sites vs e-commerce driven?
I want to think that this update is about giving more weight to sites designed to appeal to the people that use it rather than just optimized for search engines but we'll have to see where this goes. Wouldn't it be fantastic if people that built good websites and engaged in no link trading or employed any SEO tactics beyond solid design and implementation floated to the top of the SERPs?
My point is that more than ever, we are constantly working to improve our algorithms and scoring. Some changes are hardly noticed at all.
And he also said -
And I wouldn’t be surprised if a second stage of the index rolls out around this time next week. I also wouldn’t be surprised if a third stage of the index rolls out the week after that.
"Could it be that the Jagger2 and 3 are those changes that are hardly noticed at all?" ;)
I've been hearing a lot about this cannonical (am I saying it right?) google problem and now I think I may be affected.
Basically when I search for my site in google this is what I get:
www.domain.com - fresh cache, title, backlinks, etc..
domain.com - very old cache (about 8 months old), no backlinks, many old deleted pages (as supplemental results)...
I'm using monstercommerce and unfortunately I've had no way of getting rid of their www.domain.com/index.asp - duplicate home page
What's my problem here? I'm desperate for some answers!
Thanks in advance and hope to see you at Pubcon!
Today is the day of stage II of Jagger Update, The Father of All Updates, nickname The Terminator.
And we are still waiting for GoogleGuy to keep us posted ; whats stage II is all about.
Many thanks in advance GoogleGuy!
P.S. I have a feeling that GG is reading this post right now. Right GG? :-)
It has resolved some of the spam issues I have been regularly telling Ggoogle about.
The site at #1 is still a major spammer (6 mirrors, 68000 links all with same anchor in a industry where 1000 links for 10 year old sites is a whole lot).
Overall I give that dc a 7/10.
Cant see that much difference on that DC yet (but it is the 3rd push I am waiting for anyway)
But I guess if there is significant difference for a lot of folks that maybe the start of the second push.
At the start of the first push some DC seemed to be going a different way but they did not hold. So confirmation from GG or MC would be good :).
[edited by: Dayo_UK at 7:57 am (utc) on Oct. 26, 2005]
Knocked two review/index sites that interlinked heavily out of the top 30.
Knocked one site that has many different names from #9 to no-where.
Has not knocked the #1 for past year from #1 even though it has 5 mirrors under different names and 68000 bought backlinks.
I think this update had something to do with network interlinking spam.
What I don't understand is that I have gone from PR5 to a PR6 but my rankings are just falling and I haven't seen any improvement.
I have read a very interesting recent interview with Matt Cutts, by Aaron Wall. And one of the questions was:
When you guys roll out new algorithms, filters, and patches some good sites end up getting filtered out with the bad. Do you pre-test most of the algorithms prior to launching them? How do you know how strongly to apply filters? By default do you usually lean on one side or the other and then tweak your way back?
and Matt's answer was:
We always put algorithmic changes into our test harnesses to poke and prod in lots of different ways. But you also have to be adaptive. If someone in the outside world notices an issue after a launch that you didn't notice, it's important to take that feedback and act on it, and also to try to improve the testing procedure to cover that in the future. We usually have a pretty strong sense of whether something will be a large-impact launch or not. But you can't completely avoid having a large impact with a launch. An example might be if you're replacing a large subsystem in the crawl-index-serve pipeline. We continually go back and improve or replace sections of our system. Sometimes the results can't be bit-for-bit compatible in output, so you have to do the best you can. Update Fritz in 2003 is the canonical example of that; you can't go from a batch-based search engine to an incrementally-updated search engine without some visible impact. To answer your last question, I personally lean toward softer launches; webmasters never need any extra stress. But sometimes launches can't be made completely soft or invisible, as I mentioned.