| 1:10 am on Oct 26, 2005 (gmt 0)|
Getting more referrals from AskJeeves lately. They are advertising a lot
| 1:38 am on Oct 26, 2005 (gmt 0)|
Yah I love the increase in Ask Keeves and Yahoo hits, seems not everyone is sticking with Booble.
Is it possible that the new results are favoring those URL/s that benefit from higher clickthroughs on SERPS (Via they keeping tabs)?
I think this may be possible, anyone have any way to determine?
| 1:47 am on Oct 26, 2005 (gmt 0)|
"does anyone disagree with the notion that this update is heavily skewed in favor of content sites vs e-commerce driven?"
No, I don't disagree. I think this is exactly where the update is headed. I also believe that latent semantic indexing within the algorithms is playing a big part in this update. Namely, long tail, more obscure key phrases. A great deal of computing power would be needed to provide instant results for searches for longer key phrases, the current trend. G may be using the different DCs to see what their systems can withstand. Time is needed to do all this testing.
| 2:24 am on Oct 26, 2005 (gmt 0)|
define disappeared :)"
My main site has almost exactly 1,000 pages. Goggle was still showing 976 pages as indexed. No apparent problems with dupe content or supplemental pages.
But the site disappeared in the SERPS. I stopped looking at position 600 or so for my main keyword phrase. A few days later it showed up on or around page 20 for most search phrases and worked it's way up to page 8 slowly.
I'd like to think it was something I did to help it improve. But I didn't do anything worth mentioning. I've made the mistake before of changing too much too soon during updates and it took me months to recover not from the update, but from the changes I made.
| 2:45 am on Oct 26, 2005 (gmt 0)|
>> it took me months to recover not from the update, but from the changes I made.
I did the same last fall. What a nightmare, but I did it to myself. Now I just removed a few keywords, nothing major.
| 2:50 am on Oct 26, 2005 (gmt 0)|
I see this "tweak" as a step towards the final goal - quantifying a quality site for a given query using a algo based on backlinks, titles, descriptions and as I should have added in my previous post - content.
Someone else mentioned symantics. Obviously it's a blend of all the above. Not so obvious is the blend itself.
For those that care, stop freaking out about updates, start concentrating on good links, good content and a good site all around and its going to be all good.
I just hope for those that have dropped in rankings while running a clean quality site can weather the storm till the ship rights itself.
| 4:39 am on Oct 26, 2005 (gmt 0)|
|does anyone disagree with the notion that this update is heavily skewed in favor of content sites vs e-commerce driven? |
I have both kinds of sites and neither have seen much beyond increased ranking and more search terms bringing in visitors. I think this update is heavily skewed for something but whatever that is is more complicated than big vs little or content vs e-commerce sites.
I want to think that this update is about giving more weight to sites designed to appeal to the people that use it rather than just optimized for search engines but we'll have to see where this goes. Wouldn't it be fantastic if people that built good websites and engaged in no link trading or employed any SEO tactics beyond solid design and implementation floated to the top of the SERPs?
| 5:06 am on Oct 26, 2005 (gmt 0)|
Matt said -
|My point is that more than ever, we are constantly working to improve our algorithms and scoring. Some changes are hardly noticed at all. |
And he also said -
|And I wouldn’t be surprised if a second stage of the index rolls out around this time next week. I also wouldn’t be surprised if a third stage of the index rolls out the week after that. |
"Could it be that the Jagger2 and 3 are those changes that are hardly noticed at all?" ;)
| 5:08 am on Oct 26, 2005 (gmt 0)|
I'm telling you Jagger2 already happened I'm amazed people don't notice it.
| 5:34 am on Oct 26, 2005 (gmt 0)|
Guys, I need serious help..
I've been hearing a lot about this cannonical (am I saying it right?) google problem and now I think I may be affected.
Basically when I search for my site in google this is what I get:
www.domain.com - fresh cache, title, backlinks, etc..
domain.com - very old cache (about 8 months old), no backlinks, many old deleted pages (as supplemental results)...
I'm using monstercommerce and unfortunately I've had no way of getting rid of their www.domain.com/index.asp - duplicate home page
What's my problem here? I'm desperate for some answers!
Thanks in advance and hope to see you at Pubcon!
| 5:36 am on Oct 26, 2005 (gmt 0)|
Today I have noticed change in backlinks counts for some of my websites.
No other Changes seeing at the moment.
| 5:36 am on Oct 26, 2005 (gmt 0)|
|Wouldn't it be fantastic if people that built good websites and engaged in no link trading or employed any SEO tactics beyond solid design and implementation floated to the top of the SERPs? |
Fantastic: "Based on or existing only in fantasy; unreal."
| 6:02 am on Oct 26, 2005 (gmt 0)|
188.8.131.52 looks interesting.
| 6:18 am on Oct 26, 2005 (gmt 0)|
|Fantastic: "Based on or existing only in fantasy; unreal." |
Perhaps, but I can't help but think that something like this is what Google strives for even with their need to turn a profit.
| 6:38 am on Oct 26, 2005 (gmt 0)|
184.108.40.206 has a different set of results, most looking very good compared to what we have now.
| 7:25 am on Oct 26, 2005 (gmt 0)|
220.127.116.11 has significantly updated results for me.
They look a bit like the main Google serps between Jagger1 and a couple of days ago.
I would bet one of my least favourite of my collection of vintage games consoles on that dc showing Jagger2 results.
| 7:28 am on Oct 26, 2005 (gmt 0)|
Same serps as above now on 18.104.22.168
I am now willing to bet a slightly spoiled Atari VCS2600
| 7:32 am on Oct 26, 2005 (gmt 0)|
|Same serps as above now on 22.214.171.124 |
Google updates DCs by their C-class. We should actually be mentioning 66.102.9.* is being updated.
| 7:41 am on Oct 26, 2005 (gmt 0)|
Good morning Folks!
Today is the day of stage II of Jagger Update, The Father of All Updates, nickname The Terminator.
And we are still waiting for GoogleGuy to keep us posted ; whats stage II is all about.
Many thanks in advance GoogleGuy!
P.S. I have a feeling that GG is reading this post right now. Right GG? :-)
| 7:46 am on Oct 26, 2005 (gmt 0)|
Bearing in mind that GG is going to read this at some point, I should day that I like 126.96.36.199.
It has resolved some of the spam issues I have been regularly telling Ggoogle about.
The site at #1 is still a major spammer (6 mirrors, 68000 links all with same anchor in a industry where 1000 links for 10 year old sites is a whole lot).
Overall I give that dc a 7/10.
| 7:46 am on Oct 26, 2005 (gmt 0)|
188.8.131.52* looks promising. My supplementals seem to have disappeared from Serps (but still visible with #*$!)and I am on page 5 (from 18) for my main keyword. Never been that far up (site is one year old)
[edited by: joergnw10 at 8:02 am (utc) on Oct. 26, 2005]
| 7:48 am on Oct 26, 2005 (gmt 0)|
Morning reseller. Guess GG is keeping up his words. Looks like Jagger2 is making its way from 66.102.9.* DCs.
| 7:52 am on Oct 26, 2005 (gmt 0)|
>>Morning reseller. Guess GG is keeping up his words. Looks like Jagger2 is making its way from 66.102.9.* DCs.<<
Good morning McMohan.
Question is; Does part II of Jagger meant to deal with SPAM & SUPPLEMENTALS ISSUES?
If its the case, whitehat webmasters would be in business again :-)
| 7:53 am on Oct 26, 2005 (gmt 0)|
Cant see that much difference on that DC yet (but it is the 3rd push I am waiting for anyway)
But I guess if there is significant difference for a lot of folks that maybe the start of the second push.
At the start of the first push some DC seemed to be going a different way but they did not hold. So confirmation from GG or MC would be good :).
[edited by: Dayo_UK at 7:57 am (utc) on Oct. 26, 2005]
| 7:57 am on Oct 26, 2005 (gmt 0)|
184.108.40.206 has for me:
Knocked two review/index sites that interlinked heavily out of the top 30.
Knocked one site that has many different names from #9 to no-where.
Has not knocked the #1 for past year from #1 even though it has 5 mirrors under different names and 68000 bought backlinks.
I think this update had something to do with network interlinking spam.
| 7:59 am on Oct 26, 2005 (gmt 0)|
|Question is; Does part II of Jagger meant to deal with SPAM & SUPPLEMENTALS ISSUES? |
I guess Jagger2 is more of a correction to Jagger1, with inputs from webmasters, user search behaviour et al. Anyway, we will only know when GG writes in this forum or Matt updates his blog.
| 8:09 am on Oct 26, 2005 (gmt 0)|
I really hope [220.127.116.11...] isn't the way things are going as my site has dropped from first page on most keywords through to about page 3 / 4 after over a year being around the top of page one for most keywords.
What I don't understand is that I have gone from PR5 to a PR6 but my rankings are just falling and I haven't seen any improvement.
| 8:11 am on Oct 26, 2005 (gmt 0)|
Talking about Jagger The Terminator.
I have read a very interesting recent interview with Matt Cutts, by Aaron Wall. And one of the questions was:
When you guys roll out new algorithms, filters, and patches some good sites end up getting filtered out with the bad. Do you pre-test most of the algorithms prior to launching them? How do you know how strongly to apply filters? By default do you usually lean on one side or the other and then tweak your way back?
and Matt's answer was:
We always put algorithmic changes into our test harnesses to poke and prod in lots of different ways. But you also have to be adaptive. If someone in the outside world notices an issue after a launch that you didn't notice, it's important to take that feedback and act on it, and also to try to improve the testing procedure to cover that in the future. We usually have a pretty strong sense of whether something will be a large-impact launch or not. But you can't completely avoid having a large impact with a launch. An example might be if you're replacing a large subsystem in the crawl-index-serve pipeline. We continually go back and improve or replace sections of our system. Sometimes the results can't be bit-for-bit compatible in output, so you have to do the best you can. Update Fritz in 2003 is the canonical example of that; you can't go from a batch-based search engine to an incrementally-updated search engine without some visible impact. To answer your last question, I personally lean toward softer launches; webmasters never need any extra stress. But sometimes launches can't be made completely soft or invisible, as I mentioned.
| 8:15 am on Oct 26, 2005 (gmt 0)|
Am I behind the times or are google on their results pages no longer saying:
"in" is a very common word and not used in the search results blah blah
It seems stop words are now very important.
| 8:20 am on Oct 26, 2005 (gmt 0)|
Cant say I can notice any real difference on either of those DCs yet. Not for the industry I watch anyway. Still looking bleak for Bob.
| 8:24 am on Oct 26, 2005 (gmt 0)|
Looking at that DC again - and I dont know if others can confirm - it looks to me like stage 1 has not effected that DC.....?
Very hard to tell - but what I was seeing develop out of the first stage certainly does not look like it has hit that DC.
Time will tell - for me when people first started noticing changes on 18.104.22.168 - they are nothing like what was displayed a few days after first noticed.