Forum Moderators: Robert Charlton & goodroi
AlexK, that's a different domain. But the point is very well taken. You've found a pretty obscure query (~295 results) that the keyword stuffing spammers like to target. I'll check this out in more detail.
[edited by: Brett_Tabke at 1:03 am (utc) on Nov. 12, 2005]
Just one thing I recently noticed is that subdomain cross linking is not automatically penalized by Google unless 1000's of crosslinks, maybe this should be part of a future update- not sure how easy it will be done on the algo side though.
some sites using kw1.example.com , kw2.example.com like really a lot of subdomains etc...that's what I'm talking about, there should be a tighter filter on subdomains for allinurl counts I think.
Whatever the jagger 1, 2, 3 has done to all the sites over the internet, its a real sorry scene.
But this is what I have to say.
I am following the best optimization method for all my sites. around 24 sites myself and 50+ sites with my team and none of jagger updates has effected my site.
From the start I mean since 2001 - till date. I am using all the ethical method of optimization. I never want to go reciprocal linking as I knew one thing this is a real spamming on the internet. But I used reciprocal link to its best. I am sorry if that is rude, but this what I feel.
I will request all to maintain an ethical means of optimization irrespective of search engines. Search Engine has to recognize the best seo practise on the internet for any given keyword. Well, we cannot just be depended on one search engine. "Sorry Google" if I offended you. But if Google want to survive on the they has to service the seo else I don't see small players like us will support Google. any kind of jagger or revolutions from any search engine.
I am not worried about Jagger.
Thank you,
Marketing Agency
I think that the latest spell of updates from 'G' have one purpose in mind ... To make it clear that no form of SEO is acceptable anymore. By chopping and changing the rules they hope to make the [Ethical] SEO practices of the past more trouble than they're worth.
My company has been hit hard in the last 4 months, leading to redundancies, but I'm starting to see that they actually could be trying to level the playing field between the SEO's and the regular web publishers.
Good for them!
I was hoping to see something better by the end of the week. But Top 2 or 3 results look OK, but rest of the listings are spam sites in my industry.
We have no spamming, no link exchanges and not even competing for bigger keywords. Then what is it that's effeting. Even the page rank is reduced for some of the sites just before jagger. Is the flux over or still going on?
Www - Wwwwww www.com www www.com www.com www.www.com www.www.com wwwwww.com wwwwww.com ww.www.com ww.www.com wwwww.com wwwww.com www.www.com www.wwwcom wwwwwwcom ...
www.#*$!xx.com.281005.info/
How many listings can we report to Google, as there are N-Number of sites. What is the strategy they are follwing to filter the sites? Either the sites are spam sites, Coming Soon! sites, Japanese sites, spam sub domains which redirect Google.com or some adult domain.
I can see the day when G's algo can dismiss all forms of Meta, all forms of linking sites that reside on same servers and any sign of textual irregularities.
To say that SEO is giving search engines what they want is not entirely true. As soon as anyone sways from their basic content, in order to fill search engine criteria, they are decreasing the visitor experience. I feel that 'G' is going "all experience".
I also think that 'G' see moneys made by SEO companies as theirs and they're out to get it back. Making these SEO companies pop in and out of favour with search engines will eventually cement in the minds of customers that 'G' is the only one who can provide the top spots.
Cheers & all the Best
Before going to bed yesterday, I enjoyed listening to Matt Waddell (Google Mobile Team) very nice song; Get lost and found on your phone.
Rumour has it that Matt Cutts (Google WebSpam Team) is also about to finish writing his own WebSpam song:
Get lost and found on Google Serps :-)
Well.... It seems that the dog ate the new serps of [66.102.9.104...] while I was in bed.
GoogleGuy
Weather report pleeeeeeeeeeeeeeeeease!
What have you done to my good kind friend The Father of All Updates Jagger3? Has he also been eaten by the dog :-)
Thanks a bunch.
Google's entire method of ranking sites(originally) was based on an internet that was unaware of the ranking criteria. SEO of any kind disrupts at the core the method.
Now, SEOs from big companies down to mom and pop site owners are constantly trying to break the code.
As a site owner, I contantly complain that I'm spending too much time on chasing ranking instead of improving my site. I hate it. But in my market if you don't chase the carrot you have no hope. I'm sure there are people at Google that feel a similar way about spammers(and to a degree SEOs). Instead of improving search overall, they must spend huge amounts of time dealing with people trying to cheat the system.
I've come to accept the fact I'm one of those people trying to cheat the system. Not through cloaking or hardcore black hat stuff. But I do make minor changes to my site and partner for links in an attempt to rank better. That makes me an enemy of search engines. Make no mitake about it. SEOs(black hat or white) are the bad guys as far as SEs are concerned.
you quoted me [I have recovered almost all positions that I lost on September 22 and gained a few exciting extras. One difference though is that pre-September, my home page was usually the indexed result, whereas now I'm finding that my section indexes are rating high for their keywords. ] and you asked on what date did you recover from 9/22? and which DC do you see your results?
The answer is that I didn't start to see any recovery until late last night, with results first appearing on 66.101.7.104 then appearing on 66.102.9.147 this morning.
I've checked numerous dc's for consistency, and I've regained my number one spot for my main search term on many dc's. The only difference between the dc's for me is the number of search results, like so:
66.102.7.104 -27,500,000
66.102.9.147 -32,700,000
216.239.37.104 -32,600,000
216.239.37.147 27,600,000
I lost ground on some minor terms but appear in the top five for the very first time on some major terms that I was never able to rank for before. It's lovely. :)
I have made site changes since September. Reduced navigation repetition to avoid duplication, ditto on some overused keywords. I've reported hijackers and used the Remove Url console with good, but not total, success. (There are still some persistent instant meta refresh hijackers I can't shake) I also used the Remove Url tool to get rid of ancient (and non-existent) but indexed htm pages since replaced by shtml.
I put up a 301 redirect to elminate the non-www pages that were appearing before. And I redesigned my home page, so that it now points almost exclusively to the separate section indexes of my site, instead of to the indexes plus several sample pages within them. That strategy seems to have resulted in better recognition of my separate section indexes, with profitable results.
That's my experience; hope it is helpful for someone.
Thanks for sharing. Much appreciated.
This Success Formul, sounds very interesting!
"I lost ground on some minor terms but appear in the top five for the very first time on some major terms that I was never able to rank for before. It's lovely. :)
I have made site changes since September. Reduced navigation repetition to avoid duplication, ditto on some overused keywords. I've reported hijackers and used the Remove Url console with good, but not total, success. (There are still some persistent instant meta refresh hijackers I can't shake) I also used the Remove Url tool to get rid of ancient (and non-existent) but indexed htm pages since replaced by shtml.
I put up a 301 redirect to elminate the non-www pages that were appearing before. And I redesigned my home page, so that it now points almost exclusively to the separate section indexes of my site, instead of to the indexes plus several sample pages within them. That strategy seems to have resulted in better recognition of my separate section indexes, with profitable results.""
>>This is thread is 26 pages...WOW! Could someone post a summary for those of us who don't have 26 hours to read this... <<
Summary of thread "Part 3 Update Jagger":
It seems that the dog ate the new serps of [66.102.9.104...] :-)
Good whitehat SEO and user experience are not mutually exclusive. In fact they are often one and the same.
Logical page titles; accurate meta tags; descriptive <h1>, <h2>, <h3> tags; secondary navigation menus; breadcrumbs; site indexes; contextual links; plain-English, rational and easy-to-remember domain, directory, sub-directory and file names; and yes, even targeted link exchanges if properly implemented CAN all add greatly to the user experience as well as help your rankings.
As far as the most recent algo is concerned, my sites either moved up, or stayed about the same so I have no ax to grind with Jagger. In fact my personal site went from number one to number one AND number two. However, relevance at Google is fading fast.
I remember when you could run searches at Google without quotes. Today I ran a 3-word search without quotes and the #1 result included only two of the three words and they were not found consecutively. WTH is that?
Google's direction is obvious, discount what's actually on the page and rely almost exclusively on off-page factors so that they can't be gamed. Nice little theory but it doesn't work. Relevance takes a beating and the person being gamed is the mook using Google instead of Yahoo.
BTW--Ran the same three-word search without quotes at Yahoo and the results were spot on. Nine out of the top ten sites employed all three words consecutively.
If they are students of history, Larry and Sergey already know that "all glory is fleeting," and that eventually we all "jump the shark."
Apologies for straying from topic and to GG who was very helpful throughout the Jagger experience. Finally, congrats to all who moved up.
[edited by: Sparkys_Dad at 8:21 am (utc) on Nov. 8, 2005]
Now the 66.102.7.104 results are shown on 66.102.9.104!
Definitely: Google hates my page..
Now it disappeared again from all DCs that I could see yesterday..
I'll have to find other job.. :-(
My site is designed for the user. It offers high quality products that are exactly related to the search terms I covet. I have tons of original articles with useful advice related to my products.
But that's is just plain not going to get you in the money in my market. Another site has a blog built into it where he "posts" public domain articles related to the keywords. His blog(not to mention his sites with over 10k affiliate pages) has almost zero original content and out ranks many real sites and real blogs. He has 4 times as many articles posted as I do, hasn't written one of them that I've seen. He's page 1.
Another site has 2 full time guys in India doing nothing but exchanging links with anyone and everyone. He's number 1 for almost every related term in my market.
So pardon me if I check keyword density and look for quality link partners. Yes, I'm making changes to my site to try and rank better. That alone makes me the enemy of SEs. White hat SEO is still anti-SE. SEs are not your friends or partners even if you think you are within their current guidelines. When that algo changes things are where they are and the SE doesn't care who gets burned as long as they feel they've improved the experience for their users. That's their job after all.
No need to panic if you don't see your site/pages on Jagger3 DCs. Its just the normal Flux that both GG and Matt told us about in advance.
You may wish to to wait and see. And I wouldn't be surprised if your site/pages return back as soon as the Flux is over within a week or so.
Good luck!
P.S. within the next hour or so GoogleGuy migtbe posting few comments. Stay tuned.
As Reseller says, there is flux going on.
Speaking personally, and this IS only personally, I have seen exactly the same results for my keywords these past few days, regardless of database, and they all come back to .7.104
He has 4 times as many articles posted as I do, hasn't written one of them that I've seen.Well, you cannot control what others do. You can only do your best with your own site. We see, every day, companies trying to provide services based on robots, which miserably fail to understand human nature and human needs.
I do agree that it seems that Google must be spending a huge amount of time trying to outthink the cheats, and perhaps too little time on how to support truly content-rich sites.
But, at the end of the day, Google is now in it FOR THE MONEY. Anything that adversely affects their revenues will be shot down by their shareholders. Google now only cares about you and your needs IF it coincides with their own.
Not exactly "do no evil"...