Welcome to WebmasterWorld Guest from 184.108.40.206
They certainly have done no such thing, imo. The opposite in fact. A lot of sites that likely had penalties against them seem to have been, wrongly, looked upon too favorably by this update. Google needs to get far more aggressive with its penalties. One month slaps on the wrist aren't even worth mentioning.
Google has made tons of screwups and lost a lot of sites. It's ludicrous to call these "penalties" though. When you drop a glass on the floor, you did not penalize it.
Nice. :) We got to take action on it, and we're also looking at priorities going forward.
Isn't it surprising while Google is preparing to handle sophisticated spam techniques and overt SEOing, G is still not able to address few of the primitive, old school spamming such as hidden text and links?
[edited by: McMohan at 9:10 am (utc) on Oct. 27, 2005]
According to googles patent I read, good sites will continue getting natural links pointing at their site. Which causes your rankings to go up.
If a good site goes bad, the natural back links will stop or slow down causing your rankings to drop. Or causing a penalty?
If you had a few websites that were hevily cross linked you would be fine until you took those cross links off of your pages. Which would cause a penalty because you had a massive drop in links to your websites.
Or if you traded links hevily for a month with a couple hundred sites and then did nothing for a few months. Your link growth stopped and you would get a penalty.
I also read that they keep track of your link (growth and decline) every month.
So if you traded links alot for a month and then stopped it would look bad because your rate of link growth was real high for a month and then dropped down to normal which looks like below normal to google because of the one month you traded links heavily.
Google is looking for sites that grow steadily over time. If you get 100 links this month, you should also get 100 links next month and every month after that. That will keep you from getting a penalty.
The whole idea behind that is to fight buying links. Or buying page rank.
The sandbox is there to keep a person from buying a PR7 link and ranking right away. Basically making you run out of money from buying page rank.
Keeping track of your link growth history helps them rank you according to how popular your site is right now.
If your site steadily recieves new backlinks every month you will steadily rank well in google. If your site suddenly goes bad then people will drop your link and google will lower your ranking.
Or if you bought links on high PR pages and you dropped those links after a month or 2 or 3 you would get a penalty because you lost those links.
I have 18 websites and I have been testing google and how it ranks sites for over 5 years and this is how things are working from what I have seen in my own websites.
Google is right about "There is allmost nothing your competetor can do to get you banned".
But there is one thing I have found that works to get a competetor banned but I don't think that sort of thing needs to be let out of the bag.
Reseller and Dayo - About the update, your and maybe my site troubles (my troubles 302/hijacking/non www) is first seeing a effect late this weekend or next week, so there is still hope, but once again dont expect to much, they had a year to fix it and nothing, but of cause we can hope for the best + its also the first time we have got a real statment from google, so we will maybe have a great Christmas.
if so is there a way to get a confirmation from you that a certain site got hit for a certain keyword (passed undamaged thru all previous updates including Florida but jagger2 seemed to have killed me).
is there a way to post you a question?
But we know now that Google are serious about it and hopefully serious about a fix. So I assume that if the fix is not right they will want feedback etc until they do get it right.
At the moment each update more and more sites get caught out (so it seems to me) - it is not just me being selfish (obv. I want my sites to return) - but going forward Google need to make sure they get this right.
IMO etc - however, we should probably talk about it this time next week or a bit later.
After checking my site's position on Google I was wondering why it is that I see different serps depending on whether I use BT as my ISP or Freeserve
Ranks will depend on which DC your default Google is resolving to. I would suggest you search on 220.127.116.11, since those results are the new batch of ranks.
[added - OK, posted before I saw GG's post]
>>I have come to learn that the first changes in an update are usually the important ones and the flux that comes afterwards is just a little tweak here and there. Jagger seams is no different from the rest so far. <<
I beg to differ. IMO, all parts of an updates are important. Allegra and Borboun are good examples. And then the "flux" which I still fear most ;-)
And fellow members who have been affected by Jagger1 would tell another story than yours, I Guess ;-)
Back in the old days - then I would agree - serps appeared on a dc and these pretty much were what you got.
Dominic and Esmerelda were the first times I think that things really seemed to be different.
(although with this update MC did sort of indicate on his blog that these were three seperate updates that are happening at the same time so could be thought of as one)
Night GG - this going to become a regular late night thing to send you to sleep :)
[edited by: Dayo_UK at 9:30 am (utc) on Oct. 27, 2005]
Sure, I will report this blackhat/spamer again with jagger2 as info.
Trust me when I say that I am talking about probably the biggest "cloaking pages" farm out there, but it will hurt also all their clients so I though that you guys did not want to penalize for this reason...
I going to report it right now, once again, I will add "referencement" as info if this can help.
If 18.104.22.168 is indeed the way things are going then things aren't looking too good for me, lol!
I have a related question though. Google has spidered all of my PHP pages as indicated by a site:www.domain.com search, yet my robots.txt has had Disallow: /*.php$ in it for months. I'm a bit concerened as it's also spidered my php converted to html pages, which it must see as duplicate content. Using the datacentre GG suggested I notice that although are pages are still indexed, we've dropped majorly from some key search terms. If this is as a result of the duplicate pages, what is the best way of remedying the situation if I've already taken steps to stop my php pages being crawled?
Thanks for any advice
I think the main google's goal is not to let you be bored.
After florida I lost the index for two month, but all the inner pages were untouched.
After jagger I see the opposite..my index is still top ranked but I lost all the inner pages.
When google was wrong? : the last 5 years when all my inner pages was top ranked or after jagger when no inner pages can be found?
I see Google has lots of job vacancies. Well, if you can't beat them...join them! GoogleGuy, can I send you my CV?
I still think that June results were better...buying season or not.