Forum Moderators: open
Speaking from personal experience, I have a couple of "clean" sites that went down the tubes this last update. I also have a few "not-so-clean" sites that are doing fine--in fact, better than ever.
Nobody really seems to be able to figure out the real algo changes Google made during the Florida update. I personally suspect the keyword text in links pointing to your site has something to do with it, but it's not the whole story.
The fact is, so many clean sites are getting penalized now, that SEO's don't know what to do... and when we don't know what to do, we experiment. Experimentation inevitably takes some of us down the road of "black-hat" optimization.
My point is that folks at Google may believe they have reduced the amount of spammy sites in their index, but my assertion is that this is only temporary. New, even spammier sites are going to be popping up like mushrooms!
There are so many holes in this that within 3 months the situation to make Adwords more profitable(what the update was about dont kid yourself) will completely backfire.
But by then Google will have 2-10 billion in cash so who will care, all will get new office chairs and raises, and long term employees (wont mention names) will most likely have a half million in stock, there will be such a high at the plex that noone will notice.
I have to add something else, Yahoo has been doing this a long time, by looking at Yahoo results(and the servers they get results from) Yahoo has all ready asked that the nob be turned back a bit, Im almost positive they are still asking the nob to be turned back even more from the dailly results im seeing(BTW on Yahoo Im now in the top 20 and beyound 120 in google)as they are changing dailly.
My point is take if from another companies point of view that this update in the long run will hurt. Yahoo knows this.
I disagree with...
"you take your competitor whos having a field day because he didnt put keywords in the few links they he had, and you give him a few new links with the keywords in them(on fresh botted links pages) and hes gone."
My competition has been doing this to me for months now, and I still rank #1 for my key phrase.
Take "prescription drugs" your sample of the index would not be the case.
Take "really wierd purple television sets" google would not consider this to be a money word - result = not effected.
This is probably the case with your niche. I have about 8-10 keywords effected from this, there are another 100+ (non money words by googles standards) not effected at all.
Take "really wierd purple television sets" google would not consider this to be a money word - result = not effected.
We were not affected by the smaller keywords. The problem is that many niche markets train their consumers to use particlar keywords. THOSE are the keywords that are getting slammed/filtered.
Local, state, and national associations, peers, vendors within these markets use this common terminology to find what they are looking for or to reference something specific. In many of the cases, the common public would never even put the three words together.
I can take getting hit a few spaces. But to kick a site out of the top 1000 and suggest they are not relevent is just insulting. Especially when they are following the guidelines set forth by Google.
Dave
I'm will to try and help anyone that does, but I'm not wasting time playing rhyming games :o)
Dave
As far as help goes, unless you have the red button....
cloaking is not "black hat".
It most certainly is in Googles eyes! Some get caught and some dont, but that doesn't make it not black hat.
The advise to cloak in very irresponsible IMO.
Dave
...you give (your competitor) a few new links with the keywords in them... and he's gone. All ready did this and it works.
Your statement is highly suspect and makes me doubt, discount, and ignore everything else you have to claim.
Said with much affection for the membership: I understand any frustration you may be going through, but let's stick to the facts and not start making things up.
Dave, you have a good head on your shoulders!
[edited by: coosblues at 7:34 am (utc) on Dec. 6, 2003]
deanril, Google doesn't take links into account within the database that fast.I tried explaining this to a client today in an un-Florida related matter. The work I had provided for her last year provided unexpectedly quik results which were accented by Yahoo's simultaneous switch to Google. She expected the same thing regardless of prior consultations explaining thats not how it works.
Freshbot and link credits are a part of different processes that occur at different time intervals. There is no connection between the two,
Of course I also had to explain to her why her wedding business (not one I had worked on), as well as 90% of the rest of the local wedding businesses were removed from the results because they weren't relevant any more for local wedding business searches. The last thing she said was "This can't be good for the searcher can it?" I said "Noo, that's why i think it will change"
So I lied.
Ill say this, there are fresh backlinks today.
Keywords in text WILL get you the boot for that keyword/s.
Google has changed a lot, what applied in the past does not apply today.
[edited by: Brett_Tabke at 4:06 pm (utc) on Dec. 7, 2003]
[edit reason] please leave the personal stuff at the door when you visit webmasterworld. [/edit]
In addition Google does indeed pay attention to the way websites link to other websites. Go to [google.com...] and enter this search phrase "miserable failure" without quotes. Click "I'm feeling lucky" if you like. At least at the moment I posted this message if you clicked "I'm feeling lucky" you ended up on the bio for President George Bush on the official White House website yet the words "miserable" & "failure" are no where in the source code for the page. Why does this happen? Because some pranksters decided to link to George Bush's bio using that key phrase many times on different websites and blogs.
I think it might be too soon to consider cloaking, as who knows exactly what you should be cloaking for at this stage? Yes gbot, i know, but what should the content of those cloaked pages be, really? It's all good with random content, markov chains and such, but it reminds me very much of the talk of "de-optimizing" pages, i'm not sure any of this is really good unless you know the exact effects of what you're doing, and really... who does at this moment?
Also, this change is ...special (for lack of better words) so a few (even very obvious) "tech-solutions" might have slipped under the radar in some areas whereas similar ones have been caught in other areas. And, the changes are not over yet, perhaps they'll even be continouus for some time, who knows outside the "plex"?
Anyway, some controlled experiments might yield insights and i'm always for insights, but i have my doubts about efficiency outside luck and coincidence at this moment (although, of course, i might be wrong). Personally, i'm more on the "optimize your sites" path, (aka. whitehat or whatever), although i know a lot of you are very sceptic here, and i can't really explain why i see good results for some pages/keywords and not others - at least not as in "general rules" or "one-size-fits-all".
One thought though: to do cloaking successfully you have to control your technology and exactly what you display to whom and on which url's - i've seen dropped whitehat examples that just don't do this very well... aka. the duplicate issue agian
/claus
Ha GG.
Someone has just linked to 2 of my best sites(that have been untouched and actually improved since florida) on approx 20 of their sites on over 50000 of their pages with the "same anchor text" and would you believe it that I dont show anywhere for those keyword searches.These link are only a wk old and some are already showing up in this backlink update.
Me thinks YOUR sample is too small. ;)
cabbie, about a week ago on the supporter's forum you were saying that Google had knocked out what the title called a "nest" of your sites (double digits, if I recall correctly), and you were wondering how someone could have tied them all together. I don't know what your sites are, but is it possible that your sample has confounding factors? If you'd like to send in examples, feel free though..