Forum Moderators: open
The thread searching your domain name post Florida seems to indicate that all these front-page issues could be related to some kind of mess with the domain name, using two or more domains... at least that's my personal interpretation.
In my case there is only 1 domain being penalized BUT the domain has the exact 3 words (which incidentally also describe the service) separted by hyphens.
It is NOT a question of an individual page being excluded but rather a search.
If my service is offering blue widgets in italy (also domain name) the search will NOT provide results for any of the 80 pages which ALL have this in the title as well as H1 and domain name. The addition of 1 more word does NOT produce results (blue widgets anyword Italy). However, the addition of 2 words (for example: blue widgets anyword anyotherword Italy) DOES produce results as before the Florida update.
Perhaps the filter is activated when the search phrase becomes identical or close (in terms of keyword percentage) to the suspect keywords on the page.
If a search for "blue widgets anyword Italy" garners 75% of the suspect keywords (blue widgets Italy) and does NOT produce results, a search for "blue widgets anyword anyotherword Italy" will produce results because the percentage of suspect kewords in relation to the search phrase is only 60%.
Hmmm...I hope what I just said is clear.
Just a theory, but I feel very sure that it is a question of keywords and not pages which are the focus of Google's anti-spamming campaign.
yep, but that's not the new thing of this update; Google has always used DNS records (in fact they had a nototriously slow cache of them for years) - here's a thread form October 28: Quick Google DNS Update. [webmasterworld.com] - the other thread i refer to in the last post of that thread is now in the supporters forum.
Here's a few earlier ones, concerning how 301 redirects are followed:
August 21 (#2): Moving From One Webhost to Another... [webmasterworld.com]
August 25 (#37): Changing WebHost Service [webmasterworld.com]
Before mid-october they were pretty slow at recognizing DNS changes - somewhere around mid august they became very fast for 301 redirects and around mid october they became lightning fast for DNS changes as well. DNS changes were literally picked up by Googlebot the same or the following day.
Around both these times we also saw threads with missing index pages or "bad incoming links" - googlebot picked up misconfigured redirects, and sometimes these could harm the original site. A bug - problem was solved eventually, afaik.
September 26: Duplicate content - a google bug? [webmasterworld.com]
Also, i should say that the people at Google have been working on improving duplicate-detection (at least on a small scale/testing). In the beginning it could have some unfortunate sideeffects, as it was being combined with the deepfreshbot's "fresh" emphasis eg. here:
October 9 (#5): Web Site Copying [webmasterworld.com]
Naming of deepfreshbot, june 13 (#43): [webmasterworld.com...]
And of course, sometimes there's a natural, but unexpected explanation to it all, as here:
October 4: Penalized site only with some keywords, is it possible? [webmasterworld.com]
I guess that the missing index page issue we see now is something like what we saw then. I believe it is a sideeffect of other factors in the update - it might be a (granted: somewhat successful) attemt of eliminating a certain kind of (duplicate) spam.
Now, i should say that on numerous occasions from around june, members on this forum (including myself) have been almost begging Google to do something about the spammy serps. We did get a message that we had to be patient around may/june sometime, i can't seem to find it now, but this thread is good for getting a feeling about how these things work - it's generally not an individual "per site" thing, it's preferably a general algo that works across all sites:
June 12 (#4): Some Q&A answers [webmasterworld.com]
Post #5 of that thread is highly relevant now i'd say. Please note that post #4 says : "Those algorithms may take longer to get right..." so, my best guess is that someone is working on something somewhere.
/claus
AM I BEING PARANOID? Maybe I need to take a big vacation. LOL
"maybe I am being a little paranoid"
Relax - there are many of us in the exact same boat.
thousands of us!
don't know why, but it is really across the board.
For those untouched, I guess they are either doing something right, or, google just hasen't reached them yet.
I have noticed that the number of dropping segements
(topic areas) continues daily.
A buddy of mine lost his site today.
That post was not intended as a general one - it was just a reply to the post by Trawler. For some people these things might cause problems, but i don't think this affects all - quite a few members do not own/operate a lot of domains. (otoh, quite a lot do that as well)
Read post #65 by steveb here - what he mentions are real world pre-Florida SERPs, i've seen something quite similar myself (although i'm not sure it was the same industry): [webmasterworld.com...]
Such cases may sound extreme to the more, say, "whitehat" as well as to newbie webmasters, but the former "algo" in fact encouraged this behavior. I don't think it's easy to encounter this without affecting a range of other sites that are similar in some way, although they are not the same, perhaps not even commercially interesting.
Now, something happened, and some instances of spam disappeared. Not all kinds of spam did. Some variations of the type steveb mentions are even there still; i'm still fighting a small three-domain "fortress" having six places in top 10 for a few sets of keywords. So, i think something will still need to be done against that particular flavour of spam.
Also, some perfectly legit sites seems to have suffered. And then a lot of other things happened at the same time. It's not easy to say exactly what is what here.
/claus
In my niche when I do an advanced search for "widget insurance" using any of the following filters I get the same or similar results which are much like pre Florida SERPs.
intitle:widget insurance
allintitle:widget insurance
allintext:widget insurance
allinanchor:widget insurance
-fufufu widget insurance
+the widget insurance (or any other common word that might be used with widget insurance like "quote" for example)
An explanation of what all of these filters do can be found here
[google.com...]
The designers of the advanced search functions of Google clearly intended these filters to be an aid to users in finding better, more specific, results. In each case the results they produce are sites that have been dropped from the standard Google search.
FWIW The same results are also delivered under the Directory tab.
When I repeat this in a "category" which is unaffected by the Florida effect using the term "widget clubs" there is little difference between the unfiltered standard search results and the advanced search results.
Still searching for answers.
Sid
For our particular 3 keyword phrase, pre-florida showed sites that actually offered their own branded products that matched the search terms.
The one thing almost all of the sites had in common was that they were all listed in the Google directory. The only 3 sites (that offer a product being described by the 3 keyword combination)that survived the Florida update are not in Google's directory.
Yet they have the same 3 keyword phrase that all of the sites pre-florida used. They are optimised much the same way that everybody else was. Each site is unique in that they all offer a different competing product for the same industry. None of them use affilitiates or have done any major cheating like doorway pages, etc...
Anybody else seeing this? Are any of the remaining sites listed under your specific google directory? Are any of the remaining relevent sites listed there?
Not silly at all. Anchor text is still the #1 algo ingredient (or at least it is an important one). A site can rank just fine based solely on anchor text pointing at the site. This has been true before.
Words on a page matter more now, but Yahoo.com wouldn't need to have "yahoo" on its main page to rank forst for a "yahoo" search. It would be first by a wide margin simply from its anchor text.
I haven't seen this, and if this is so Google would be seriously broken. If whether or not being in the Google Directory was considered, it should be that being listed will benefit a site in the SERPs. One thing the ODP tends to do well is not list duplicate content sites, and doorway pages set up just for search engines. Penalizing sites just because of an ODP listing would reduce relevance of SERPs.
Hissingsid
Here's an interesting thing which may add to this debate.
In my niche when I do an advanced search for "widget insurance" using any of the following filters I get the same or similar results which are much like pre Florida SERPs.intitle:widget insurance
allintitle:widget insurance
allintext:widget insurance
allinanchor:widget insurance
If I do all of the above searches for my main keywords then I come up as #1-3 ..pre-florida results. No surprise there.
However, I did a search for
allinlinks: keyword1 keyword2
allinlinks: keyword1 keyword2 keyword 3
allinlinks: keyword1 keyword2 nonsenseword
allinlinks: keyword1 keyword2 unrelatedkeyword
I get the ole "Your search - allinlinks: keyword keyword - did not match any documents."
It seems like this allinlinks filter is not working. The only word that I could find to show any results was google! It only showed 108 results! I have to imagine that there are more than 108 pages with links pointing to Google ;-)
I think this is significant. Does anyone else?