Forum Moderators: open
I did several things to make the site more user oriented rather than Google oriented. Something I should have done from the start.
Excellent statement!
As for the deoptimisation, the keyword is now repeated about 5 times excluding the metas, down from about 20-25.
Now, my Title is still very similiar to before, just slightly re-arranged, my Description is more descriptive, and I re-wrote much of the text to sound more natural. I also removed the header.
We are back in the top 5 for our money phrase and in the top 2 for several of our more popular phrases.
Am I happy I de-optmized? yes. But the truth be told, there are things on my site that would not be there if I was completely unaware of SEO. I want to be listed well on Google so there are still aspects of my site that are there because Google says they should be.
I know longer have a H1 tag. I am not saying the H1 tag is the cause of disappeared sites though.
HTML is a structural markeup language, and it makes sense to use "H" tags to identify headlines, section heads, subheads, etc. I can't imagine that Google would look askance at "H" tags except where they've been misused (e.g., for whole paragraphs or pages of body text).
i see a lot of what you see but.....i see keyword spamming works if you dont have that text in your inbound links..the momment your inbound links matches repeated words/phrases you get hit. If i get inbound links to pages already doing well without them the pages then fall like a stone...pages with no inbound links but repeated words/phrases do well....
...the momment your inbound links matches repeated words/phrases you get hit.
A lot of implications in that statement, and some things left unsaid. On a related note, could some folks have experienced an uptick from deoptimization simply because they unlinked the inbound text to repetitive text? The effect would be the same as Google Bombing, and Google bombing still works.
Setting a threshold at which repetitive anchors lose their effect would be a good way to cut down on aggressive optmization but would be ineffective against Google Bombing, as there isn't matching text on the target page.
Anyone disagree that an anchor text dampening filter is in place?
Sidenote: I didn't deoptimize anything, but my stuff was never heavily optimized anyway, so none of that seems to be affecting me whether a filter is in place or not. I'm interested in what others have experienced though.
pages with no inbound links but repeated words/phrases do well....
This would be contrary to Google's stated policy but I have seen it too. At times I think the inbound links are actually hurting my site. Most of my inbound links however are not text based but image based with alt tags. It's the 3-word phrase in the alt tag that is no longer showing up in the SERPs. I wonder if that has anything to do with it...
HTML is a structural markeup language, and it makes sense to use "H" tags to identify headlines, section heads, subheads, etc. I can't imagine that Google would look askance at "H" tags except where they've been misused (e.g., for whole paragraphs or pages of body text).
True, but in the past, i used it to repeat my keywords. The average joe surfer that comes to my site has no idea what a <h1> tag is. my site makes it abunduntly clear what it is about without using them.
You can have a relevent and interesting site without them and I am not being punished by Google for leaving the header tags out.
True, but in the past, i used it to repeat my keywords
You mean you used them incorrectly to try and boost your rankings? Spamming.
I resisted the urge to due large scale "de-optimization". I am into laser changes and testing the results these days.
I have one domain with two sub-domains, all of which had top rankings for high-traffic terms pre-Florida. Post Florida, all three were down around 200-400 in the SERPs.
One subdomain has a two keyword target: keyword1, keyword2. This phrase is on page where you'd expect it to be and in the incoming anchor text from many different places.
After Florida hit, I decided to test the theory of keyword in the title tag being "the problem". Such a theory seemed crazy to me, yet it thought I was seeing evidence of it in the SERPs. I changed only the title tag text of the above mentioned subdomain by replacing keyword1 with an unrelated adjective. No other changes have been made to this page or the incoming anchor links. No other changes have been made to the site.
So what's the result? Well the page that I removed keyword1 from the title tag is now at #514. The other subdomain that I left alone is fairing better at #92. The main domain index page is back in the top 10, and bouncing between #8-11.
Remember, I didn't change the two pages that are doing better.
My conclusion is that deoptimizing is not the right thing to do. I think the changes in the algo can be accounted for by things like stemming, semantics, and possibly an overdue PageRank update which have been recently theorized by a number of people.
I appreciate your post because it provides good information, and in particular you focused on title tags. If we consider applied semantics to be part of the new algo then we could look into word associations and see how adding them to title tags changes the SERPs.
I removed a series of tiny images which I thought might be considered "spam" (they were simply pointing to pages which opened in javascript and therefore would not otherwise have been indexed).
The results: No change.
If all the webmasters who were affected by the florida update report on specific changes or "deoptimization" techniques employed we might be able to collectively deduce or formulate some solid theories regarding the new algo. I think it's important to make one change at a time in order to be fairly sure of what is responsible for any changes in SERPs, much like you did by removing the keyword from the title tag. Obviously there are many variables at play here but every piece of information helps to understand the process better.
I've seen many pages with Google AdSense appearing in the top 20 which were not there before the Forida update.
Another possible pattern to good versus bad ranking post Florida is the PR of your links pages. From what I've seen if your links page(s) have a higher PR then this possibly helps to boost your ranking, if however your PR is low (2 or below) and especially 0 then this possibly goes towards your page ranking worse post Florida.
This is just my opinion based on observations over the past 3 weeks or so.
I haven't been able to test and verify this yet as it takes time to add quality incoming links to your site, but I believe that the type of links pointing at you and not only the PR will influence SERPs for your site.
For example, it may be better to have an incoming link from a site which is on theme and not necessarily high in PR than to have a high PR site pointed your way which has little to do with what Google perceives your theme to be.
For example, it may be better to have an incoming link from a site which is on theme and not necessarily high in PR than to have a high PR site pointed your way which has little to do with what Google perceives your theme to be.
Maybe, but are we talking about the theme of a site or the theme of a page? Let's say that FORBES or PC MAGAZINE includes your doughnuts site in the food category of its "best of the Web" or "top 100 sites" directory. The inbound link from FORBES or PC might not be "on theme" to the degree that a link from don-loves-doughnuts.org might be, but one might expect it to carry more weight (which it would, if PageRank were a major factor).
WAS OUR SITE EVER OPTIMIZED?
Good quality text for READERS - never did any SE tweaking to the content. Guess thats why Google re-instated us to such a nice place.
REAL ESTATE SITES STILL GONE
We do have a few real estate clients, still appears like "G" has a penalty on ALL agents, etc. for geographic/real estate searches. This kinda stinks - wonder whats going to happen.
You are right. It seems like de-optimizing might have worked for other industries but in the real estate world all attempts seem to fail. I tested de-opt. and my site briefly showed in the top 40, but it was short lived. I noticed one of my competitors also reappear but they made no changes. However, both of us are gone, out of top 1000. I'm going to test the stemming theories but I don't have much hope.
I still notice that the filter still is only applied to money terms, so this may be the reason people see their sites return. What is considered as a money term may change with Google.
One thing that really bothers me with Google these days is searching for something and finding 8 of the top 10 listings being message board posts. Other results in the top 10 are also often a page of results from some other search engine I've never heard of. Speaking as Joe surfer instead of a webmaster, I'd say 40%-50% of the time I cannot find exactly what I'm looking for in Google search results anymore. Usually I end up clicking on a relevant adword advertisement instead, which brings us back to the theory of increasing adwords revenue prior to the IPO (which by the way I hear is coming out at around $30.00 a share to start!). And the rich get richer.....