Forum Moderators: open
After the florida update i thought of penning down the summary of it apart from cries.
Here what I hate to see but true
Linking Strategy
------------------
You should not have one single keypharse as anchor text for all backlinks, try to have around 4 to 5 keyphrases (40% use the main keyphrase, next 30 % some other,...). Also try to link different pages instead of your homepage. Donot do reciprocal linking, you will be penalized for that.
H1 tags
---------------
You should not have more than one h1 tags, donot follow the h1, h2, h3 sequence. Have H1 and h4 and thats all.
Alt tags
-----------------
I recommend donot use it right now, the reason being all tricks of SEO should not be applied on the same page. So leave it, less priority.
Title
-------------
Title, meta tags keep it as early, but I will say donot use title and h1 together.
Keyword density
--------------------
between 3% to 10%.
Links + Title + 5% keyword density +nothing else =#1
Thanks
Aji
I'm not sure about the H1 - H4 suggestion. H tag use is standard web design and good practice. As long as you don't repeat KW's till the cows come home, or use ALL H1 tags you should be fine. I still believe that proper use of H tags can only improve your ranking, and that includes the use of H2 and 3.
Donot do reciprocal linking, you will be penalized for that.
Not true! I do this all the time to relavent sites and so do many other sites I watch.
You should not have more than one h1 tags, donot follow the h1, h2, h3 sequence. Have H1 and h4 and thats all.
Not true!
I recommend donot use it right now, the reason being all tricks of SEO should not be applied on the same page. So leave it, less priority.
I use these exenstively as it helps users.
Title, meta tags keep it as early, but I will say donot use title and h1 together
No true!
between 3% to 10%.
Don't even think about keyword density, just write your text for humans to read.
Of course, all this is ONLY relavent if you want good placement.
Dave
[edited by: Dave_Hawley at 4:15 am (utc) on Dec. 2, 2003]
Sorry I don't get your point?
....Oops, yes I do, sorry!
Dave
I like the start so far, but to tell you the truth the only thing I have changed is my title.
I am using Alt tags, H1 and H2, recripocal links, and a KWD around 20%. What I have found is that the main target set of keywords for my site I am not in the SERPs at all.
BUT if you mix up the words, separate them into smaller combinations, etc. I'm on page 1 of all of them.
The only reason I changed my Title tag was to keep it from matching my H1 tag too closely. I just got crawled today so I'll let you know what happens in a day or so.
The general consensus seems to be that Google is targetting over-optimization of commercial keywords. So, the sensible question to ask is, "How does google recognise this?"
The probable answer:
1. A dictionary of commercial search terms (adwords?)
2. Links, particularly reciprocal links, that have these keywords
3. A correlation between 2 and on-page optimization factors.
4. An absence of natural-seeming links.
Number 4 is the important one, I'd guess. 4 is what makes a page seem to be nothing but spam.
Personally, I wouldn't change sensible on-page optimization - H1 tags, URLs, etc. There is no reason why Google wouldn't want to be told what a page is about.
The solution? It's only a theory, but if I was heavily affected at the moment (which I'm not), I'd be looking for non-reciprocal links inserted into the middle of body text WITHOUT good anchor text. This means...
BLOGS.
Yep. People here hate blogs, but at the moment I'd be asking everybody I know who has a blog to link to me - not comment spam, and not permanent links, but the sort of link that comes in the middle of a post with anchor text like "really useful article" or "good info here" or "this is really funny". And probably not to home pages, but deep links.
Preferrably from Blogger blogs. There's a reason Google bought them. Natural seeming links from real people, to indicate that a site is more than spam.
Just a thought, based on logic rather than any evidence.
Alt tags
-----------------
I recommend donot use it right now, the reason being all tricks of SEO should not be applied on the same page. So leave it, less priority.
Alt tags are very useful html. Average users like them because they can mouseover and see additional link/image information, and if used appropriately can really add value to your site. Why would anyone avoid using them? I seriously doubt Google is applying any filter to alt tag usage (when used properly).
Alt tags
-----------------
I recommend donot use it right now, the reason being all tricks of SEO should not be applied on the same page. So leave it, less priority.
Utter utter nonsense....
or did we just pick it up
I think some of us would like to *think* we "just pick it up" but to me it's quite clear that Google will always keep 1 step ahead of SEO's. Personally I see SEO as pointless and very superficial. I focus mainly on HO (human optimizing) sticking to basic HTML and constantly adding linked content.
The major thing that i am seeing is that you need less keywords on your page. but at the same time, how are you suppose to write a very informative page without mentioning your topic more than an x amount of time
I do not *think* Google is overly concerened with keyword density. Just write text for humans to read and you will get the right ratio.
Dave
To answer this thread question - "What should be new linking strategy after florida?"
I think trying to understand the concept of Eigenvalue and eigenvector as well as webgraph can be valuable. I recall Mil2k, a senior member here, had elaborated about subgraphs quite nicely several months ago. Although it might not be exactly the same phenomenon of Florida, it gives a good idea and concept how the filter can be triggered.
It is something like,
Water is good for health, bacardi good for relaxation, wine tastes good, apple juice very nutritious.
Utter nonsense, how can you forget milk, a complete food.
But G says never take all at the same time, you will get a bellyache.
Everything is good but still many not there as they have done over.
Aji
==============================
Analyzing around 5 sites might not be sufficient, Aji unless you hit the right jackpot. If you dig further more, you might be surprise to find more discrepancies and inconsistancy in the pattern and you will be wondering why.
For ex., "You should not have one single keypharse as anchor text for all backlinks, try to have around 4 to 5 keyphrases (40% use the main keyphrase, next 30 % some other,...). Also try to link different pages instead of your homepage"
I'm familiar with some sites that employed similar tactics that you described above but could not escape the Florida wipe out.
What I have written is after analysing around 5 sites of some very popular kew phrases.
Well it's conclusive then, 5 from 3 billion is ample ;)
Dave
Copyright © 2003 Client Name ¦ Created by Widget Web Design
..With the words 'Widget Web Design' as the link back to our site.
'Web Design' happens to be part of the company name, but are we supposed to change our link strategy now because web design is a commercial term?
Any ideas?
[edited by: Dave_Hawley at 8:34 am (utc) on Dec. 2, 2003]
Any ideas?
===============================
Does it hurt you now? If not, I don't think you should be worry about.
If you are still worry about it for the future, you may consider to have only plain text 'Widget Web Design' and then your company url on the side or at the bottom. - Just an idea though.
We were also number nine or something for 'web hosting country name' and we've dropped off for that too.
Interestingly, we're still number one for 'web hosting city name'
*shrug*
I don't think anyone is getting "penalized" for linking structure. Here's my guess...
If you have 10 inbound links to a page, and 9 of those 10 links have the same keywords in the same order, you will only be given "credit" for 2 links.
I know of one site that was recently brought to mind because they contacted me to request a link. I said okay and was all set to give them a nice descriptive paragraph. All they wanted was one sentance with a 2 word keyphrase in it. I did some checking and ALL of the links pointing to their site say the same exact thing. Before florida they were #2 for the 2 word phrase, they are now at # 50. It is not a spammy site and the keyphrase for those who are wondering IS once in the title, once in description, and very low KWD.
Does anyone have any evidence that Google is actually discounting/penalising reciprocal links?
Aside form the many other good reasons for sticking with convention, Ink loves "H" tags -do you want to mess up your Ink listings as well as G? You're timing is perfect!
Alt tags are also going to be part of the big accessibility conversion certainly in the UK where the Disability Discrimination Act stipulates that sites should be accessible for all. Is Google going to therefore penalise such a vital component?
Thats not to say that keyword stuffing should be allowed - youve just got to be sensible in applying your seo.
If its clear that repitition of anchor text is being penailised - and I dont think any of us know for sure that it is - then you change your text implementing a broader keyword range than before.
One thing is for sure - LINKS ARE GOOD theyve always been good and they always will be and you have to continue accumulating them. We all have to adjust our practices and within 6 weeks we will know how to fine tune our websites.
Until then let the debate continue but please - lets keep it real.
C
In my opinion Google have sought to attack their problems at the root, rather than cutting branches off. The keyword is the root of any search. In theory, this is a great idea, in practice there has been casualties.
If Google are worth their salt, they will fine tune the filters to let the good sites through. If their only motivation is $$$$ then alot of webmasters will need to find another way to make a living.
That was some very specific words. Although these concepts indeed could give some insight on how Google works (as well as a lot of other things), i really don't think that you have to know any mathematics at all to produce a good (ranking) website - and you don't have to be a PhD either.
These are just tools for analysing complex sets of interdependencies. It sounds pretty advanced, and it is, but it will not really help you create that super site, as these tools are used for entirely different purposes.
Go create that super site instead. Don't be afraid of links coming in or links going out; specifically, dont try to control the free flow of linking power. Nobody really can do that 100% anyway in the real world, not even the big household brand names.
Dawe_Hawley might be considered an "optimist" or even "SEO ignorant" by some for the things he said earlier, but that's an uninformed view. To me, what he says makes a lot of sense, you just have to dig a little deeper. SEO might not be SEO in all cases. And that is not the same as a SEO penalty.
What they do is to assign weights to all kinds of things and then they rank pages according to those weights. If you page has the highest sum of weights for some term or phrase, you will simply rank highest for that. And, yes, weights can be set to zero as well as any other number - and they keep changing a little now and then.
As there are a large number of such weights, you will sometimes see that pages that are not "SEO'ed" at all rank just as high as pages that are. Of course they do that for another reason, as there is not one specific way to the top, but many.
(added: filtering, in turn, is an entirely different matter, unfortunately i have no time for that right now, sorry)
The Florida update was not a lot about your pages. It was mostly about Google's own pages - specifically that little thing known as the search box. What people type in here is now treated a little different, and it's actually pretty impressive.
By all means, it was a minor tweak in terms of page ranking and weighting. In terms of focus and understanding of search patterns it was a major leap.
With regard to math: This has never been an equation of second order, so having just one maximum is not really natural anyway.
/claus