Forum Moderators: open
After the florida update i thought of penning down the summary of it apart from cries.
Here what I hate to see but true
Linking Strategy
------------------
You should not have one single keypharse as anchor text for all backlinks, try to have around 4 to 5 keyphrases (40% use the main keyphrase, next 30 % some other,...). Also try to link different pages instead of your homepage. Donot do reciprocal linking, you will be penalized for that.
H1 tags
---------------
You should not have more than one h1 tags, donot follow the h1, h2, h3 sequence. Have H1 and h4 and thats all.
Alt tags
-----------------
I recommend donot use it right now, the reason being all tricks of SEO should not be applied on the same page. So leave it, less priority.
Title
-------------
Title, meta tags keep it as early, but I will say donot use title and h1 together.
Keyword density
--------------------
between 3% to 10%.
Links + Title + 5% keyword density +nothing else =#1
Thanks
Aji
The Florida update was not a lot about your pages. It was mostly about Google's own pages - specifically that little thing known as the search box. What people type in here is now treated a little different, and it's actually pretty impressive.Why is it pretty impressive?
By all means, it was a minor tweak in terms of page ranking and weighting. In terms of focus and understanding of search patterns it was a major leap.Do you mean a commercial leap? I cannot post evidence of this, although it does exist at a website the mods would not want me linking to.
I'm not sure if Claus is suggesting that weights can be negative as well as positive or zero?
Personally, I don't see why not.
And a "penalty" might be a good name for a negative weight :)
CONTENT CONTENT CONTENT
Was it Brett that always says "Its better to get 1 hit from each of 50 pages -vs- 50 hits from one page"
>Google now uses stemming technology. Thus, when appropriate, it will search not only for your search terms, but also for words that are similar to some or all of those terms. If you search for "pet lemur dietary needs", Google will also search for "pet lemur diet needs", and other related variations of your terms. Any variants of your terms that were searched for will be highlighted in the snippet of text accompanying each result.
So it is better to theme your pages rather than target them for keyword combinations. Maybe that's what GG ment when he said that someone searching for Cheeseburger most likely would be interested in results for making one rather than buying one? So there is no penalty, your list of competators just got much bigger.
I would think that by working for links on sights with similar themes as yours you should be ok.
I do not completely understand why they have taken over that page yet, but I think I am getting close to an answer. What I have noticed is that I am showing up for a more broad range of searches. Some of which get 6 times the search traffic than the one I once targeted.
The problem is that my site is only 'related' to the search, not specifically targeted for the search. This is a good thing for name recognition though so I'm not complaining.
In the searches I make, I have not seen poor results, however, I take on trust that many searches are produing daft results. I'm reasonably confident that I understand why now (in general terms). It would appear that either as a result of a bug or deliberate design, pages may be dynamically classed as spam depending on the search terms. This is simply not a strategy that can last - it's ludicrous - one of the most stupid ideas I've ever encountered.
For example, a vague search might put a relevant site at #1. A more precise search could see that relevant site classed as spam and therefore placed nowhere. Think about it for a moment. Does that sound sensible to anyone? GoogleGuy, does that sound sensible to you? You would need to be certifiably insane to think that was a good idea. So either Google will drop the idea/fix the bug or users will eventually stop using Google.
It therefore follows that making wholesale changes to sites and backlinks at this time is premature. It is natural that people will want to look for small tweaks to get back into the SERPS so here are my suggestions.
1) Change your titles slightly (but keep them sensible for users, other search engines and post-Florida Google). Perhaps use alternate spellings, or synonyms.
2) If you have H1 text at the top of the page, treat it as you would the title. Perhaps use CSS to create the same visual effect by other means.
3) If you have keywords stuffed at the bottom of your pages (or elsewhere) clean it up.
Beyond that you may be wasting your time or even worse - doing harm. Time would probably be better spent trying to get more backlinks (with neutral anchor text) and perhaps creating more content (on topic or off-topic). After all, it has been suggested that high pagerank is helpful in the current climate. Certainly, these should help in the long term.
Kaled.
It appears that all you need are several sites you can link to them from with their keywords in the anchor text and they will be dropped with the rest of us that have keyword rich anchor text.
I have a site that does not link out but has around 80 inbounds with anchor text with 30 showing in google. This site has gone from #5 & #6 to no-where on a competitive commercial phrase.
My new linking stategy -- forget about finding webmasters to link to your own sites, get them to link to your competitors instead! ;)
1) Change your titles slightly (but keep them sensible for users, other search engines and post-Florida Google). Perhaps use alternate spellings, or synonyms.
I have often wondered if Google compares various cache versions of a page to determine exactly how much tweaking is done to the title and other tags. Excessive tweaking may indicate the type of SEO that Google is trying to eliminate. Whereas, changing body text might be more indicative of changing/adding content. I have noticed that many of the sites that have suffered recently have done continual title tweaking. Some of the sites I still see at the top have had relatively consistent titles for quite some time (possibly indicative of a stable theme/concept). Just some food for thought. I doubt that now is a good time to be doing any tweaking (which may send a red flag to Google that SEO is being done).
C
Mix up your anchor text, and exchange like a fool. When the next wave hits it will be nice to have links from all of those sites that fit your keyword category.
Kaled.
PS
This would require considerable CPU time that could be better used.
Tweaking does not equal SEO
I said EXCESSIVE tweaking. I see SEOs sites changing page titles after every new fresh tag trying to get higher/better rankings. Think about it...if Google is theming as some have speculated [webmasterworld.com], and a theme, by definition, is consistent, then changing titles daily MAY reveal inconsistency. It may be just one of many red flags that Google could use to cause other (consistent/non-tweaked) sites to rank higher.
I know that some tweaking would be expected depending on the forces that drive a particular industry, and ever-changing marketing strategy. It just seems like Google could decide that too much tweaking (any of the tags) is bad. Who knows LOL. It was just a thought. Everything here is so speculative, but certainly interesting to talk about. ;)
funny thing about my "web design countryname" search is ..
1) 4 out of 10 sites that are at top have descriptions from ALT tags .. which are stuffed heavily. Its just funny how ALT tag is being considered for the top position. the 1st two SERPS have the exact same descriptions, word to word. So possibly one of them is spam of the other, but its almost as if google found the right combination of words and listed them at top .. sites i am talking about are PR5 and 6 .. mine is 7 and i am listed on the 4th page.
2) secondly, a site has 5 listings on the 1st page itself .. the sub-domain, the parent domain, the internal page, 2 other spam / door way pages. Its disheartning to see other people get away like that.
sparticus and marcia, as a web designer i am experiencing the same.
I started a thread over here [webmasterworld.com] concerning this topic.
- Chad
I am not giving a certain advice in this uncertain realm, I was just putting down what I saw and what I have observed.
I know five for a key phrase out of million is very less to get a good conclusion but I am sure of one thing, this exam has changed. As some of the geeks have mentioned about the weight,
1)1% to a % ------- fail ( you will not get good ranks)
2)a % -- b % ---- optimum ( you are there close to #1 if not #1)
3)b% to 100% ---- again u fail ( over SEO tactics).
I am very very sure about this, I have gone through enough sites to conclude this, but sure a% and b% yet to decide. All of the sites which are there in the top ranks, doesnt have all SEO tricks. I will say they have onle few, some of them even donot have meta tags like keyword and description. Some of them doesnt have key phrases in their title.
These are just few attempts to know the theories but I know no one can be sure about this, all we can do is see and guess.
Aji
This might also help to explain why big brands with their anchor text survived.
MQ
Linking Strategy
------------------
You should not have one single keypharse as anchor text for all backlinks, try to have around 4 to 5 keyphrases (40% use the main keyphrase, next 30 % some other,...). Also try to link different pages instead of your homepage. Donot do reciprocal linking, you will be penalized for that.
reciprocal linking: can't agree on that one
several keyphrases: agreed
But there's another thing I'd like to add.
The negative effects of Florida are much harder with english/international keywords.
Our german keyphrases aren't affected at all. Maybe due to a dictionary, but most likely due to less occurences.
Have you tried a serial number search recently? Just entering some product code? Searching for "review soundwidget 5.1"? A technical/specific one like "font-face" or/vs. "font-family"? Or more general/topical like "apache url rewrite"? Or an ambigous one like, say, "turkey christmas" (vs. "christmas turkey")?
It seems to me that even without entering special operators the searchbox can often distinguish between if the terms should be interpreted very specifically or very broad/topical (like suggesting alternatives). As for the broad results, think in terms of, say, 10 "pre-Florida related searches" (being performed, filtered and ordered) for every one word in a "post Florida" query with no apparent loss in query time.
>> Do you mean a commercial leap?
No, i do mean a leap in better understanding of search behavior. I don't know jack about the economy of Google, but i think it's better than mine. Let's not go there, please.
>> weights can be negative as well as positive
Doesn't really matter - you can accomplish the same with entirely positive numbers, even without zero, all you have to do is rank something higher than something else (or, you could even make the whole scale negative). It could even be binary as well as decimal, octal, percentages, fractions, rules and equations, whatever...I really don't know, i suppose that's the "rocket science" part of Google's inner workings; optimizing for maximum efficiency.
Perhaps my definition of "penalty" is unclear... To me, a penalty is not when you rank bad. It is not even when you rank good, and then you do something (or you do nothing, but the algo is changed) and then, suddently you rank bad. A penalty would be if you ranked worse than you were supposed to, as dictated by the current rulesets/scores/whatever.
As in, say: Two guys are bad spammers - okay, make that a boy and a girl, and make them wear nice white hats in stead. Both do exactly the same things in terms of the factors that are considered for ranking, but one ranks significantly different than the other.
However, i agree that "broad matching" can have some similarities to "penalty", as in the extreme such two sites could rank very differently - not that this would be caused by penalties, but due to "filtering" to obtain broad matches (for lack of better words). I don't think Google has shown us anything but a quick glimpse of this technology yet.
So, what to do about linking now?
AjiNIMC, if i did so, i'm sorry i caused this thread to go off topic. Here are my two cents at the moment:
So, how to beat the broad match?
That's a tough one. Basically the same things that made you rank high before Florida will also make you rank high now... although in some cases it will be for another query. And, as plasma points out - the broad match database (if such a thing exist) has not been translated yet, it seems. It's mostly an English thing (as in language, not geography).
So, there's a choice: Go for good ranking in the broad match, or keep your good ranking for an exact search. Anyway, perhaps you can even do both. For some competitive searches i've done post-Florida i have seen clearly commercial sites in the top SERPS for standard "broad" searches, but still, one industry's "broad" is another's "specific".
Still, the next update might even be different once again....oh, sorry about the repetition.
Btw. I think that the knob has been turned a little towards more exact match lately - the new algo seems even less "different" now than a few days ago, but i might be wrong here, i've only done 10 searches or so today.
/claus
I think trying to understand the concept of Eigenvalue and eigenvector as well as webgraph can be valuable.
I think you are talking of Second eigenvalue of Google matrix [webmasterworld.com]. That would help you understand artificial linkage patterns better but IMHO that should have been applicable before Florida.
The triggering of current filter can be best seen in commercial searches. Not all commercial searches are affected, but many are.
You should not have one single keypharse as anchor text for all backlinks, try to have around 4 to 5 keyphrases (40% use the main keyphrase, next 30 % some other,...). Also try to link different pages instead of your homepage.
I do not have much problem with that :)
Donot do reciprocal linking, you will be penalized for that.
Have only seen 1 paper by Krishna Bharat which was about filtering based on reciprocal linking. But there are many sites using reciprocal linking and are perfectly allright. So I would like to disagree with this point.
I would not like to comment on the rest of the things bcoz as many here I believe that many serps are in a mess and would not stay this way if Google is concerned about it's QC.
I made this observation because I noticed many of the top directories now listed for my phrases are just my link partners. Also, one of client's competitors remained listed for a money term much to the confusion of the client. At first I couldn't figure out why they were ranked #1 but then I noticed that their links actually went to other sites, giving them the appearance of being a directory.
I think Google might even take into account whether these outgoing links are located on the same ip.
It seems that Google is filtering sites out based on whether or not they lacked outward links rather than page factors.
Trust me allanp73, out of 15 sites only three had outward links and the rest had,
I am on my way to do more on this topic but surely nothing can be 100% accurate.
Aji
SEO is NOT rocket science and it *in my opinion* is something that has grown from SEO companies playing on site owners fears. You have all seen the ads "We will get you in the top ten on Google for your search term..." What a load of rubbish, there are only 10 positions in the top ten! Most SEO business is founded on fear, lies and ignorance.
It appears to me that many here are getting Shakespear from Jack and Jill. Forget all about SEO and and focus almost entirely on writing good easy to read relevant text. This alone WILL have Google sit up and take notice of the page.
Rather than post here how bad the SERPs are, use that energy and time to write content for your site(s).
Despite what is written elsewhere, DO use the Title and description tags to give a brief concise highly relevant description that would entice a HUMAN not a robot.
Make a site map that use HTML only.
Do have a H1 heading that again gives a brief concise highly relevant description that would entice a HUMAN not a robot.
Do have sub headings.
Do link to relevant sites if you think your visitors would be interested in the site.
Do have sites link to you. Do NOT worry about PR.
Googles ultimate aim is to return highly relevant results for humans. If you aim to do the same you and Google are singing from the same song book.
Build your your site for humans and they + robots will find you, build your site for robots and only the robots will find you.
So the word of the day is humans :o)
Dave
Rather than post here how bad the SERPs are, use that energy and time to write content for your site(s). [DID THAT]
Despite what is written elsewhere, DO use the Title and description tags to give a brief concise highly relevant description that would entice a HUMAN not a robot. [DID THAT]
Make a site map that use HTML only. [DID THAT]
Do have a H1 heading that again gives a brief concise highly relevant description that would entice a HUMAN not a robot. [DID THAT]
Do have sub headings. [DID THAT]
Do link to relevant sites if you think your visitors would be interested in the site. [DID THAT]
Do have sites link to you. Do NOT worry about PR. [DID THAT]
[RESULT: Went from #1 to nowhere]