Forum Moderators: open
The cut a long story short, when analyzing the top SERPS for my given keyword, and decided to test the top 3 guys out to see what they doing.
Firstly, they all had ridiculous Keyword density and when I browsed their main site, I could find these keywords for love or money. When I used lynx, I found keyword packed pages with keyword density going all the way up to 18%.
This is what they were doing, What is the best method?
SITE 1: Used frames, to give the spider a keyword packed page.
SITE 2: Used a redirect to "dummy.html", also keyword packed.
SITE 3: Used meta refresh to their main site
What's the best way to do this? Is any of this considered spam?
All of it would fail a hand check. Your time is more valuable creating quality content that back stuffing pages. Seriously, it takes more time to screw around with that stuff than youd ever gain. The light truth and way are via more content. More content=more opportunities for long term referral generation. Stuff/gimmicks/tricks=more opportunity for month to month crying over rankings lost.
It's a zen thing. Focus on the content and forget about Google. When you do that for awhile, a strange thing happens - you get more referrals. You just can't get and keep quality long term rankings without quality content.
I'm not talking about illicit actions, rather content/format variants. Find out what Google likes and give 'er!
Seriously, think about the inference. As many ways as possible...
This philosophical approach will also lead to robust pages that are not readily susceptible to "Google-Flux."
You've heard of semantics? Think 'metasemantics!'
Content keyword/keyphrase maximus. Learn creative linquistics.
Then, content, content, content.
Powerful SEO combination.
I was on the current #1 ranking SEO site on Google this afternoon for SEO related keyword phrases. Their keyword density was off the charts 34% in some cases. I did some further research to see if they were maybe cloaking, and they did not seem to be.
Now, a month ago I and my assistant went through over 1000+ pages (8 Sites) of the sites we work on daily and reworked the density of every page to come out to what is in the acceptable range talked about around here. Before we had never really given much thought to it, but as we are currently working a new site that is competing one of the most heavily competed for words and phrases, we needed an edge. In the past pages with 30%+ had been taking the #1 - #3 slots every time, no problem. Now that we have changed - during Dominic - we have in most cases droped back as much as 10 pages. This is in places we did very well to dominated for the last 6 months, in most cases. So what gives? Does Google really not look very hard at density in the traditional way.
[edited by: aaronjf at 3:47 am (utc) on June 17, 2003]
It's sad to 'see' how poorly many pages are worded. Google LOVES descriptive language!
That's what I mean by metasemantics... "more" or "enhanced" semantic usage. This refers to an overt awareness of language constructs and "metasematic" possibilities. In fact, this approach might seem foreign to keyword spammers. That's fine by me.
Squeaky clean, built for users, and using language effectively: that's the way to robust indexing and success.
"Meta" can be be defined alternately as beyond; transcending; more comprehensive. While "semantics" can be explained as the meaning or the interpretation of a word, sentence, or other language form.
Without the original topic evolving into a full-fledged discussion of keywords, let's just say that Google, and other engines, can work wonders for pages containing 'rich text.' It is very possible to optimize a page for multiple keyword/key phrase targets. Creative language is key.