Welcome to WebmasterWorld Guest from 22.214.171.124
it goes to page 1, then 2, then back to 1, now its on 3.
looking deep into things i get really fristrated.
one of our competitors is page 1 for said keyword.
the page that is listed has the following:
- 70 mentions of the keyword in the meta tags
- 26 mentions of the keyword in alt tags
- 3 mentions in the title tag
- 117 mentions of the keyword in the on page text, 24 of them linked to other parts of the site
now i thought that creating a page like that would be frowned upon.
also what i dont like is that the top 2 results for the keyword are youtube videos.
Also, position 3 in the serp has got 106 mentions of the keyword on the page listed.
so the question is, how on earth do you win the battle when there are sites using what i have always thought to be bad seo preactices?
youtube videos getting really good rankings.
yes they are good sites, i just read the wenmaster tools again, and its this point i think is relevant.
"Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"
mentioning the keyword 106 times within a few paragraphs, i think, falls foul of this.
I was looking at a site yesterday with hundreds of keywords stuffed below the footer - it's number one for a lucrative and sizeable UK search term. Has been for years.
If their meta has 70 keywords in total, then its a bit overdone. But in some cases its acceptable: Printers is a good example - you have 100s of defined titles. Color Printers, Laser printers, etc.. these add up.
The amount of 24 anchor text links are not a big issue. The real question is - is their link relevant? does the user get sent to the right unique content? or are they point all these links to one page? If the competitor sends users deeper in the site with relevant content, then its fair.
As your being told - non HTML media - video, downloads (pdfs, Excels, Words) these all count. So videos are weighted. My advice - Don't just 'live with it' - get an action plan to include these items.
>So videos are weighted. My advice - Don't just 'live with it' - get an action plan to include these items.
what seogio said. videos are going to define next year's serps.
Quit worrying about high limits on keyword density.
In summary, bad SEO isn't bad SEO if they are ahead of you in the rankings. Instead of finding ways to report websites that aren't (perhaps only in your mind) following the webmaster guidelines, figure out what you aren't doing right. If you can't beat em, then join em. That means, get a blog, write 2 paragraph articles on each page, and you're going to get top rankings. It's really frustrating to hear a webmaster complaining about somebody elses website when in fact, the target market may very much enjoy their website. See, not everyone looking at websites is a webmaster. We can't say what is nice, pretty or convenient. We see things differently. You see a mess of keywords, but I bet the consumer doesn't.
joined:July 3, 2008
the point isn't whether or not keyword stuffing is the reason they are number one. The fact is that it should count against them and obviously doesn't.
....yes they are good sites, i just read the wenmaster tools again, and its this point i think is relevant.
If they're "good sites," it's possible that there are enough mitigating factors to outweigh the keyword stuffing. A site that passes the sniff test in other respects may be viewed more charitably than, say, a made-for-AdSense scraper site or or thin-affiliate site with 100,000 computer-generated pages and 10,000 reciprocal links from sites of no intrinsic value. Why? Because in "grey area" situations (such as the number of times a keyword is used on a page) it makes sense to look at the overall picture. That's what a human reviewer would do, and if a search engine's alogrithm can replicate that kind of judgment through the use of different measurement factors and statistical probability, then good for the search engine.
Tricks are only tricks when they work. The white-on-white is unlikely to be a working trick. You might was well penalise
<meta name="robots" content="RankNumber1" />