Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Remove keywords from URL?

         

Tonearm

8:54 pm on Dec 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been reading about Google applying penalties for having keywords in the URL. Is it no longer ideal to have keywords in the URL? Would you include them in new pages?

dailypress

9:21 pm on Dec 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I would. Google Map also thinks so. + it makes sense right?

I can understand penalties for jamming in keywords in Long-Urls-that-makes-sites-look-like-spam but otherwise im gonna do what I have been doing for years!

where have you been reading this? a forum/blog? or trustworthy source?

SEOPTI

9:52 pm on Dec 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, just read the patent about -950 reranking.

".. whether the occurency is in a title, in a URL, in the body, sidebar .. "

ogletree

10:00 pm on Dec 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google does not like anything done to excess. Design your site for users. It all comes down to getting more links and good content designed for users attracts links.

BeeDeeDubbleU

10:46 pm on Dec 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yes, but good content designed for users is possible on a website with keywords in the domain name.

See Matt Cutts view here,
[google.com...]

TheMadScientist

10:48 pm on Dec 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yes, just read the patent about -950 reranking.

".. whether the occurency is in a title, in a URL, in the body, sidebar .. "

I'm not sure if you're trying to say a keyword in the URL is bad or not, but the patent referred to is actually about spam detection based on the actual frequency of a phrase and related phrases and the ability to predict a 'standardized' frequency of the phrase and related phrases contained in a document.

In case I'm not saying exactly what I mean in English above: They're looking for the presence of a phrase and related phrases in a document within a 'threshold' of what's considered to be a 'normal' occurrence rate for the phrase and related phrases.

I keep laughing about this, because the only way I can see Google actually penalizing a site (or page) solely based on the presence of a keyword in the URL is like this:

Google Exec: How can we maintain our lead in searches conducted over the other search engines?

Google Engineer: We can penalize sites if they use the keyword in the URL.

Google Exec: What will that do?

Google Engineer: Well, if a person searches for a keyword or 'key phrase' and we've penalized all pages using that keyword or phrase in the URL because it seems to indicate they have what the person was searching for the person will have to search again because they won't easily be able to find a page that's actually about the phrase or keyword... Then people will always conduct more searches when they visit our search engine!

Nearly ever page from php.net and here on webmasterworld.com (just to mention a couple) would be penalized if the occurrence of a keyword or phrase in the URL invoked a penalty...

Tonearm

11:45 pm on Dec 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



+ it makes sense right?

That's the biggest thing for me. It makes sense to include information about what a page is about wherever possible.

zehrila

12:44 am on Dec 3, 2009 (gmt 0)

10+ Year Member



Adding a keyword to url could be helpful. Often, the url structure makes it obvious for user to understand the content of site. For example domain.com/laptops/asusxyz-widget.html where widget is keyword and explains the nature of page, on the otherhand, url without keyword such as domain.com/laptops/asusxyz.html is bit confusing, the page could be about any thing!

I think its excessive usage of "keyword based anchor text" used for internal linking which could raise flags e.g if your site is about widgets and your home page has 30 links to internal pages and all those 30 links have anchors like blue widgets, green widgets, gray widgets, black white widgets etc, it might lead to an over optimization penalty.

Tonearm

12:59 am on Dec 3, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think its excessive usage of "keyword based anchor text" used for internal linking which could raise flags e.g if your site is about widgets and your home page has 30 links to internal pages and all those 30 links have anchors like blue widgets, green widgets, gray widgets, black white widgets etc, it might lead to an over optimization penalty.

What about a page linking to products like this:

Big Red Widget
Small Blue Widget
Long Hairy Widget

It sounds like that would incur the same penalty you're talking about, but it's really necessary to have "Widget" in each of those. Big Red? Long Hairy?

zehrila

2:31 am on Dec 3, 2009 (gmt 0)

10+ Year Member



Tonearm: If you're talking about external links, then probably one cannot have much choice over anchor text. People usually link to such pages using the anchor text similar to that of meta title of targeted page.

If its internal linking, i have seen sites doing perfectly fine with excessive usuage of similar keyword for internal linking, such as "Big Red Widgets" "Small Blue Widgets" "Long Hairy Widgets" where Widgets is the keyword, to me i think its over optimization, a little bit of screaming, specially when all of your internal links have keywords "Widgets" as their anchor text. You can probably use "Big Red" "Long Hairy" for your anchor text, but on particular landing page, you can set meta titles which truly indicates the nature of your page, which is Big Red Widgets, you can also use it in H1 tag (not that they are too much of value anymore) and somewhere in text as well. Finally, get few links to that particular page targeting your keyword. Someone correct me if i'm wrong.

[edited by: zehrila at 2:42 am (utc) on Dec. 3, 2009]

steveb

2:39 am on Dec 3, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"I've been reading about Google applying penalties for having keywords in the URL."

Wherever you read such nonsense, stop reading there.

tedster

3:45 am on Dec 3, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One of Google's motivations for the new breadcrumb hierarchy in the SERP (replacing the URL) seems to be replacing overly long or uninformative URL structures. That approach would value economic use of keywords in the file path.

Also, let's make sure we are all talking about the same thing here:

URL - the complete address for the page
Domain Name - the part of the URL up to the first single forward slash
File Path -the part of the URL that follows the first single forward slash

I say this because MANY people are pretty loose about which term they use, especially switching the first two around.

Tonearm

3:01 pm on Dec 3, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been reading about Google applying penalties for having keywords in the URL.

Wherever you read such nonsense, stop reading there.

It's right here:

[webmasterworld.com...]

tedster says:

I'd say the point is don't use the same keyword in the anchor text for many different internal links. Some repetition is certainly natural, but too much can hurt you.

tedster

4:21 pm on Dec 3, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Tonearm, that post is about anchor text, not about the URL.

Tonearm

4:59 pm on Dec 3, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



tedster, you're right, I apologize. I got mixed up. My previous post should have gone like this:

I've been reading about Google applying penalties for having keywords in the URL.

Wherever you read such nonsense, stop reading there.

It's right here:

[webmasterworld.com...]

SEOPTI says:

You should worry about keyword rich internal navigation and keyword rich URLs. This is the perfect mix for a -950 re-ranking.

tedster

7:26 pm on Dec 3, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That's why it's called OVER-optimization penalty. The same keyword in the URL and the internal anchor text, and doing that for many pages on the site with the same single word showing up in many instances - that can be playing too close to the fire. It is a matter of degree.

BradleyT

7:28 pm on Dec 3, 2009 (gmt 0)

10+ Year Member



They probably meant keyword stuffed URLs.

zehrila

10:53 pm on Dec 3, 2009 (gmt 0)

10+ Year Member



Ted: Do you think its fine to have urls like /.../category/fluffy-Blue-Widgets.html with Anchor "Fluffy Blue" however the meta title could be "Download Fluffy Blue Widgets" ?

TheMadScientist

12:34 am on Dec 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I personally think the best, easiest answer to this question is to simply review the sites listed in the SERPs. If they use the idea you're thinking about using for your liking / naming convention then IMO it's relatively safe. If you don't see the linking / naming convention you are thinking about, then IMO it's a bit more risky...

I think this is one of those fairly easy questions to answer just by observation and it will probably be found either the post of SEOPTI is being completely misunderstood, taken out of the context it was intended for, or is incorrect. In any case it's a single post and can easily be evaluated by simply searching and reviewing what the sites you see are doing.

steveb

12:46 am on Dec 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"It's right here"

And it's just FUD. Using keywords in URLs is friendly to users, search engines, and yourself. This forum is called webmasterworld.com/google/ not webmasterworld.com/aysts/ for the benefit of everyone. There is no Google downside to keywords in the URL, and plenty of positives for Google and humans. And more to the point, the most well-constructed, well-ranked websites do it this way.

TheMadScientist

12:49 am on Dec 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@ steveb

Shhhhhh... Don't go telling everyone how easy the question is to answer. Make them at least search a bit. K? Thanks! :) And don't even mention the bread crumbs, or the links you click to get into the forums, alright? I mean seriously, every forum about Google has the word Google in the link. That's a penalty right?

Hissingsid

10:23 am on Dec 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've read and seen vids of Matt Cutts saying that keywords in URL give them clues about what the page is about and is a good thing. If you were designing a filing system you would have a structure that goes from more general to more specific with names that give you a clue about what is in the directories and what is in the files etc.

So /widgets/blue-widgets/widget-service.html seems to me to be a helpful structure and in my experience works very well on Google.

Keyword stuffing of URL, anchors, nav and content is a different matter IMO. I don't think there's an OO penalty but there is a keyword stuffing (stupidity) penalty. You can stupidly highlight your stupidity by stuffing keywords in all of the key areas of your site, then when you stuff the content of a file boom you are stuffed!

Cheers

Sid

PS Some folks still think that Florida was about an over optimisation penalty. IMO this was a myth propagated by people who were not bright enough to realise what was really going on.

TheMadScientist

4:01 pm on Dec 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yeah, Sid if (when) people realize a search engine is really just a massive storage and retrieval system things usually make more sense. (It's like your 'hard disk' only bigger.)

Think about what you would call folders & files on your computer and that's probably a good idea for URLs. That's all the Internet really is, so it stands to reason you should name things accordingly.

IMO this doesn't make sense:
widgets-blue/fuzzy-blue-widgets/super-fuzzy-blue-widgets-for-sale.html

Because IMO that's not anything most people would call the file or directories on their computer...

widgets/blue/fuzzy/super-fuzzy/for-sale.html
AND
widgets/blue/super-fuzzy/for-sale.html

Make quite a bit more sense to me, even though the word fuzzy is repeated in one.

Personally, I might shorten your URL a bit depending on the exact situation:
/widgets/blue/service.html

Of course I might leave it very similar to what you had, depending on the exact situation. ;)

SEOPTI

5:31 pm on Dec 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Today a site which I use for testing purposes went into the -950 box. This definitely happened because of keywords in the URL. I was able to nail it down to the keywords in the URL after months of testing, believe it or not.

Hissingsid

5:46 pm on Dec 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I was able to nail it down to the keywords in the URL after months of testing, believe it or not.

;-)

TheMadScientist

6:00 pm on Dec 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Would you mind giving us an example of the URL(s)?

Also, how many URLs are using 'essentially the same' pattern both as a count and percentage of the site?

There seem to be quite a few sites in the index with a keyword or two in the URL(s) that are not penalized...

steveb

12:45 am on Dec 5, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"believe it or not"

Of course I don't believe it, especially since I'm on a page with keywords in its URL and it isn't penalized.

But now anyone else reading this with make up their own minds.

tedster

2:29 am on Dec 5, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I was able to nail it down to the keywords in the URL after months of testing, believe it or not.

The phrase-based spam detection patent [appft1.uspto.gov] mentions these factors, among others:

grammatical or format markers, for example by being in boldface, or underline, or as anchor text in a hyperlink, or in quotation marks... whether the occurrence is a title, bold, a heading, in a URL, in the body, in a sidebar, in a footer, in an advertisement, capitalized, or in some other type of HTML markup.

Based on that patent, and my experience with -950 sites that people brought to me - the threshold is triggered by a combination effect. That is, measuring single factors in isolation builds an assumption into the test that is no longer the way that the ranking algorithm is constructed.

This means working with a single factor can produce correlations that appear to be "the cause", but the full cause is actually having the combined score cross a threshold. This is further complicated by the fact that the threshold is regularly recalculated - so sometimes your site crosses the threshold and sometimes the threshold is what moved, not your site. And as a side note, I expect that Caffeine will allow more frequent threshold calculations.

Certainly some factors in the combined factor score are weighted more heavily than others. I'd say the heaviest weights probably do go to keywords in the anchor text, URL, and title.

Sid, given all those moving parts, did your testing take at least some of them into account?

Hissingsid

10:11 am on Dec 5, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Tedster,

It was SEOPTI who said that, I was being ironic with my ;-) while thinking exactly what steveb said.

I have a hypothesis that goes back to Florida and all that talk about semantics (which I fully accept) and that is this:

If you have a page/site that is keyword stuffed in all of those areas mentioned in the patent you point to AND the page/site is not semantically rich around that topic then you will get a 950 BUT it is much harder to go through the threshold and get the penalty if your page is semantically rich on that topic.

Cheers

Sid

tedster

5:46 pm on Dec 5, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



it is much harder to go through the threshold and get the penalty if your page is semantically rich on that topic.

Yes! -And that effect is a result of the phrase-based indexing methodology, too. If the semantic richness (co-occurring phrases) goes way too low or way too high, that's when it kicks in. The "way too high" is something that can nail mix-and-match scraper pages.

When those patents were first published, it seemed clear that copy writing should be much more natural than many SEOs were advocating. Even before that time I noticed that co-occurring phrases could boost some rankings. Those phrases also add a powerful new dimension to the long tail traffic.

[edited by: tedster at 7:13 pm (utc) on Dec. 5, 2009]

This 34 message thread spans 2 pages: 34