Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Site Wide Content and Usage of Keywords

         

zehrila

1:39 pm on Oct 22, 2009 (gmt 0)

10+ Year Member



Sometimes, you want more attention on specific pages from users and search engines, in that case you publish that most important or latest content on almost all pages of your site, either in footer section or in the side bar. I have been doing this ever since and see no issues, but due to changing behaviour of google is it feasible to continue doing so?

The other question is about excessive usage of keywords with in content. So, now lets say your site is about widgets and there are tons of widgets e.g blue widgets, yellow widgets, green widgets, game name widget and so one, on my home page and my category pages, i'm listing them like i mentioned above e.g "game name widget" so all the links on my homepage and category pages contain such links which upon clicking leads to the content pages. My question is, does this usage of widgets at the end of each product listing dangerous? those are my targeted keywords as well.

Robert Charlton

7:30 pm on Oct 22, 2009 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



My question is, does this usage of widgets at the end of each product listing dangerous? those are my targeted keywords as well.

Good question. I've seen this mentioned in some of the -950/end of result "penalty" discussions as a likely reason for pages getting filtered.

At the same time, I'm seeing red widget, blue widget, green widget menus used on many sites with apparently no problems at all.

I've wondered what the difference might be between pages where the repetitions cause problems and those where they don't. I'm also wondering whether perhaps Google might have relaxed its filters, as I'm seeing much less end of result discussion.

It's a question I asked way back in 2002, but which I've never seen satisfactorily answered....

Avoiding excessive repetition in global text links
"Widget" really belongs in every link, but it may be seen as spam
[webmasterworld.com...]

The discussion is now dated, but some considerations still apply.

I've tried to run tests on live sites where this was possible, but results were non-conclusive. Concern for me is whether removing "widgets" adversely affects rankings of level below the links... ie, the pages to which the links are pointing... vs whether keeping them in can cause appearance of overoptimization in the page containing the linking menus.

I would not excessively repeat "widgets" in, say, a breadcrumb trail... but this doesn't clarify the repetitions in menus.

zehrila

7:54 pm on Oct 22, 2009 (gmt 0)

10+ Year Member



Its amazing you could recall back to such an old thread, good memory. I have been using this ever since, i started this site in 2006, which after 2 years reached up to 100k uniques a day, later it was slapped down to hell. I am now in a process of fixing it and trying to reduce signs of over optimization, i have no issues removing widget from my targeted keyphrase which is "keyword widget" from my home page which contains dozens of similar links, but what i fear is if the removal of word widget shall reduce the relevance and my keyword might never rank good?

I have survived this for years, and i can see tons of sites doing this and they are doing perfectly fine. There are sites in my niche which use this strategy and then there are sites which don't, sites using this technique are ranking better.

[edited by: Robert_Charlton at 8:11 pm (utc) on Oct. 22, 2009]
[edit reason] examplified specific [/edit]

TheMadScientist

9:27 pm on Oct 22, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Sometimes, you want more attention on specific pages from users and search engines, in that case you publish that most important or latest content on almost all pages of your site, either in footer section or in the side bar. I have been doing this ever since and see no issues, but due to changing behaviour of google is it feasible to continue doing so?

I think that's an interesting question, which I believe is addressed by 'boilerplate detection' when the content is static, but I'm not sure how it applies to content which (I'm guessing) changes frequently.

Personally, I've become a fan of the 'empty div' <div id="NewContent"></div> which is filled onLoad="showContent();" using JS... You could even have 'default content' which is static for SEs and those not running JS, but replaced with newer or updated content for those who are... This way, the content within the div would most likely (IMO) be discounted as a 'boilerplate' and you shouldn't need to worry about it.

zehrila

9:56 pm on Oct 22, 2009 (gmt 0)

10+ Year Member



TheMadscientist: Thanks for shedding some light on this, so what i gathered from your post is, having sitewide content could raise boilerplate detection. To support your argument, matt Cutts in his post said the same.
We think we do pretty well on detecting boilerplate (e.g. you’re not likely to run into any issues of duplicate content for header/footer type stuff)

But does this mean that sidewide content won't rank?

TheMadScientist

10:04 pm on Oct 22, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My understanding is it basically means: The point or focus of your page will not have the content determined to be within a 'boilerplate' evaluated as part of the page for ranking purposes.

IOW: The content you use across a number of your pages Does Not change the focus of the pages it is present on... Basically, it doesn't count as content on those page for ranking purposes.

It (IMO / Experience) does *not* mean the page containing the full content will have it's rankings diminished... It just means the 'boilerplate content' will not 'count' on the pages it is detected on, which is why I think filling the div with some 'standard content' for SEs and those without JS enabled might be a viable solution. IMO you want the content to be detected as 'boilerplate content', if I'm understanding correctly.

zehrila

10:19 pm on Oct 22, 2009 (gmt 0)

10+ Year Member



My intention is to divert users attention towards content which is either fresh or popular and having such content burried under category listings shall not help getting more users. Also, i thought search engines might think, okay this page is appearing on specific set of pages, it might be an important page and rank it higher.

So, the sitewide affiliate links are also detected as boilerplate?

TheMadScientist

10:33 pm on Oct 22, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My intention is to divert users attention towards content which is either fresh or popular and having such content burried under category listings shall not help getting more users. Also, i thought search engines might think, okay this page is appearing on specific set of pages, it might be an important page and rank it higher.

I would probably be inclined to:

1.) Create a 'fresh & new' page which lists say the last ten 'content blocks' used within the 'boilerplate'.

2.) Link to the 'fresh & new' page with a static link preceding 'the boilerplate'. (This link needs to 'pass link juice', so it should be a stock standard link.)

3.) Create a 'boilerplate' on each page with static content for SEs & users without JS, but which updates dynamically for the 95% of people who do not fall in this category. (It can go anywhere on your page.)

4.) Add a link to the main navigation of your site to the 'fresh & new' page, so you pass a bit of 'extra link juice'.

Basically, what I'm suggesting is to pass 'extra link juice' to the 'fresh & new' page you create, then you are passing that 'link juice' along to the pages it links to... You could even archive the 'fresh & new' page to 'keep the juice' flowing. (If you want live working an example of what I'm talking about with a 'fresh & new' page, check out [webmasterworld.com...] :)

So, the sitewide affiliate links are also detected as boilerplate?

I don't know on this, because it seems to be new information... If the link is within 'boilerplate text' it might be seen that way, but links seem to be treated a bit differently, especially site-wide affiliate links.

zehrila

8:10 pm on Oct 27, 2009 (gmt 0)

10+ Year Member



TheMadScientist: Beside what you said, i am intending to customize the specific set of text for each category so that it could be relevant to that certain category and dynamic at same time, i think i shall write down a paragraph and use $keyword attributes specific to the content of that category to make it unqie from other categories. I am using this technique for my meta descriptions as well.

I think what you mentioned about having most of boilerplate type links nofollowed and channel the page rank to a specific page and distribute page rank from that point, it is an interesting approach and i am considering using it. By that is hall be able to present fresh content across the specific category, sitewide.

zehrila

8:29 pm on Oct 27, 2009 (gmt 0)

10+ Year Member



I've seen this mentioned in some of the -950/end of result "penalty" discussions as a likely reason for pages getting filtered.

At the same time, I'm seeing red widget, blue widget, green widget menus used on many sites with apparently no problems at all.

Robert: i am pondering with the same issue. The situation i am going through, i would like to think that Google has double standards, but probably they have a valid reason for that. Till 2-3 years ago, i used to see sites excessively using "widgets" in their links and they were doing perfectly fine, now in my niche, i cannot see any semi big site (20-50k uniques a day) using or lets say abusing the term widgets as much, however, the top of the cream is still using them excessively, each of their link contains that term but they are perfectly doing fine, infact they rank better than any one in the niche. I have concluded it to these findings.

1: It depends upon the authority of site.
2: Depends upon the quantity and quality of back links you have.
3: Quality, frequency and recency of content.
4: Brand favorism (that might be interrelated to authority)

These two sites i am talking off prove my point, they are multimillion dollar sites, hence, they have an access to everything "quality" and probably their quality factor outweighs their spammy factor.

I think, if one is penalised, it is safe to drop the term or atleast reduce it to a level where it doesn't feel spammy, but on the same time, as you mentioned earlier, once you remove the term, it might just devalue the link and that particular landing page might never rank as good.

My conclusion is, Google is focusing on simplicity. Things which were good back in the day are now taboo.