Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Dupe content and Iframes

Are iframes frowned upon

         

mbucks

12:27 am on Aug 3, 2006 (gmt 0)

10+ Year Member



Hi all, my first post in this excellent forum. I have looked around and though I've found similar discussions, none specific and recent.

A site we launched in Feb was progressively growing and being included to the google index up until May time when all of a sudden many pages went supplemental and some removed from the index completely.

Since then, actual indexed pages may go up to about 400 then down to 25 and up and down again. When not fully indexed, they seem to be supplemental. This is on a site with about 1500 pages. We've engaged in the sitemaps prog and have quite a few good links coming in. Our PR has since increased to 5.

The site is carefully structured for the user. A page would consist of a hotel description with a series of 'Things to do' in the locality below it. The 'things to do' would be a short paragraphs with a CLICK HERE to read more on the specific suggestion. There are about 5 of these below an accommodation. If there's another hotel in the area, the Things to do list may be similar. But obviously the actual hotel description will always be unique.

I'm coming to the conclusion that we may be getting penalized for dupe content. We've considered removing the recommendation intros from the listings, and replace with a CLICK HERE TO READ THINGS TO DO IN THIS AREA button, but this would be to the detriment of the users.

Sorry to be long winded.

Another option we've considered is to put the recommendation intro into Iframes placed as a list below the unique hotel descriptions.

Does anybody know if this would be frowned upon by google. to me it's the best option for our users (short of leaving as is), but getting these pages back into the main index is the priority.

mbucks

6:43 pm on Aug 3, 2006 (gmt 0)

10+ Year Member



Anybody have experience or theories on this? It got worse today. Almost no pages left. This is killing us! :(

Quadrille

10:26 pm on Aug 3, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Not sure why the variation; do all the pages have unique meta descriptions and titles?

As for iframes, the problem with them is that they are ignored when google looks at the holding page, as they are (in google's eyes) not part of that page.

So if they are not unique content, that may help, as that content would otherwise reduce the proportion of unique content.

If they ARE unique, then it's important to keep them part of the page.

You do not say how 'big' the hotel description is.

Remember when considering unque content to look at the source code, and assess what % you have for the page - it's often a sobering experience!

mbucks

10:54 pm on Aug 3, 2006 (gmt 0)

10+ Year Member



Hi,

thanks for your thoughts. In reply:-

The Meta descriptions and Titles are unique, they'll be the name of the accommodation. e.g. Hilton Hotel in London, Greater London.

So the page may be made up of a description of the hotel with average 300 words.

Then below this, you will have about 10 snippets of reviews. These are about 50 words each, taken from the full review pages. So no, they're not unique. You see, it's these that I'm thinking of putting into Iframes, as the text is actually taken from the full review, each on it's own page. So I'm thinking it would be better if google didn't see it. And thus reduce my dupe content.

I take your point on the source code. This is the tricky bit. Do you think a pretty hefty flash header on ever page contribute to the duplicate content?

It's frustrating in that we get a lot of feedback saying how nice the site is to navigate, and it looks like we have to make some big changes to make it google happy and to get out of supplemental.

mbucks

11:29 pm on Aug 3, 2006 (gmt 0)

10+ Year Member



A couple extra details on what we're thinking about the iframe thing.

The recommendation snippets would be shown in here, and this page would be set to 'follow,noindex'. We don't want these pages themselves to be included in google's index as they'll have none of the main nav elements from the site, however they will have links to other pages that should be included (the recommendations themselves).
Thinking this would be better than simply blocking the page via robots.txt, as it will be beneficial to have google picking up the links from these pages?

mattg3

12:07 am on Aug 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I use iframe to cloak the english preview version of a text (my pages) so that Google, doesn't think I am scraping, or whatever the weird filters think. :\

How does Google treat 2 languages on a page?

Most German users can read some English so the snippet should be useful to them in case the German version isn't sufficient and they can have a look if they want to bother looking up the English version.

I guess Googles overzealous filters would think that this is dual content, while the actual use is preview.

I assume Googles overambitious duplicate filter would kick in and penalise one of my sites then for dual content if I take it out of the iframe.

Quadrille

1:19 am on Aug 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Googe does not read flash - but it may be read as page bloat; maybe take it off the page as iframe or js?

300 words is more than many have; it would take a fair bit of shared copy to drown that, I'd have thought.

If you do not already use css, now's the time to start.

Do go through your code and take out anything not needed - you may be able to reduce the internal links, for example.

Quadrille

1:34 am on Aug 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



mattg:

I'm not 100%, but I'm pretty sure that Google does not translate anything while spidering; and even if it did, it's unlikely the result would be seen to be the same as your translation, so no worries

Have you ever seen a Google translation? ;)

mbucks

2:24 am on Aug 4, 2006 (gmt 0)

10+ Year Member



Ahhh Quadrille,

I'm glad you said that about internal links. It's one thing our no.1 developer suggested could contribute to the prob.

Now somehow, some pages do have 130 (ish) links. 99% internal.

Is it poss that too many internal links could cause this to happen. I understand vast amount of externals can de-value (we have a nofollow on every external link), but I didn't think internal links would be such a prob.

Thanks for the other tips.

opifex

4:07 am on Aug 4, 2006 (gmt 0)

10+ Year Member



too many internal links! and repetitive text for the links? i have a site suffering from that problem at the moment, but the serp isnt affected and it is a well established destination directory. at one time i did have an se problem that the se's would direct to the wrong pages of the directory only because the keyword appeared in the link menu. am in the process of fixing the problem with a .js menu but had also considered putting the menu as an html page to display in an iframe.

Quadrille

9:24 am on Aug 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm not sure that it's links per se that are the problem; just that 99 links are a load of code, and are never going to be used in a million years!

Dependinding on the site design, keep links to other major sections, and some links within the current section.

So you'll need a 'navigation bar' of srts for each section, linking within and to neighbors, rather than one standard across-the-site bar that has to include everthing. Internal links (esp HTML links) are vital - but can be overdone, and *any* shared code needs to be whittled down. if you feel the links are indispensable - go for a sitemap!

Do use css - even if only to get rid of <font>. Every bit helps.

It might be worth experimenting on your next new pages - having a text link to 'local' bits and extras, rather than snippets

mbucks

11:05 am on Aug 4, 2006 (gmt 0)

10+ Year Member



Thanks for the tips.

A large portion of these links are contained within the relevent recommendation summaries shown at the bottom of pages, for example a recommendation summary would have a link to the summary itself but also links to view other pages relating to it (eg. a link to view all recommendations within the same town).

Thinking it is pretty highly likely that this is what's causing the problem, so will try moving this into an iframe over the next couple days and then see what effect this has.

Thanks again for the recommendations!

mbucks

12:16 pm on Aug 6, 2006 (gmt 0)

10+ Year Member



Has anybody seen this before:

It's all theories, but we're going to proceed and call similar content into Iframes.

But now every page except the home page is supplemental. This includes things like the contacts page which is all unique apart from the footer links. So this really stumps us and wonder if anything else can make the entire site go supplemental?

Sooooo frustrating.