Welcome to WebmasterWorld Guest from 18.204.48.199

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate Content & A Drop down Menu Causing Poor SERP Results?

     
3:04 pm on Nov 25, 2014 (gmt 0)

New User

joined:Nov 25, 2014
posts:11
votes: 0


Hi,

I've been searching high and low for the answer to this but have found nothing.

To explain, I have recently taken over an e-commerce site which is performing very badly in the SERPS.

The first major problem is the fact that all of their product information has been copy & pasted directly from the manufacturer...word for word, letter for letter for over 1800 products.

It has no manual penalties levelled at the site in WMT however, but using SERP tracking software short and long tail terms are way back in row ZZ (I've been tracking 47 terms and only 2 are in the top 10 & top 30 (which have the brand name after the keyword). Searching just for the brand name we do at least come up first which is where our meagre traffic is coming from (along with adwords).

I figured...oh well, there's really only one thing to do and that is to start adding content & replacing the duplicate content into my own words.

So I've written content on category pages including some which have 2000+ word buyers guides below the products and short 100/200 word intro paragraphs above the fold, but most have a decent amount of 2-300 words of text, as well as including images & diagrams etc. You know...useful info written in my own words. I've also changed the meta titles as I've gone along to be a bit more descriptive.

In all I've probably written nearly 30,000 words of new content and I've only changed a small percentage of what will eventually need to be changed.

I've also used some short boilerplate text which pulls out some variables to make it unique. This text basically states how long the [product] will take to deliver and that other [category name]'s are available, and that you can call us on 0800 etc etc etc. in order to try and dilute some of the duplicate content found in the product description. This is around 90 words or so.

On the products themselves I've written really, really detailed products descriptions (800-1000 words...nothing overwhelming when you actually read it) on a few things whilst on others I've just written about 2-300 words or so. Neither of these approaches seems to have made the slightest difference to search engine performance.

In fact, just testing one random page that I've updated now, I've put in the exact page title + the brand name and we come up 9th, which is 8 spaces behind all of our competitors...even though I've searched for my brand name. Hmm...

This gets me onto my two questions:

1. Is there a magical amount of pages of my site that I will have to add unique & useful content to before the things get better in the SERPS? What I'm really getting at is should I expect to see results for the few pages that have been changed or should I not be surprised that there has been no improvement because the whole site is being algorithmically punished and to just keep plugging away at creating unique text throughout the site and eventually things will improve?

I use the fetch as google tool after I've changed every page and submit only that url to give Google the most up to date version of that page and to let it know that there is new content there. My sitemap is up to date as well.

2. When viewing a cache:www.url.co.uk version of any of the pages on my site and viewing 'text only', at the top of every single page are links to every single category found in the dropdown menu. This means that there are roughly about 150 links before you even get to the main body information and this is replicated across every page of the entire site. This of course means that every landing page and every category page and every product page are linking to totally unrelated pages (because it's linking to everything). So for example a page about beds is linking to a page with the anchor text of telephones. Things like that.

Do you think that this dropdown menu is turning the site into some kind of impenetrable link nest where equity is passed around in a really unfocused way? How do you guys and girls handle navigation if a drop down menu causes huge problems? I have breadcrumbs but what else do I have to/can do? Or is my main worry the duplicate content and to not worry about the architecture of the site?

I don't have any internal duplicate issues because of the way info is pulled from databases and the fact that there are no printer versions of pages, no pagination, no session id's etc.

The reason I am most frustrated is the fact that most of our competitors are also using the exact same text as what the guy who built the site used and they have similar link profiles to us and have similar Domain Authority etc but they don't seem to be getting punished at all! Very frustrating.

Is there something obvious that I have missed out on? If you require more information I'll do my best to supply it.

Thanks!
5:25 pm on Nov 25, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member planet13 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 16, 2010
posts: 3828
votes: 31


Have you checked traffic patterns to see whether the site was affected by one of the Panda updates?

Have you checked google doing a "site:" operator search to make sure all products and pages are indexed ONLY once?

Have you checked the competition to see whether your site is more appealing? I mean, not just more / better written content, but offers a better value proposition?

And one more thing:

"...and they have similar link profiles to us and have similar Domain Authority etc..."

How sure about that are you?

If they have a link network out there, they might be blocking bots that check for backlinks (like moz or majestic or the other backlink checkers).
8:08 pm on Nov 25, 2014 (gmt 0)

New User

joined:Nov 25, 2014
posts:11
votes: 0


Firstly, thank you for your reply.

In answer to your first question, there aren't really any discernible patterns throughout the life of this site...it's never really performed great:

[s83.photobucket.com ]

The site:search seems to pretty much match up with the amount of pages indexed in WMT and everything seems to be normal. How would I check for duplicate listings in the SERPS themselves? What would that signify to you?

By better value do you mean price wise? Delivery costs etc? Or do you mean site design? The site isn't the best looking thing in the world but it's not the worst either. Others have pretty creaky old school designs and are ranking well.

By similar link profile, I haven't gone into it in much detail but I've used Opensite Explorer and Ahrefs and gone through most of our backlinks and we all have intersects from all the same places really. No one seems to be blowing us out of the water with super powerful editorial links or anything. It's a niche where places to get relevant backlinks from seem to be from a fairly limited pool.

This is why I was thinking that they are using their link equity in a much better way than us because of the way they have set out their navigation.

I really appreciate your help! If you need any screenshots or anything, please let me know.

Thanks.
8:40 pm on Nov 25, 2014 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 7, 2006
posts: 1137
votes: 140


Do you think that this dropdown menu is turning the site into some kind of impenetrable link nest where equity is passed around in a really unfocused way?


Probably. I assume, from your description, that it is css-based, so that with css turned off you just get a huge list of links.

There was a recent thread on menu styles ([webmasterworld.com ]) that discussed some aspects of this, but also see [webmasterworld.com ], as my suggested method of blocking access to the js in robots.txt may not be such a good idea any more.
9:48 am on Nov 26, 2014 (gmt 0)

New User

joined:Nov 25, 2014
posts:11
votes: 0


Thanks Wilburforce.

Having read both of those threads I'm now perhaps even more confused! As you point out it seems like your damned if you do and you're damned if your don't.

On the one hand I can leave the menu and have this horrible interlinking structure which might be punishing me, or I can try your method of blocking access in the robots.txt which might end up punishing me! Surely there are millions of of sites with sitewide dropdowns? There must be a definitive answer on this somewhere!

No one answered your question about whether CSS menus can be the cause of keyword-stuffing penalties (or other penalties) and I think that I would like to know this as well.

I guess it's impossible to know whether the pages that I have updated haven't really nudged the needle yet is because of this menu or whether it's because the whole site has never been trusted for having this duplicate content?

Is there an example of a good navigation system for SEO? I'm really not married to a dropdown at all and I have a developer who can change what I need.

What can I do to improve trust signals to tell the almighty G that I'm trying to improve the user experience here? Re-writing all this might take more than a year.

Thank you for any advice you can offer.
11:29 am on Nov 26, 2014 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 7, 2006
posts: 1137
votes: 140


@Les Zeppelin

Your navigation system needs to follow site-structure (don't focus on the menu without thinking about structure) and should be user-friendly (don't focus on SEO without thinking about the user), but ideally should not be hostile to bots or search-engines.

Drop-down menus are a relatively intuitive navigation system for the end-user (e.g. they have the same look and feel as Windows expanding menus), and are one fairly effective way to deal with a large number of pages without requiring the user to click through a lot of intermediate pages to find what they are looking for.

However, if your menu is (as, in html terms, a CSS menu is) just a big block of links, bots can't easily discern site structure, and also see substantially more code relative to content, which has no advantages for SEO or anything else.

Off-page js is my own solution (I have now removed the block in robots.txt), as it keeps the page size down, allows you to make site-wide menu changes in a single file, and allows browsers to cache the js so they don't have to load the menu every time the user goes to another page. As far as I can see, unblocking the js folder has so far had no effect on SEO, and none of the search engines seem to treat the js links as part of the site's internal link structure (although the js folder has only been unblocked for a couple of weeks, so that may change).

I also recommend using a breadcrumb-style menu (whether on its own or in addition to any other menu style), as this clarifies site structure for bots, as well as (if you use a js menu) leaving a navigation structure in place for js-disabled browsers.
12:08 pm on Nov 26, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


Site architecture matters bigtime for ecommerce sites. And there's also some standard stuff you want to check.

- Are there multiple URI paths to get to product and category pages?

- Are you implementing rel=canonical?

- Do you have *any* URL parameters? (stuff like ?sort= or ?order=)

- Is there an onsite search function? Are you blocking the results pages from the index?

- Sounds like the site never ranked well. Do some pages get more traffic than others?

- What about user engagement? Do users love your site, do they share it with others, do they come back regularly, does it generate any buzz with them?

- As far as your product descriptions, are when you rewrite them are you really making them new and unique, or just pushing words around in a different order?

- And finally, (and please don't take offense) Is there a clear and obvious reason for the site to exist? The reason I ask this is not to be mean, but if you have that many products that so many other stores are carrying (whether or not the descriptions are exactly the same) and there are only ten links to a page in search engines, you'd have to have a pretty new and unique business model to crack that nut.

Not to put too fine a point on it, I could put up a shoe site tomorrow with 3000 SKUs, but I'm not going to be giving Zappos a run for their money. I'd have to have a good strategy (and stellar execution) to even show up in the first TEN pages.

It often happens that poor SEO and site structure get blamed when actually it's a weak business model that's the real culprit.
2:39 pm on Nov 26, 2014 (gmt 0)

Senior Member from FR 

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Feb 15, 2004
posts:7139
votes: 413


Not to put too fine a point on it, I could put up a shoe site tomorrow with 3000 SKUs, but I'm not going to be giving Zappos a run for their money. I'd have to have a good strategy (and stellar execution) to even show up in the first TEN pages.

It often happens that poor SEO and site structure get blamed when actually it's a weak business model that's the real culprit.

This ^^^..should be pinned to the top of almost every thread and forum here..
3:22 pm on Nov 26, 2014 (gmt 0)

New User

joined:Nov 25, 2014
posts:11
votes: 0


Thank you guys.

In reply to Nutmeg:

- Are there multiple URI paths to get to product and category pages?
Not sure...I'll have to ask my developer on this.

- Are you implementing rel=canonical?

No, but I don't think I need to for this site.

- Do you have *any* URL parameters? (stuff like ?sort= or ?order=)


Nope.

- Is there an onsite search function? Are you blocking the results pages from the index?


Yes we have and no we haven't. I've asked my developer to update the robots.tx file accordingly.

- Sounds like the site never ranked well. Do some pages get more traffic than others?


No it hasn't but I guess it never really had much of a hope what with it having very, very little original content and placing masses of duplicate content on it from day 1.

My top landing pages are for people who have been searching for branch locations...so it's things like weburl.co.uk/cardiff, weburl.co.uk/staines etc.

- What about user engagement? Do users love your site, do they share it with others, do they come back regularly, does it generate any buzz with them?


All time Bounce Rate = 36.42%
All time New Visitor percentage = 71.4%
Returning = 28.6%
Since June this year there has been about 160 separate sales, and of those there's been a handful of repeat customers.

I wouldn't say that there's much buzz. We have 500+ followers on Twitter and I tweet regularly but only get single figure clicks on the links I post out. Our products aren't exactly something to create a buzz about though to be fair.

- As far as your product descriptions, are when you rewrite them are you really making them new and unique, or just pushing words around in a different order?


I'm making them as unique as humanely possible. I take what's existing, put it through a plagiarism checker, have a look at the nice 0% unique figure I always get...re-write it, adding as much content as I possibly can and make it easier to read etc and put it back through the plagiarism checker and smile when it says 100% unique. Rinse & repeat.

Due to the nature of the products there are unavoidable key phrases, but when I take all the other words around it and totally re-write it that shouldn't be a problem. One technique I use is to read all the key features and then try to write as much as I can from memory, then going through the text and adjusting as necessary to make sure all the information is correct. I think they are as unique as they can be.

- And finally, (and please don't take offense) Is there a clear and obvious reason for the site to exist? The reason I ask this is not to be mean, but if you have that many products that so many other stores are carrying (whether or not the descriptions are exactly the same) and there are only ten links to a page in search engines, you'd have to have a pretty new and unique business model to crack that nut.

Not to put too fine a point on it, I could put up a shoe site tomorrow with 3000 SKUs, but I'm not going to be giving Zappos a run for their money. I'd have to have a good strategy (and stellar execution) to even show up in the first TEN pages.

It often happens that poor SEO and site structure get blamed when actually it's a weak business model that's the real culprit.


No offence taken! It's not my website.

I guess my answer to the first part of this is to think about a niche that I know really well, which is skateboarding.

Across the UK there are dozens of skateshops all with their webshops selling the exact same products for pretty much the exact same prices.

Obviously from a user point of view there's no real reason why there should be 15 different shops selling the exact same skateboards for the exact same price. But more obviously from a business point of view it makes total sense that you will want to stand out against the crowd and be the one who gets the attention of that one user and get their custom.

Now if one of those shops takes their user experience to the next level they would deserve to outsell the others by being rewarded in the SERPS right?

That's what I'm trying to do here. My first goal is to just get vaguely competitive and to not be demoted for every search to result number 100+!

And the thing is I'm not really even competing with that many people. I used that 8 competitors thing earlier but I was just being a bit facetious really as there are really only a couple of real rivals who sell the same catalogue of products as us.

I know the answer is not to bung a load of pages up and hope for the best, but if I strengthened each & every page with unique content I should get rewarded in the SERP's for at least some of those pages right?!

It's a fair point that you make though and one I really need to think about.

Thank you.
3:46 pm on Nov 26, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


I know the answer is not to bung a load of pages up and hope for the best, but if I strengthened each & every page with unique content I should get rewarded in the SERP's for at least some of those pages right?!


No, it doesn't work like that. Improving user experience is going to take a lot more than rewriting product descriptions. For example, one of my clients works in a B2B niche that is somewhat related to office supplies. In fact, where they used to rule their niche completely, now they have Amazon and Office Depot and other similar huge brands selling the same or similar products. Now, there's only so much you can say about a box of paper clips or pens (not real products here, but representative).

So how do you make a mark against that?

First of all, you use other channels to build your brand. This particular client uses direct mail (print catalogs), inside and outside salespeople, email marketing, some limited social media, print ads, banner ads, and of course AdWords and BingAds. As I keep saying Google doesn't want to make sites popular, they want to rank popular sites. So you have to make your site popular by whatever means available. For my own sites, I've done things as small as tee shirts and tote bags, and as large as sponsoring local charitable events.

No amount of rewriting of the (let's say) paper clip descriptions are going to help us sell more paper clips. Adding extra text just to fluff it out - that's one of the oldest tricks in the book and it doesn't fly in 2014. (Great fluff test - read your text OUT LOUD to someone - if they don't sound right spoken, they won't look right written)

You can also try lowering prices, but if your only point of value is price, you've already lost because there is always - ALWAYS - someone out there who will beat your price.

So you need to add value to the user experience by doing some things that your competition either isn't doing, can't do, or are doing but not talking about.

Maybe you can beef up the return policy, or offer free or discounted shipping. We found success by trying to help our customers pick out exactly the right product to suit their needs - if you are selling paper clips, instead of padding out product descriptions, consider putting up comparison charts to make it easy for the user to decide the best one for his needs or the best way to save money with bulk pricing (do you use 1000 paper clips a year, or 1,000,000?). Emphasize extreme customer service.

Solicit and add testimonials from satisfied buyers. Incorporate a ratings and/or review system.

Every year my developer and I sit down and try to think up things that would enhance the user experience with no consideration of time or money. Then we prioritize, and decide what we can actually do.

THEN WE PUBLICIZE IT. If you make changes, let your users know you've done it, don't just wait to see if they notice it. I always have a "What's New in 2014" type page where I tell people what I've been working on (and solicit suggestions) and a surprising number of people actually read it.

In other words, start giving off signals of a quality site and quality user experience - for the WHOLE site, and find a unique selling proposition.

(If your site search results pages are already indexed, blocking them in robots.txt probably won't do you any good. Better you slap a NOINDEX on the page, and either let them drop out or remove them in GWT, and THEN add them to robots.txt)

Hope this helps.

[edited by: netmeg at 4:06 pm (utc) on Nov 26, 2014]

3:57 pm on Nov 26, 2014 (gmt 0)

New User

joined:Nov 25, 2014
posts:11
votes: 0


Amazing advice.

Thank you very much Netmeg. I really appreciate that.