Welcome to WebmasterWorld Guest from 184.108.40.206
quote from Optimus:
Adam still needs to clarify whether drop down navigation menus constitute boilerplate repetition, as this obviously will affect millions of sites. If it is duplication, then google must inform us of the threshold so navigation menus can be redesigned accordingly.
I finally gave in and minimized boilerplate repetition which included removing a dropdown. Result: Pages came out of supplemental but conversions are down, indicating to me that users are still too impatient to click around to other pages for pertinent information.
Does anyone know for sure that dropdowns can cause ranking problems or supplemental problems?
When did dropdowns become an issue? They are a common part of site architecture. If they are part of a common include, they will be treated as so.
This is why it is important that you have multiple navigation themes for larger sites. If you try to put everything under one umbrella, it will present issues from a variety of standpoints. Indexing being one of them.
From Adam Lasnik (in the thread linked above)...
Re: menus, particularly lengthy ones. And nav stuff overall. Again, not likely to be a problem unless the content on the pages is minimal or extremely similar overall.
If you have a page that is comprised mostly of includes that contain the same exact content and then a paragraph or two of unique content, that may present duplication issues. ;)
Well, I have almost a thousand product-pages as landing-pages, which would well fit pageoneresults description: a navigation include at the left on ALL pages all the same, and relatively thin content insofar as the pages contain only the (relatively short) product descriptions as unique content.
However, all pages are well indexed (though google constantly refuses to assign pagerank to them: probably because I added half of them in one bunch = unnatural growth).
I guess the difference to sites of similar kind, which did get into trouble is this: half of the data I used for these landing pages came from a catalogue of one of my importers and it took me several days if not weeks to turn it into a format importable into my database. The content, though thin, is unique in every respect, contains my sweat, tears and swears.
And to me this is the critical hint in Adams post, where he pointed to the fact that in many cases hundreds of sites use the same csv-data from affiliate partners or google adsense. And indeed it is of no use for anyone if google indexed all these copies.
I must say I am really amazed, how google managed to decide that my site is somewhat different in CONTENT, though from its FORM surely a candidate to be filtered the way Adam described.
So what others may learn from this is: Don't sail the thresholds you don't know anyway, google has many other means to make a decision. If you find your boiler-plate navigaton or dropdown-menus convenient for your visitor, leave them. If you placed them on your site only because of link-anchor-text and search engines: skip them.
I can't say I agree with that. What better way to get se's to understand what your site is about. It reinforces what your content is as well as the title does.
As far as drop down boxes go even google guy says that they have problems reading links in js which is what 99% of drop down is. I would avoid them if possible.
But as said earlier..it is a very integral part of site architecture and I can't believe it can cause that problem. I would seriously look at what else you did that brought the pages out of supplemental.
1. Strong internal linking does not need to make overly heavy use of a repeated phrase.
2. Other strong factors in a site's overall profile might mitigate one slightly troublesome area.
joined:Dec 29, 2003
Just to confirm: Goog does not read /score based on what's included via .js, correct? I do not care much about the links in there, they are linked anough from the front page and the main categories, which are less likely to have dupe issues
In some ways, Google has moved beyond black/white or yes/no
That's cool, Ted, but for my site it has been all black and no for 4 straight months, while my site really isn't that spammy.
The question arises: does Adam Lasnik really know what's going on, algo-wise? Google wouldn't be the first company with departments that aren't communicating.
[edited by: tedster at 11:19 pm (utc) on Mar. 10, 2007]
In some ways, Google has moved beyond black/white or yes/no. They look at (and balance off between) many scores of factors.
Here's an example. I buy a few loans at prosper. A few factors to consider when sizing up a borrower and figuring out how much you can trust a borrower to pay back your loan:
- Credit Rating
- Home Owner
- Number of delinquencies in the past 12 months
- Number of delinquencies in the past 7 years
- Number of years employed
- amount of loan
- previous listings
- bank account verified
- # of recent inquiries
- age of the borrower
- performance rating of the group a borrower belongs to
A HighRisk credit grade isn't good, but that by itself won't disqualify a loan, if other factors look decent. For example, 0 delinquencies might sound good, but that means nothing if the borrower only has 1 credit card and has only had it for a year.
It's never cut and dry as, for example, saying age of a borrower doesn't matter; it can matter if a 90-year-old borrower gets sick 2 years into the loan and can't pay back the rest. It just matters less, in general, than the number of delinquencies or DTI %.
Same deal with Google. Scoring low on one factor is a strike against you but it will not be the killing blow if Google likes other stuff about your site (e.g. cnn is liking to one of your pages; your site's been in Google's database for over 5 years, etc).
In this case, I'd focus on generating unique, meaty content on every page, and forget tweaking nav menus. The most I'd do is move menus below content using CSS. If you don't have the time or resources to generate content, you can always NOINDEX them so you redirect more weight to pages that do have content, or encourage visitors to generate content for you. And if you happen to have enough sites linking into you, you may not even need much content.
[edited by: Halfdeck at 3:21 pm (utc) on Mar. 11, 2007]
However my menu is designed in JS with HTML links with mouseovers controlled with CSS so it is easily read and indexed by the Search Engines.
We experimented by removing some, all or none across 3 sites and observing the behaviour.
I can report that, in our case, it made no difference at all, so they have been reinstated, since they better assist with the overall navigation experience for the user.
Adam or other good members may chime and say otherwise - but it looks OK to me.
[edited by: Whitey at 8:43 am (utc) on Mar. 26, 2007]