| 4:16 pm on Mar 7, 2007 (gmt 0)|
I sure hope drop down boxes won't cause a problem with rankings and dups.
Does anyone know for sure that dropdowns can cause ranking problems or supplemental problems?
| 4:19 pm on Mar 7, 2007 (gmt 0)|
|Does anyone know for sure that dropdowns can cause ranking problems or supplemental problems? |
When did dropdowns become an issue? They are a common part of site architecture. If they are part of a common include, they will be treated as so.
This is why it is important that you have multiple navigation themes for larger sites. If you try to put everything under one umbrella, it will present issues from a variety of standpoints. Indexing being one of them.
From Adam Lasnik (in the thread linked above)...
|Re: menus, particularly lengthy ones. And nav stuff overall. Again, not likely to be a problem unless the content on the pages is minimal or extremely similar overall. |
If you have a page that is comprised mostly of includes that contain the same exact content and then a paragraph or two of unique content, that may present duplication issues. ;)
| 4:57 pm on Mar 7, 2007 (gmt 0)|
Uhm.. I can see this affecting image galleries and shopping sites, but that's hardly news anyway.
| 5:40 pm on Mar 7, 2007 (gmt 0)|
If its not crawlable. Best way is to check the source code to see if google sees the link.
| 9:10 pm on Mar 7, 2007 (gmt 0)|
> I can see this affecting image galleries and shopping sites
Well, I have almost a thousand product-pages as landing-pages, which would well fit pageoneresults description: a navigation include at the left on ALL pages all the same, and relatively thin content insofar as the pages contain only the (relatively short) product descriptions as unique content.
However, all pages are well indexed (though google constantly refuses to assign pagerank to them: probably because I added half of them in one bunch = unnatural growth).
I guess the difference to sites of similar kind, which did get into trouble is this: half of the data I used for these landing pages came from a catalogue of one of my importers and it took me several days if not weeks to turn it into a format importable into my database. The content, though thin, is unique in every respect, contains my sweat, tears and swears.
And to me this is the critical hint in Adams post, where he pointed to the fact that in many cases hundreds of sites use the same csv-data from affiliate partners or google adsense. And indeed it is of no use for anyone if google indexed all these copies.
I must say I am really amazed, how google managed to decide that my site is somewhat different in CONTENT, though from its FORM surely a candidate to be filtered the way Adam described.
So what others may learn from this is: Don't sail the thresholds you don't know anyway, google has many other means to make a decision. If you find your boiler-plate navigaton or dropdown-menus convenient for your visitor, leave them. If you placed them on your site only because of link-anchor-text and search engines: skip them.
| 6:18 am on Mar 9, 2007 (gmt 0)|
>>>If you placed them on your site only because of link-anchor-text and search engines: skip them. <<<<
I can't say I agree with that. What better way to get se's to understand what your site is about. It reinforces what your content is as well as the title does.
As far as drop down boxes go even google guy says that they have problems reading links in js which is what 99% of drop down is. I would avoid them if possible.
But as said earlier..it is a very integral part of site architecture and I can't believe it can cause that problem. I would seriously look at what else you did that brought the pages out of supplemental.
| 8:32 pm on Mar 9, 2007 (gmt 0)|
So what are we to believe? The Lasnik post that suggests that Google isn't really that strict on internal linking, or the phrase-based proponents that suggest that site-wide repetition of key-phrases (as in menus) leads to the -950 penalty?
| 11:32 pm on Mar 9, 2007 (gmt 0)|
You can take some value from both, I'd say. In some ways, Google has moved beyond black/white or yes/no. They look at (and balance off between) many scores of factors.
1. Strong internal linking does not need to make overly heavy use of a repeated phrase.
2. Other strong factors in a site's overall profile might mitigate one slightly troublesome area.
| 7:34 pm on Mar 10, 2007 (gmt 0)|
On the dupe side: I just made my left side menu as a .js include. I had the a-z and categories, around 100 or so words in total so it can add up on product pages that aren't as thick as others.
Just to confirm: Goog does not read /score based on what's included via .js, correct? I do not care much about the links in there, they are linked anough from the front page and the main categories, which are less likely to have dupe issues
| 10:53 pm on Mar 10, 2007 (gmt 0)|
|In some ways, Google has moved beyond black/white or yes/no |
That's cool, Ted, but for my site it has been all black and no for 4 straight months, while my site really isn't that spammy.
The question arises: does Adam Lasnik really know what's going on, algo-wise? Google wouldn't be the first company with departments that aren't communicating.
[edited by: tedster at 11:19 pm (utc) on Mar. 10, 2007]
| 3:17 pm on Mar 11, 2007 (gmt 0)|
|In some ways, Google has moved beyond black/white or yes/no. They look at (and balance off between) many scores of factors. |
Here's an example. I buy a few loans at prosper. A few factors to consider when sizing up a borrower and figuring out how much you can trust a borrower to pay back your loan:
- Credit Rating
- Home Owner
- Number of delinquencies in the past 12 months
- Number of delinquencies in the past 7 years
- Number of years employed
- amount of loan
- previous listings
- bank account verified
- # of recent inquiries
- age of the borrower
- performance rating of the group a borrower belongs to
A HighRisk credit grade isn't good, but that by itself won't disqualify a loan, if other factors look decent. For example, 0 delinquencies might sound good, but that means nothing if the borrower only has 1 credit card and has only had it for a year.
It's never cut and dry as, for example, saying age of a borrower doesn't matter; it can matter if a 90-year-old borrower gets sick 2 years into the loan and can't pay back the rest. It just matters less, in general, than the number of delinquencies or DTI %.
Same deal with Google. Scoring low on one factor is a strike against you but it will not be the killing blow if Google likes other stuff about your site (e.g. cnn is liking to one of your pages; your site's been in Google's database for over 5 years, etc).
In this case, I'd focus on generating unique, meaty content on every page, and forget tweaking nav menus. The most I'd do is move menus below content using CSS. If you don't have the time or resources to generate content, you can always NOINDEX them so you redirect more weight to pages that do have content, or encourage visitors to generate content for you. And if you happen to have enough sites linking into you, you may not even need much content.
[edited by: Halfdeck at 3:21 pm (utc) on Mar. 11, 2007]
| 2:55 am on Mar 12, 2007 (gmt 0)|
I have a drop down menu with over 80 pages in it along with my footer and several links on my home page which equal over 100 links and no problem so far.
However my menu is designed in JS with HTML links with mouseovers controlled with CSS so it is easily read and indexed by the Search Engines.
| 5:18 am on Mar 13, 2007 (gmt 0)|
I've had problems in the past with dropdown boxes. not necessarily google, but msn and a few others. Instead of putting my meta description as a page snippet .. or content on the page as a snippet .. msn in particular saw the dropdown box as most important and used the list of words as my description. not only did it look bad, but it looked really spammy although in it's place on our site .. was just simple navigation.
| 8:40 am on Mar 26, 2007 (gmt 0)|
I previously reported my concern that multiple drop down boxes could be a potential risk, since they produced repetitious text across the pages and site.
We experimented by removing some, all or none across 3 sites and observing the behaviour.
I can report that, in our case, it made no difference at all, so they have been reinstated, since they better assist with the overall navigation experience for the user.
Adam or other good members may chime and say otherwise - but it looks OK to me.
[edited by: Whitey at 8:43 am (utc) on Mar. 26, 2007]