Anyone here a Drupal user? Everything was rolling along fine until we noticed page creation (using the panels module specifically) caused an exorbitant number of divs and nested div tags to populate the source code.
Basically anything simple content was <div>ed or nested in <div>s to the point that most of the code was, you guessed it, mostly some form of div tags.
I know divs and styles work well together, but this was over the top and makes me wonder if this is detrimental to crawling/indexing. I can't help feel that if I build the page using basic HTML I could reduce the size to almost a third of what its is now AND the content would be clean and less diluted.
Anyone have experience with Drupal or opinions about this?
Point taken, however, I guess my question is more geared towards whether Googlebot would take the additional time to keep digging down through excessive lines of code to get to my content.
If a good portion of my content is now relegated to line 350 due to heavy <DIV> usage instead of line 150, this might be a concern.
I'm always under the assumption that bots will only give you a finite amount of resources to devote to crawling a particular site - so, having it spend its time peeling back and parsing countless <div>s seems a little counter-intuitive to outright content exposure.
That's where my concern lies - I don't think it will differentiate between a single div block with content vs content w/o <div>. It's the accumulative code bloat that worries me.
Most times SEs find the content, regardless of how much layout (divs, etc), is involved. If we're talking about a page that is above fold to below fold, I doubt there's any problem or need for concern.