Forum Moderators: Robert Charlton & goodroi
I am currently trying to write some code to hide certain DIV sections until a user puts their mouse over an image or link. The code I had been using set the DIVs to "display:none" which I found out was a very bad thing with Googlebot (as my pages have dissappeared from their listings and index)
The site in question is at: < Sorry, no personal urls >
You will note that currently, all DIVs are displayed when the page is loaded... What it USED to do was display the four images, the AIBuilt logo to the right, and upon mouseover of any image, display the relevant text.
Currently, mousing over an image will revert the displayed text to the way it should be, but I am (was) trying to minimize the page clutter by having the initial load hide the text until an area was selected/moused-over.
Originally, I had the (text) sections in iFrames, which Google listed, but the problem there was that Google would refer people to the iFrame instead of the main document.
I was thinking about one of two options, but I don't think (or dont know) that they will fool GoogleBot.
(1) an OnLoad that hides the text using display:none (though I keep reading elsewhere that GoogleBot will look for such things in the Javascript and penalize).
(2) having the OnLoad (or the DIVs' styles) set to 0px high until a mouse is over an image, at which time it will set the height to auto for the appropriate DIV section.
Is there a way of implementing this that will not drop my pages from the ranking/index? (one of the above, or any other method?)
Thanks in advance,
Robert
[edited by: tedster at 2:46 am (utc) on July 30, 2007]
What you've described is not at all against Google's guidelines - in fact show/hide mouseover divs are common in many very widely used drop-down menus written with DHTML. As long as a basic and obvious user action switches the visiblity of the divs, there should be no problem.
What makes you feel this code is the reason for your site's disappearance?
Thanks for the response and for checking the URL...
The current page layout I dont think to be a problem.
The previous page layout is what I think got us de-listed... it happened (the de-listing) shortly after we changed the site layout and were re-crawled.
All 4 DIV sections were previously set to <DIV style="display:none;"> on page load, and the would be un-hidden with the mouse-over. I just changed that to see if it would get us re-listed.
So my suspicion was, since all the text content was hidden on initial page load, that is what dropped us from their listing and index (previously you could search for "TheURLinQuestion.COM" and come up with results... or search for "Specific Search Term" and we would be listed in the top 3. Since the change to hide the text till mouseover, neither brings up any results, and Google's WebMaster tools show that the site isnt in their database at all (which it previously was) - thus my suspicions.
The current layout, since all the text is un-hidden, is kind of cluttered, and I would prefer it to have just the logo or minimal text, which would then change on mouse-over. Currently, the mouse-over functions will give you an idea of how the DIVs should be hidden and un-hidden... pretending that all but the DIV with the logo on white are hidden when you initially come to the page would give you an idea of what the site looked like and came up as when our listing in their index dissappeared.
Thanks,
Robert
[edited by: RobertMfromLI at 3:10 am (utc) on July 30, 2007]
(1) Original site was done, content was in frames. Was indexed (listing each page in the listing for "SiteURL.COM" (ie: index.html, topbar.html, btmbar.html)
(2) Site was revised with no frames, and no hidden DIVs. Site was re-indexed, and Google Search would bring up index.html, etc. A search on specific terms from the site would list the index page in the top 3
(3) Site was revised again, all text hidden until mouseover (with the <DIV... display:none...> in the HTML code itself). Site was visited again by GoogleBot at which time...
(4)
(a) the index page was removed from the listings,
(b) the only listing for "SiteURL.com" is now the topbar and bottombar which arent used in the new site designs
(c) doing a site:URL only returns the topbar and bottombar files in their index, and no longer returns the index page or any other pages crawled by them since we used the <DIV...display:none> tags.
The previous code (right about when the index page dissappeared from their listing) hid ALL text until a mouseover event using the <DIV style="display:none;"> tags.
Thanks,
Robert
[edited by: RobertMfromLI at 3:33 am (utc) on July 30, 2007]
What you are describing sounds like a one page website. If so, building more content and getting more inbound links may be all you need to set things straight -- but that's just my guess. I'd still recommend the Webmaster Tools account.
I have one but am waiting on verification. On verification attempts, Google is reporting a timeout (yet no request gets routed to the web server).
Router Log:
[INFO] Thu Mar 25 00:58:18 2004 Dropped TCP packet from MY_IP:80 to GOOGLEs_IP:58608 as unable to modify header options
The router's log shows an invalid response to GoogleBot that was blocked (thus the web traffic never even gets to the web server). I seem to be having the same problem with the w3 Validator as well.
So, now I am baffled as to what's going on. Perhaps the page is no longer in the listing because GoogleBot can no longer get there (and it's thought of as a dead link)...
Now I just have to figure out why the router is suddenly doing this. This seems to happen with certain scripts trying to access the server... don't understand why... GoogleBot, W3's Validator, and a couple others. Yet other ones I have tried online seem to get through successfully. Hadn't changed anything on the router... though, after this, I did upgrade the firmware, reset it, re-add the DMZ entry for the server, and still same results...
Well, I've started a thread on another forum that handles hardware stuff like the one I am running in to (since this isnt the applicable place - unless it has something to do with changes in how GoogleBot connects in the last 3 weeks).
Thanks again,
Robert
Your guesses on both points are correct:
- I don't have NIS (or any other AV)
- I am not running Windows
:-(
As far as I can see (and it has been sooooo long since I looked at anything to do with the TCP/IP protocols), GoogleBot tries initiating a connection, the server returns the connection request (handshake), the (our) router drops the packet thusly:
Dropped TCP packet from 192.168.0.222:80 to [GoogleBot_IP]:33082 as unable to modify header options
Thus, no socket to port 80 is ever created on the server, and Google never gets a response to initiate the session/connection.
I've changed nothing on my end on the router. Though, after a week and a half of trying, I upgraded it's firmware, reset it, turned off the firewall, and set the DMZ option to point to the server. Same results.
The other odd entries in the router are:
Blocked outgoing ICMP packet (ICMP type 3) from 192.168.0.222 to?.?.?.?
I can consistently duplicate this issue with W3's HTML Validator as well (and a few other bot/check by script sites)... while many others connect fine, including Yahoo and MSN.
I am at a total loss... On Google's forums, there are a lot of complaints of such, and it seems someone from Google (when they happen to visit the thread) take care of each issue (with the Google WebMaster Tools Verification bot) manually - which of course will still result in GoogleBot not making it to the sites.
My only guesses are maybe
(1) My ISP is filtering certain traffic
(2) My ISP started using a proxy server to proxy our traffic - which happens to be breaking certain things [this - and an AIM client that connects via their Oscar server (connects, loads buddy info, etc... but cannot send IMs to anyone - though AIM Express work fine].
(3) Google changed something in their bots
(4) My router is having some issues (though everything else works fine and it passes all it's tests - so I would exclude this one).
-Robert