Forum Moderators: open
<added>I know the topic of image maps an SEs has been discussed ad nauseum but I've found conflicting answers</added>
I've read (here at WebmasterWorld) that SEs don't like image maps
Rubbish! ;)
There is no problem here, if use properly, can give a good boost:
<img height="160" width="319" src="keyword.jpg" border="0" usemap="#keyword" alt="keyword">
<map name="keyword">
<area href="keyword9.html" title="keyword9" coords="108,110,153,156" shape="rect">
<area href="keyword8.html" title="keyword8" coords="54,109,99,155" shape="rect">
<area href="keyword7.html" title="keyword7" coords="0,110,46,158" shape="rect">
<area href="keyword6.html" title="keyword6" coords="266,57,314,103" shape="rect">
<area href="keyword5.html" title="keyword5" coords="214,57,259,102" shape="rect">
<area href="keyword4.html" title="keyword4" coords="162,57,206,103" shape="rect">
<area href="keyword3.html" title="keyword3" coords="109,57,154,103" shape="rect">
<area href="keyword2.html" title="keyword2" coords="55,57,101,102" shape="rect">
<area href="keyword1.html" title="keyword1" coords="215,3,260,51" shape="rect"></map>
...the duplicate link will reduce page performance and are not required.
W3C Reference [w3.org]
I realize the alt represents the image should the image not load but
<area href="keyword.html" title="keyword" alt="keyword" coords="214,57,259,102" shape="rect">
seems a little overkill in one element.
Or maybe it is better to use "alt" vice "title" should the image not load the link structure would still be apparent (without needing to mouse-over).
Although W3C quotes the use of both, I have yet to see a reference where both are used (at the same time).
Never put much thought to this before... but with PDA on the rise the "alt" is a bigger consideration.
For my curiosity, can you tell me how will the duplicate links in text form reduce page performance.
I thought "Link Text" was a scoring zone.
The link in anchor text or image isn't an issue, two links to the same location is. Unless one is a JavaScript link, or contained within an object (Flash, Shockwave or other applets) and you need the second to have a bot credit links so pages do not appear "orphaned".
This isn't the case using a image map, all links are readable.
Duplicate links have two problems:
1. PageRank transfers are biased, so strategic use of PageRank to distribute evenly throughout a site is hampered, and
2. Main menu link hierarchies are tricky, particularly if resident throughout your site. If the bot can read and follow both links on every page (to every main topical pages) twice, this could flag a spam filter.
I use left or right navigation, which is toward the top of the page as you scroll down. I always use text navigation on the bottom also, so visitors won't have to scroll back up to navigate the sites. I've got double navigation like that all over the place on sites and there's never been any problem, either with load time or search engines.
We don't know if text in the alt or title attributes has the same value as keywords in text links, but it certainly has value if it makes navigation easy to understand and use and we do know that the location on a page where keywords appear has importance.
It's probably not 100% necessary, but it doesn't hurt unless it significantly compromises design. Another alternative is to put some of the links within the body of text, if it's fitting for the design of the page.
wouldn't the tool tip negate the need for the alt.
The thread that Marcia referenced gets into some of those details. The main issue is UA support, with some putting "alt" in the tool-tip and some using "title".
The Bobby recommendation [cast.org] is to use the alt attribute, and they are pretty much top dog when it comes to accessibility recommendations.
Relative to keywords, there is not a lot of evidence that current search engine algos take title attributes into account, or at least not for every type of tag. I've been doing some informal experiments in this area, and I haven't satisified myself one way or the other.
There was a thread in the Google forum [I can't find it right now] where someone felt they had conclusively shown that Google was NOT using the title attribute at all. But then again, Google changes every month, doesn't it ;)
I always use text navigation on the bottom also, so visitors won't have to scroll back up to navigate the sites. I've got double navigation like that all over the place on sites and there's never been any problem, either with load time or search engines.
Smaller site stay off of Google's radar.
Usability is another major concern but:
(Can't seem to find the orginal studies)
Shorter Web Pages are Preferred [members.aol.com]
about half way down.
If I'm not mistaken, I don't believe that the tools at Search Engine World are dealing with xhtml at this time. There's been a question asked about that, but posting in our Community forum (or checking for a previous post there) would confirm it.
>Smaller site stay off of Google's radar.
Either way, large or small, there are a lot of factors involved in optimization or page construction, so it's hard to state things as an absolute unless there's a definite indication, and even then, multiple factors would have to be looked at. There have been some small sites that were definitely not off Google's radar; in fact it's highly likely that they were subject to human review. It would be very surprising if they weren't individually looked at by Google.
I'm not 100% sure that we're talking about small and large sites here, but I'd suspect that how competitive the market is would have more of a bearing on safe practices in respect to coming under scrutiny. The issue of image maps and how search engines deal with them is the design issue, and that can vary from one search engine to another.
It might be another topic entirely, but the question of whether search engines will frown on practices that conform to specifications, either with human review or automatically by algorithm, could be a whole other topic for debate and discussion.
But I think I'll run with the duplicate links for now until I get a chance to validate the page at W3C and run some additional tests. I'll wait to see if googlebot actually chews through the site or not and then make a decision as to what to do.