Forum Moderators: open

Message Too Old, No Replies

Image Map and Text Links

will this hurt?

         

lorax

1:42 am on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have a website with the main navigation as an image map. Each of the sections has text links down the side so once I'm in a section, I'm not worried. I've read (here at WebmasterWorld) that SEs don't like image maps so I placed a duplicate set of main nav links in text format at the foot of the home page. This ties all the pages together as far as the Sim Spider is concerned but will the SEs penalize me for a duplicate set of nav links or is it a mute point because they can't see the first set or have they gotten better about image maps?

<added>I know the topic of image maps an SEs has been discussed ad nauseum but I've found conflicting answers</added>

fathom

3:44 am on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I've read (here at WebmasterWorld) that SEs don't like image maps

Rubbish! ;)

There is no problem here, if use properly, can give a good boost:

<img height="160" width="319" src="keyword.jpg" border="0" usemap="#keyword" alt="keyword">

<map name="keyword">

<area href="keyword9.html" title="keyword9" coords="108,110,153,156" shape="rect">
<area href="keyword8.html" title="keyword8" coords="54,109,99,155" shape="rect">
<area href="keyword7.html" title="keyword7" coords="0,110,46,158" shape="rect">
<area href="keyword6.html" title="keyword6" coords="266,57,314,103" shape="rect">
<area href="keyword5.html" title="keyword5" coords="214,57,259,102" shape="rect">
<area href="keyword4.html" title="keyword4" coords="162,57,206,103" shape="rect">
<area href="keyword3.html" title="keyword3" coords="109,57,154,103" shape="rect">
<area href="keyword2.html" title="keyword2" coords="55,57,101,102" shape="rect">
<area href="keyword1.html" title="keyword1" coords="215,3,260,51" shape="rect"></map>

...the duplicate link will reduce page performance and are not required.

tedster

9:14 am on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Each area tag should also have its own alt attribute - very handy for both human accessibilty and search engines.

W3C Reference [w3.org]

piskie

9:29 am on Nov 13, 2002 (gmt 0)

10+ Year Member



Fathom
For my curiosity, can you tell me how will the duplicate links in text form reduce page performance.

I thought "Link Text" was a scoring zone.

fathom

9:37 am on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



tedster wouldn't the tool tip negate the need for the alt.

I realize the alt represents the image should the image not load but

<area href="keyword.html" title="keyword" alt="keyword" coords="214,57,259,102" shape="rect">

seems a little overkill in one element.

Or maybe it is better to use "alt" vice "title" should the image not load the link structure would still be apparent (without needing to mouse-over).

Although W3C quotes the use of both, I have yet to see a reference where both are used (at the same time).

Never put much thought to this before... but with PDA on the rise the "alt" is a bigger consideration.

fathom

10:00 am on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



For my curiosity, can you tell me how will the duplicate links in text form reduce page performance.
I thought "Link Text" was a scoring zone.

The link in anchor text or image isn't an issue, two links to the same location is. Unless one is a JavaScript link, or contained within an object (Flash, Shockwave or other applets) and you need the second to have a bot credit links so pages do not appear "orphaned".

This isn't the case using a image map, all links are readable.

Duplicate links have two problems:

1. PageRank transfers are biased, so strategic use of PageRank to distribute evenly throughout a site is hampered, and

2. Main menu link hierarchies are tricky, particularly if resident throughout your site. If the bot can read and follow both links on every page (to every main topical pages) twice, this could flag a spam filter.

Marcia

10:03 am on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Recent short discussion
[webmasterworld.com...]

I use left or right navigation, which is toward the top of the page as you scroll down. I always use text navigation on the bottom also, so visitors won't have to scroll back up to navigate the sites. I've got double navigation like that all over the place on sites and there's never been any problem, either with load time or search engines.

We don't know if text in the alt or title attributes has the same value as keywords in text links, but it certainly has value if it makes navigation easy to understand and use and we do know that the location on a page where keywords appear has importance.

It's probably not 100% necessary, but it doesn't hurt unless it significantly compromises design. Another alternative is to put some of the links within the body of text, if it's fitting for the design of the page.

tedster

10:37 am on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



wouldn't the tool tip negate the need for the alt.

The thread that Marcia referenced gets into some of those details. The main issue is UA support, with some putting "alt" in the tool-tip and some using "title".

The Bobby recommendation [cast.org] is to use the alt attribute, and they are pretty much top dog when it comes to accessibility recommendations.

Relative to keywords, there is not a lot of evidence that current search engine algos take title attributes into account, or at least not for every type of tag. I've been doing some informal experiments in this area, and I haven't satisified myself one way or the other.

There was a thread in the Google forum [I can't find it right now] where someone felt they had conclusively shown that Google was NOT using the title attribute at all. But then again, Google changes every month, doesn't it ;)

fathom

10:46 am on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I always use text navigation on the bottom also, so visitors won't have to scroll back up to navigate the sites. I've got double navigation like that all over the place on sites and there's never been any problem, either with load time or search engines.

Smaller site stay off of Google's radar.

Usability is another major concern but:

(Can't seem to find the orginal studies)

Shorter Web Pages are Preferred [members.aol.com]

about half way down.

lorax

2:17 pm on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks all.

fathom, why would the Sim Spider here at SEW not follow the links of the image map then? If I add the text links everything works out.

fathom

6:29 pm on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would think it should.

Any change their JavaScript links.

lorax

6:37 pm on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Nope - straight map links.

<img src="/img/nav.gif" width="748" height="16" usemap="#name" alt="" />

<map name="somename" id="somename">
<area shape="rect" coords="0,0,154,15" href="/subdir/filename.html" alt="appropriate keyword desc" />
</map>

fathom

6:51 pm on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



XML hey.

This is probably the problem. Sticky Brett.

fathom

6:59 pm on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Actually a standard link ref has a close tag and in the image map the link ref is contained in the map element, so it is possible that " /> is not required.

<area shape="rect" coords="0,0,154,15" href="/subdir/filename.html" alt="appropriate keyword desc">

Marcia

10:50 pm on Nov 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



/>

If I'm not mistaken, I don't believe that the tools at Search Engine World are dealing with xhtml at this time. There's been a question asked about that, but posting in our Community forum (or checking for a previous post there) would confirm it.

>Smaller site stay off of Google's radar.

Either way, large or small, there are a lot of factors involved in optimization or page construction, so it's hard to state things as an absolute unless there's a definite indication, and even then, multiple factors would have to be looked at. There have been some small sites that were definitely not off Google's radar; in fact it's highly likely that they were subject to human review. It would be very surprising if they weren't individually looked at by Google.

I'm not 100% sure that we're talking about small and large sites here, but I'd suspect that how competitive the market is would have more of a bearing on safe practices in respect to coming under scrutiny. The issue of image maps and how search engines deal with them is the design issue, and that can vary from one search engine to another.

It might be another topic entirely, but the question of whether search engines will frown on practices that conform to specifications, either with human review or automatically by algorithm, could be a whole other topic for debate and discussion.

lorax

2:23 am on Nov 14, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



fathom & Marcia,
Thanks for your replies. We're talking about a small website of about 20 pages for a Solid Waste District. The old version of the site had a javascript navigation - which I immediately threw out. I tried to get them to use text links only but they're really stuck on their typeface of choice for the nav text. Seperate images were behaving badly (it's all tedster's fault for not helping me with the CSS - of course I should have posted a question too ;) )in the variety of browsers we targeted so an image map became the solution.

But I think I'll run with the duplicate links for now until I get a chance to validate the page at W3C and run some additional tests. I'll wait to see if googlebot actually chews through the site or not and then make a decision as to what to do.