Forum Moderators: open

Message Too Old, No Replies

hidden sitemap... is that ok?

trying to find out if using csss to hide a layer with links in it will attr

         

nippi

10:57 pm on May 8, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have a new site, with all navigation on the home page being javascript, so google cna not follow the links.

I have added a site map, with a link to it, but this emans all pages are a minimum of two clicks from home, and PR suffers as a result.

I have now palced the contents of the sitemap on the home page, and hidden it with a style. NO attempt to trick search engines, all I want to do is get the main pages back to one click from home. Placing html links on the home page will ruin the design

IS this likely to attract a penalty from google?

Jesse_Smith

3:54 am on May 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Rumor has it that Google is testing out java links. If you have logs, look for Googlebot/Test. That's what shows up when it tries to get Java stuff. Try not to have anything hidden. Heck, simply having hidden text can get you banned!

molsmonster

5:21 am on May 9, 2004 (gmt 0)

10+ Year Member



usability trumps visual

nippi

8:18 am on May 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



OK on hidden text, but i am not sure whether hidden by css is spottable by google.

I doubt I would fail a human check, as all I have done is placed the sitemap on the home page and hidden it, the sitemap is identical to the one the people can get to from the sitempa link on the home page.

Can google read css and spot I am hiding a layer, and even if it does, do I get penalised?

ThomasB

9:27 am on May 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google can recognize and follow pretty much everything that looks like an URL, even if it's in JS. The GB/Test rumors are another reason why you shouldn't be too nervous about that. I'd simply add 1 visible link to a sitemap at the bottom and GB will definetly see and follow it. Never hide anything unless you know how to do it 100% sure.

nippi

10:12 am on May 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



link to sitemap, place all my pages 2 clicks from home, and makes it like my home page only has one link. As a results, all my inner pages are pr3, home page pr5. I'd expect them to be pr4.

I'm taking the risk.

SuzyUK

10:52 am on May 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Can google read css and spot I am hiding a layer..

not as far as anyone knows. There have been some reports of Gbot indexing CSS, but I've seen no proof yet and even if it could I would still doubt it's ability to detect "hidden" things any better that it can within HTML. After All most dropdown menus are hidden at the start :)

However in saying that, why not make it a "usability aid" Keep it hidden on the home page but offer the viewer (e.g. via a JS "toggle" button) the choice of making the div/layer visible as an aid to their surfing experience.. and no different from a dropdown menu really..

Suzy

the_nerd

11:05 am on May 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



link to sitemap, place all my pages 2 clicks from home, and makes it like my home page only has one link. As a results, all my inner pages are pr3, home page pr5. I'd expect them to be pr4.
I'm taking the risk.

you take the risk to get banned just to increase PR of some of your inner pages? Sounds to me like someone requesting another card at blackjack - when the dealer has 18 and you 19 ;)

mars9820

12:27 pm on May 9, 2004 (gmt 0)

10+ Year Member



I am pretty much with the others overhere.

I wouldn't hide any text on the page as well. sooner or later google will be able to read/understand it and you will find out the hard way since you never know when will that time come.

It is like playing Russian Roulette....keep pulling the tricker...you never know when the bullet is in the chamber.

I would use absolute URL's (http://www.widget.com/internalpage.htm) in the menu's instead of relative URL's (/internalpage.htm) in that case they will be completely in your source and googlebot most likely will catch them all and correctly.

nippi

8:55 pm on May 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



then the point is.... Is ALL hidden tex penalisable? Heaps of instances where text is hidden except when user does something on the page?

I know of one site that when you mouse over a map, hidden divs appear with data for that region....

Site ranks well

metrostang

9:47 pm on May 9, 2004 (gmt 0)

10+ Year Member



The site you mentioned shows to the viewer of the web site the info when moused. What you have done is the equivilent of hidding from the viewer keywords for which you want to rank highly while showing these to the SE.

Using CSS to do this is used extensively by adult sites and when ask about to practice, GoogleGuy stated that Google doesn't currently have a method of seeing this. He went on to say that even though Google can't catch it, that he was sure that your competitors would and that they would report it to Google.

I'm not quoting him exactly, but the message I got was don't do it.

trimmer80

10:23 pm on May 9, 2004 (gmt 0)

10+ Year Member



flame me down if i am wrong but....

you could
1. Set up the full url for your javascript links .... e.g. [yourdomain.com...] instead of just /inner_page.html

and/or

put the site map in a <noscript> </noscript> tag. Thus helping any surfer that doesn't have javascript as well as letting the bot through.

BigDave

12:04 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google can already automatically identify hidden text on a page. But they cannot identify it with googlebot, instead it is most likely a customized browser.

If someone were to go to your page with this browser, it would signal that you have hidden text. Even worse, it would signal that you have a hidden link.

As long as it is only googlebot that comes to your site, you are fine, but a human check would be able to spot the hidden link quickly.

nippi

12:37 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



but as I keep saying.

The hidden links are IDENTICAL to the javascript links. I could do the javascript links with layers instead, some sort of a show/hide fucntion, but it is not as pretty as the javascript version.

ALl the anchor text is the same. All I want to do is

(1) get html links on home page isntead of non- google friendly javascript links.

Does <noscript> give me this advantage? I can nto beleive it is given the same weight as links on a page.

I am happy for my competition to spot it and dob me in, as I don't beleive google would see it as doing anything wrong.

jimh009

1:29 am on May 10, 2004 (gmt 0)

10+ Year Member



Nippi,

You are playing with fire in my opinion. Google doesn't like anything hidden. Sooner or later the bot will pick up on it and, when it does, there goes your site.

You want to potentially penalize your site just to get a 1 PR boost on some of your interior pages?

BigDave

3:10 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



nippi,

You can have your beautiful page (that will not work for the approximately 13% of people that surf with JS off) and do as you please.

Or, you can take into account that while google may follow URLs that are in JS, they will not pass Pagerank, and change your site to make it search engine friendly.

Or you can do wwhat Google specifically tells you not to do (if you want to be in their index that is) with the hope that it will work out. But do not come back here whining about having been dinged.

If Google traffic is important to you, you should take their rules into account when designing a site.

nippi

3:25 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Big Dave.

Design is client specification driven. If it was 100% my call, there would be no javascript links, but the client pays my bills.

Access is NOT an issue, as if javascript is left off... the links in the div SHOW. I'm using javascript, to load a style sheet, that hides the div. If no javascript, then the hidden div shows at the bottom of the site, with a "javascript off - please use bottom navigation" message at the top of the screen

I believe it would pass a human test for reasons already given, an it appears to be their ONLY for nonjavascript users and link anchors are identical to javascript text.

PR1 boost is a tenfold increase that is worth a great deal as far as I am concerned.

Thanks for all opinions, it appears I am fine to use it for now. I will not return and whine if it does not work out.

BigDave

3:54 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Nippi,

Why did you bother asking here if you already made up your mind?

claus

5:50 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> NO attempt to trick search engines, all I want to do is get the main pages back to one click from home

This sentence doesn't make sense. One click from home - and then you hide the links so your visitors can't see them and make that one click? That's very bad usability.

If your design don't support what you want to do with the site, then the design is wrong. Change the design and make those links visible if one click access is what you want - your visitors will probably like it a whole lot more than hidden links.

>> all navigation on the home page being javascript

That's very bad usability as well. Make sure you can read and navigate the site using a Lynx browser.

nippi

7:50 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Big Dave

>> Why did you bother asking here if you already made up your mind?

> Is this likely to attract a penalty from google?

a.This was my question.
b.I did not know the answer.
c.I was given the answer(answer = probably safe from google but carries some risk from human check)
d.I advised cool on bot check, and also I thought I would pass a human check and of my decision to proceed.

Please explain where you think from this progression of events I gave any indication of having already made up my mind?

Claus

>> That's very bad usability as well. Make sure you can read and navigate the site using a Lynx browser.

What I am doing MAKES the site work on a lynx browser as the div with the text links in it comes into play and is useable when the javascript links are not.

HelenDev

9:20 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have added a site map, with a link to it, but this emans all pages are a minimum of two clicks from home, and PR suffers as a result.

Sorry, going back to the beginning somewhat, but, is this true that PR suffers from having a site map linked from every page?

If so, why? Why would G penalize people for making their site more user friendly? Doesn't make sense to me - can anyone explain?

H.

troels nybo nielsen

9:48 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Helen, internal distribution of PR is a very complicated matter, and I believe that ordinary webmasters like us should care very little about it and instead concentrate on user friendliness.

Putting a complete sitemap on each page is not necessarily very user friendly. There are people who claim that there should never be more than 7 links on a page. This may be a bit exaggerated, but there is no doubt that pages with many links can be confusing.

And Google might regard it as some kind of internal link spamming that might trigger a filter of some kind.

HelenDev

9:54 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ordinary webmasters like us should care very little about it and instead concentrate on user friendliness

That's what I thought :)

Putting a complete sitemap on each page is not necessarily very user friendly

I wasn't suggesting actually putting the whole site map on every page (!), just a link in each page footer to one main site map, which, will itself neccesarily have a lot of links, but I think it could be helpful to some people visiting our site.

troels nybo nielsen

9:56 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sorry for misunderstanding you. Your solution sounds fine.

nippi

10:55 am on May 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>suzyk

Sorry, missed your post

I will do this

thankyou

jimh009

10:58 am on May 10, 2004 (gmt 0)

10+ Year Member



>> Sorry, going back to the beginning somewhat, but, is this true that PR suffers from having a site map linked from every page?

Not that I can tell. I have a 3000 page site that has a link to the site map on every page. Never had any problems, PR or otherwise, from it

Besides, it is good to have for your visitors, too.

erwig

12:24 am on May 12, 2004 (gmt 0)

10+ Year Member



How about putting the site map in <noscript> tags after the JavaScript links. That way, Google should be able to follow even it it doesn't read the JavaScript links. The only users that would see the site map would be the ones with JavaScript disabled. You may sacrifice your design a little for that small percentage but that seems to be the most straight-forward and honest solution.

With best regards,

Christian

frances

2:11 am on May 12, 2004 (gmt 0)

10+ Year Member



This is going even further back to the beginning. But there are pages and pages of respectable sites (not mine) doing extremely well in the serps with reams of hidden text all directed at search engines. And the text isnt even always hidden with css. One of my competitors has paragraphs of rubbish and links to himself on his home page hidden away by font color and he ranks about sixth. He's certainly been doing it for a couple of years.

I'm not suggesting this as a good practice. And I stress - I dont do it. And maybe he and all the others will get caught one day. But we shouldnt get over-paranoid here. It is a realistic option.

erwig

5:19 pm on May 12, 2004 (gmt 0)

10+ Year Member



I've noticed too that it seems to work for some sites. I have a competitor whose homepage is only an image map. I was surprised that he ranked quite highly on Google without any text on the page. (He only has a PR4 and only about three on-topic links.) Then I saw the keyword stuffing. Simple, old-fashioned size 1 font of the same colour as the background. I thought that wouldn't work anymore.

The problem is that, even though it may work now, there's a chance Google will figure it out and not just move your site down in the rankings, but actually penalize it. It will take you months to get the penalty removed if at all.

I don't quite see the point of keyword stuffing with hidden text anyway. Why not just write some actual text for the visitors that includes your most important KW several times?

Keyword stuffing used to work in the early days when the search algos were very simple. Kind of like: Page A mentions the keyword 5 times, but page B mentions it 20 times, so I'll rank page B higher. To rank well in Google, you need a natural KW density, and you can achieve that by writing real text.

But as for the links, I would still hide them with <noscript> and make your site usable for visitors with JavaScript disabled in the process.

Christian