Forum Moderators: open
I have added a site map, with a link to it, but this emans all pages are a minimum of two clicks from home, and PR suffers as a result.
I have now palced the contents of the sitemap on the home page, and hidden it with a style. NO attempt to trick search engines, all I want to do is get the main pages back to one click from home. Placing html links on the home page will ruin the design
IS this likely to attract a penalty from google?
I doubt I would fail a human check, as all I have done is placed the sitemap on the home page and hidden it, the sitemap is identical to the one the people can get to from the sitempa link on the home page.
Can google read css and spot I am hiding a layer, and even if it does, do I get penalised?
Can google read css and spot I am hiding a layer..
not as far as anyone knows. There have been some reports of Gbot indexing CSS, but I've seen no proof yet and even if it could I would still doubt it's ability to detect "hidden" things any better that it can within HTML. After All most dropdown menus are hidden at the start :)
However in saying that, why not make it a "usability aid" Keep it hidden on the home page but offer the viewer (e.g. via a JS "toggle" button) the choice of making the div/layer visible as an aid to their surfing experience.. and no different from a dropdown menu really..
Suzy
link to sitemap, place all my pages 2 clicks from home, and makes it like my home page only has one link. As a results, all my inner pages are pr3, home page pr5. I'd expect them to be pr4.
I'm taking the risk.
you take the risk to get banned just to increase PR of some of your inner pages? Sounds to me like someone requesting another card at blackjack - when the dealer has 18 and you 19 ;)
I wouldn't hide any text on the page as well. sooner or later google will be able to read/understand it and you will find out the hard way since you never know when will that time come.
It is like playing Russian Roulette....keep pulling the tricker...you never know when the bullet is in the chamber.
I would use absolute URL's (http://www.widget.com/internalpage.htm) in the menu's instead of relative URL's (/internalpage.htm) in that case they will be completely in your source and googlebot most likely will catch them all and correctly.
Using CSS to do this is used extensively by adult sites and when ask about to practice, GoogleGuy stated that Google doesn't currently have a method of seeing this. He went on to say that even though Google can't catch it, that he was sure that your competitors would and that they would report it to Google.
I'm not quoting him exactly, but the message I got was don't do it.
you could
1. Set up the full url for your javascript links .... e.g. [yourdomain.com...] instead of just /inner_page.html
and/or
put the site map in a <noscript> </noscript> tag. Thus helping any surfer that doesn't have javascript as well as letting the bot through.
If someone were to go to your page with this browser, it would signal that you have hidden text. Even worse, it would signal that you have a hidden link.
As long as it is only googlebot that comes to your site, you are fine, but a human check would be able to spot the hidden link quickly.
The hidden links are IDENTICAL to the javascript links. I could do the javascript links with layers instead, some sort of a show/hide fucntion, but it is not as pretty as the javascript version.
ALl the anchor text is the same. All I want to do is
(1) get html links on home page isntead of non- google friendly javascript links.
Does <noscript> give me this advantage? I can nto beleive it is given the same weight as links on a page.
I am happy for my competition to spot it and dob me in, as I don't beleive google would see it as doing anything wrong.
You can have your beautiful page (that will not work for the approximately 13% of people that surf with JS off) and do as you please.
Or, you can take into account that while google may follow URLs that are in JS, they will not pass Pagerank, and change your site to make it search engine friendly.
Or you can do wwhat Google specifically tells you not to do (if you want to be in their index that is) with the hope that it will work out. But do not come back here whining about having been dinged.
If Google traffic is important to you, you should take their rules into account when designing a site.
Design is client specification driven. If it was 100% my call, there would be no javascript links, but the client pays my bills.
Access is NOT an issue, as if javascript is left off... the links in the div SHOW. I'm using javascript, to load a style sheet, that hides the div. If no javascript, then the hidden div shows at the bottom of the site, with a "javascript off - please use bottom navigation" message at the top of the screen
I believe it would pass a human test for reasons already given, an it appears to be their ONLY for nonjavascript users and link anchors are identical to javascript text.
PR1 boost is a tenfold increase that is worth a great deal as far as I am concerned.
Thanks for all opinions, it appears I am fine to use it for now. I will not return and whine if it does not work out.
This sentence doesn't make sense. One click from home - and then you hide the links so your visitors can't see them and make that one click? That's very bad usability.
If your design don't support what you want to do with the site, then the design is wrong. Change the design and make those links visible if one click access is what you want - your visitors will probably like it a whole lot more than hidden links.
>> all navigation on the home page being javascript
That's very bad usability as well. Make sure you can read and navigate the site using a Lynx browser.
>> Why did you bother asking here if you already made up your mind?
> Is this likely to attract a penalty from google?
a.This was my question.
b.I did not know the answer.
c.I was given the answer(answer = probably safe from google but carries some risk from human check)
d.I advised cool on bot check, and also I thought I would pass a human check and of my decision to proceed.
Please explain where you think from this progression of events I gave any indication of having already made up my mind?
Claus
>> That's very bad usability as well. Make sure you can read and navigate the site using a Lynx browser.
What I am doing MAKES the site work on a lynx browser as the div with the text links in it comes into play and is useable when the javascript links are not.
I have added a site map, with a link to it, but this emans all pages are a minimum of two clicks from home, and PR suffers as a result.
Sorry, going back to the beginning somewhat, but, is this true that PR suffers from having a site map linked from every page?
If so, why? Why would G penalize people for making their site more user friendly? Doesn't make sense to me - can anyone explain?
H.
Putting a complete sitemap on each page is not necessarily very user friendly. There are people who claim that there should never be more than 7 links on a page. This may be a bit exaggerated, but there is no doubt that pages with many links can be confusing.
And Google might regard it as some kind of internal link spamming that might trigger a filter of some kind.
ordinary webmasters like us should care very little about it and instead concentrate on user friendliness
That's what I thought :)
Putting a complete sitemap on each page is not necessarily very user friendly
I wasn't suggesting actually putting the whole site map on every page (!), just a link in each page footer to one main site map, which, will itself neccesarily have a lot of links, but I think it could be helpful to some people visiting our site.
Not that I can tell. I have a 3000 page site that has a link to the site map on every page. Never had any problems, PR or otherwise, from it
Besides, it is good to have for your visitors, too.
With best regards,
Christian
I'm not suggesting this as a good practice. And I stress - I dont do it. And maybe he and all the others will get caught one day. But we shouldnt get over-paranoid here. It is a realistic option.
The problem is that, even though it may work now, there's a chance Google will figure it out and not just move your site down in the rankings, but actually penalize it. It will take you months to get the penalty removed if at all.
I don't quite see the point of keyword stuffing with hidden text anyway. Why not just write some actual text for the visitors that includes your most important KW several times?
Keyword stuffing used to work in the early days when the search algos were very simple. Kind of like: Page A mentions the keyword 5 times, but page B mentions it 20 times, so I'll rank page B higher. To rank well in Google, you need a natural KW density, and you can achieve that by writing real text.
But as for the links, I would still hide them with <noscript> and make your site usable for visitors with JavaScript disabled in the process.
Christian