Forum Moderators: open
-Squared
For Javascript
[webmasterworld.com...]
and the latter half of [webmasterworld.com...]
I don't think anyone is 100% sure about pagerank being passed on to robots.txt pages, or whether it is preserved sort of thing.
Either way, IMO its a good thing to exclude things like TOS pages and site specific information, but not javascripting outbound links type thing :)
Depends on what you have available.
>On cloaking
Once you go to the darkside you will never return
What do you mean? I'm cloaking a left/right sidebar, I think that left is more usual for humans. What's wrong about it?
I don't see myself cloaking content in the near future.
You should really be ok,
Cloaking is widely used in spamming SE's once you have learnt how to cloak you may find the tempatation to High.
Famous Last Words
"I'll Just cloak this page so that the SE can't see it........What the Hell I might as well show the SE a Content Rich Page and The Customers my Terms And Conditions it can't hurt"
DaveN
If I was you, I would use your terms and conditions/Privacy page in a positive way. You can add lots of information about your site and services in there (which can be found via a search if you're clever about it) and, of course, that makes 2 more pages for the Googlebot to chew on. Man, that Googlebot loves pages ;)
GoogleGuy once said -
*If you work really hard to boost your authority-like score while trying to minimize your hub-like score, that sets your site apart from most domains.*
Sounds like you wouldn't want to do that, so just let the PR flow, man!
IMO, JS links and robots.txt exclusion are both appropriate for getting a better PR distribution and they're not part of Google's "Don'ts". If a page is good for getting external inbound links I use JS links to link to it internally, if not, I exclude it by robots.txt.
Way too broad of a statement. There is a huge difference between using IP delivery to cheat your way to the top and using it to properly control a particular bot.
When I go to www.google.co.uk I am served a different set of links than someone who is located in the U.K. This happens because Google has detected my IP and made the determination that certain links simply won't be relevant to me because I'm in the U.S.
Other high-profile sites detect UA/IP's so that they can server their pages without any advertising. It is extremely common. The fact that you check IP's before determining what configuration to use to deliver your content does not make you a spammer.
Using IP detection to deliver navigation that prevents Google from crawling content that will never provide any relevant information to a Google user is beneficial to both parties. And the intent behind using such a system is exactly the same as excluding the pages using a robots.txt file. The only difference is that using IP detection [b]works much better{/b].
With IP, I'm in control, and I don't have to trust the fact that Google will actually stay out of my excluded content. I also don't have to display to the world a list of every page on my site that I don't won't showing up in a search engine. Nor do I have to deal with the annoying habit Google has of returning excluded pages in their SERPS.
If not serving a particular link improves PR flow to important pages with valuable content, while excluding pages via robots.txt does not, then that is a flaw in how Google deals with robots excluded pages. If I exclude them, then Google shouldn't look at them, and the links pointing to those pages should not be used to calulate any PR scores.
Way too broad of a statement. There is a huge difference between using IP delivery to cheat your way to the top and using it to properly control a particular bot.
Let's be honest here - cloaking is more often than not used to provide the search engine with content (not seen by the Web page visitor) to increase search engine placement. Google says "Don't Cloak" on their Webmaster page. So - Don't Cloak with Google...
<added>OK so that is a little OT, squared, look at ciml's post again #18. I think you could take that as an allaying of your worries. I agree that if you don't have external links on those pages and they link to other pages on your site the leakage shouldn't be a worry.
The log scale on the Toolbar makes a big difference. A 10% loss is about one twentieth of a notch on the Toolbar - the difference between 6.00 and about 5.95.
The effect of PR on ranking is applied in some similar manner, otherwise PR10 pages (which have many thousands times the PR of PR5 pages) would score top for all their words.
Jane Doe, the toolbar is showing guessed PageRank not real.
>block them through robots.txt and see if that makes a difference.
As long as PageRank is a property of pages within the Google index, and not all pages, it should make a difference.
All this PR saving is just going to put you on the map as a PR Miser in googles little algos - now or the future.
It's not that I am against it in theory - it is just the advantages are so small compared to the possible huge disadvantages.
The PR you save by doing this isn't going to help you much. The other things that will no longer be in the algo could hurt you.
It is not necessarily a BAD thing to give other site PR.
Think very very carefully before you do something like this.
It doesn't quite work 100% that way. If your homepage is PR7, and assuming your other pages are PR6 and all linked to from the homepage and each other, now there 19 PR5 interior pages linking to each other and back to the homepage. If you put the two pages, PP and TOS in an /information/ directory and exclude that directory with robots.txt that's 17 interior pages linked to from the homepage - 2 less, and 2 less PR6 pages linking to the rest of the site.
Are all the pages linked to from the homepage and with each other now, or are some in directories? And how competitive is the category you're in? Are you competing against 3 million pages, a million or 500K?
Added:
>It is not necessarily a BAD thing to give other site PR.
Agreed. And if some check and catch it, it could cost some good links and create some bad feelings. We don't have to, however, give away a disproportionately large amount of Page Rank when we link back to people. There are honest ways to control that without looking like a miser.