homepage Welcome to WebmasterWorld Guest from 54.237.38.30
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 33 message thread spans 2 pages: 33 ( [1] 2 > >     
Plugging the page rank leak.
How do you link out without linking out?
netnerd

10+ Year Member



 
Msg#: 22642 posted 4:54 pm on Mar 15, 2004 (gmt 0)

Im trying to design a directory site which will not leak out lots of page rank to other sites.

Previously, you could just do a redirect script with php or something , but if you have the url in the link to the script, i have a sneaky suspicion google treats this like a link.

The other option would be to use javascript, but again it would appear google now follows it.

Any suggestions?

 

abates

10+ Year Member



 
Msg#: 22642 posted 8:42 pm on Mar 15, 2004 (gmt 0)

Perhaps you could try naming all of your pages "links.html"? :)

KevinC

10+ Year Member



 
Msg#: 22642 posted 8:51 pm on Mar 15, 2004 (gmt 0)

well that links.html theory - is how do you say... Full of leaks ;)

I havn't seen any cases of JS links being spidered - anybody have evidence to the contrary?

veedub

10+ Year Member



 
Msg#: 22642 posted 9:34 pm on Mar 15, 2004 (gmt 0)

I have seen Google spidering javascript on several sites I work on.
I am not sure if google actually understands the script but it will try to follow pieces of url strings like "directory/widgets.html" that you will have most likely in your script somewhere.

Also someone from Google officially said that they will "soon" be able to deal with JS. I read about it on the boards here somewhere.

GoogleGuy

WebmasterWorld Senior Member googleguy us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 22642 posted 9:36 pm on Mar 15, 2004 (gmt 0)

Personally, I'd put your main links page in a section that was excluded by robots.txt.

Just my two cents..

veedub

10+ Year Member



 
Msg#: 22642 posted 9:40 pm on Mar 15, 2004 (gmt 0)

Maybe I should add that I first noticed JS being spidered and "hidden" pages getting indexed about a month ago. It must be a new "feature"...

KevinC

10+ Year Member



 
Msg#: 22642 posted 9:45 pm on Mar 15, 2004 (gmt 0)

oh sure you could exlude the directory - but then your not going to get those pages indexed.

bakedjake

WebmasterWorld Administrator bakedjake us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 22642 posted 10:03 pm on Mar 15, 2004 (gmt 0)

Personally, I'd put your main links page in a section that was excluded by robots.txt.

GG - just to get this straight, you are advocating blocking robots access to a links page?

Why hide legitimate links from Google?

In fact, this seems like a bad idea - a site with a ton of incoming links but no outbound links could look very spammy...

jcoronella

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 22642 posted 10:12 pm on Mar 15, 2004 (gmt 0)

Personally, I think that if a website doesn't want to REALLY support a site by linking to it, but does for other reasons, it is in google's interest to know this. The whole PR system that google has in place is a way for sites to 'vote' for other sites, and it makes sense that if you really don't think siteA is very good, but you need to link to it, that vote shouldn't count. Nor would google want it to count.

Sites that feel the need to 'hide' links are likely NOT highly regarded by Google - at least in theory - and you are just helping them do their job by not spreading your PR around.

Jake has a good point about sites with no outgoing links sending up the 'spammy' flag. There is also evidence that good on-topic outgoing links helps your ranking.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 22642 posted 10:45 pm on Mar 15, 2004 (gmt 0)

"GG - just to get this straight, you are advocating blocking robots access to a links page?"

Well no, he obviously didn't say that. He is just suggesting how to accomplish the non-leak asked about in the first post.

But to the question, you can't (or shouldn't) be able to have your cake and eat it too. If you you want your pages indexed and you link out, that *is* a PR vote. It's the whole point of the thing.

yonnermark

10+ Year Member



 
Msg#: 22642 posted 10:47 pm on Mar 15, 2004 (gmt 0)

"GG - just to get this straight, you are advocating blocking robots access to a links page?"

Well no, he obviously didn't say that. He is just suggesting how to accomplish the non-leak asked about in the first post.

Same thing

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 22642 posted 10:52 pm on Mar 15, 2004 (gmt 0)

"Same thing"

Uh, no.

It wasn't a general suggestion. It was a specific answer.

netnerd

10+ Year Member



 
Msg#: 22642 posted 11:11 pm on Mar 15, 2004 (gmt 0)

Thanks GG

So really i would need to have a CGI forwarder page which was excluded in robots.txt?

What if i point all my links to a page excluded in robots.txt - does page rank flow to this non-existant page? Or does it just not flow there becuase the page doesnt exist.

As for having my cake and eating it, i want to build a good directory of sites and while it would be nice to give them votes, i want the site to be a success and keep page rank. So I would rather have my cake than giving so much of it away that i dont have any to eat at all!

troels nybo nielsen

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 22642 posted 11:17 pm on Mar 15, 2004 (gmt 0)

If I were Google and were tired of webmasters trying to benefit from the PR system while not contributing to it, what would I do?

I might start rewarding those webmasters who actually contribute to the distribution of PR. And how? Perhaps by rewarding websites that have easily spiderable outbound links to relevant pages on websites that they seem not to be related to?

That reminds me of something: Haven't I read quite a few people complaining about all those directory pages showing in SERPs lately?

Right now I'm busy with something else, but one of these days I'll be leaking some more PR by adding outbound links to my websites.

<edit>Inserted forgotten word</edit>

[edited by: troels_nybo_nielsen at 11:31 pm (utc) on Mar. 15, 2004]

Stefan

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 22642 posted 11:21 pm on Mar 15, 2004 (gmt 0)

Im trying to design a directory site which will not leak out lots of page rank to other sites.

How about you design a real site instead?

netnerd

10+ Year Member



 
Msg#: 22642 posted 11:22 pm on Mar 15, 2004 (gmt 0)

That is also a good point - hard to know how best to structure a site these days!

Stefan

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 22642 posted 11:33 pm on Mar 15, 2004 (gmt 0)

My apologies for that, Netnerd, but "Users" are finding too many directories lately. The SE's will inevitably clue into the fact that it just p*sses people off, and directories will suddenly be ripped out of the serps.

That aside, what you're asking is how you can link to info sites, because you have no original content, without even sharing some PR. I would suggest that not only is there an ethical parameter here, it has to be obvious that the info/content sites will always rise to the top while directories that depend on linking to other's content are doomed to eventual failure.

My several cents.

netnerd

10+ Year Member



 
Msg#: 22642 posted 11:39 pm on Mar 15, 2004 (gmt 0)

When i said "thats a good point" i didnt mean yours Stefan.

"How about you design a real site instead? "

I wouldnt expect that attitude from a senior member of this forum, even with a belated apology.

I personally find directory sites if well written are very useful, particularly when they are written for a certain subject and have reviews of the merits of different sites. This is how i was intending to approach my new site.

The fact that i want to keep page rank is becuase (as anyone who wants to optimise for search engines will no doubt agree) that i want my site to rank well. In turn this will benefit both me and google users.

rfgdxm1

WebmasterWorld Senior Member rfgdxm1 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 22642 posted 11:40 pm on Mar 15, 2004 (gmt 0)

>Right now I'm busy with something else, but one of these I'll be leaking some more PR by adding outbound links to my websites.

Heretic. Burn the witch! ;) I used to worry about this in the past. Not much anymore. I was actually planning on adding some more outbound links, on the theory it may in the long run garner more inbound links. I've noticed that this tends to be a successful strategy if thinking beyond the short term.

(Note: my sites are amateur. Thus I don't have much to worry about losing; unlike most here running commercial sites who are afraid visitors will go to other sites.)

Stefan

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 22642 posted 11:52 pm on Mar 15, 2004 (gmt 0)

I wouldnt expect that attitude from a senior member of this forum, even with a belated apology.

Ok, as a senior member, (I should have posted less often), I'll apologize again and also give you a tip:

G seems to be rewarding linking out since Florida, thus the ubiquity of directories in the serps. So, my senior member advice is to link lots, and link often, especially if you have a directory site. At the moment, forget about PR leak, it isn't a factor.

JayC

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 22642 posted 11:58 pm on Mar 15, 2004 (gmt 0)

[edit]Thread developments while I was typing made this response obsolete.[/edit]

yonnermark

10+ Year Member



 
Msg#: 22642 posted 12:02 am on Mar 16, 2004 (gmt 0)

In no way could that be interpreted as "advocating" a robots.txt exclusion for links pages

But it wasn't far away from being that. He didn't say "I woudln't do that if I were you" or "just let the PR flow, what comes around goes around"

He gave no warning about negative effects of doing it..... so in practise it was as good as condoning banning robots from the links.htm page

bakedjake

WebmasterWorld Administrator bakedjake us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 22642 posted 12:05 am on Mar 16, 2004 (gmt 0)

steveb:

Well no, he obviously didn't say that. He is just suggesting how to accomplish the non-leak asked about in the first post.

I purposely took his quote out of context to clarify, and wanted to point out why it might be a bad idea to follow that advice, regardless of GG's post. (Said with total respect for GG)

netnerd:

The fact that i want to keep page rank is becuase (as anyone who wants to optimise for search engines will no doubt agree) that i want my site to rank well.

You're about to affirm the consequent. Don't do it.

PageRank is one of many, many variables used to rank pages. While PageRank leakage may cause a site to have lower rankings (everything else being the same - and I'm not convinced), is it really that relevant? Believe me, there are hundreds more important things to be doing first, before worrying about this.

If you can definitively show me evidence of PageRank leak and a drastic negative ranking effect as a result of it, please do.

Profile, don't speculate.

plumsauce

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 22642 posted 4:00 am on Mar 16, 2004 (gmt 0)


sites, especially forums and pseudo directories
that *routinely* mask outbound url's automatically
*devalue* their status in their reader's eyes.

wouldn't it be wonderful if google would also
implement this little heuristic. easy to program,
not much needed in cpu cycles.

for an example of a forum that does this, you need
look no further than the screen you are looking at
right now.

at least blogs just link out without all these
tricks.

+++

GoogleGuy

WebmasterWorld Senior Member googleguy us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 22642 posted 5:09 am on Mar 16, 2004 (gmt 0)

bakedjake/plumsauce/steveb, I tend to agree. I just suggested one way to do it. I'm not saying that it's a great idea. But since you brought it up: yup, good sites usually don't hoard PageRank. :)

netnerd

10+ Year Member



 
Msg#: 22642 posted 10:17 am on Mar 16, 2004 (gmt 0)

Ok Guys - its looking like it might be a better idea to link out more! Thanks GG,Stefan and everyone else for your tips.

troels nybo nielsen

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 22642 posted 11:09 am on Mar 16, 2004 (gmt 0)

This link [webmasterworld.com] might be of some help for decisions about linking strategies.

Tropical Island

WebmasterWorld Senior Member tropical_island us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 22642 posted 12:23 pm on Mar 16, 2004 (gmt 0)

After having spent the last couple of hours creating links for various sites that requested them I came upon this thread and the comments of fathom in the other thread.

If we are being honest with each other and asking for links it would not be an honest move to prevent a reciprocal link from being indexed. I would immediately remove a site that I discovered was doing this.

I think GG's response was a technical answer to a question in the first post - not a recommendation to do it.

netnerd

10+ Year Member



 
Msg#: 22642 posted 12:30 pm on Mar 16, 2004 (gmt 0)

Yeah tropical - but i wasnt going to ask for reciprocal links. Though i do think now on reflection that google is using outbound links as part of its current ranking system.

DaveN

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 22642 posted 12:41 pm on Mar 16, 2004 (gmt 0)

Personally, I'd put your main links page in a section that was excluded by robots.txt.

if the webmaster who is linking to you find out you have blocking the link, he will be well p**S and using GG dirty tactic (lol) you can't do much about it, better to run a links page with 2 types of links

1 straight html which will pass pr (don't be greedy share the PR)
2 bounce the other links through a cgi re-direct which has a robots.txt deny on it ....... don't use 301 or 302.

DaveN

This 33 message thread spans 2 pages: 33 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved