Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Main navigation links

Why is pagerank not being passed on?

         

Munster

10:41 am on Oct 11, 2005 (gmt 0)

10+ Year Member



Ok,for starters I know PR is not the be all and end all but to me it is a good indicator as to whether your link structure is search engine friendly. For some reason page rank is not being passed on through my website. The homepage has a PR of 6 so in my mind the next level down should get 5 then the next 4 etc.

I can only imagine that is is the way that the developer has built this top nav, and the only thing that I can see that is even remotely out of the ordinary is the following statement:

<body onload="window.setTimeout('cycle();',3000);(loadbanners());(urlswitch());">

Could this for some reason be blocking G's path through the navigation?

Everything else seems pretty standard except for this, can anyone tell me what it is used for and if they have experienced the same problem?

Cheers

Nial

Munster

8:37 am on Oct 12, 2005 (gmt 0)

10+ Year Member



I have another question,

A friend of mine has his website SEO'd by some fly-by-night SEO company and they have done something very strange.

They have created default.asp as the homepage (with pr0) and put this code on it (nothing else)

<html>
<head>
<meta http-equiv="refresh" content="0;url=index.asp">
</head>
<body>
</body>
</html>

which refreshes to the index.asp (pr4)

Why on earth would they do this, will it do more harm than good?

Nial

complete

1:10 pm on Oct 12, 2005 (gmt 0)

10+ Year Member Top Contributors Of The Month



Pagerank changes are done once every 4 months (when we see them) more or less, so you might not yet see the other pages pagerank go up.

Munster

3:03 pm on Oct 12, 2005 (gmt 0)

10+ Year Member



Yeah, but the site is 2 years old, and as far as I know the PR has never been passed on.

Lorel

4:24 pm on Oct 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Meta refreshes can cause a site to be banned so that "SEO" site may cause your site to be lost in the serps for months.

the asp scripts can cause PR to not be passed. Here is Googleguy's comment on .asp and etc. scripts:

So what's the problem with a session id, and why doesn't Googlebot crawl them? Well, we don't just have one machine for crawling. Instead, there are lots of bot machines fetching pages in parallel. For a really large site, it's easily possible to have many different machines at Google fetch a page from that site. The problem is that the web server would serve up a different session-id to each machine! That means that you'd get the exact same page multiple times--only the url would be different. It's things like that which keep some search engines from crawling dynamic pages, and especially pages with session-ids. Google can do some smart stuff looking for duplicates, and sometimes inferring about the url parameters, but in general it's best to play it safe and avoid session-ids whenever you can.

Google's Webmaster Technical Guidelines:

*Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
*Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the