Forum Moderators: martinibuster

Message Too Old, No Replies

PR 6 Link

Should I do the exchange

         

Rizzo

3:33 am on Jun 11, 2004 (gmt 0)

10+ Year Member



Today a webmaster emailed me and said that he had three sites with a PR 6. He said he wanted to do a reciprocal link exchange with me. I have a PR 5. I looked though his sites and I can't find anything wrong with them. But the content is not related to mine. Should I go ahead and exchange the links or should I pass it by, because of the unrelated content?

manwah

4:01 am on Jun 11, 2004 (gmt 0)

10+ Year Member



why not if their link directory/page is organized.

KevinC

4:14 am on Jun 11, 2004 (gmt 0)

10+ Year Member



Like Manwah said - I would do the trade if it was going into a good hand built theme directory - But not just a links page.

sem4u

7:37 am on Jun 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sounds okay to me. Just check the PR of the pages where your link will be placed.

graywolf

4:39 pm on Jun 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You know I am always fascinated by PR hunters who don't know how to create fully meshed sites to balance/distribute PR.

rover

5:01 pm on Jun 11, 2004 (gmt 0)

10+ Year Member



What exactly is a fully meshed site structure?

graywolf

11:28 pm on Jun 11, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



To create a meshed website you need to interlink all of your pages. If you have a large website, over say 75 pages, you will need to establish some sort of heirararchy, but for smaller sites it’s not always necessary. Lets create a sample website with the following pages

Page A: home page
Page B: contact us
Page D: category pages
Page D: product 1 page

You could create a basic link structure like this

A links to B and B links to A
A links to C and C links to A
C links to D and D links to C


A
/ \
B C --D

The problem with this link structure is you aren’t sharing PR effectively between the pages. A better arrangement would look like this
A links to B and B links to A
A links to C and C links to A
B links to C and C links to B
C links to D and D links to C

A
/ \
B---C-–D

The only change you’ve made is connecting B and C to each other. It’s better but D is still not getting the most PR it can. Lets bring D into the loop

A links to B and B links to A
A links to C and C links to A
A links to D and D links to A
B links to C and C links to B
B links to D and D links to B
C links to D and D links to C


A --- B
¦ X ¦
C --- D

You’ve now created a fully meshed site that effectively distributes PR between the pages.
This way it doesn't matter where the spider comes into your website everything one level away.

thanks to Martinibuster for helping me straighten this out

chakra

4:55 am on Jun 12, 2004 (gmt 0)

10+ Year Member



Graywolf - do you mean that the more links to your internal pages from your homepage the better? What if i have 150 internal pages, should i or should i not all get a link from the homepage?

Thank you in advance.

graywolf

11:20 am on Jun 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



According to G they will only crawl 100 links. It's hard to say without seeing but you probably are going to need to have some categories and sub categories.

What you are trying to do when you mesh a website is have all of the pages link to each other, or at least the category pages all link to each other.

example

main categories
produce
snacks
frozen

sub categories
produce - apples, oranges, lemons
snacks - cookies, chips
frozen - tv dinners, ice cream

When you hit the produce main category page it will link to the two other main category pages AND ALL of the produce sub category pages. Additionally each of the produce sub category pages will link to ALL of the other produce sub category pages AND each of the other main category pages.

What this does is make as many pages as possible available to the spider right away without having to climb thru a hierarchy.

I have one website now that has a just over 40 pages, all of the main sections are connected like this, about 30 pages, each of the those pages has a PR5. The homepage has the same PR as the category pages and as the links pages. Makes it a heck of a lot easier to get links if people see they are going on a PR5 page instead the traditional links directory with a PR2 or PR3.

Red_Eagle

3:09 pm on Jun 12, 2004 (gmt 0)

10+ Year Member



I was reading some older posts about theme pyramids [webmasterworld.com]. Are those theories now null and void?

graywolf

4:02 pm on Jun 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Nope, if you look at Brett's post#4 you will see there are 4 subtopics, each subtopic has:

low value - 1 page
medium value - 2 pages
high value - 4 pages
money pages - 8 pages

thats 15 pages x 4 topics = 60 pages
60 pages + 1 homepage = 61 pages

Well below the 100 link limit. You need to establish the hierarchy visualy with font sizes, indents or some other element.

look at his quote

Try to link to as much deep content on your index page as possible - yes I know - it is a tall order. However; spiders like first level content. If it is linked off your root page, then many se's will tend to follow at least those links or rank those linked pages higher.

He had it right back in 1999.

Red_Eagle

4:14 pm on Jun 12, 2004 (gmt 0)

10+ Year Member



Ah, I understand now. Thanks!

martinibuster

4:48 pm on Jun 12, 2004 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



The reason you have them linked from your home page is because most of your inbound links are pointed to your homepage.

However if you have good content deep inside your website, there's no reason why you can't have some partners link straight into there, that way you are assured of being spidered from deep to deeper. This is a good way of overcoming home page limitations, a way to "push" content to the top.

rfung

8:46 pm on Jun 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



graywolf: is there a place where G specified that they will only crawl 100 links per page?

my homepage has links to all products I'm listing and it's around 500... do i need to break this down into 5-6 subpages with 100 products each?...

never heard of this 100 link limitation - I've heard about the 100kb page size though...

martinibuster

9:30 pm on Jun 13, 2004 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Off Topic:
>>> is there a place where G specified that they will only crawl 100 links per page?

GoogleGuy has mentioned this on numerous occasions. Google clearly mentions it at their webmaster faqs [google.com].

When learning about the search engines, it's a good habit to look in the search engine "about us" and "help" pages.

identity_00

4:49 pm on Jun 21, 2004 (gmt 0)

10+ Year Member



To much thinking in this tread.
setup one way links-

You said he has at least two sites with PR6 so have him pick one of his sites and link to you then you link to his other site thus creating double one-way links.

Make sure the PR6 is on the linking page.

Watcher of the Skies

5:19 pm on Jun 21, 2004 (gmt 0)

10+ Year Member



Nowhere on that page, and nowhere in these forums does Google or GoogleGuy EVER specifically mention that only 100 links per page are crawled. Here is the quote from Google's FAQ:

"If the site map is larger than 100 or so links, you may want to break the site map into separate pages."

This is just a mere suggestion. There IS evidence however (and a great thread on it six months to a year or so ago) that only the first 100K on a page will be spidered/followed.

It's SO tiring hearing people publish half-truths like they know what they're talking about, but hey, it's stuff like that that makes guys like me rich. ;)

fidibidabah

6:03 pm on Jun 21, 2004 (gmt 0)

10+ Year Member



Yeah, I have 8,345,678 links on one page which are crawled daily.

Really.

martinibuster

7:13 pm on Jun 21, 2004 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



This is just a mere suggestion. There IS evidence however (and a great thread on it six months to a year or so ago) that only the first 100K on a page will be spidered/followed.

How much content is indexed weighed by the kb, and how many links are on a page and subsequently spidered are two different issues.

If Google and people from Google are recommending to keep links to under 100 on a page, I'm not going to play Edgar G. Robinson to Google's Moses and insist for a detailed explanation of why they published those recommendations. I'm going to take the hint.

I'm not saying everyone should do it. It's up to you. Me, I'm going to take the hint.

neuron

8:29 pm on Jun 21, 2004 (gmt 0)

10+ Year Member



I believe the number of links crawled/indexed per page is 124, 125, or 128. This is not necessarily because of some crawling limitation by the googlebot, but more likely a PR sharing issue.

Because most sites have standard menus on each page, the advice to limit links to 100 is to give some breathing room to the page so that you can have your normal menus on the page. I usually have a dozen menu links and a couple of other buttoms or text links on each page that each point to interior pages, thus I can easily add 100 additional links and they'll all get indexed, and won't bust the real limit.

Because PR is divided amoung all pages linked to from a page, the limit was probably put in to keep from having to share PR with more than 100 pages, the calculation could get tedious with more than that number of links. Remember, PR calculation is an iterative process, one that is repeated until RP settles to a stable value not necessarily to a precise value. Why did they limit the PR iterative cycle to only acquire a stable value rather than a precise one? Because the dang calculation can get out of hand if you don't limit it, and they found that values stabalize after only a few iterations, and that additional iterations of hundreds of calculations did little to make results more accurate. So, if they found they needed to limit the number of iterations in the calculation, I can reasonably assume they set limit of links on a page to do PR calculations on for the same reasons.

Google only uses integers to rank pages. Pages in SERPs are ranked #1, #2, #3, #4, #5, and so on and so forth, not #1, #1.02795, #2.8473, and #6.2240. So, what is important is that stable relative values be calculated, not necessarily exact values.

I may have just gone off an a tangent, but two cents often are.

Watcher of the Skies

5:17 am on Jun 22, 2004 (gmt 0)

10+ Year Member



"I'm not going to play Edgar G. Robinson to Google's Moses and insist for a detailed explanation of why they published those recommendations."

:)

the_nerd

12:44 pm on Jun 28, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



According to G they will only crawl 100 links. It's hard to say without seeing but you probably are going to need to have some categories and sub categories.

They definitely crawl more links. I have a page that links to about 500 internal pages. Some intoductory text and then just about 500 links. Everything crawled, everything in the index, everything ranking.

Ok, it wasn't planned to be so. The list grew because the underlying data grew and I didn't have the time to break it down into a hyrarchical strucure.

but there ist just one of those pages in the whole site and
it has been there for more than a year
and the PR of the page is rather high and
they seem to like the pages - they all rank nicely.

dirkz

7:14 pm on Jul 3, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> They definitely crawl more links.

There can still be a difference between how many links they crawl and how many links get a PR distribution.

As MB said, it's really only a hint, but it sure makes sense, also for the surfer.