Welcome to WebmasterWorld Guest from 18.210.22.132

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Page rank sculpting

     
8:43 pm on Jan 5, 2012 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 29, 2008
posts: 624
votes: 0


I am trying to sculpt page rank ( I know most of you are going to tell me that it is not possible ) but let me first explain what I think...

From what I understand google was created to rank documents and not websites that are looking to sell anything ( in other words good looking website as we know them today ). In other words there are 2 ways to rank on google either you get high quality links ( which means that you either have to have an incredible product or need to already be a famous ).

Check who is on the 1 st page of google for the word computers... and you will understand what I mean ( did they do anything to rank there ? nothing , they got ranked there because they are famous and everyone talks about them...)

If you are not famous good luck ranking on google for competitive terms ! On way to do it is the wikipedia way but creating millions of pages and having decay factor adding one by one... ( good luck on that one too if you need to make a living ... and pay for bills within the next 10 years...)

The other solution once you have the external link "green light " from google is to sculpt your internal PR ! Years ago it was possible with no follow and robots.txt but google is making it more difficult ( I will let you figure out the reason... ) they say that is is to improve the rankings, I would agree with that but let's not forget that google is also a business.

From being in SEO for many years and having ranked websites 1 st on google for moderately competitive keywords, I believe that I understand their algorithm "fairly well"..
.
However, I have an issue with the way to sculpt the PR internally and have it flow from the homepage to subpages and back to the homepage to boost it. Everyone talks about flat site architecture but from my understanding and tests this doesn't work because it gives the same PR to all your pages... ( which is 1 ) ... unless you manage to hide the menus and drop down from googlebot on your subpages and only do internal linking on those.

Hidden menu seems to be a technic doesn't work anymore... because googlebot ignores iframe and robots.txt within pages, does anyone have any information on that ? ) is it still a valid technic or no ?

From what I heard using that technic isn't considered cloaking because of the robot.txt.

If this technic is not available anymore... what can you do to flow your PR ? ( to me nothing... and this is why I would like to get some advice ). Has anyone any idea on how to sculpt the internal PR positively and boost the homepage and the subpages by leaving the menus and drop downs on all my pages ( including subpages )

PS : This website webmasterworld ranks well for the keyword "webmaster" on google and as you will notice, it has menus and categories on the homepage but once in the category all the other categories are gone and you have to get back to the homepage to see the other categories ( using their breadcrumbs or the back button of your browser )

The same technic is also used in google webmaster central... which confirms that it is they way it has to be done but how can it be done on a website that sells something and with menus... ( because it would mean that to see all your products your clients would have to come back to your homepage every time ... , not very practical isn't it ?
9:08 pm on Jan 5, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Appreciate all the changes in how PR works today - especially how the "intelligent surfer" model has replaced the "random surfer" model behind the math of the original PageRank paper.

So think in terms of what pages you want to flow page rank to, not what pages you want to stop PR from flowing to. As you mentioned, Google has pretty well stopped that possibility.

This means internal links within the content, especially links within sentences, are going to send maximum power to their target pages. This is a major technique for flowing PageRank to where you most want it to go.
9:53 pm on Jan 5, 2012 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 29, 2008
posts: 624
votes: 0


Do you have an example of a website that flows page rank well with the " intelligent surfer model "
10:14 pm on Jan 5, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


We don't discuss specific websites here.
10:27 pm on Jan 5, 2012 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 29, 2008
posts: 624
votes: 0


ok and just to make sure intelligent surfer model is different than reasonable surfer model is it ?
10:43 pm on Jan 5, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Sorry, "reasonable surfer" is actually the right name - I messed up.
10:49 pm on Jan 5, 2012 (gmt 0)

Preferred Member

10+ Year Member

joined:Apr 29, 2008
posts: 624
votes: 0


From what I understand from reading the intelligent surfer model compare to the orginal random model... is the intelligent one it doesn't follow all the links on a website but only the one that are related to the orginal search query... ( meaning let's say I search for green widget on google but on my website there is no link that says green widget ) i won't be counted for ranking in google.

Then there is a PR calculation that is done for every term and not the entire site via all the links ( but only the related links , meaning the ones that are on pages than contain the term " green widget "

is that the idea ?
11:24 pm on Jan 5, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Sort of - but it's not all or nothing.

the intelligent one it doesn't follow all the links on a website but only the one that are related to the orginal search query

Really, it's a question of weighted likelihoods. In the original model, all links were equally weighted, each one "voting" an equal share of the URL's total link equity to their respective target URL. Now that's not even close. But I don't see links being weighted zero just because they are in the footer or a sidebar.