Forum Moderators: open

Message Too Old, No Replies

Themes, Link Popularity & Page Rank

Do we have proof that themed links really work?

         

mitzyb

3:15 pm on Aug 27, 2002 (gmt 0)

10+ Year Member



I believe that theming in link popularity is vital for Google but if I were too try and prove this theory I don't know how I would go about it?

Any ideas?

Also, does anyone think that we are becoming too obsessed with page rank? How is page rank defined exactly?

Is it possible that there is more than one Google algo? The secret to ranking well in Google is still a mystery - it appears just to be a mish mash of things that sometimes count and sometimes don't.

startup

3:37 pm on Aug 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I view themes and links as the next stage in the algo. As of now it doesn't have any effect that I can see. We know G will continue to upgade their algo to stay ahead of any potential competitors.
PR is only part the algo. Google states "Then for each query, Google applies hypertext analysis using more than 100 variables to determine relevance." (note this is from the appliance). The 100 variables is what keeps me experimenting.

mitzyb

3:41 pm on Aug 27, 2002 (gmt 0)

10+ Year Member



Can I ask where you got this info from? I'd just like to find some good reading material on this.

startup

3:44 pm on Aug 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What I quoted is really all it say.
[google.com...] (Google-Class Results).
When ever I cann't explain results that statement keeps me looking for more answers.

caine

3:54 pm on Aug 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



startup,

interesting find, never seen that page before.

startup

4:36 pm on Aug 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I can't take credit for finding it, it was posted in these forums a long time ago. The most interesting part is "hypertext analysis".

brotherhood of LAN

4:47 pm on Aug 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



One thing worth bearing in mind is how much it costs google and how their technology can cope with updating an entire web in X amount of days!

How impatient do we as webmasters get for a G update? Then compare that to the impatience of link rot and worldwide combined spam :)

>>100 variables

Who wants to start listing them then ;)

Markus

5:42 pm on Aug 27, 2002 (gmt 0)

10+ Year Member



> The most interesting part is "hypertext analysis".

Hypertext analysis is nothing but checking HTML tags. Additionally, it is "Then for each query". PR is calculated long before any query.

muesli

6:08 pm on Aug 27, 2002 (gmt 0)

10+ Year Member




>>100 variables

Who wants to start listing them then :)

let's give it a try. i'd start with (unordered list!):

page rank
keyword in title
keyword in H1
keyword in URL
keyword density (total word count being considered)
keyword in title
keyword in links to page (anchor text)
keyword in bold / strong, etc.
keyword in other parts (full text, alt, title, meta description)
distance between keywords in all appearances (if search for 2+ keywords)
keyword position in all appearances (how far from top)
URL length

more guessing (that's how i'd do it):
======================================
absence of competitive keywords other than search term

all i can think of for now, feel free to complete.
muesli

brotherhood of LAN

6:38 pm on Aug 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Nice list for starters muesli!

Backing up to the original title a bit, you could only assume that a computer would be able to determine any sort of "theme" from both a pagerank score or the score of things on the page.

mitzyb....if you wanted to try and prove what you want to prove - why not get some webspace at geocities and use some googlewhacking words on the page and send a couple of unrelated links pointing towards it.

When (if) it gets in the index- it shouldnt rank well for those googlewhacking words you used. Otherwise it might be more likely that themes are not as important as you might have wanted.

/oops added
Make sure they are not literally "googlewhacks" otherwise they will be the only SERP returned and that wouldn't be very beneficial :)

startup

7:13 pm on Aug 27, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Markus,
PR is only part of what is being used to rank a page in the Serps.

"Is it possible that there is more than one Google algo?" Only one algo between updates. Changes in the algo during and after updates, yes. I believe this is done by adding different weights to parts of the "main" algo (let's see if how much flak that one causes):).

egomaniac

1:41 am on Aug 28, 2002 (gmt 0)

10+ Year Member



I agree with startup. There is no evidence that Google is currently using Themes. However if you are designing from scratch today, it only makes sense to use a theme design from day one.

In my opinion, Google ranking is primarily affected by keywords in Title Tag, keywords H1 or H2 Tag at top of page, and keywords in inbound links pointing at that page. Other onpage factors mentioned above are also definitely part of the algo.

I am an advocate of themes, and I think Google will eventually use them. But this is just a guess on anyone's part.

startup, I agree that there is more than one algo. I have been noticing during the last post-update month that my pages would cycle through 3 different ranking positions with three different total result counts for the search. This went on all month, and continues to go on today.

chiyo

2:47 am on Aug 28, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I agree with most here.. Like many I guess, I am designing assuming that theming WILL be important sooner or later.

It's one seansible answer to google's page rank spam problems, and makes sense, plus it is already used in other engines.

Things move fast in the search engine world. The smart guys must be basing their design and optimization on what search engines will like in 3, 6, and 12 and maybe more months down the track than what they liked the month before last, which is the latest evidence you ever have.

Beachboy

12:37 pm on Aug 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I wonder if Google takes into account in the calculation of PR or SERPs:

* How often new inbound links to a site are established, or

* The rate of change of inbound links, or

* How many new links in a "class" (such as PR5) are added since in a period of time, like since the prior update ....

Things like that. It occurs to me that I have never seen a "rate of change" where linking is concerned in any discussion here, or anywhere, for that matter.

guezo2

4:49 pm on Aug 29, 2002 (gmt 0)

10+ Year Member



Does Google take into account
1/ the frequency of updates for one page
2/ the rate of new pages within the same site

Non-really-related question:
How does Google know when a page "changed" since last time googlebot crawled it? Maybe there is a way to compute the "similarity" between two pages, and Google considers that 2 pages are different when the similarity changed more than a given threshold...

ciml

4:58 pm on Aug 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'd be surprised if the frequency of changes made a difference. In my experience they don't even affect crawl frequency.

gopi

8:43 pm on Aug 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



guezo2 , I guess googlebot compares the last modified time stamp to determine whether a page is updated or not.

In unix hosts , an simple " touch *.* " will change the modification date of all files.

Slud

8:57 pm on Aug 29, 2002 (gmt 0)

10+ Year Member



Interesting point Beachboy.

Google does use an algorithm similar to the one you describe. (i.e. "The #N add it should be getting X% click-thru rate for this keyword, if not move down.")

So the idea has certain bounced around the halls of the googleplex.

Though I wish it wasn't true, the ranking system does seem to factor in a pages history somehow. Quite accidentally, I made a new site and got it ranked #1 in a month for a $4 word at overture. I still can't explain what happened, other than there must be some kind of recency boost.

brotherhood of LAN

9:28 pm on Aug 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Muesli,

Brett has a page about file sizes [searchengineworld.com] also being taken into account. I guess that would make sense from a limited storage point of view.....while things you mention like keyword density being something that would be of dramatic difference in a 5k and 101k page. You could compile that into the list.

Beachboy

9:41 pm on Aug 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Slud, thanks. Unfortunately, I am one of those very right-brained people who had to repeat high skool geometry and algebra. Math is almost beyond me. So I have to reason things out by being sensitive to observation, and I often construct arguments in analogies. In other words, I haven't a clue what that formulation you wrote means.

BUT. A rate of change can be equated to acceleration. Stomp on the gas in a powerful car, you nearly get tossed into the back seat. Is there a Google equivalent? Let's say 10 PR5 sites link to your site since the last update. We have the effect of those sites on your site as everyone here already knows, but do we also have an additional effect caused by the fact that 10 PR5 sites linked to you where as in the prior update period, only three did? If there is such an effect, I would think it would dissipate in terms of positioning and/or PR with subsequent updates when the rate of "acceleration" slows.

Slud

9:54 pm on Aug 29, 2002 (gmt 0)

10+ Year Member



I didn't explain myself very well.

What I imagined was some kind of baseline for how many incoming links a site should have based on how old it is.

If a brand new (to Google) site has 50 incoming links, it's above what Google would expect. If a 3 year old site only has 50 incoming links, it's below average.

Again, no evidence for any of this aside from a new site that did better than I would've thought.

Beachboy

10:07 pm on Aug 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hmmmm. I wonder. Does anyone notice positioning slippage a month or two after the rate of inbound link addition slows? It will be interesting to see what happens to your site in a couple months, Slud, and thanks for the less technical explanation! :)