Forum Moderators: open

Message Too Old, No Replies

Google meets Heisenberg

         

deus777

3:18 am on May 31, 2003 (gmt 0)

10+ Year Member



as time passes it seems to me that the entire premise for google: using backlinks to dertermine ranking results is totaly flawed.

maybe some of the phds at google should brush up on heisenberg.

[google.com...]

it is simply not possible to make an objective ranking of sites using however fancy algorithm google likes without expecting commercial sites (and non-commercial sites) to manipulate the algorithm to boost rankings.

think about it, tops spots convert to more traffic and more sales and dollars in the bank. and then for google to call such attempts "spam" or allude that there is something evil about this is the height of arrogance or naive stupidity.

no matter how fancy the algorithm it can still be tested and worked. it is only a matter of time before googles gives up and switches to paid positions.

deus777

3:25 am on May 31, 2003 (gmt 0)

10+ Year Member



backlinks in theory should only add a zero or positive value to your overall pagerank, yet our test have shown that it is possible to construct a set of links where a very small negative value is passed to your pagerank.

obviouly replicating these structures can result in huge pagerank penalties for the target web site. this was tested and proved in reality. google was approached about it and the test sites where given PR0.

however google denies that it is possible to bomb someones pageranking, despite experiments to the contrary.

if you know what the magic linking structure is you can basically bomb someone out of the rankings.

is google going to remove this ability before it becomes widely known?

BigDave

3:43 am on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Nothing personal, but I don't believe you, and you have given me no reason to believe you.

There are linking strategies that will get you a penalty, and that is the only wya to pass negative pagerank.

Marcia

3:57 am on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



deus777, assuming from the content that they were related to the same topic I've combined all three of your threads into one.

I assume you feel there's a correlation between search engine rankings and Heisenberg's physics theories, so suppose you describe how you feel the two relate.

Heisenberg's Uncertainty Principle [zebu.uoregon.edu].

NovaW

3:57 am on May 31, 2003 (gmt 0)

10+ Year Member



heisenberg's uncertainty principle - by cataloging the web google in fact changes the web - is much more noted once google provided pagerank in the toolbar. What is this but a tool for people to determine what links could be valuable? So, the whole links are votes was once pure & a valid algo - is now corrupted by the impact of google & no longer is so relevant.

I love the Pagerank indicator on the toolbar and i'd miss it if it was gone, but google - get rid of it! - it's spoiling the web!

Marcia

4:54 am on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



See now, I find the Page Rank scoring system perfectly compatible with the Second Law of Thermodynamics, which simply means that energy never completely dissipates or disappears, but loses some of its power after an iteration.

That's what happens with PR transmitted with links. The PR energy transmitted to the linked-to page is less than the PR energy of original linking page in varying degrees, and the amount is determined by variables that can be adjusted.

So from that aspect, PR seems to have harnessed the power of the Second Law and has a firm foundation in scientific theory.

When we do well we like it, and when we don't do well we don't like it. But there's no way we can argue it doesn't have rational foundations.

NovaW

5:02 am on May 31, 2003 (gmt 0)

10+ Year Member



No question - PageRank is a brilliant concept & it's what sets google apart - it's hard to manipulate. When PR was first rolled out - links were pure - nobody was swapping links or searching for links to increase their pagerank. Now the fact that google is the No.1 SE - means that linkage on the web is driven by pagerank instead of pagerank being an observer of links. --> H-uncertainty Principle - you change what you look at.

So the question is - why allow PR to distort the web? If nobody knows what the PR of another site is - they have to judge links on the value / content of the site (which afterall is the fundamental basis of googles 1 link = 1 vote). While they are at it - google should also remove the 'check the backlinks feature' - These toolbar features add absolutely nothing to the value of google to the searching public & just allow webmasters to exploit google. They fight spam on one hand & give out tools to help people spam on the other hand.

AthlonInside

5:48 am on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Should we also accused WebmasterWorld of having such a wonderful forum which helps to create more spammers with the knowledge they can gather here? :)

Nothing is perfect and everything has flaw. I just can't believe there would be good search engine that can do well with porn sites, gambling sites or other industries that are just too many people are preparing to 'manipulate' their ranking.

Or course seldom people will manipulate the ranking for SARs, and many other fields, so Google will still be the interested of many people.

rfgdxm1

5:55 am on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>I love the Pagerank indicator on the toolbar and i'd miss it if it was gone, but google - get rid of it! - it's spoiling the web!

Biggest problem with this idea is that Google penalizes sites for linking to bad neighborhoods. If Google is going to punish sites for this, then it is reasonable for Google to provide some mechanism for determining what bad neighborhoods are. Another consequence of getting rid of the PR display is that it will make webmasters fearful of linking to other sites because they may be bad neighborhoods. When in doubt, don't link. This will also have an effect of spoiling the web. And, even without the PR display, we already know a lot that we could use to guess PR without it. An Alltheweb backlink search can reveal a lot even without the Google toolbar.

As for theories, Heisenberg is the wrong one. You need to look to anthropology. A problem that anthropologist are well aware of is that once people are aware of their presence, and that they are jotting down notes about them, this often causes them to alter their behavior. Webmasters know that Googlebot is observing, and thus alter what they do. This has been an issue since way back in the early days of search engines. People used high keyword density to do well in search engines. However, this sort of repetition in language is generally seen as a bad thing in ordinary documents. Webmasters create page specifically with search engine bots as a consideration, while trying to make it look at least acceptable to human readers.

Pete_H

8:03 am on May 31, 2003 (gmt 0)

10+ Year Member



NovaW: I don't see what's bad about being able to check backlinks to a site or that it has no values to the general searching public.
When searching (way before I ever heard about SEO) I often used to check which sites linked to an interesting site because I figured they probably contained more info on the same subject (and often more links to more sites on the same subject).

Ranking a page by counting the number of links to it isn't 100% proof, but it's only a factor when calculating the relevancy of a page. It's also probably one of the most effective methods used to date. If you (or someone else) thinks they've got a better way I'd be very interested to hear about it.

Why would google give up and switch to paid positions? Wouldn't it be even much worse? People with the most resources/money would surely get the highest positions, independent of the relevancy of their site. Now they can spend resources/money on SEO techniques and rank better, but they're never sure they will. (unless they make sure their site really becomes relevant, which in turn is good for the searcher who find the site, no?)

percentages

8:07 am on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>no matter how fancy the algorithm it can still be tested and worked. it is only a matter of time before googles gives up and switches to paid positions.

However fancy the algo, it will always be manipulated;)

Will they give up and switch to PFI/PPC, someday maybe, but I think we have a few years of free advertising left yet:)

Marcia

8:16 am on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>too many people are preparing to 'manipulate' their ranking.

In the type of industries mentioned where that is commonly done, the "manipulators" are basically all trying to out-manipulate each other. They're all in the same game, which isn't anything like what goes on outside with most other industries. They all have the same weapons and are fighting on common ground in the same battle. It's kill or get killed. They know it and the search engines are not so naive as to not know which industries they operate in.

Any webmaster who doesn't take the time to find out what the groundrules are in a particular industry shouldn't be doing sites, much less be expecting to win. That makes about much sense as trying to get into the middle of a herd of stampeding elephants and trying to fight them off with one stungun or a sling-shot. It just ain't gonna happen.

cornwall

8:51 am on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>I find the Page Rank scoring system perfectly compatible with the Second Law of Thermodynamics, which simply means that energy never completely dissipates or disappears, but loses some of its power after an iteration. <<

Interesting thought that.

The Second Law of Thermodynamics has also been summarized as "you can't get something for nothing."

Now that is my view of getting a high serp on Google ;)

doc_z

2:06 pm on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



maybe some of the phds at google should brush up on heisenberg... it is simply not possible to make an objective ranking of sites ...

Heisenberg's Uncertainty Principle does't say anything about a single variable (-> ranking). You need two operators which don't commutate. Even in this case you can determine one of them separately as precise as you want.

no matter how fancy the algorithm it can still be tested and worked. it is only a matter of time before googles gives up and switches to paid positions.

Why should paid positions give better results for the user? Of course, people try to manipulate the results. I think that the same as in real life. But does this mean to abolish laws because some one breaking it?

backlinks in theory should only add a zero or positive value to your overall pagerank, yet our test have shown that it is possible to construct a set of links where a very small negative value is passed to your pagerank. ....
this was tested and proved in reality.

According to the original algorithm, this isn't possible. Also I have made PR tests and I saw others making them. None of these tests showed such a bevaviour. Of course, Google could have modified the original algorithm (and I saw some hints for that), but no test showed any signs of negative transferred PR. This doesn't disprove you, but I don't believe in this theory. (Who proved this and who verified it?)

jomaxx

2:32 pm on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google's various algorithms have an effect on the structure of the web. So what? Every other search engine, directory and authority site that webmasters want to get listed on ALSO have the same kind of effect. I'd like to see you describe a way of indexing the web that WOULDN'T affect the web somewhat if it became popular.

It seem to me that once again someone is sniping at Google simply because they are number one for the time being.

P.S. IMO the assertion that you can somehow link to another site and reduce its pagerank is completely false. And IMO the assertion that Google will switch to paid listings is unrelated to and unsupported by your argument.

Alcogooglic

4:14 pm on May 31, 2003 (gmt 0)

10+ Year Member



backlinks in theory should only add a zero or positive value to your overall pagerank

This is true only in theory or in direct PR calculations. Actually, Google uses indirect method to calculate PR, which is an iteration method. Everyone who uses iteration methods knows about their side effect - during several first iterations, the intermediate values may be negative. In general, computed PR may oscillate around actual PR value, may become negative several times, and only after 100 (or about that) iterations the process converges to actual PR. Google thinks that 100 iterations is enough. Actually, it's enough only for common link structures. However, someone (like deus777) may invent a very uncommon link structure, which doesn't converge even after 100 iterations. Thus, the negative transferred PR is possible as a side effect of the iteration method and may occur only in very special link structures.

our test have shown that it is possible to construct a set of links where a very small negative value is passed to your pagerank.

deus777,
your set of links is rather primitive, because it transfers only 'a very small negative value'

doc_z

4:54 pm on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Everyone who uses iteration method knows about its side effect - during several first iterations, the intermediate values may be negative. In general, computed PR may oscillate around actual PR value, may become negative several times, and only after 100 (or about that) iterations the process converges to actual PR.

Yes, in general that's true. However, if you use the original algorithm (Jacobi iteration) and non-negative initial values, you can not generate negative PR values.

But Google probably uses a different (faster) algorithm which can indeed lead to such effects. Anyhow, I would expect that these effects are small since Google have much time for these iterations.

deus777, if there is currently a way to decrease PR, why do you don't publish it? You would force Google to change it (and one could easily try to verify it - assuming that it would be true).

Alcogooglic

5:47 pm on May 31, 2003 (gmt 0)

10+ Year Member



you can not generate negative PR values

these effects are small

Actually, the 'negative PR values' aren’t necessary. If an algorithm leads to small positive PR values, instead of large positive, that’s enough. Unfortunately, Google cannot compare the results of iterations and the answer, because the answer (all PRs) isn’t known. There’s no an absolute and universal iteration method. There’re always some special cases beyond the scope of iteration method.

hutcheson

6:27 pm on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One of the counterintuitive results of information theory is that sometimes adding random noise to a signal may actually improve its usefulness for certain kinds of processing.

It has been my opinion for some time now that Google should add a random perturbation -- different each month -- to each page's PR, and a similar random tweak in the keyword-weighting algorithm. I believe that would help further confound the spammers who focus on such algorithmic details rather than on generating text useful by humans; and as a result would on the average generate better results.

Google hires much better mathematicians than I am (all right then, yes, Google hires mathematicians, if you must put it that way) and I wouldn't be surprised if this factor doesn't drive some of the level of the monthly tweaks.

doc_z

6:43 pm on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



(I know that this is just an academic discussion. I mostly agree with Alcogooglic.)

There’s no an absolute and universal iteration method. There’re always some special cases beyond the scope of iteration method.

No, there are iteration methods which yield the exact solution. For a damping factor of 0 < d < 1 there exist algorithms that converge to the exact solution in at most n (the dimension) steps. Of course, in practice (with n = 3 billion) this is not of pratical interest. (Also the effects of rounding coming from numerical calcuations errrors were neglected.)

Google cannot compare the results of iterations

Since you solve a set of linear equations ( 0 < d <1 ) of type M * PR = (1-d), you compute the following norm: ¦M * PR_k - (1-d)¦, where PR_k is the PageRank vector after the kth iteration. The solution is exact if this norm is zero.

Alcogooglic

8:07 pm on May 31, 2003 (gmt 0)

10+ Year Member



doc_z,

It seems to me you know too much for a simple Full Member.
Confess honestly - Are you actually a Senior Member already?
If yes, then I surrender at once. :)

webdoctor

8:32 pm on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It has been my opinion for some time now that Google should add a random perturbation -- different each month -- to each page's PR

hutcheson,

How do we know that Google aren't already doing this? :-)

webdoctor

rfgdxm1

10:58 pm on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>hutcheson,
>How do we know that Google aren't already doing this? :-)

We don't, and it wouldn't surprise me that they are. And, this random perturbation may go beyond PR; in fact it might make sense to do so because at the moment PR isn't all that big of a part of the algo. Such as hutcheson's suggestion of "a similar random tweak in the keyword-weighting algorithm." This can also be extended to anchor text weighting, weighting of H1 tags, etc. I in fact suspect that Google may be doing this. I regularly watch a number of SERPs that are for non-commercial search terms where the webmasters of the sites that tend to come up top pay little attention to SEO. Some of the changes that I have seen could be explained by random elements added in. And also by Google randomly adjusting the weighting used for all sites for various algo factors. Such as, this month high KWD density is favored, and the next month low KWD, but lots of backlinks with anchor text is favored. One can't SEO a page to the top if they don't know what algo factors next month randomly get more weight.

doc_z

11:50 pm on May 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It seems to me you know too much for a simple Full Member.

Alcogooglic,

there was a life before I was a menber of webmasterworld. Although I don't remember these days. :)
(As already said: I think you gave a reasonable and interesting explanation for a decrease of PR in msg#16 and I mostly agree with your point of view. Basically I just want to add some things - even if these things are more of academic than of practical interest. There is no need to surrender.)

Such as hutcheson's suggestion of "a similar random tweak in the keyword-weighting algorithm." ... I in fact suspect that Google may be doing this.

Before the current update I didn't saw any signs of a random component, but I see some evidence for this after the dominic update. However, the update isn't complete. Therefore one has to wait to draw conclusions.

sanblasena

12:31 am on Jun 1, 2003 (gmt 0)

10+ Year Member



I for one would be very happy if they got rid of back links. It is so boring and such a waste of time for so many intelligent people to be sitting around doing this. I just gone done - spent about an hour and a half and requested 3 backlinks for my site. Don't know if I'll get them.

As I look at the results google is giving for my little corner of the world, it is very disappointing. I am getting pages from large companies that have an index page for this area, but no content. But, due to page ranking and back links they come in first.

I hope it changes and they put me number one!

Pat

deus777

1:33 am on Jun 1, 2003 (gmt 0)

10+ Year Member



doc_z,

I did say a small pagerank change. in other words as was also found a simple few page test wont show any effect.

The original test involved a very large number of domains, with some very clever software designed to pass googles spam detectors. The experiment was setup a while ago to test googles spam detection.

the test is probably beyond the capabilites of most people.

What was found was that the linking structure trashed the target web pages ranking for the targeted keywords, once the linkage structure stop pointing to the target webpage the ranking was restored.

the only conclusion was that the pagerank alogrithm was modified to pass a very small below zero effect to a target webpage where a perticular structure was in place to artificially increase a pagerank.

maybe it was a bug or deliberate penalty, more likely the later.

the follow on is that where any below zero modifier effect is in place then its just a matter of resources to multiply that effect to a noticeable level.

If someone has the resources to test it I can provide the complete details.

tedster

1:42 am on Jun 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Seems to me like the Google PR situation gets into the same territory as Kurt Gödel [time.com]'s work with incompleteness and undecidability in formal systems.

Gödel was a collaborator with Einstein, and he developed the theory of recursive functions, incompleteness and more.

What happens when information about Google's PR system is distributed to the field of all web authors? We get a high degree of self-reference and complexity within the system. PR on its own is not the issue - it's the combined field of PageRank itself PLUS the knowledge of PR in the very same field that PR was established to measure. PR gets bent back on itself in a fundamental way.

There's a kind of recursiveness or self-reference in that joined system, and that creates a formally unpredictable complexity. It inherently undermines Google's efforts at precision beyond a certain level.

And after all, "Google meets Gödel" has a nice alliteration to it, don't you think?

deus777

11:14 am on Jun 1, 2003 (gmt 0)

10+ Year Member



tedster,

you are spot on. when nobody knew how google worked, a number 1 rank was trivial. search results where good. now everybody knows the fundamentals, search results are now mediocre and 100s of thousands of businesses waste time and money farming backlinks.

googles popularity is self-limiting the more popular it is, the more businesses that have nothing to loose will try to work the algorithm. the more people try to alter the shape of the web because of googles view of it, the less effective google becomes. the premise of pagerank is flawed and its demise is inevitable.

godel, heisenberg, the fact remains you cant measure something objectively without altering it.

google can not hire enough phds to outsmart potentially millions of eager businesses looking for cheap source of traffic.

compared to even a year ago, I now look at the distribution of pagerank and its not far off being meaningless now. little sites get pagerank 7 without much drama, while sites that deserve page rank 7, ie have valuable content are 5 or 6, simply because they dont actively farm backlinks.

what really pisses me off, now in yet another pointless effort to try to pretend we are not allowed to alter the shape of the web because google is looking at it, google yet again changes its algorithm and yet again I have to waste my time trying to work out what the new rules are to get back to a reasonable spot. pfffffft.

takagi

11:25 am on Jun 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



the only conclusion was that the pagerank alogrithm was modified to pass a very small below zero effect to a target webpage where a perticular structure was in place to artificially increase a pagerank.

What about theming of the links. Having extra links from pages not related to the topic, could also have a negative influence on the ranking without lowering the PageRank.

doc_z

12:02 pm on Jun 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



the test is probably beyond the capabilites of most people

Don't worry about that :)

What was found was that the linking structure trashed the target web pages ranking for the targeted keywords, once the linkage structure stop pointing to the target webpage the ranking was restored.

the only conclusion was that the pagerank alogrithm was modified to pass a very small below zero effect to a target webpage where a perticular structure was in place to artificially increase a pagerank.

I think you have to distinguish between PR and ranking. A drop in ranking not necessarily means a drop in PR. The situation which you described looks more like a ranking problem. However, even in this case it should be impossible to hurt your competitor's site. (This does't mean it is impossible.)

A determination of PR effects is more complicated for several reasons, e.g.:
- you don't see the real PR. You just see the ToolbarPR, i.e. an integer on a log scale. In particular this is a problem for small PR effects.
- you cannot compare different situations in time, because other factors (damping factor, log scale, PR of incoming links) can/will change. (Therefore, an experiment of kind adding incoming links and PR drops has no meaning.) You always have to compare different situations at the same time.
- you have to ensure that there are no unknown incoming links
- iteration effects as described by Alcogooglic

I'm still interested in details. If you don't want to discuss this in plublic you can sticky mail it to me.

"Google meets Gödel" has a nice alliteration

Wouldn't 'Google goes Gödel' the better alliteration?

What about theming of the links. Having extra links from pages not related to the topic, could also have a negative influence on the ranking without lowering the PageRank.

As already said, it should be impossible to hurt ranking from incoming links.
This 63 message thread spans 3 pages: 63