Forum Moderators: open
maybe some of the phds at google should brush up on heisenberg.
[google.com...]
it is simply not possible to make an objective ranking of sites using however fancy algorithm google likes without expecting commercial sites (and non-commercial sites) to manipulate the algorithm to boost rankings.
think about it, tops spots convert to more traffic and more sales and dollars in the bank. and then for google to call such attempts "spam" or allude that there is something evil about this is the height of arrogance or naive stupidity.
no matter how fancy the algorithm it can still be tested and worked. it is only a matter of time before googles gives up and switches to paid positions.
obviouly replicating these structures can result in huge pagerank penalties for the target web site. this was tested and proved in reality. google was approached about it and the test sites where given PR0.
however google denies that it is possible to bomb someones pageranking, despite experiments to the contrary.
if you know what the magic linking structure is you can basically bomb someone out of the rankings.
is google going to remove this ability before it becomes widely known?
I assume you feel there's a correlation between search engine rankings and Heisenberg's physics theories, so suppose you describe how you feel the two relate.
Heisenberg's Uncertainty Principle [zebu.uoregon.edu].
I love the Pagerank indicator on the toolbar and i'd miss it if it was gone, but google - get rid of it! - it's spoiling the web!
That's what happens with PR transmitted with links. The PR energy transmitted to the linked-to page is less than the PR energy of original linking page in varying degrees, and the amount is determined by variables that can be adjusted.
So from that aspect, PR seems to have harnessed the power of the Second Law and has a firm foundation in scientific theory.
When we do well we like it, and when we don't do well we don't like it. But there's no way we can argue it doesn't have rational foundations.
So the question is - why allow PR to distort the web? If nobody knows what the PR of another site is - they have to judge links on the value / content of the site (which afterall is the fundamental basis of googles 1 link = 1 vote). While they are at it - google should also remove the 'check the backlinks feature' - These toolbar features add absolutely nothing to the value of google to the searching public & just allow webmasters to exploit google. They fight spam on one hand & give out tools to help people spam on the other hand.
Nothing is perfect and everything has flaw. I just can't believe there would be good search engine that can do well with porn sites, gambling sites or other industries that are just too many people are preparing to 'manipulate' their ranking.
Or course seldom people will manipulate the ranking for SARs, and many other fields, so Google will still be the interested of many people.
Biggest problem with this idea is that Google penalizes sites for linking to bad neighborhoods. If Google is going to punish sites for this, then it is reasonable for Google to provide some mechanism for determining what bad neighborhoods are. Another consequence of getting rid of the PR display is that it will make webmasters fearful of linking to other sites because they may be bad neighborhoods. When in doubt, don't link. This will also have an effect of spoiling the web. And, even without the PR display, we already know a lot that we could use to guess PR without it. An Alltheweb backlink search can reveal a lot even without the Google toolbar.
As for theories, Heisenberg is the wrong one. You need to look to anthropology. A problem that anthropologist are well aware of is that once people are aware of their presence, and that they are jotting down notes about them, this often causes them to alter their behavior. Webmasters know that Googlebot is observing, and thus alter what they do. This has been an issue since way back in the early days of search engines. People used high keyword density to do well in search engines. However, this sort of repetition in language is generally seen as a bad thing in ordinary documents. Webmasters create page specifically with search engine bots as a consideration, while trying to make it look at least acceptable to human readers.
Ranking a page by counting the number of links to it isn't 100% proof, but it's only a factor when calculating the relevancy of a page. It's also probably one of the most effective methods used to date. If you (or someone else) thinks they've got a better way I'd be very interested to hear about it.
Why would google give up and switch to paid positions? Wouldn't it be even much worse? People with the most resources/money would surely get the highest positions, independent of the relevancy of their site. Now they can spend resources/money on SEO techniques and rank better, but they're never sure they will. (unless they make sure their site really becomes relevant, which in turn is good for the searcher who find the site, no?)
However fancy the algo, it will always be manipulated;)
Will they give up and switch to PFI/PPC, someday maybe, but I think we have a few years of free advertising left yet:)
In the type of industries mentioned where that is commonly done, the "manipulators" are basically all trying to out-manipulate each other. They're all in the same game, which isn't anything like what goes on outside with most other industries. They all have the same weapons and are fighting on common ground in the same battle. It's kill or get killed. They know it and the search engines are not so naive as to not know which industries they operate in.
Any webmaster who doesn't take the time to find out what the groundrules are in a particular industry shouldn't be doing sites, much less be expecting to win. That makes about much sense as trying to get into the middle of a herd of stampeding elephants and trying to fight them off with one stungun or a sling-shot. It just ain't gonna happen.
Interesting thought that.
The Second Law of Thermodynamics has also been summarized as "you can't get something for nothing."
Now that is my view of getting a high serp on Google ;)
maybe some of the phds at google should brush up on heisenberg... it is simply not possible to make an objective ranking of sites ...
Heisenberg's Uncertainty Principle does't say anything about a single variable (-> ranking). You need two operators which don't commutate. Even in this case you can determine one of them separately as precise as you want.
no matter how fancy the algorithm it can still be tested and worked. it is only a matter of time before googles gives up and switches to paid positions.
Why should paid positions give better results for the user? Of course, people try to manipulate the results. I think that the same as in real life. But does this mean to abolish laws because some one breaking it?
backlinks in theory should only add a zero or positive value to your overall pagerank, yet our test have shown that it is possible to construct a set of links where a very small negative value is passed to your pagerank. ....
this was tested and proved in reality.
According to the original algorithm, this isn't possible. Also I have made PR tests and I saw others making them. None of these tests showed such a bevaviour. Of course, Google could have modified the original algorithm (and I saw some hints for that), but no test showed any signs of negative transferred PR. This doesn't disprove you, but I don't believe in this theory. (Who proved this and who verified it?)
It seem to me that once again someone is sniping at Google simply because they are number one for the time being.
P.S. IMO the assertion that you can somehow link to another site and reduce its pagerank is completely false. And IMO the assertion that Google will switch to paid listings is unrelated to and unsupported by your argument.
backlinks in theory should only add a zero or positive value to your overall pagerank
our test have shown that it is possible to construct a set of links where a very small negative value is passed to your pagerank.
Everyone who uses iteration method knows about its side effect - during several first iterations, the intermediate values may be negative. In general, computed PR may oscillate around actual PR value, may become negative several times, and only after 100 (or about that) iterations the process converges to actual PR.
Yes, in general that's true. However, if you use the original algorithm (Jacobi iteration) and non-negative initial values, you can not generate negative PR values.
But Google probably uses a different (faster) algorithm which can indeed lead to such effects. Anyhow, I would expect that these effects are small since Google have much time for these iterations.
deus777, if there is currently a way to decrease PR, why do you don't publish it? You would force Google to change it (and one could easily try to verify it - assuming that it would be true).
you can not generate negative PR values
…
these effects are small
It has been my opinion for some time now that Google should add a random perturbation -- different each month -- to each page's PR, and a similar random tweak in the keyword-weighting algorithm. I believe that would help further confound the spammers who focus on such algorithmic details rather than on generating text useful by humans; and as a result would on the average generate better results.
Google hires much better mathematicians than I am (all right then, yes, Google hires mathematicians, if you must put it that way) and I wouldn't be surprised if this factor doesn't drive some of the level of the monthly tweaks.
There’s no an absolute and universal iteration method. There’re always some special cases beyond the scope of iteration method.
No, there are iteration methods which yield the exact solution. For a damping factor of 0 < d < 1 there exist algorithms that converge to the exact solution in at most n (the dimension) steps. Of course, in practice (with n = 3 billion) this is not of pratical interest. (Also the effects of rounding coming from numerical calcuations errrors were neglected.)
Google cannot compare the results of iterations
Since you solve a set of linear equations ( 0 < d <1 ) of type M * PR = (1-d), you compute the following norm: ¦M * PR_k - (1-d)¦, where PR_k is the PageRank vector after the kth iteration. The solution is exact if this norm is zero.
We don't, and it wouldn't surprise me that they are. And, this random perturbation may go beyond PR; in fact it might make sense to do so because at the moment PR isn't all that big of a part of the algo. Such as hutcheson's suggestion of "a similar random tweak in the keyword-weighting algorithm." This can also be extended to anchor text weighting, weighting of H1 tags, etc. I in fact suspect that Google may be doing this. I regularly watch a number of SERPs that are for non-commercial search terms where the webmasters of the sites that tend to come up top pay little attention to SEO. Some of the changes that I have seen could be explained by random elements added in. And also by Google randomly adjusting the weighting used for all sites for various algo factors. Such as, this month high KWD density is favored, and the next month low KWD, but lots of backlinks with anchor text is favored. One can't SEO a page to the top if they don't know what algo factors next month randomly get more weight.
It seems to me you know too much for a simple Full Member.
Alcogooglic,
there was a life before I was a menber of webmasterworld. Although I don't remember these days. :)
(As already said: I think you gave a reasonable and interesting explanation for a decrease of PR in msg#16 and I mostly agree with your point of view. Basically I just want to add some things - even if these things are more of academic than of practical interest. There is no need to surrender.)
Such as hutcheson's suggestion of "a similar random tweak in the keyword-weighting algorithm." ... I in fact suspect that Google may be doing this.
Before the current update I didn't saw any signs of a random component, but I see some evidence for this after the dominic update. However, the update isn't complete. Therefore one has to wait to draw conclusions.
As I look at the results google is giving for my little corner of the world, it is very disappointing. I am getting pages from large companies that have an index page for this area, but no content. But, due to page ranking and back links they come in first.
I hope it changes and they put me number one!
Pat
I did say a small pagerank change. in other words as was also found a simple few page test wont show any effect.
The original test involved a very large number of domains, with some very clever software designed to pass googles spam detectors. The experiment was setup a while ago to test googles spam detection.
the test is probably beyond the capabilites of most people.
What was found was that the linking structure trashed the target web pages ranking for the targeted keywords, once the linkage structure stop pointing to the target webpage the ranking was restored.
the only conclusion was that the pagerank alogrithm was modified to pass a very small below zero effect to a target webpage where a perticular structure was in place to artificially increase a pagerank.
maybe it was a bug or deliberate penalty, more likely the later.
the follow on is that where any below zero modifier effect is in place then its just a matter of resources to multiply that effect to a noticeable level.
If someone has the resources to test it I can provide the complete details.
Gödel was a collaborator with Einstein, and he developed the theory of recursive functions, incompleteness and more.
What happens when information about Google's PR system is distributed to the field of all web authors? We get a high degree of self-reference and complexity within the system. PR on its own is not the issue - it's the combined field of PageRank itself PLUS the knowledge of PR in the very same field that PR was established to measure. PR gets bent back on itself in a fundamental way.
There's a kind of recursiveness or self-reference in that joined system, and that creates a formally unpredictable complexity. It inherently undermines Google's efforts at precision beyond a certain level.
And after all, "Google meets Gödel" has a nice alliteration to it, don't you think?
you are spot on. when nobody knew how google worked, a number 1 rank was trivial. search results where good. now everybody knows the fundamentals, search results are now mediocre and 100s of thousands of businesses waste time and money farming backlinks.
googles popularity is self-limiting the more popular it is, the more businesses that have nothing to loose will try to work the algorithm. the more people try to alter the shape of the web because of googles view of it, the less effective google becomes. the premise of pagerank is flawed and its demise is inevitable.
godel, heisenberg, the fact remains you cant measure something objectively without altering it.
google can not hire enough phds to outsmart potentially millions of eager businesses looking for cheap source of traffic.
compared to even a year ago, I now look at the distribution of pagerank and its not far off being meaningless now. little sites get pagerank 7 without much drama, while sites that deserve page rank 7, ie have valuable content are 5 or 6, simply because they dont actively farm backlinks.
what really pisses me off, now in yet another pointless effort to try to pretend we are not allowed to alter the shape of the web because google is looking at it, google yet again changes its algorithm and yet again I have to waste my time trying to work out what the new rules are to get back to a reasonable spot. pfffffft.
the only conclusion was that the pagerank alogrithm was modified to pass a very small below zero effect to a target webpage where a perticular structure was in place to artificially increase a pagerank.
What about theming of the links. Having extra links from pages not related to the topic, could also have a negative influence on the ranking without lowering the PageRank.
the test is probably beyond the capabilites of most people
Don't worry about that :)
What was found was that the linking structure trashed the target web pages ranking for the targeted keywords, once the linkage structure stop pointing to the target webpage the ranking was restored.the only conclusion was that the pagerank alogrithm was modified to pass a very small below zero effect to a target webpage where a perticular structure was in place to artificially increase a pagerank.
I think you have to distinguish between PR and ranking. A drop in ranking not necessarily means a drop in PR. The situation which you described looks more like a ranking problem. However, even in this case it should be impossible to hurt your competitor's site. (This does't mean it is impossible.)
A determination of PR effects is more complicated for several reasons, e.g.:
- you don't see the real PR. You just see the ToolbarPR, i.e. an integer on a log scale. In particular this is a problem for small PR effects.
- you cannot compare different situations in time, because other factors (damping factor, log scale, PR of incoming links) can/will change. (Therefore, an experiment of kind adding incoming links and PR drops has no meaning.) You always have to compare different situations at the same time.
- you have to ensure that there are no unknown incoming links
- iteration effects as described by Alcogooglic
I'm still interested in details. If you don't want to discuss this in plublic you can sticky mail it to me.
"Google meets Gödel" has a nice alliteration
Wouldn't 'Google goes Gödel' the better alliteration?
What about theming of the links. Having extra links from pages not related to the topic, could also have a negative influence on the ranking without lowering the PageRank.