I'm curious about the math too, so I read through the example at Cornell provided by graeme_p. It turns out you are both partly right. Without the dampening factor, the calculation converges for a connected web graph with no dangling nodes. However, the dampening factor is necessary to deal with disconnections and dangling nodes, so they could not dispense with it entirely.
This makes sense intuitively if you think of the random surfer model. The dampening factor represents the probability that the surfer will "jump" to a random page instead of following a link. If the surfer never jumps, the probability of the surfer landing on a page is still well-defined for a connected graph with no dangling nodes -- as the number of clicks goes to infinity, the probability of finding the surfer at any given page _at a particular time_ converges to its real value, which can be found by linear algebra. Of course the surfer will visit every page an infinite number of times, which is why one might think the series would diverge, but we are not calculating the number of visits to a page, but rather the probability of finding the surfer on that page, which converges.
However, if there is a dangling node (no outgoing links) the surfer will get stuck there, thus breaking the calculation. Therefore the dampening factor is required so that the surfer can leave such a page by jumping to some random other page.
None of this prevents the dampening factor from being different for different kinds of links -- that makes the calculation more complicated but it is still doable.