based on what you've put forward as the criteria, i guess it would be reasonable to assume that given sufficient processing power and bandwidth then there is no real problem and given insufficient of above then even a couple of sites would be too many...
so it was a bit of a non question without more specifics.
Bandwidth and hardware are one side of the equation and as DaveAtIFG says may be a bit speculative although the hosting company should be able to clarify technical limitations with some certainty.
We have two virtual hosting plans with V*rio - the recommendation according to the manual is 25 "low traffic" sites for that particular plan. On the one we run 7 "medium" traffic sites plus 12 "practically-zero-traffic" sites (we receive a warning that we're exceeding limits with each new addition but quite honestly haven't had any problems with access/bandwidth whatsoever). On the other we've got two domains, both "medium" traffic but isolated because it makes sense for them to sit remotely from the rest.
Our decision for setting limits is related directly to the risk of getting penalised on a single IP because that's how a given algo may be configured. It's a small risk but the consequence could be very serious - we don't control the penalty mechanism or necessarily have any immediate recourse if it were to happen.
To cap it all, we have a third plan dedicated to our real pot of gold - one domain standing alone.
The approach may be a relic from when we perceived SEís limiting sites and pages based on IP. Itís probably pretty extreme especially with the very broad acceptance of virtual hosting too but it may be another angle worth considering?