Forum Moderators: open
Now, imagine that say, Microsoft would produce software with a quality of 3.4 bugs per million lines of code.
Would we be worse or better? I have no idea.
3.4 bugs per million lines of code
Would we be better or worse?
That really depends on the nature of the bug. If that percentage affects someone's ability to invade or change my system, then definitely worse. If that same percentage affected page rendering, that is a little better... unless your forte is writing CSS.
The big problem that M$ faces isn't so much as writing buggy code, but with writing code that is later exploited due to an oversight. That's not really a bug, it's poor design. And with so many people working on different aspects of M$ code, it's easy to understand how poor design can creep into a project... people aren't communicating and/or they aren't thinking though the entire process.
Generally speaking, 3.4 defects per million is a pretty good target. Still, that depends on the product. Acetaminophen is being recalled because of metal in the product. Certain products should have a zero-defect tolerance.
FWIW, code that I write for my sites is tested before going into production. It wasn't always that way, but my defects have dropped to zero. That makes me happy.
production techniques that assures a quality of about 3.4 maximum defective pieces in each million.
Also known as Six Sigma - a series of processes designed by Bill Smith at Motorola in 1986.