When I was researching my book Does IT Matter?, I had an interesting discussion with the chief executive of Cognizant, a big software-outsourcing firm. He said that his company saw its Indian facilities as “software factories,” and it was applying rigorous automation and quality-control techniques to improve their productivity. Last week, on a trip to Helsinki, I heard a similar message from an executive of Fujitsu Services, a Japan-based IT outsourcer. He said that Fujitsu is seeking to do for software writing what Toyota did for car manufacturing: apply industrial engineering techniques to strip waste, delay and error out of production processes.
It makes sense. With software forming an increasingly vital part of the infrastructure of the world’s economy, reliability, stability and security are paramount concerns. In the past, software companies routinely shipped bug-ridden programs, figuring they could patch and update them later. And their customers took what they were given. That’s history – or will be soon. In the future, the best business software companies will distinguish themselves by producing industrial-strength, bulletproof code – code that approaches Six Sigma standards.
Six Sigma is a popular quality-control method that improves a manufacturing process’s performance by meticulously measuring its results, identifying the root causes of any problems, and fixing them. It aims at achieving an extremely low defect rate of 3.4 per million. That means you get a flawless output 99.99966% of the time. Some in the tech business have questioned the applicability of Six Sigma to software production. Robert V. Binder, of RBSC Corporation, argues in an influential white paper that software creation is fundamentally different from industrial production. He makes some excellent points, explaining, for instance, that defining a software “defect” is trickier than defining a defect in a physical component. But he, like other critics, seems to miss the bigger point. He argues that software production processes are “fuzzy”: “The behavior of a software ‘process’ is an amorphous blob compared to the constrained, limited, and highly predictable behavior of a die, a stamp, or a numerically controlled milling machine.” But it’s precisely the intent of a rigorous quality-control program to reduce the “fuzziness” in a production process and make it more constrained and predictable. Most manufacturing processes, it’s worth remembering, began as crafts performed by artisans. At one point, they all must have seemed fairly fuzzy.
Binder also argues that Six Sigma standards would be nearly impossible for software developers to achieve. He points out that hitting a Six Sigma quality level – 3.4 failures per million lines of code – “would require a software process roughly two orders of magnitude better” than NASA’s process for writing its space-shuttle avionics software. “It is hard to imagine how this could be attained,” he writes. But, again, that’s the essence of the Six Sigma approach: it establishes a seemingly unattainable quality standard in order to spur a complete rethinking of an established production process. Sometimes, that’s the only way to get an amorphous blob into shape.
You should have contacted Mr. Kohli, the former head of Indian IT major TCS. You should have known this 8 years back. They also had factories, like “Y2K Factory”.
Given consolidation of data centers under utility computing…and that software quality will go up and approach six sigma quality standards…and that system component sales volumes will go down because of more centalized computing…one would also think that the the price for quality, highly-utilized system components (including software) will go up.