If you believe that corporations would be willing to make a little less money in order not to put the nation -- their nation -- at risk, you should read Richard Clarke's excellent, just-issued book, Cyber War.
As Clarke reports, prior to the 1990s, the Pentagon made extensive use of specialized software designed by in-house programmers and a few defense contractors. But under pressure from libertarian ideologues and business lobbyists, the Pentagon began to use commercial software instead -- in particular, Microsoft software. However, it turned out that Microsoft had built a low-cost brand based on a principle of "one format for all," rather than software that was tailored to special security needs. Problems soon arose, including, as Clarke recounts, a 1997 incident when the USS Yorktown, a Ticonderoga-class cruiser whose ship operations were administered on computers running Windows NT, was rendered inoperable after Windows crashed. "When the Windows system crashed, as Windows often does," Clarke writes, "the cruiser became a floating i-brick, dead in the water." After this and a "legion of other failures of Windows-based systems," the Pentagon considered a shift to free, open-source operating systems like Linux. The code of open-source software can be altered by the user, and so the government would be free to change the software without interference from companies jealously guarding their design. It is also free.
Such a switch, though, would have been disastrous for Microsoft's lucrative dealings with the government. The company was already fiercely opposed to regulation of its products' security; it did not want the added delay and cost of improving its software in order to decrease its vulnerability. If the government switched to open-source software, it could make the improvements itself, but doing so would deal a major blow to Microsoft's profits. So Microsoft moved to prevent the government from exploring any alternatives. It "went on the warpath," writes Clarke, threatening to "stop cooperating" with the government if it adopted an open-source platform. It made major campaign contributions and hired a small army of lobbyists. Clarke outlines their purpose as: "don't regulate security in the software industry, don't let the Pentagon stop using our software no matter how many security flaws it has, and don't say anything about software production overseas or deals with China." (China, security experts feared, could plant logic bombs and malware into the software.)
Clarke reports that Microsoft insiders admitted that the company "really did not take security seriously," because "there was no real alternative to its software, and they were swimming in money from their profits."
Nothing has changed since these lines were written last year -- and a sitting-duck Navy cruiser is just one example of security risks caused by private corporations' consistent practice of preferring profits over protecting the nation.
In 2007 an unknown intruder infiltrated the networks of the Departments of State, Defense, and Commerce, and all of the military agencies. The intruder stole an amount of information roughly equal to the entire Library of Congress.
In addition to the theft of sensitive information, a successful cyber attack could dismantle America's power supply for extended periods of time. Conducting a test, the Department of Energy found that it could hack into the controls of a 27-ton power generator and remotely cause it to destroy itself. (The generators are run by private companies who are not subject to regulation forcing them to secure their sites, sites which can be accessed and directed using a normal Internet connection.) The utility companies' resistance to cyber security regulation echoes the position of Microsoft, which has stymied the government's efforts even while expanding its business into government activities.
When it comes to cyber attacks, the bottom line for the public is increased defense. But again and again, private corporations have demonstrated that they have a very different bottom line in mind.