List of worst 25 programming errors ever Released

using hacked Web sites,” said Richard Wang, U.S. Manager at SophosLabs.

In the report, nine other errors fall under “Risky Resource Management,” and the seven final errors have been classified as “Porous Defense” issues. “Some of the errors in the list relate to organizational behavior and policy; for example, ‘CWE-250: Execution with Unnecessary Privileges.’ Implementing and enforcing procedures to change the insecure behavior of a network’s users, and administrators can help to reduce the risk of compromise,” Wang noted.

The report found that most of the errors that have been identified are not well understood by programmers, who are generally not taught by university computer science programs how to avoid such mistakes. Also, the presence of these errors is not frequently tested by organizations developing software for sale. “Ten years ago, computer crime was not a big thing. So nearly all the teachers and most of the current patch programmers learned to code when they didn’t have to worry about hackers. Now they do, but their teachers don’t know how and they are uncomfortable teaching it,” Paller explained.

Haskins writes that the economic impact on an organization can be significant. A security breach can mean loss of intellectual property or loss of confidential data. Cleaning up after a data breach can be expensive, as is the process of notifying those who may have been affected. There are both financial and customer confidence costs, said Wang. As organizations struggle to cope with the constant onslaught of security vulnerabilities that are a result of these mistakes, billions of dollars are wasted patching errors and testing patches to clean up after an infection, Paller noted. “More billions of dollars [have been lost] to cybercrime,” he added.

There is hope, however. While these mistakes are prevalent, there are several remedies. Universities should be forced to teach and test all current programmers for secure coding skills and fill their gaps using the GIAC Secure Software Programmer Test, Paller said. Second, “test all software using automated source and binary code testing tools. No. 3, write contracts that require developers to fix the errors and take financial responsibility for the ones they miss. Once the software writers are responsible for losses from these errors, they will do 1 and 2,” he suggested.

There are several remedies, Wang agreed, but added that as with most security problems, there is “no silver bullet.” Organizations must ensure the software they use is developed with security in mind, whether the software is off-the-shelf or custom built. “If the software has access to your data or your network, then security flaws in that software could give hackers access to your data and your network. If your organization develops software, make sure that your developers are aware of these programming errors. Even if the software they are developing is not security-related, any software could potentially be used by hackers to obtain or increase their access to a network,” he explained.

Haskins quotes experts to say that organizations should ensure that the software they use is up-to-date and fully patched. This applies to applications as well as operating systems (OSes). As OS developers have improved their security patch distribution methods, hackers have targeted applications, which may be less frequently updated. Web browsers, another access point, provide conduits between a potentially dangerous Web site and any application that the browser can start on the desktop. “For example, flaws in media players, image viewers or document reading software could be exploited via a Web browser used to launch those applications,” Wang continued. “It is sadly the case that someone else’s programming error, whether in a Web site or application, can cost you. Organizations must understand the risks they face and have layered security solutions in place so that a single flaw does not expose the entire organization,” he concluded.