New IT Security Scoring System: But Will Anybody Use It?

The U.S. Department of Homeland Security, the MITRE Corporation, and the SANS Institute released on Monday their 2011 Top 25 Most Dangerous Software Errors list. This is the third release of the list since 2009.

Quoting from the press release at the MITRE website:

The 2011 Top 25 makes improvements to the 2010 list, but the spirit and goals remain the same. This year's Top 25 entries are prioritized using inputs from over 20 different organizations, who evaluated each weakness based on prevalence, importance, and likelihood of exploit. It uses the Common Weakness Scoring System (CWSS) to score and rank the final results. The Top 25 list covers a small set of the most effective "Monster Mitigations," which help developers to reduce or eliminate entire groups of the Top 25 weaknesses, as well as many of the hundreds of weaknesses that are documented by CWE [Common Weakness Enumeration].

The addition of an IT security scoring system is extremely useful in helping organizations evaluate whether they have big red signs in cyberspace that proclaim to hackers, "Attack me, please."

An article in ComputerWorld quotes Alan Paller, director of research at SANS, as saying, "Companies and not-for-profits that build or buy Web services and software do not have a reliable way to know whether the software they are using is protected against common attacks."

By having this scoring system, Paller says, "buyers and builders of software and services will be able to ask for assurance that the critical flaws have been eliminated and be able to verify that."

A story over at Government Executive, which equates the new IT security scoring system to an Energy Star rating, says, however, that "the scoring system unveiled Monday is not mandatory for agency or commercial software development or for federal procurement."

Which, of course, makes one wonder whether anyone will use it or worse, abuse it, like what happened to the Energy Star ratings before they were recently tightened up.

I hope everyone uses the scoring system—there is no real excuse not to. As some of you may remember, I have been calling for the past couple of years (here and here) for a software "never event" list modeled on the medical never event list to be drawn up and supported by the IT community. As I wrote back in 2009,

The IT community should rally around these 25 errors and deem them our never events. And the government—and private enterprise—should start to refuse to pay for software that contains them. Some of the programming errors listed are in the same league as a doctor removing the wrong limb.

With this scoring system now available, I think it is time for IT organizations to step up to the bar: Take the test, fix any IT security gaps that may exist, and then publicly and proudly state that your organization has eliminated the most dangerous 25 software security errors.

I hope DHS and MITRE step up to the challenge—and the IEEE does, too.

PHOTO: iStockphoto

Advertisement

Risk Factor

IEEE Spectrum's risk analysis blog, featuring daily news, updates and analysis on computing and IT projects, software and systems failures, successes and innovations, security threats, and more.

Contributor
Willie D. Jones
 
Advertisement