DHS, industry release plan for evaluating IT risk

Published 28 August 2009

The recommendations provide tech developers and managers with a model for identifying and mitigating risk in IT solutions that serve companies in the critical infrastructure sector and other markets

New recommendations released on Wednesday from DHS and a coalition of information technology firms provide tech developers and managers with a model for identifying and mitigating risk in IT solutions that serve companies in the critical infrastructure sector and other markets.

The IT Sector Baseline Risk Assessment evaluates threats posed by the primary functions of the IT sector to provide products and services; incident management capabilities; domain name services; Web-based content, information and communication services; and Internet routing, access and connection services. The assessment will address more closely identity management in the next version.

The threats are becoming more sophisticated, sustained and determined, and the vulnerabilities are increasing as we continue to connect new and different devices to the network,” said Robert Dix, vice president of government affairs and critical infrastructure protection at Juniper Networks and executive member of the IT Sector Coordinating Council, which developed the assessment with DHS. “The ramifications have an impact that we need to pay greater attention to. This [assessment] maps these threats and vulnerabilities to the critical functions of the IT sector, to identify where gaps in protection exist and inform mitigation strategies.”

Nexgov’s Jill R. Aitoro writes that the assessment was created as part of DHS’s National Infrastructure Protection Plan, first developed under the Bush administration, which called upon different critical sectors to devise plans through public-private partnerships that address their unique characteristics and risks and suggest strategies to best mitigate those risks.

For example, risks associated with domain name services include a large-scale manmade denial-of-service attack that prevents access to the computer network, according to the IT risk assessment, and the mitigation strategy for that risk is to develop processes that ensure continuous monitoring and redundancy through backup capabilities. Risks associated with the manufacturing of products and services include weaknesses in the supply chain that allow viruses to be injected deliberately into software or firmware. The mitigation approach is to develop sourcing strategies that monitor the availability and quality of computer components and enforce timely response — including product recalls — when a security compromise is identified.

The second version of the assessment will include metrics that will help determine whether the mitigation strategies in place will improve the risk profile. No date has been set for the release of the next version.

We don’t just want another paperwork exercise; we want to measure whether we’re actually reducing risk,” Dix said.

Aitoro writes that some security specialists questioned whether the risk assessment goes far enough to force public and private organizations to eliminate computer vulnerabilities. “There are no actions required of industry or promised by industry — except to write more reports and plans,” said Alan Paller, director of the computer security training SANS Institute, who was in the early meetings to develop the assessment. “The industry groups were aware they could keep the government from asking industry to fix any of the security problems as long as they could point to a project where industry was going to help write a plan, or do an analysis of the problem.”

Phil Reitinger, deputy undersecretary for DHS’ National Protection and Programs Directorate, said the plan is a good start. “It’s a step forward, but it doesn’t mean the problem is solved,” he said in an interview with Nextgov. “The security environment is getting riskier; that makes it incumbent upon us to focus more on how we can make a difference — to move from output-based metrics to outcome-based metrics. That’s hard to do.”