From: GovInfoSecurity

Eric Chabrow, Executive Editor

Continuous monitoring for vulnerabilities on information systems is supposed to reduce paperwork for federal agencies, such as the check-box compliance forms that show they comply with the rules born from the Federal Information Security Management Act.

But does documenting processes involved with continuous monitoring represent unnecessary paperwork? Congressional auditors at the Government Accountability Office don’t think so, but State Department officials implementing a custom application called iPost think documentation, in some cases, is unnecessary and outdated.

GAO issued a 63-page audit Monday evaluating iPost, a complex, custom-made application designed to provide enhanced security monitoring for State’s extensive IT systems that supports more than 260 embassies and consulates worldwide as well as 6,000-plus facilities, mostly passport offices, in the United States.

In several recommendations aimed at improving iPost, GAO called on State to document processes to assure iPost functions properly. “Having procedures for validating data and reconciling the output in iPost will help ensure that incomplete or incorrect data is detected and corrected, and documenting these procedures will help ensure that they are consistently implemented,” Gregory Wilshusen, GAO director of information security issues, writes in the report.

But in a written response, State Department Chief Financial Officer James Millette explains why the department doesn’t fully agree. “Further documentation called for by the GAO would run counter to the department’s goal of avoiding an environment in which” – in words the CFO says he cited from Sen. Tom Carper in a 2009 interview with GovInfoSecurity.com – ‘too often we have agencies who manage what we call paper compliance rather than really addressing the security of their networks, we want to go beyond paper compliance’ and effectuate meaningful security enhancements.”

No doubt continuous monitoring can facilitate near real-time risk management and represents a significant change in the way information security activities have been conducted in the past. And, the GAO audit credits State for being ahead of nearly all other federal agencies in moving toward continuous monitoring of its worldwide IT network, including the creation and deployment of iPost.

How iPost Works

A range of enterprise management and monitoring tools such as Active Directory (AD), Systems Management Server (SMS) and diagnostic scanning tools generate the source data for iPost.

These tools provide iPost with vulnerability, security compliance, anti-virus signature file and other system and network data. The custom tool posts data to an iPost database, reformats and reconciles them and then populate the data into other iPost databases. Data are associated with a site or operational unit and integrated into a single user interface dashboard that facilitates departmental monitoring, mostly by local and enterprise IT administrators and their bosses.

Summary information iPost generates furnishes an overview of the current status of hosts at a site. Detailed data on hosts within a site also are available through the application navigation. For example, as GAO notes, when looking at data about a specific patch, a user can see which hosts need that patch. Users can select a specific host within the scope of their control to view all the current data iPost has for that host, such as all identified vulnerabilities.

Though GAO lauded State for its use of iPost and implementation of a risk-scoring programs that identify and prioritize several but not all areas affecting information security risk.

Specifically, iPost’s risk scoring program addresses Windows hosts but not other IT assets on State’s major unclassified network; covers a set of 10 scoring components that includes many, but not all, information system controls that are intended to reduce risk; and assigns a score for each identified security weakness, although State could not demonstrate the extent to which scores are based on risk factors such as threat, impact or likelihood of occurrence that are specific to its computing environment. “As a result, the iPost risk scoring program … does not provide a complete view of the information security risks to the department.”

As part of the audit, GAO surveyed 40 State officials responsible for IT security and found they used iPost to identify, prioritize and fix Windows vulnerabilities that were reported in iPost and to implement other security improvements at their sites. More than half of the respondents said that assigning a numeric score to each vulnerability identified and each component was very or moderately helpful in their efforts to prioritize vulnerability mitigation.

Auditors determined that State has implemented several controls aimed at ensuring the timeliness, accuracy and completeness of iPost information. For instance, the audit shows, State employed the use of automated tools and collection schedules that support the frequent collection of monitoring data, which helps to ensure the timeliness of iPost data. State also relies on users to report when inaccurate and incomplete iPost data and scoring are identified, so they may be investigated and corrected as appropriate.

Still, Wilshusen writes, the timeliness, accuracy and completeness of iPost data aren’t always assured. “Several instances existed where iPost data were not updated as frequently as scheduled, inconsistent or incomplete,” he says. “As a result, State may not have reasonable assurance that data within iPost are accurate and complete with which to make risk management decisions.”

Despite the challenges, GAO says State has benefited from iPost. “iPost has resulted in improvements to the department’s information security by providing more extensive and timely information on vulnerabilities, while also creating an environment where officials are motivated to fix vulnerabilities based on department priorities,” Wilshusen says.