Performance Metrics for Cloud Computing Needed

Editor’s Note:  The discussion of cloud computing performance metrics at FOSE echoes CRE’s statement before the Internet Security and Privacy Advisory Board on the need to develop such metrics.

From: GCN

What cloud computing needs to take the next step

By Kathleen Hickey

The General Services Administration and the National Institute of Standards and Technology have developed compliance standards for agency acquisition of cloud computing technology, but they still need to develop performance metrics and interpret standards for agency use of cloud computing technology, experts said at the FOSE 2012 conference held in Washington.

“There is an appendix of [cloud computing federal government] standards for vendors to comply with,” said Katie Lewin, director of the federal cloud computing program in GSA’s Office of Citizen Services and Innovative Technologies. But “there needs more interpretation and discussion between cloud computing vendors and customers. You can’t just give standards to comply with. [Agencies need to know] what the government is looking for and how the controls will be interpreted.”

Ben Tomhave, principal consultant at LockPath, which provides government risk and compliance technology, agreed. “Not everyone understands the question, ‘What is security?’ ” he said.

What would help is a common language and clearly defined word meanings, said Robert Bohn, NIST’s cloud computing manager. “We need a taxonomy to communicate,” he added. “What is a cloud broker? What roles are there?” NIST is working on developing cloud computing definitions with service-level agreements, he said.

Additionally, there need to be performance metrics for cloud computing, another area under development at NIST, Bohn said. “We want to measure it like we measure [electrical] watts or water” utilization, he said.

Because these pieces are missing, current practices are not “overly aligned” to risk management, Tomhave said. “The impact analysis is missing. What is the impact of not having patches if the server goes down? That should dovetail with vendor selection…. On the flip side, vendors have a tough time understanding expectations.”

Agencies should use a risk management process to decide what applications to move to the cloud, Tomhave said. Plus, government needs to identify “threshold controls,” Lewin said. “If [your agency] can’t reach those, perhaps you are not a good candidate for cloud computing.” As an example, she cited dual-factor identification.

Bringing big data into the cloud is a next step for the federal government, Lewin and Bohn said. “Big data is ideal because you don’t have to replicate data. That is the next big application,” Lewin said. Similarly, Bohn said he would like to see “more inter-cloud activity — cloud-to-cloud” communications.

Another project under way is developing continuous monitoring parameters, Lewin said. While GSA has identified three controls for continuous monitoring, they probably will not be ready for implementation by June, when GSA’s Federal Risk and Authorization Management Program launches, she said. FedRAMP is a governmentwide program providing a standardized approach to security assessment, authorization and continuous monitoring for cloud products and services.

GSA is working with product and process development consultants on developing continuous monitoring parameters, Lewin said. What should be monitored is the first aspect under discussion, she said. “What feeds [should an agency] get? How should they process them?”

Tomhave said he anticipates the federal government automating compliance procedures to meet continuous monitoring and risk management needs, including monitoring mobile devices, within three to five years.

“The government cares a lot about” security, Lewin said. “There are a lot of people across the government looking at cloud-based security.”

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published.

Please Answer: *