«

»

May
30

UNCC Breach Highlights the Need to Think Differently about Cybersecurity

From: Network World

By John Linkous

The University of North Carolina-Charlotte (UNCC) recently disclosed that they have discovered over 350,000 student, staff and faculty records – including Social Security numbers – that have been exposed to public access in multiple systems, in some cases for several years.

Looking at how UNCC reacted to this discovery, including voluntary notification of potentially affected parties and hiring of a forensic analysis firm to identify if and when an actual breach of the data occurred, it’s clear that the university is conducting a reasonable post-mortem on this incident. That’s more than can be said of other institutions that have experienced similar security issues. Of course, that doesn’t alleviate the sighs of, “here we go again…” that most of us in the information security industry think when we read about yet another massive set of highly confidential data that has been exposed and may-or-may-not have been exploited.

As with most cases of data breach, the devil is usually in the details. In this case, the problem seems to have been caused by misconfigured systems and incorrect access control settings. This was not a case of world-class cybercriminals building customized malware to egress data, or a case of everyone’s favorite acronym of the moment, “APTs”; it was simply an unintentional – but critical – misapplication of security controls.

In 2006, Gartner analyst John Pescatore said system misconfigurations are the attack vector responsible for more than 65% of successful data breaches. Fast forward six years, and we’re unfortunately still in the same boat. So, with all that hindsight, why weren’t these security misconfigurations and ACL issues identified by UNCC? Don’t they have basic monitoring tools that should have picked up the error, even if it wasn’t spotted manually? Starting with the second question, UNCC has – I suspect – the traditional security and IT operations tools that most large organizations have: a SIEM, signature-based detection tools like antimalware and IDS/IPS, perhaps some flow-level and packet-level network traffic analysis tools and the like. Their set of tools is probably not too dissimilar from the setup most of us have within our organizations.

So, if UNCC had all of the usual monitoring tools then why wasn’t the data leak picked up? The answer to that question is easy – it probably was, it just wasn’t obvious. The focus on security for so long now has been on two things: signature-based attacks and event-based analysis. Unfortunately, what UNCC encountered was neither: there was no “attack,” and the problem manifested itself not as an event, but rather as a piece of state data (i.e., how something is configured, not what it does).  The traditional event/signature/network security toolset couldn’t see the problem, and would have had no way of alerting UNCC personnel.

Unfortunately, this is a serious problem that is common within information security infrastructures through the world. For far too long, we have ignored state-based data as part of our security monitoring profile. IT operations personnel utilize “thick” agent-based tools (think IBM/Tivoli, Microsoft SCCM, and others) to enforce configuration standards, but far too often, those standards are not driven by information security. Instead, we wind up with a “barrel o’ data,” and often that data excludes critical security information: we look at events and network traffic, but not asset state. We look at asset state, but only on hosts or devices, but not both. We collect all the right data, but we have no way of correlating it to identify which abnormal traffic predicated an attack, and subsequently which events were precursors to an unauthorized configuration change. In short, we always seem to be missing that elusive center piece of the proverbial security puzzle.

Unfortunately, the lack of this complete visibility across the spectrum of security data – which today goes by several names, including continuous monitoring, “holistic security,” situational awareness and others – is why most errors are only picked up during infrequent compliance audits, or after a data breach or other security-related issue has already occurred (as it appears to have happened in this case). Regardless of the name we use for it, we in the security industry need to start thinking differently about security by acquiring the right tools and the right analytical mindset.

Today, effective information security isn’t simply about watching the “antivirus and IDS dashboards.” We need to correlate information from a variety of sources to enable informed decisions to be made. Without this change in focus, our industry is going to continue to experience data breaches that, according to the 2012 Verizon Data Breach Investigations Report [3], could have been avoided through simple or intermediate security controls at an over 90% rate (including the kind of configuration and access control issues that were encountered at UNCC).

Leave a Reply

Please Answer: *