Discover PerformanceHP Software's community for IT leaders // March 2014
Security in 2014: New strategies against stronger foes
Security is more than the perimeter, says HP Security Research’s Scott Lambert. Understanding how systems and devices access information, and what really needs protecting, is key.
Last year closed with dispiriting news: a massive security breach affecting such retailers as Target and Neiman Marcus showed that even as defense costs rise, cybercriminals are more successful than ever. But in 2014, there’s plenty we can do to offset the outlays of time and money that effective security requires—if the business will let us.
To help security best practices take hold in our organizations, the whole business has to get on board, explains Scott Lambert, director of threat research for HP Security Research. Discover Performance recently interviewed Lambert to find out how organizations can expect to be targeted by attackers, and why some of the biggest roadblocks to success will come from within.
Q: Enterprises are dealing more and more with targeted attacks. Which types are most prevalent?
Scott Lambert: There are a number of vectors adversaries are using to compromise systems. Primarily they’re operating through vulnerabilities in web browsers or their installed plug-ins, or through other client software running on a given computer.
Targeted attacks occur for various reasons, such as social-political issues or economic gain, and as part of both corporate and government espionage efforts where information is the dominant currency. Generally, enterprises find themselves a target for intellectual property theft, though there is a growing trend of enterprises being attacked to gather information in support of a different target altogether—for example, a subcontractor’s data stolen to gain information that will later be used to infiltrate a government agency’s systems.
In 2013, several software vendors reported through their patch release processes that previously unknown vulnerabilities, or zero days, were being used to carry out targeted attacks. Between December 2012 and December 2013, the vendors we track reported that at least 22 zero day vulnerabilities were used in targeted attacks. Many of the attacks appeared to have leveraged either spear-phishing or watering-hole tactics to achieve an initial compromise.
While zero days continue to be a growing and very effective tool used in targeted attacks, a significant portion of system compromises still occur because of deployed software that is either unpatched or poorly configured. Unfortunately, our struggle to keep systems patched continues to provide an easy avenue by which adversaries compromise systems.
Q: Patching isn’t a new obligation for IT. Why aren’t we doing a good job at it?
SL: This is actually a pretty complicated question, and the reasons differ based on a given organization’s circumstances and maturity. Keeping software up to date requires several parties to fire on all cylinders—affected software vendors, IT departments, and even other software developers who use a component that needs to be updated. It’s easy to see how the complexity of the patching process increases when you consider each of these parties, let alone when you combine them.
Over the years, high-profile software vendors have worked to make the update process as seamless as possible. Microsoft has been a pioneer, for instance. But keeping software effectively patched isn’t a problem we can solve overnight, and there will always be room for improvement. For vendors, two keys are to establish a regular cadence that IT departments can set a schedule around, and to make updating as transparent as possible by enabling automatic updating.
For IT departments, the initial challenge is around visibility. It has gotten harder to know what to patch now that, in addition to "internal IT," there are systems and devices outside IT’s direct control—cloud services and infrastructure and employees’ devices—accessing sensitive data. Now add the challenges of scheduling changes to production systems with other affected parties, having a validation effort in place to mitigate the chances of bringing down critical infrastructure as a result of applying a patch, and the fact that large organizations will always have more patching to do than staff to accomplish the task. It’s not easy.
Unfortunately for defenders, there will always be more work to do than we have resources available. That makes understanding how you or others are being compromised, or are likely to be compromised, an imperative when prioritizing your efforts and fortifying your defenses.
Q: What types of threats are overhyped?
SL: Instead of worrying about classifying threats as overhyped, defenders should be focused on taking a more risk-based approach to security that takes into account their own company’s specific needs. Consider all threats in the context of your organization’s current and future IT infrastructure, its existing defenses, and what mechanisms adversaries are currently leveraging to carry out successful attacks. Only then can you judge whether a given threat is "overhyped" —or rather that it poses less risk to your organization than it might to another organization.
Defenders can’t control whether an attacker will attempt to use a given threat against their environment, but they can control whether their defenses are capable of thwarting a given threat. The real battle is to understand which threats pose the greatest risk to your organization, and to use that information to prioritize investments that address the gaps in your defenses.
Q: Can you elaborate on this?
SL: Here’s an example. I remember sitting in a weekly "war room" meeting at a particular company where their director of security operations was discussing the metrics associated with the number of unpatched vulnerabilities in the environment, and how quickly or not those were being patched by the IT organization. The CSO praised his IT team’s impressive responsiveness to patching what were labeled as "critical and high-severity" vulnerabilities, but as he asked about the actual risk those prioritized vulnerabilities posed to the organization, he concluded that his team hadn’t done a good job of understanding the organization’s real exposure. For him, his IT department was performing little more than "check-box" security.
Now, one could easily argue that "critical and high-severity" vulnerabilities should be patched first, considering the impact they can have to a given system and the software it is running. This CSO considered the possibility that some of those vulnerabilities may have been overhyped, given the context of their specific organization. He was asking his team to look at the bigger picture and make sure the company was making the best investments with the resources available.
Q: In our recent conversation with HP’s Art Gilliland, he made clear that we need to stop worrying so much about people getting in, and spend more resources protecting sensitive information. Is that intellectual shift a difficult one for enterprise security?
SL: It is one of many difficult challenges in front of us, but it’s certainly not a new one. We in the security industry have never said to focus on prevention alone. Historically, the industry has called for a defense-in-depth approach, in which detection and remediation are even more important for ensuring business continuity and minimizing losses.
Take a look at any of the recent publicly disclosed breaches, and it’s not hard to see that this is a balance with which defenders continue to struggle. A target must be able to detect and remediate a compromise within minutes or seconds—not hours, days, or years. Unfortunately, deploying solutions that address that part of the equation is non-trivial and expensive, which is one of many reasons you see imbalance between prevention and other forms of defense. And in recent years, the transformation of IT to a more service-oriented entity—one in which sensitive information simultaneously resides both within the perimeter and increasingly outside on private and/or public cloud services and infrastructure—has further forced the issue.
The adversary will always seek out data or information of interest, regardless of where it resides. As data and information move into the cloud, attackers are adjusting accordingly. In 2012, the industry saw malware targeting users of cloud-based payroll services, and we saw malware leveraged to sabotage internal database stores. Two types of incidents mentioned in 2013 are telling examples of what is to come in 2014 and beyond—namely, malware variants and specific attacks on SaaS offerings to support reconnaissance and exfiltration of sensitive data, as well as outright data destruction. In April of 2013, industry researchers saw a targeted attack leveraging a vulnerability in a Microsoft SaaS offering to gain access to an organization’s Microsoft Office site (for example, a hosted SharePoint site); similarly, we also saw mentions of new Zeus variants targeting users of Salesforce.com for the purpose of data exfiltration. Meanwhile, in September 2013, the industry witnessed an insidious outbreak of the Cryptolocker ransomware, which takes control of data in a manner that leaves defenders with two choices: either pay the ransom or hope that your backups mitigate the data loss.
Q: What strategies will get us there?
SL: First off, it can’t just be the security team that’s involved. The corporate culture must support your security initiatives. Success hinges on a "we’re all in" attitude. If that attitude is not something your management is instilling across the organization, you’ll end up failing.
Also, security teams need to take a step back from the traditional security controls and ask themselves how they would know that they’ve been breached. How would they know that the sensitive information they are charged with protecting has been compromised or illegally accessed or exfiltrated? Answering these questions will undoubtedly expose gaps. Which of those gaps should be addressed first is best decided by following a risk-based approach to security, as I described before.
Lastly, security teams need to embrace learning from other teams in both similar and dissimilar organizations across the globe. Cybercriminals are quite efficient at sharing intelligence among themselves about what works. It’s time defenders start collaborating and taking back ground by sharing intelligence that is both illustrative and actionable—the latter being the most critical. Thankfully, a number of industry efforts are underway to better facilitate this sharing, and to help ensure it delivers on the intended value proposition.
Ponemon’s 2014 Cost of Cyber Crime study
Join thousands of IT execs, engineers, and solution experts to explore IT trends, strategies, and best practices. (Barcelona,
HP Software’s Paul Muller hosts a weekly video digging into the hottest IT issues. Check out the latest episodes.
Preparing today for tomorrow’s threats.
Introduction to Enterprise 20/20
What will a successful enterprise look like in the future?
Challenges and opportunities for the CIO of the future.
Dev Center 20/20
How will we organize development centers for the apps that will power our enterprises?
Welcome to a new reality of split-second decisions and marketing by the numbers.
IT Operations 20/20
How can you achieve the data center of the future?
What the workforce of 2020 can expect from IT, and what IT can expect from the workforce.
Looking toward the era when everyone — and everything — is connected.
Data Center 20/20
The innovation and revenue engine of the enterprise.