Back to Basics to Address Evolving Cyber Threats
The final entry in this four-part series really maps closely to the objectives set forth by National Cyber Security Awareness Month, focusing on what is really required to effectively protect systems against cyberattacks. Concerns about evolving cyber threats –a major theme of this article series – have the industry increasingly talking about this topic. Although the threat landscape has dramatically changed, in many instances system vulnerabilities have remained the same over the past 15 years. That’s not to say that the volume of attackers or sensitive information stored in these systems hasn’t significantly increased. The problems that organizations are facing are not necessarily new vulnerabilities, but more so the heightened attention paid to address those issues.
A prime example is the recent JPMorgan Chase data breach, which impacted 76 million households and seven million small businesses. Although it is widely reported that this successful attack came from an organized cyber crime ring out of Russia – a threat scenario that was probably unthinkable 15 years ago – the attack is a case study that has been plaguing the industry for years: how to detect and block a persistent attack. This situation has been witnessed over and over again, from Target and Home Depot to Albertsons and Dairy Queen, leading many to ask how organizations can address these cyber threats.
Going Back to the Basics
The previous article in this series spoke to the advancement of cyber tools and technologies, all the shiny new objects that organizations have at their disposal. Yes, cyber technology is progressing for the better, and the market is in much better shape. That said, organizations are often more concerned about buying and integrating the latest and greatest into their security posture. In actuality however, the ability to take a step back to address the basics might serve them more effectively.
Chairman and CEO Robert Carr of Heartland Payment Systems has been outspoken about this fact given the recent retail and financial institution breaches. In 2008, his organization suffered what was, at the time, the largest breach in history, with 130 million debit and credit card accounts accessed. Carr went back to the basics after the breach, implementing end-to-end encryption and tokenization into the Heartland security infrastructure. These technologies are not new, and they certainly are not considered “sexy,” but they are strong, stable solutions that are often overlooked for the “next big thing.”
Organizations need to take a step back and look at their infrastructure and assets, while prioritizing vulnerabilities by potential impact in order to fix the worst problems first. Instead of turning to the latest emerging technology, they should ensure that they have the foundational tools in place to understand 100 percent of their systems inventory, manage configurations, scan for vulnerabilities regularly and identify the most significant weaknesses for immediate remediation.
Consider Continuous Monitoring
Although going back to the basics is a priority, organizations should also consider integrating a continuous monitoring approach. Continuous monitoring – which was made mainstream by the Department of Homeland Security (DHS) with its Continuous Diagnostics and Mitigation (CDM) program – is a process of conducting ongoing, real-time checks for compliance and risk, providing an accurate, near real-time state of network security. The key to continuous monitoring is implementing the tools and processes to understand and manage the hardware and software inventory of the enterprise while scanning routinely to identify and remediate vulnerabilities.
The government is really leading the way with the CDM initiative. CDM, and its $6 billion contract vehicle, set the stage to help fortify federal “.gov” networks – and the often classified, sensitive and personal data that resides on those networks. The commercial market is also looking to adopt a similar concept, but is challenged by the dynamic and consistently changing network environment they face due to acquisitions, mergers, expansions, retractions and the like. It is important to transition from focusing purely on compliance reporting towards combating threats consistently and proactively by enumerating the worst problems and prioritizing their remediation based on impact.
No matter how comprehensive an organization believes its cyber systems are designed to protect against new and emerging threats, continuous monitoring enables them to remain a step ahead to close the most problematic attack vectors.
A lot has transpired in terms of cybersecurity technology advancement, mostly for the better. These advancements, however, have sometimes led organizations astray in securing their infrastructure and operations. It is still important to pursue new approaches, such as continuous monitoring, and emerging technologies, but it is just as critical to take that step back to conduct the due diligence to validate foundational capabilities are in place.
Previous articles in this series: