We have been following the same cybersecurity approach, more or less, for over a decade. Yet, most everyone agrees that the problem continues to grow worse. Perhaps we are not on the right course. Maybe we are operating on false assumptions. The following list (to be continued in next month’s column) is meant to promote a dialogue about what, in my view, are widely held cybersecurity myths.
According to frequent headlines in the press, cybersecurity is an issue that has seized the attention of corporate boards and the executives who report to them. The reality is probably more nuanced. Although the largest companies in some sectors are engaged in extensive risk management efforts, the broader business community in the middle market remains at best uneven in its response, says Matthew F. Prewitt, partner with law firm Schiff Hardin in Chicago, chair of Schiff Hardin’s data security and privacy team and co-chair of the trade secrets and employee mobility team.
In 2009, Heartland Payment Systems announced that it had suffered a devastating breach: 134 million credit cards were exposed through SQL Injection attacks used to install spyware on Heartland’s data systems. The company processes payments for debit, prepaid and credit cards, in addition to online payments and checks and payroll services.
Ask most corporate executives to define cybersecurity and their initial thoughts turn to data privacy. That’s for good reason. Companies are bleeding corporate trade secrets and personally identifiable information at such an alarming rate that confidentiality issues and related compliance concerns can’t help but dominate the cybersecurity agenda. Yet, ask cybersecurity professionals what keeps them up at night, and the topic invariably turns to data deletion, tampering with control systems, and the potential to cause physical harm over the Internet. These concerns fall into categories that are distinct from protecting data confidentiality. Instead, they demonstrate the importance of maintaining an enterprise focus on the integrity and availability of your company’s most essential data, systems and services.
When looking at the cyber technology market over the past 15 years, it is evident that the catalyst for cyber evolution was Y2K. Prior to the Y2K frenzy, “cybersecurity” was masked in the systems engineering function, and external threats consisted of hackers looking to leverage free computing capabilities with very little focus on information/data access or network destruction.
From an executive-level perspective, the greatest shift in cybersecurity relates to the focus and the responsibility – moving from strictly an “IT issue” to one of a business function. Look no further than the Target breach and the subsequent resignations of the company’s CEO and CIO to see how cybersecurity has escalated to the C-suite. This was unprecedented 15 years ago, when the primary cybersecurity role of IT was information assurance. So why has the philosophy changed?
Establishing command and control gives the power to professionals so they can properly assess the risks and determine which threats pose the greatest danger and must be considered a high security priority. Authority also requires that they identify potential threats that may be considered “acceptable risks” to the organization – meaning they are worth keeping an eye on, but don’t warrant a significant security investment.