If there’s one constant in cybersecurity, it’s that security leaders embrace new opportunities — and confront unprecedented challenges — with each new year. Emerging technologies, new attack vectors and even geopolitical crises shape the day-to-day evolution of the chief information security officer’s job. 

In 2024, there will be no shortage of embracing the new for CISOs. Security leaders must contend with the application of artificial intelligence in both cyberattacks and defense; a U.S. election cycle with a major focus on election security and fraud prevention; a new SEC disclosure rule requiring publicly traded companies to report material incidents; and a global geopolitical environment fraught with conflicts.   

All of these factors bring risk and even fear to employees at their companies, customers, and customers’ customers. In 2024, CISOs must strive to build trust and confidence in the face of these uncertainties. 

Crypto comeback

The recent approval of Bitcoin ETFs has spurred cryptocurrency to significant gains in value, rising out of the trough of 2022. In 2021, when cryptomania gripped the globe, crypto-related theft and fraud skyrocketed. Once again, security leaders can expect to see new malware delivery that plants crypto miners, account takeovers at cloud service providers, and crypto-related “smash and grabs” where wallets left without the right protections end up being emptied. And the international groups fighting ransomware will stay very busy.

Add in the proliferation of cloud-based-GPUs driven by the expansion of generative AI, and there are a combination of factors that will lead to significant security challenges at the intersection of cryptocurrency value — especially when considering that GPUs are some of the most efficient pieces of hardware for mining and AI-produced content is often traded in crypto. 

AI friend or foe? 

From a cybersecurity and fraud perspective, AI has created an increasingly tricky environment. On a positive note, AI will likely complement security defenses by accelerating threat detection and response. On the other hand, AI can also be misappropriated to create even more sophisticated attack vectors, particularly in social engineering and fraud.

Fraudsters with access to cheap AI will perfect their English or other target-language grifts and become more successful at exploiting vulnerable populations like the elderly, poor, or internet users in developing countries. This will create big questions we must wrestle with as a society: Is maintaining safe use of AI the responsibility of its creators? How expensive will it be to maintain safety, and who will oversee it? Who pays for damaging fraud schemes?

Similar to other challenges in harmful content creation and content risk management, this will force CISOs to carefully navigate the risks and rewards that come from AI implementation. 

Geopolitical collateral damage

At the outset of each international conflict in recent years — primarily Russia’s invasion of Ukraine and the Israel-Gaza conflict — security leaders observed a surge in cyberattacks. Some of that was directly related to the wars, but much was driven by activist groups looking to influence political or military outcomes. 

It’s a trap for CISOs to assume that they have nothing to worry about if they aren’t part of a governmental or military supply chain.

Malware is often poorly targeted or intentionally broad to attempt to land at its final target, with little consideration for the potential of collateral damage. The ongoing conflicts, or potentially new ones in 2024, will continue to put pressure on CISOs to consider threat vectors beyond the immediate risk posture of their business. 

Election-year challenges

It’s been eight years since misinformation campaigns tainted the 2016 U.S. elections. And while the same tactics may not be directly employed this year, the 2024 election cycle will likely come with its own set of challenges. Are content platforms like ‘X’ ready to take on the responsibility of monitoring and managing disinformation or misinformation that could have a direct effect on an election? 

And that’s not all. Objectionable content, employee communications and customer relationships have all become part of the election-year business narrative. GenAI could amplify these challenges. Think that’s not a CISO’s concern? That’s worth reconsidering. Employee and customer trust are often a direct outcome for a CISO, and where cloud platforms are involved, trust has a much wider reach.

SEC material incident filings

In December 2023, the SEC gave CISOs something new to sweat over with its newly implemented rules that require businesses to report cybersecurity incidents that are material in nature within four business days. 

That means CISOs must be able to help determine materiality and all of the nuances that come with that one powerful word. That requires a strong understanding of how a business operates, markets itself and records and collects revenue. 

One outcome is that CISOs may increasingly find themselves working side-by-side with Chief Revenue Officers and Chief Marketing Officers in addition to their technology and finance counterparts. To be successful, a public company CISO must be able to translate the deep technical decisions they make into business outcomes around the customer lifecycle or product development.

Counteracting FUD with trust

If there’s one thing the internet is very good at, it’s spreading fear, uncertainty and doubt at an alarming rate. CISOs have a role in counteracting this force. In the face of a constant drumbeat of negativity, security leaders must exercise our strengths for positive change and outcomes. 

While risk posture may look different in 2024, it is no more dire than other points in modern history. CISOs must step into the role of a truly cross-functional and cultural leader within their companies to strengthen trust with employees and customers.