High-profile events, including SolarWinds and the Colonial Pipeline attacks, have catapulted cybersecurity and information technology into the limelight. It has captured the attention of organizations and the government alike. In fact, preventing potentially devastating cyberattacks — whether for financial, intellectual property or nation-state secrets — has become a top priority for the Biden administration. 

The Cybersecurity and Infrastructure Security Agency (CISA), for example, has urged U.S. organizations to put their ‘Shields Up’ and the White House requested a budget increase of $10.9 billion for cybersecurity-related spending in 2023. Cybersecurity went from being a tick in the box to becoming an integral part of every organization.

Now the question is — how does higher education support our next generation of cyber warriors? It’s a much needed profession that must incorporate a breadth of general IT skills with specific security training too, but we have a long way to go in attracting and supporting students interested in this career path. Higher education institutions need to be better equipped to defend themselves too. Recently, Lincoln College in Illinois shuttered its doors after 157 years due to the combination of the pandemic and a ransomware attack.

Why Cybersecurity Education Is Needed 

A cyberattack occurs every 39 seconds. This might be the only statistic required to answer the question of why cybersecurity education is needed. But it’s not just the pure volume of attacks; it’s the impact they are having. Last year, a cyber hack led to the largest fuel pipeline in the U.S. being shut down, and fuel shortages across the entire east coast. 

But cybersecurity and IT education can’t only be looked at through the lens of supply and demand to fill these jobs for the public and private sectors. It’s also up to our nation’s college students to seek degrees related to this field. Generally speaking, most students do some background research on the fields they are interested in to see if education in that area is a worthwhile investment. If the field is growing, if it pays well, these are factors that come into play. 

It won’t take much research on cybersecurity to know that, yes, the field is growing much faster than average at 33% between 2020 and 2030, according to the U.S. Department of Labor. And yes, it pays well, with a median pay of $103,590 per year. General STEM degrees seem to follow this growth, as the number of degrees continues to go up yearly

With the market and student interest in the right place, the responsibility for cybersecurity education then falls to the higher education institutions themselves. Many schools do offer degrees that prepare our nation’s youth to become software developers and engineers, but few properly prepare students for the cybersecurity component of this field. 

The Current State Cybersecurity and IT in Higher Ed

Now the question is — how are the U.S.’s higher education institutions preparing students for jobs in a field that is expanding exponentially? Unfortunately, not enough colleges and universities are focusing on this growing field. As recently as 2019, only 3% of U.S. Bachelor’s degree graduates had any cybersecurity-related skills. The key here is to understand why this percentage is so low and what we can do about it as an industry. 

Until recently, cybersecurity wasn’t the worry it has quickly become. Add to this that cybersecurity is constantly evolving, with new tools and practices being implemented frequently. This creates a challenge for higher education institutions to develop a curriculum around cybersecurity and prepare the next generation of IT professionals. 

Tactics to Prepare the Next Generation of IT 

As colleges update their curriculum and add cybersecurity training and degrees, here are some tactics they should implement to truly and properly prepare students to be professionals in this field:

  1. Adjusting Academia: As we teach students how to write and deploy code, we need to teach them how to do so securely. So much of the code being pushed today is insecure, laying the groundwork for attacks and breaches to happen with great frequency. A huge step to improving this is to code securely from the start. This is an essential piece of any software-related degree today. Moreover, cybersecurity should be an element of any business degree — everyone needs to know the basics. 
  2. Hands-On Experience: Colleges and universities can put together lab environments for students where they learn to collect data, conduct forensics, and remediate flaws. These labs need to offer real cybersecurity tools and solutions, from those that scan for vulnerabilities to those that manage patch management. Nothing can prepare a student like real-world experience. Cybersecurity labs that leverage real tools and real flaws are needed so students can take what they learn in the classroom and apply it. 
  3. Partner with Leading Cyber Firms: There are hundreds of cybersecurity firms in the U.S. alone. Colleges and universities can look to partner with these organizations, providing internship/college credit experiences that give students a real look at a day in the life and further their hands-on experience.

The state of cybersecurity in our country is changing rapidly. The need to prioritize cybersecurity emerged quickly, and it will take some time before we have the supply of professionals, methodologies, and tools, to meet this challenge head on. A critical component of improving cybersecurity in the U.S. belongs to higher education institutions, which must prepare the next generation of IT and cybersecurity professionals with the education and training necessary to be ready to secure our increasingly connected world.