#HTE
Why Computer Science Programs Don’t Require Cybersecurity Classes
Last week, security firm CloudPassage issued a report revealing that most of the top ranked computer science programs in the United States don’t require a cybersecurity course. The news has attracted considerable attention as a sign of universities’ inadequate commitment to training a cybersecurity workforce for the future. “The threat of hacking seems to lurk around every corner, but American universities may not be doing enough to prepare the next generation of cyberdefenders,” wrote Andrea Peterson in the Washington Post.
But that criticism is largely unwarranted, both because computer security is often a module in courses on other computer science topics, such as networking, programming, and operating systems, and also because, despite the frenzy of efforts to try to educate cybersecurity experts, we still know very little about how best to teach them—or even what they need to know.
That hasn’t stopped the United States from committing a lot of money to the problem. The $19 billion cybersecurity initiative President Obama proposed earlier this year included $62 million for a new CyberCorps Reserve program that would train and provide scholarships to people who take cybersecurity jobs with the federal government. It’s the latest in a string of efforts by the United States and other countries to recruit and train more cybersecurity workers, with projections suggesting there could be as many as 1.5 million unfilled jobs in the field by 2020. Specialized cybersecurity degree and scholarship programs are cropping up everywhere—including at my own university, the Rochester Institute of Technology—and growing rapidly in size, filled with students who want to acquire a relevant and employable skillset.
But what is that skillset? What are the things that every person training to be a cybersecurity professional over the course of the next decade should know? There’s surprisingly little consensus around this question—or perhaps it isn’t surprising at all, given how new the phenomenon of cybersecurity training programs is and how rapidly the field is changing.
Of course, people in all professions—from doctors to lawyers to engineers—have to deal with new discoveries and developments in their field. But most established fields have a fairly uniform set of expectations around what constitute the critical, foundational skills and modes of thinking that you will learn regardless of where you go to school to become a surgeon, corporate lawyer, or mechanical engineer. Even computer science, a relatively recent addition to the academic world, has developed a reasonably standardized core curriculum that students are expected to learn—and that teachers know they should be teaching. These days, you wouldn’t receive a degree in computer science without some background in algorithms, programming, systems, and networks—and many of your core courses on those topics would likely include some material on the relevant security issues. After all, figuring out how to secure networks is, in many ways, a totally different skillset from thinking about how to write secure code or design secure devices.
Still, it’s possible that requiring a security-specific course could be valuable—after all, I teach cybersecurity courses, so I’m certainly not opposed to them, on principle. But before we pour too many more millions of dollars and thousands of students into these programs, shouldn’t we have some clearer sense of what they should be learning about security, specifically? Do they need to be trained in software security? Hardware security? Penetration testing? Anomaly detection? Security economics and metrics? Policy and international conflict?
This is surely a challenge faced by every new and emerging field—defining the essential body of knowledge, breaking it down into units and courses and curricula, casting envious looks at the well-established divisions and sequences honed for centuries in fields like mathematics, chemistry, or foreign languages. What makes cybersecurity different is that there has been a very short grace period for figuring out how it should be taught and what the workforce of the future should know. Computer security is perceived as such a devastating and dangerous threat—with the potential to bring down the power grid, cripple national security, cost the economy billions, and destroy any notion of personal privacy—that we’re going from nothing to trying to train thousands and thousands of workers pretty immediately. Even as those of us who teach it are wrestling with these questions and developing tentative curricula, training programs for those of all ages and expertise levels are multiplying everywhere in the world, built on very different models.
In Israel, a training program called Magshimim offers after-school training in cybersecurity to middle- and high-school students from groups “in the periphery” that are underrepresented in the IDF’s Intelligence Corps. In the U.K., an elaborate suite of somewhat bizarre computer games and online puzzles called the Cyber Security Challenge is used to cultivate and identify talent in the field. In China, the People’s Liberation Army runs its own Information Engineering University in Zhengzhou, and earlier this year Chinese philanthropist Henry Cheng announced a donation of about $45 million to the China Internet Development Foundation to support additional cybersecurity training. And in addition to the newly proposed CyberCorps Reserve, the United States already has a CyberCorps scholarship program, as well as a cadre of other initiatives run out of the National Initiative for Cybersecurity Careers and Studies.
At a moment when almost every area of cybersecurity policy is contentious—from law enforcement’s access to encrypted smartphones to the appropriate channels and protections for threat information sharing—one thing many governments seem able to agree on is the need for more cybersecurity professionals. Terrified that their countries will get left behind in the rush to hire security workers in both the public and private sectors, policymakers are funneling more and more money into cybersecurity workforce training and capacity building.
Everyone is certain that they need a much larger cybersecurity workforce than they currently have, but no one is sure what that workforce will need to know, or how to go about teaching them. Figuring that out will take time—time for lots of trial and error, and disagreement and careful assessment and comparison of all the different models currently being piloted around the world. And over that time we may also find out that we’ve been training people the wrong way, if it turns out they don’t have the skills they need to protect against whatever turn out to be the most viable and imminent threats posed by computer technology. Ideally, of course, that period of figuring out what works and what doesn’t work for workforce training would not coincide with trying to train more than 1 million people to fill cybersecurity jobs, but the ramp up in numbers seems pretty inevitable at this point.
In light of that, it would be nice to see policymakers in the United States and elsewhere focus a little more attention—and funding—on assessing and comparing cybersecurity education programs, as well as the quality of the students they churn out. The frenzy over training a sufficient quantity of cybersecurity workers may come at the expense of the quality of that training, and the desire to profit from providing that training may lead to too much competition in a space that would be better served by collaboration and cooperation right now.
Developing high-quality cybersecurity training programs will require us to be smart about drawing on existing expertise and experience being garnered—sometimes painfully—in the private sector, the military, the civilian branches of government, and overseas. We need to pay attention to content and take advantage of all the hard lessons we’re still learning about what online threats can look like and how they can—and cannot—be defended against. We need to try to be both creative and rigorous not only in designing training programs but also in assessing them and the ability of their graduates to protect us against the threats of the future.
Our future cybersecurity workforce will only be as good as our training programs, and right now no one—not even those of us who teach in them—really has a clear sense of how effective those are. And how could we? For the most part, we’re making it up as we go along.
This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.
http://www.slate.com/articles/technology/future_tense/2016/04/why_computer_science_programs_don_t_require_cybersecurity_classes.html