RSAConference | By: Bob Ackerman | Tuesday February 20, 2018
Nearly 85 U.S. universities today offer undergraduate and/or graduate degrees in cybersecurity, up from a tiny handful a decade ago. Some of these are among the top universities in the country, such as Dartmouth, John Hopkins and New York University. There are also about a dozen two-year college programs, as well as about 25 accredited online cybersecurity degree programs.
And yet the cybersecurity talent shortage continues to grow unabated and, worse yet, surveys of cybersecurity industry players consistently show that less than 25% of new cyber specialists are prepared to work without extensive hands-on training.
The reason is not simply that cybersecurity keeps changing, even though it does. Rather, it is that all the growth in degreed cyber programs – impressive as it is – does not come close to filling a workforce gap expected to widen to 1.5 million jobs globally by next year. At least one report says that demand for cybersecurity professionals has grown three-and-a-half times more than the average of other IT sectors.
As a result, folks with backgrounds as network engineers, systems administrators and programmers take a few cyber courses and are offered jobs in a bid to fill the gap. They don’t know as much as they should. Yes, they learn on the job – but by then they become attractive to other prospective employers. Nearly half of cybersecurity pros are approached by headhunters every week, surveys show.
The solution? Part of the answer is to recast university computer science programs, which are more diversified and hence attract more students, to require cybersecurity courses. After all, cybersecurity impacts virtually all IT systems. Most computer science programs today don’t mandate even one cybersecurity course, partly because of disagreements about what they need to know. This must be resolved.
The best trained cyber pros are and will remain those with computer science degrees who take cyber courses — or those with four-year cybersecurity degrees – who have a professional internship under their belt and can step into jobs in intrusion detection, security software development and attack mitigation with minimal help. Most important, they have the broad, well-developed backgrounds, including problem solving and engagement skills, required to grow and master new areas of security as the industry keeps evolving at a non-stop pace.
This will not happen overnight, of course. So here are existing efforts that should be aggressively expanded to help combat the cybersecurity talent shortfall near-term:
- Begin building a pool of talent at an early stage. Probably the best such example is the Pathways for Technology Early College High School in Brooklyn. Backed by companies such as IBM and championed by the New York City Department of Education, P-TECH students take four years of high school with a focus on science, technology (including cybersecurity), engineering and math while pursuing a two-year associate’s degree at New York City College of Technology. If they apply for a job at IBM, they are first in line for an entry-level tech job.
The goal is 100% completion of an associate degree within six years. Hundreds of students attend the program annually. Students only need to live in the area and be interested in science or math. The program has been expanded to more than 60 schools globally.
- Create more “new collar” jobs. IBM is also addressing the cyber talent shortage by creating what it calls “new collar” jobs. These prioritize skills, knowledge and willingness to learn over degrees. New collar employees pick up the necessary skills through on-the-job training, industry certifications and community college courses. They represent 20% of IBM cybersecurity hires since 2015.
- Build more cyber boot camps and improve them. These intensive programs train people in key cyber skills and help them land jobs. Camps include Securest Academy in Denver, Open Cloud Academy in San Antonio and Evolve Security Academy in Chicago. In addition, an interesting hybrid between a boot camp and a community college program is the City Colleges of Chicago, which offers a free cybersecurity training program for the military and civilians.
These programs could be better, however. A key problem is that they don’t adequately prepare students for senior cybersecurity positions.
- Incentivize more colleges and universities to compete in cyber contests. They would strive to ultimately compete in the annual National Collegiate Cyber Defense Competition in San Antonio and along the way compete in regional competitions. A fair number of universities already compete and periodically win, at least regionally, including the University of Maryland, Arizona State University, Brigham Young University, Rochester Institute of Technology and the University of Washington. But the more the competitors, the better. New talent is typically motivated by exciting challenges. A student who starts out as a hacker may become a sophisticated problem solver.
- Encourage more technology giants with cybersecurity units to team up on some fronts to improve cybersecurity efficiency. Cisco and IBM, two of the top 10 acquirers of cyber startups, recently announced a partnership to help their customers rationalize a surfeit of cybersecurity tools. A 2017 Cisco survey of 3,000 security officers found that 65% of their organizations use between six and 50 different security products. Managing such complexity challenges over-stretched security teams and can lead to potential security gaps. Ultimately, Cisco Security Group and IBM Security’s relationship is focused on helping enterprises reduce the time required to detect and mitigate threats.
- Intensify the push into AI to automate security. Given the interminable talent shortage, speedy adoption of AI is required. If computers can do the basic work, humans can focus on decision-making and incident management. Among impressive partnerships between computers and humans is MIT’s Computer Science and Artificial Intelligence Lab, which reviews data from tens of millions of logs daily and singles out anything suspicious. Human analysts then check out false positives and log legitimate threats. The system, which learns from its mistakes, has been able to detect most attacks without human help.
Near-term, business, government and communities should augment these efforts by improving the skills of available talent. Enterprises need to implement new skill development training programs to help existing cyber pros remain sharp as hacking technologies continue to evolve.
Mid and longer-term, universities need to partner with corporations and governments to create a formal template – probably requiring creation of a cybersecurity educational entity – to determine how to best train cyber professionals. They must require that computer science majors in particular and select other STEM-related majors enroll in some basic cybersecurity courses. Accompanying cybersecurity work-study programs should be created and aggressively marketed.
Most established fields have a relatively standardized core curriculum focused on developing critical and foundational skills deemed necessary for their practitioners. This includes doctors, lawyers, accountants and engineers, among others. Why should cybersecurity pros be any exception? The U.S. must have more, better-trained cyber talent. There is no other option.
Posted on February 20, 2018