Even the good news is bad news.
While Joshua Corman didn’t use that exact line in his opening keynote at SOURCE Boston this week, that was a pervasive, and sobering, theme.
Corman, a founder of I am The Cavalry and director of the Cyber Statecraft Initiative for the Atlantic Council, said he was there to tell some “uncomfortable truths” about the state of cybersecurity – among them that, “the critical infrastructure of our space is too big to fail, and it’s failing.”
He said the current statistics are depressing enough – that the database of CVEs (Common Vulnerabilities and Exposures), “which is the predicate for all of our intrusion detection,” holds only about 80 percent of those in existence, and that there is security “coverage” – blocking or detection technology – for only 60 percent of that number. “So you’re at 60 percent of 80 percent,” he said. “At best, you’re getting about 50 percent coverage of the knowns. When you make a risk decision, you’re doing it with a 50 percent blind spot.
“This is a too big to fail thing. It’s like our bridges and tunnels collapsing,” he said, adding that there is even less coverage for industrial control systems (ICS) and medical devices.
“And it’s about to get a lot worse,” he said.
In the cases of ICS and medical devices, the risks go well beyond identity theft, compromises of credit card information or loss of privacy. “There is no coverage of vulnerabilities that could have a kinetic effect,” he said, citing the February 2016 ransomware attack on Hollywood Presbyterian Hospital that took down its computer systems for a week until it paid a $17,000 ransom. That, he said, was the result of, “a single Java flaw in a single library.”
The potential life-and-death implications, he said, are obvious. “Can you imagine coming to the emergency room and being told you have to go somewhere else, when seconds count? Or if you had surgery scheduled and it had to be canceled?”
Corman, who is serving on a task force on cybersecurity in healthcare that was generated from the passage of the Cyber Information Sharing Act (CISA), said he initiated discussion of a real-world example of how crucial cybersecurity is in that industry.
“I asked why the death count from the Boston Marathon bombing was so low,” he said. “The answer was that it was blocks from the best medical facilities in the world.
“But what if you combined a cyber attack with a physical attack?” he said, adding that when he raised the question, “everybody in the room turned white.”
The point, he said, is that healthcare cybersecurity, “is in critical condition” for reasons that have been documented for more than a decade:
- A severe lack of security talent. “The reality is that 75 percent to 85 percent of health care organizations lack a single qualified security person,” he said. “Nobody to apply patches or receive threat intelligence. Which defies logic.”
- Legacy equipment. “Windows XP (which Microsoft no longer supports) is usually the best-case scenario,” he said. “We’re in a bad way, defending really old, difficult to manage systems.”
- Premature overconnectivity. “This is the government’s fault,” he said. “The intent was good, but the problem is there is no threat modeling or secure design. These things were never designed to be connected to anything and are connected to everything.”
- Vulnerabilities that impact patient care.
- An epidemic of known vulnerabilities. “One single medical technology was found to have more than 400 of them,” he said.
These problems are solvable, he said, “but we’re not focused on them.” He said he fears that needed focus won’t come until there are more high-profile incidents – a bit like water pollution wasn’t taken very seriously until the Cuyahoga River in Ohio caught fire several times decades ago.
There are some encouraging signs from government, he said, noting the two-year amendment to the Digital Millennium Copyright Act (DMCA) that allows (with rigorous restrictions) the unlocking of copyrighted software for research purposes.
He also cited the recent Food and Drug Administration’s publication of post-market security guidelines for medical devices, which requires flaws to be acknowledged within 30 days and patched within 60 days.
Some regulation, he agreed, could conceivably put a drag on innovation. “Some of these things I hate,” he said, “but the tradeoff is between doing something or nothing.”
And doing nothing, he said, could lead to something more serious than websites going down due to a distributed denial-of-service attack.
“The consequences of failure involve death,” he said.