It may seem strange in a week where Megaupload’s owners were arrested and SOPA / PROTECT IP went under, but cybersecurity is the most important Internet issue out there. Examples? Chinese corporate espionage. Cyberweapons like Stuxnet. Anonymous DDOSing everyone from the Department of Justice to the RIAA. The Net is full of holes, and there are a lot of folks expert in slipping through them.
I argue in a forthcoming paper, Conundrum, that cybersecurity can only be understood as an information problem. Conundrum posits that, if we’re worried about ensuring access to critical information on-line, we should make the Net less efficient – building in redundancy. But for cybersecurity, information is like the porridge in Goldilocks: you can’t have too much or too little. For example, there was recent panic that a water pump burnout in Illinois was the work of cyberterrorists. It turned out that it was actually the work of a contractor for the utility who happened to be vacationing in Russia. (This is what you get for actually answering your pager.)
The “too little” problem can be described via two examples. First, prior to the attacks of September 11, 2001, the government had information about some of the hijackers, but was impeded by lack of information-sharing and by IT systems that made such sharing difficult. Second, denial of service attacks prevent Internet users from reaching sites they seek – a tactic perfected by Anonymous. The problem is the same: needed information is unavailable. I think the solution, as described in Conundrum, is:
increasing the inefficiency with which information is stored. The positive aspects of both access to and alteration of data emphasize the need to ensure that authorized users can reach, and modify, information. This is more likely to occur when users can reach data at multiple locations, both because it increases attackers’ difficulty in blocking their attempts, and because it provides fallback options if a given copy is not available. In short, data should reside in many places.
But there is also the “too much” problem. This is exemplified by the water pump fiasco: after 9/11, the federal government, including the Department of Homeland Security, began a massive information-sharing effort, such as through Fusion Centers. The difficulty is that the Fusion Centers, and other DHS projects, are simply firehosing information onto companies who constitute “critical infrastructure.” Much of this information is repetitive or simply wrong – as with the water pump report. Bad information can be worse than none at all: it distracts critical infrastructure operators, breeds mistrust, and consumes scarce security resources. The pendulum has swung too far the other way: from undersharing to oversharing. Finding the “just right” solution is impossible; this is a dynamic environment with constantly changing threats. But the government hasn’t yet made the effort to synthesize and analyze information before sounding the alarm. It must, or we will pay the price of either false alarms, or missed ones.
(A side note: I don’t put much stock in which federal agency takes the lead on cybersecurity – there are proposals for the Department of Defense, or the Department of Energy, among others – but why has the Obama administration delegated responsibility to DHS? Having the TSA set Internet policy hardly seems sensible. Beware of Web-based snow globes!)
Cross-posted at Concurring Opinions.
Filed under: badware, Computer crime, Digital Media, Encryption, Impersonation, Intermediaries, international, Internet & Society, Media, national security, NSA, Privacy, Scholarship, Security, Software