It’s the season for “cyberthreat” information sharing proposals. There’s the White House plan, announced in January. There’s the Cybersecurity Information Sharing Act, or CISA, which passed out of the Senate Intelligence Committee on a 14-1 vote earlier this month. And yesterday the House introduced the Protecting Cyber Networks Act.
Which one should you support? The answer is “None”. Let me show my work. It’s a two step process:
Step One: Does the bill waive liability if entities share protected personal information? You’ll know because the waiver language will read something like, “notwithstanding any other law…” If there’s a liability waiver (and all three have them), go to step two.
Step Two: Does the bill narrowly define which categories of protected personal information nevertheless may be shared, and specify that such sharing must be necessary for mitigating the security risk? If not, then the bill is bad and must be voted down.
Every one of these three proposals throws industry a bone by waiving liability for violating even our very inadequate privacy rules. And none of these three proposals narrowly and specifically identifies the categories of information that Congress wants to allow to be shared, despite privacy rules. If it doesn’t do this, the bill is, as Senator Ron Wyden aptly put it following his no vote on CISA, “not a cybersecurity bill – it's a surveillance bill by another name.” Don’t we have enough domestic surveillance already?
Information sharing on its own isn’t going to solve the network security problem. But it is a factor, and it’s a relatively easy start. I’ve spent my career advocating again rules that inhibit security information sharing, and even wrote a law review article about it in 2005. Pretty much everyone agrees that vulnerability information sharing is a good idea. So what does Congress need to do to get it right?
First we need to be clear about what we are talking about sharing. We are talking about sharing vulnerability information: software flaws, virus signatures, threat signatures--stuff that system administrators need to know to check and protect their systems from attacks that others have identified or suffered.
Here are real world examples of vulnerability information, the kind of information we want to share more of. This is how you can tell if you are infected with a malicious keystroke logger. And this is for an exploit kit infection spread through websites.
Congress should take a close look at these and other bulletins. You’ll note that there is no personally identifying information in the notification. Note that there is no information protected by privacy laws like the Wiretap Act, the Electronic Communications Privacy Act (ECPA), the Family Educational Rights and Privacy Act (FERPA) or any other in there.
The threat signatures sometimes do, as with the exploit kit, list domains and IP addresses used to host the malware. That information helps potential victims block malicious incoming connections. But those IP addresses are not protected from sharing by ECPA. ECPA doesn’t stop IP addresses from being shared with private entities. And ECPA only protects (lightly) IP addresses of subscribers to or customers of certain publicly offered services. The malware delivery server will very rarely fit that definition. I’m not saying there are never going to be situations where something that fits the definition of vulnerability information is legally protected from voluntary sharing. I’m saying that is by far—by far—the exception and not the rule.
Now let’s be clear about what we are NOT talking about passing legislation for sharing. We are not talking about providing threat information: evidence that my machines were attacked, evidence about who may have attacked me, government access to my network to help me catch the attacker. We already have laws regulating how victims can report crimes and how the government can investigate those crimes. Those laws are called ECPA, the Wiretap Act, Rule 41 and the Fourth Amendment. Those laws are replete with rules about when legal process is and is not needed for investigating particular threats and specific attacks. Those rules have provisions for sharing in emergencies, to protect the rights and property of the provider, and more. We don’t need to waive those rules to promote digital security. The opposite is true. The rules need to be stronger to protect online privacy and security, absolutely not weaker.
Why does government keep stalling on information sharing? I think the government’ problem is likely real. Entities with vulnerability information are not sharing it frequently enough with the government. And when the government asks them why they don’t share, they say, oh, because we’d like liability protection. Because, what even slightly regulated corporation doesn’t want liability protection?
But that’s not why they aren’t sharing. If that were the reason, they would be sharing, because there are no laws that would create liability for most of the kinds of data we want to make more widely available.
My guess is that some sectors of commercial actors don’t see that it is worth their while to share with the government. I’ve been told that the government doesn’t share back. Silicon Valley engineers have wondered aloud what value the Department of Homeland Security has to offer the in their efforts to secure their employer’s services. It’s not like DHS is setting a great security example for anyone to follow. And there’s a very serious trust issue. Any company has to think at least twice about sharing how they are vulnerable with a government that hoards security vulnerabilities and exploits them to conduct massive surveillance.
Meanwhile, companies are sharing vulnerability data with each other, in all kinds of ways commercial and voluntary. More vulnerability information sharing would be a good thing to have. But we need not sacrifice what little privacy we have on the altar of government involvement. Congress should reject all three of these proposals and go back to the drawing board. What, exactly, do you want private parties and commercial entities to share with DHS and what security benefits, exactly, can DHS offer the public in exchange for this data? We can both share security information and protect privacy.