Wednesday, February 03, 2016

Lawfare thinks it can redefine π, and backdoors

There is gulf between how people believe law to work (from watching TV shows like Law and Order) and how law actually works. You lawyer people know what I'm talking about. It's laughable.

The same is true of cyber: there's a gulf between how people think it works and how it actually works.

This Lawfare blogpost thinks it's come up with a clever method to get their way in the crypto-backdoor debate, by making carriers like AT&T responsible only for the what ("deliver interpretable signal in response to lawful wiretap order") without defining the how (crypto backdoors, etc.). This pressure would come in the form of removing current liability protections they now enjoy for not being responsible for what customers transmit across their network. Or as the post paraphrases the proposal:
Don’t expect us to protect you from liability for third-party conduct if you actively design your systems to frustrate government efforts to monitor that third-party conduct.
The post is proud of its own smarts, as if they've figured out how to outwit mathematicians and redefine pi (π). But their solution is nonsense, based on a hopelessly naive understanding of how the Internet works. It appears all they know about the Internet is what they learned from watching CSI:Cyber.

The Internet is end-to-end. End-to-end is the technology shift that made the Internet happen, as compared to alternative directions cyberspace might have taken.

What that means is AT&T doesn't encrypt traffic. Apple's iPhone don't encrypt traffic. Instead, it's the app installed on the phone that does the encryption. Neither AT&T nor Apple can stop encryption from happening.

You think that because most people use iMessage or Snapchat, that all you have to do is turn the screws on them in order to force them to comply with backdoors. That won't work, because the bad guys will stop using those apps and install different encrypted apps, like Signal. You imagine that it's just a game of wack-a-mole, and eventually you'll pressure all apps into compliance. But Signal is open-source. If it disappeared tomorrow, I'd still have a copy of the source, which I can compile into my own app I'll call Xignal. I'll continue making encrypted phone calls with my own app. Even if no source existed today, I could write my own source within a couple months to do this. Indeed, writing an encrypted chat app is typical homework assignment colleges might assign computer science students. (You people still haven't come to grips with the fact that in cyberspace, we are living with the equivalent of physicists able to whip up a-bombs in their basements).

Running arbitrary software is a loose end that will defeat every solution you can come up with. It's math. The only way forward to fix the "going dark" problem is to ban software code. But that you can't do without destroying the economy and converting the country into a dystopic, Orwellian police state.

You think that those of us who oppose crypto backdoors are hippies with a knee-jerk rejection of any government technological mandate. That's not true. The populists at the EFF love technological mandates in their favor, such as NetNeutrality mandates, or bans on exporting viruses to evil regimes (though they've recently walked back on that one).

Instead, we reject this specific technological mandate, because we know cyber. We know it won't work. We can see that you'll never solve your "going dark" problem, but in trying to, you'll cause a constant erosion of both the economic utility of the Internet and our own civil liberties.

I apologize for the tone of this piece, saying you are stupid about cyber, but that's what it always comes down to. The author of that piece has impressive Washington D.C. think-tanky credentials, but misfires on the basic end-to-end problem. And all think-tanky pieces on this debate are going to happen the same way, because as soon as they bring technologists in to consult on the problem, their desired op-eds become stillborn before anybody sees them.

Note: I get the π analogy from a tweet by @quinnorton, I don't know who came up with analogy originally.


John Thacker said...

Counterpoint: While it's easy (and true) to say that people will be able to continue to use encrypted apps and write their own, the intelligence agencies are pretty happy to play whack-a-mole. Their reasoning goes like this:

1) Most people won't really care about security enough, and for various reasons (network effects of what their friends use instead of being that weirdo who demands that people install a new app to talk) will use the largest and most popular chat program, whose makers will generally be able to be pressured. Just look at the Paris attackers using SMS, of all things.

2) Perhaps we won't be able to successfully pressure all the big companies, but since we won't let the ones who cooperate talk, people won't know which big programs to avoid anyway.

3) While it is a typical assignment to create encrypted chat, it's also incredibly common to make security or crypto errors when making such an app and many of the homebrew things will be broken anyway, or difficult to use (ugh, PGP/GPG), or difficult to use properly.

4) The small percentage of people who do go out of their way to use rare unbroken apps can be treated as perspective criminals / surveillance targets, because the very fact of going out of your way creates a sort of probable cause. That will justify scooping up that traffic (which will be small enough that we can handle it) and looking at it in depth later. If the metadata leaks any identifying info (and we can get metadata from the cellular network metadata, etc., even if the application is encrypted end-to-end), we can use that to surveil the same people when they're using less secure methods.

It's a stronger argument than I'd like. Most of us like to point out that what they're proposing is logically impossible, but they don't care about that so much as the practical implications. The Paris attackers using SMS actually boosts their argument from the practical point of view.

streaky said...

"Just look at the Paris attackers using SMS, of all things."

Yes, lets look at that.

Yeah a known-backdoored weak(end intentionally) piece of crypto tech that GCHQ has all the keys for. No forthcoming intel. Yeah they're too scared to use the data they collect to actually make people safer - and they harm privacy in the first place.

Hang on why are we even talking about this again.

"people won't know which big programs to avoid anyway"

- they'll trust none and everybody will move to OSS with reproducible binaries. Wait why are we doing this again - to make commercial software impossible? Seems sensible.

"The small percentage of people who do go out of their way to use rare unbroken apps can be treated as perspective criminals / surveillance targets"

Except this stuff is being designed into the technical standards like http/2 which defacto *requires* crypto and everybody is using OpenSSL et al.

There's no small numbers of users here, what they've done is pushed everybody down to more secure communications where before everybody was complacent. Good if you're an luser, bad if you're in the legitimate intelligence game.

Eats Wombats said...

I think you mean prospective.

mcpjim said...

The crypto backdoor debate is not really a technical problem: it's actually a legal one.
Unfortunately, easily-ridiculed solutions like Lawfare's don't help my case. (Withhold immunity and increase legal fees for non-compliance? That's show 'em. #FellowPlease.)

Authorities and data companies can certainly find ways to split decryption keys and passwords, securing them in different places. (Two keys works for nuclear sub commanders, right?) One can foresee a White House task force promoting this solution, in which neither party can decrypt w/o the other, and only a court order can legally bring the two together.

But should we even be talking about HOW to accomplish gov't-only-access, or whether it can be done? Just assume it can. Instead we should be objecting to the entire premise of permitting gov't to access to private communications at all--by court order or otherwise.

Encryption is nothing more than a private room, a quiet restaurant booth, a bench in the park. A place where things are said that aren't intended to be heard by others. If you ban encryption, you may as well ban conversations in bedrooms, bathrooms,
doctor's offices, priest confessionals. After all, things said in those places could be dangerous, right? To argue against encryption is the same as demanding that no private conversations ever take place, even in your own bedroom, unless you record it for production later to the government, if requested. The current cell and metadata currently provided by the telcos to the government is equivalent to FedEx opening all envelopes and packages you send, photocopying the contents and sending them to a government database for storage and/or review.

Maybe it's just no longer believable when technologists say something cannot be done. I prefer to triumph in the moral/legal arena, where we try to win with reason why it shouldn't be done.

Doc said...

The matter of redifining pi is an old joke about the "Indiana Pi Bill" (Wikipedia article)