From Theory to Practice-Chatting in Secret while we're all being watched
July 15, 2015 2:23 PM   Subscribe

Micah Lee at The Intercept provides a deep and wide introduction to encryption (with a clever but helpful Romeo & Juliet framing device) then brings us all the way through the doorframe, past thinking or talking about it—Chatting in Secret while we're all being watched.
If you’re in a hurry, you can skip directly to where I explain, step by step, how to set this up for Mac OS XWindows, Linux and Android. Then, when you have time, come back and read the important caveats preceding those instructions.
....
When Juliet and Romeo are both anonymously logged into secret identity accounts and are having an OTR-encrypted conversation, they’re almost there. Depending on how Juliet made first contact, a close look at Romeo’s email or social media accounts might reveal the username of Juliet’s secret identity — she had to tell it to him somehow, after all. It could be possible for investigators to work from there to uncover Romeo’s secret identity as well.
posted by infinite intimation (19 comments total) 25 users marked this as a favorite
 
For a huge variety of reasons, I would simplify this a lot.

1. Buy an iPod Touch/iPad (without a SIM card, i.e. wifi only)
2. Install Signal
3. If you need TOR, set it up at the router level. Or use a VPN/TOR network profile. I think this is possible, but I'm not positive.
4. Use Signal.

Your attack surface is much smaller.

To elaborate, you almost definitely don't want to use a desktop or laptop computer for a few main reasons:
1) You cannot guarantee it's in a good state to start with, and if its exfiltrating your keys you're hosed. See Halvar Flake's talk here.
2) Those instructions are reasonably complicated, and unless you understand them really, really well (as in, well enough to not need this guide) you can probably be socially engineered into revealing things you shouldn't.
3) Your attack surface is massive with a modern desktop. Browsers, kernels, firmware, etc. On an iDevice your attack surface is reduced, and the code signing bits buy you very valuable security. Historical evidence and economic incentives mean that iOS devices will be the most expensive to break into (in a large part because of that code signing). This is not the case with desktops.
4) To not leak information between you're online identities you basically need to have two separate computers. There is a lot of work done in browser fingerprinting, even in incognito mode. Tails does alleviate some of this (due to its only-volatile storage), however does not address persistent firmware attacks (and if you're worried about these style attacks, you should be worried about firmware level attacks). If you're going to have separate devices anyways, why not get one that will be harder to exploit?
posted by yeahwhatever at 2:43 PM on July 15, 2015 [3 favorites]


> Your attack surface is much smaller.

If you're truly paranoid, you'd be a fool to trust a third party service like Signal. If I had the budget of the NSA, I'd be (anonymously) producing all sorts of chat services that have encryption that is "guaranteed unbreakable by the NSA".

(And I'd be publishing auditable details of the encryption, which would be provably strong. Who cares how strong your padlock is if your door hinges are held together by a piece of string.)
posted by RedOrGreen at 2:52 PM on July 15, 2015 [2 favorites]


I heard the stage magician Penn (or maybe Teller) explain that a lot of performances work because the audience doesn't expect that anyone would put that much work into faking a trick. But they do. Similarly, we don't expect that the NSA would put that much work into intercepting data. But they do. We have leaked documents that tell us that the NSA has been intercepting routers, modifying them, and forwarding them on. And there are indications that the NSA has compromised the firmware of hard drives at the manufacturer, which makes some threats both persistent and mostly undetectable. And nothing's too costly, either in terms of money or risk: we know they've bugged the phones of foreign allies' leaders, despite the very great diplomatic risks of exposure.

So I think it's very prudent to assume that everything has been compromised. They might not be listening into it, they might not even be collecting it, but the holes are there. And it won't even be the NSA that exploits them: a hole is a hole, and China (e.g.) is smart enough to find them. So what communications privacy exists in a world where keyboards, monitors, CPUs, hard drives, routers, and the Internet itself are all independently compromised?
posted by Joe in Australia at 3:42 PM on July 15, 2015 [5 favorites]


If you're truly paranoid you won't communicate electronically.

Here is the Director of the Intercept's Security talking about the Signal-on-an-iPhone route as of this morning.
Here is the Grugq (author of PORTAL and co-author of COMSEC - Beyond Encryption) saying the same thing.

Basically the point I wasn't making wasn't about the communication medium (though I still think Signal is best and due to its open source end-to-end peer-reviewed crypto) but the software running on the endpoint. Leaked docs show repeatedly that as Shamir said, cryptosystems are subverted. This means that protection of the keys becomes the hard part.

If you don't want to use Signal, I think you're making a poor choice, but whatever you do at least do it from a locked down endpoint. Which is definitely not a normal wintel, mac, or linux box.
posted by yeahwhatever at 3:46 PM on July 15, 2015 [1 favorite]


If you're truly paranoid, you'd be a fool to trust a third party service like Signal
...
Who cares how strong your padlock is if your door hinges are held together by a piece of string


The difference between Signal's encrypted communications and something like TLS that encrypts browser-to-service traffic (where the service owner can view the content of the messages e.g. "private" messaging on Facebook) is that the messages passed around by Signal are encrypted on the device before they're sent out using the final recipient's key.

So the padlock/door/house analogy doesn't really work here. Signal works because once your message is encrypted, you could hand it directly to the NSA and say "good luck have fun", so it doesn't matter if the messages are being relayed through some NSA-controlled machine, either owned by Signal the organization or anyone else. Signal (the org, or alternatively Whisper Systems on the Android side) cannot decrypt your message, and neither can the NSA.

And the important part, the client-side bit, is open source (as is the Android counterpart, TextSecure, which speaks the same language as Signal), so third parties can verify that the encryption is being done properly.
posted by unknownmosquito at 5:42 PM on July 15, 2015


Joe in Australia: You trade one-time pads ahead of time in person (Possibly by snail mail? Doesn't the postal service have a lot of legal protections on it?) then encode your messages by hand then text them in the clear so as to not raise flags that the message is owrth looking at.
posted by Canageek at 5:53 PM on July 15, 2015


The postal service in the USA has legal protections on it. They'll be different in other countries, and I don't know whether those protections apply to US mail that's temporarily outside the USA for whatever reason. But anyway, what you're proposing doesn't allow random strangers to exchange messages; it doesn't allow anything other than brief text messages; it has no recourse against pads captured *after* transit or indeed ones captured illegally while in transit. And the consequences to the government of an illegal capture would be, at worst, that the information would not be admissible in court.

Also, it turns out that it's really hard to make good one-time pads or get people to use them correctly. If a pad is reused even once, modern computers make it trivial to break plain-text messages encrypted with that pad. You can read lots of good stuff about key pads here. The bit about breaking them can be found under the heading "Breaking a Reused One-time pad".
posted by Joe in Australia at 6:24 PM on July 15, 2015


> The padlock/door/house analogy doesn't really work here. Signal works because once your message is encrypted, you could hand it directly to the NSA and say "good luck have fun", so it doesn't matter if the messages are being relayed through some NSA-controlled machine, either owned by Signal the organization or anyone else.

See, that's exactly what I meant by the door / padlock / hinges analogy. The route you're describing is an incredibly secure padlock - audited and provably secure, sure. The problem is that the door hinges aren't checked to the same level. Are you sure that Signal is not also making a plaintext copy that it is passing on directly to the NSA? Are you sure that the device firmware hasn't been compromised and isn't logging every keystroke when the Signal app is invoked? Are you sure that the baseband processor on the phone doesn't record and transmit everything at all times using a back channel on the mobile network? What is being harvested at the NSA's giant storage facility in Utah?

> The audience doesn't expect that anyone would put that much work into faking a trick. But they do.

Exactly.
posted by RedOrGreen at 7:48 PM on July 15, 2015


It makes me sad - weary, fuck-it-all, blanket-over-my-head sad - that any of this is even necessary.

we don't expect that the NSA would put that much work into intercepting data. But they do.


And/or they're just storing everything - everything - until brute forcing it is trivial / cheap. One of my nightmares is that somebody discovers a clever mathematical trick that renders RSA-type algorithms worthless, or perhaps gets some sort of quantum computer thingy, and then a giant computer owned by TLAs sifts back through petabytes of historical data.

"Mr Johnson, we notice that you were looking at horse porn in 2015."
"It's 2064. I was fifteen years old then. We were fooling around at a party."
"You'd best make sure you vote Republicrat in the next election, or your employer might find out. You'd fail Civil Decency provisions in the USAFUKYEA Act. You'd lose your job."
"But my vote is encrypted!"
"Oh, sure, totally."
posted by obiwanwasabi at 8:06 PM on July 15, 2015 [1 favorite]


Yes, I think even smart people like Bruce Schneier are too blasè about the store-until-it's-easy technique. That being said, our entire IT infrastructure is so compromised that it's a bit precious to pretend that any encryption will defeat an attacker with state-level resources and jurisdiction over both endpoints.
posted by Joe in Australia at 8:59 PM on July 15, 2015 [2 favorites]


state-level resources

Yep. Like this $5 wrench, and legislated protection from criminal or civil liability in the course of carrying out intelligence-related activities.

There's are clauses about weapons and violence against the person? So there are. Let's see who believes you.
posted by obiwanwasabi at 9:09 PM on July 15, 2015 [2 favorites]


I've spent a lot of time thinking about OTP encryption, and back-of-the-enveloped a few ideas for secure key generation, distribution and use. In theory it's quite easy to imagine having a black box with two USB slots, a (well-researched!) hardware RNG and a big red button marked FILL: put two identical blank thumbdrives in, press the button and wait. Give one to a friend. Have another black box with a modem in and a USB slot; plug your key into it. Your friend has an identical box. Plug a headset into the box, plug the modem into a voice-grade telecommunications system (VOIP is fine). Call your friend. Use the OTP to scramble the audio bitstream; as the key is consumed, delete it from the USB device. You need some in-clear handshaking and pointer management, but nothing that can be usefully spoofed or otherwise used by third parties on the line.

Assuming you keep physical security of the storage devices until you can destroy them (it is impossible to reliably delete stuff from most USB storage devices, or to be sure that there aren't mapped-out sectors with old key in them, so don't reuse the storage devices) you're vulnerable only if you or your friend is within audible range of a monitoring device so that what you say or hear can be intercepted. Ditto if you're using keyboards and screens and there's a camera about.

I think that's the only way to reduce the attack surface to the practicable minimum, assuming you have built the black boxes safely (Use the oldest possible components scavenged from old kit; hand-write the code in assembler on old computers running CP/M, blow it into old EPROMS).

Won't save you from traffic analysis, of course. But I think it will be as close as safe as non-electronic communications as you or anyone can get, and the technology is both cheap and uncomplicated. It's Z80-CPU levels of smart, and the key is that you only use modern tech where it's essential - which is in the uSB interface/storage side, and that is always behind uncompromisable layers, away from the communications medium.

Not paranoid enough to do this, but if I had to, this is what I'd do. Basically SIGSALY upgraded just enough to be personal...
posted by Devonian at 7:14 AM on July 16, 2015


Do NOT buy an iOS device, yeahwhatever. iOS gets privacy enhancing technologies far behind Android, if ever, thanks partially to Apple's anti-GPL policy. And tor still doesn't work right.

Signal has good crypto, as it's a port/clone of Android's TextSecure, but both exposes all your metadata. TextSecure will remain superior to Signal, due to Tor, side loading on Android, etc.

Also, iOS devices are all effectively back/bug doored according to published Snowden materials. Android is usually back/bug doored too, but less so if you install Replicant OS, or maybe even CyanogenMod. And iOS devices are more expensive to replace.

If you're concerned about attack surface, then only the Linux kernel and Tor should see your internet connection. Darwin will have far more zero days than Linux. And Tor should be pretty tough. You can even run xmpp-client if you want a minimal Jabber client with mandatory OtR that's written in a slightly safer language (Go).

Now OtR is technically inferior to the Axolotl ratchet used by TextSecure, but Micah Lee is absolutely correct that jabber with a temporary account and OtR over Tor beats anything not using Tor at protecting your metadata. Use Pond if you want both metadata protection and the long term ratchet.
posted by jeffburdges at 12:09 PM on July 16, 2015


It's worth mentioning ricochet too, not as good as Pond of course, but far faster.

Just a word of warning : Jabber+OtR does not encrypt file transfers! Pond does! And OnionShare by Micah Lee ain't bad, assuming they've started using authenticated hidden services.
posted by jeffburdges at 12:47 PM on July 16, 2015


Maybe we should take this to mefi mail?

iOS gets privacy enhancing technologies far behind Android

Name some of these. The most recent privacy enhancing technology I can think of is randomizing wifi mac's for non-paired networks, which iOS has before Android.

And iOS devices are more expensive to replace.

iPod Touches can be had for $200. This is close enough to Android things, which was the point of one of the discussions linked above.

Also, iOS devices are all effectively back/bug doored according to published Snowden materials.

Source? The docs I've seen indicate that with shipping interdiction the NSA has access to iOS6 devices. If you buy a modern device from a brick and mortar store that's not an issue. Assuming we're thinking of the same Snowden docs, public research also had the ability to root iOS devices with physical access at that time.

If you're concerned about attack surface, then only the Linux kernel and Tor should see your internet connection. Darwin will have far more zero days than Linux.

We disagree here. First, the XNU used on OS X and iOS are not the same, and while I'll accept that Linux is probably more secure than OS X currently, I'd say this is not the case for mobile devices. The iOS secure boot chain, as well as the introduction of features like the ARM equivalents of SMAP/SMEP (which Linux has not yet enabled afaik) means that XNU is likely better in my opinion.

The main thing I'm basing this off of is exploit prices. While linux might be more secure than xnu, the cost of exploiting an xnu device is almost certainly higher (see every leaked exploit catalog). Cost is the best way I'm currently aware of to rank these things.

but Micah Lee is absolutely correct that jabber with a temporary account and OtR over Tor beats anything not using Tor at protecting your metadata

Yep agreed. However it's a tradeoff. I think in many cases it probably makes more sense to lose metadata protection in exchange for content protection.

Use Pond if you want both metadata protection and the long term ratchet.

Please don't suggest people use Pond. I'm very excited about Pond as well, but here is a quote from the Pond homepage, by the Pond author:

Dear God, please don't use Pond for anything real yet. I've hammered out nearly 20K lines of code that have never been reviewed. Unless you're looking to experiment you should go use something that actually works.
posted by yeahwhatever at 12:55 PM on July 16, 2015


iOS devices are all effectively back/bug doored according to published Snowden materials. [...] Darwin will have far more zero days than Linux.

I don't know of any real evidence backing either of these assertions. I'd appreciate a pointer to the specific bits of the "published Snowden materials" because as far as I know these are both incorrect. (I'm willing to learn otherwise and have a sad about it.)
posted by RedOrGreen at 5:59 PM on July 16, 2015


"The NSA claims in their QUANTUMTHEORY documents that every attempt to implant iOS will always succeed."

You're wrong about metadata, yeahwhatever. It's the metadata of how often you talk with your grandmother that determines whether some government stooge chooses to apply pressure against you using her, not what you say to her.

Pond hasn't been reviewed well enough yet, but that applies to iOS as a whole. If Adam scares you off Pond, then use TextSecure over Tor, same ratchet. TextSecure merely lacks a constant traffic profile and file transfers.

There are numerous problems with gauging security based on exploit prices, like exploit longevity, deployment base, deployment demographic, and ecosystem diversity. And side effects of open vs close source development, corporate or community policy, etc. play a huge role.

iOS exploits are probably expensive because iOS users are relatively rich and the devices are numerous. And the exploits won't be found by someone else looking at the code. Yet, intelligence agencies do have teams that create exploits that always work.

Android exploits are probably less useful overall due to the ecosystem diversity. Any given stock Android device would contain vulnerable crapware installed by the manufacturer and/or carrier, but those exploits cost less for numerous reasons. If you install CyanogenMod or Replicant OS or whatever then it becomes even messier.

posted by jeffburdges at 3:28 AM on July 17, 2015


Jeff, regarding the slide shown by Jacob Applebaum, it's from the leaked NSA Tailored Access Operations (TAO) ANT catalog. The entire page for DROPOUTJEEP is online. In particular, note that NSA says that installation of DROPOUTJEEP requires "close access methods." Most security researchers have interpreted this to mean that with physical access to an iPhone the NSA could install their sophisticated trojan (perhaps by jailbreaking the device, or even just side-installing the trojan and clicking "yes" to a bunch of security alerts). I think it is best to assume that any device, no matter what OS it is running, will not be safe from the NSA if they can get physical access to the device.
posted by RichardP at 4:00 PM on July 19, 2015


That slide is from 2008. I seem to recall that there was a way of bypassing iPhones' lock screen back then which required physical access. If that vulnerability is gone then the DROPOUTJEEP technique may be obsolete. On the other hand, we know that the NSA has subverted SIM security; with physical access to a phone I suspect that you could insert a cloned SIM which would give you all the functionality of DROPOUTJEEP and more.
posted by Joe in Australia at 4:46 PM on July 19, 2015


« Older Compassionate Use, Clinical Trials, and Social...   |   From Mastermind Stanley Kramer Newer »


This thread has been archived and is closed to new comments