106

In the documentary film Citizenfour, Edward Snowden says about documents:

I'm comfortable in my technical ability to protect [documents]. I mean you could literally shoot me or torture me and I could not disclose the password, even if I wanted to. I have the sophistication to do that.

What technology/methods exist that would enable the scenario Edward Snowden is referring to when he claims to be able to create a protected file where he cannot disclose the password?

Xander
  • 35,796
  • 27
  • 116
  • 144
QBR8ZIKvyJ
  • 971
  • 2
  • 7
  • 4
  • 20
    As pointed out in an answer below, the type of attack Snowden envisages is called “rubber-hose cryptanalysis” (a euphemism for torture, see https://en.wikipedia.org/wiki/Rubber-hose_cryptanalysis). You may want to edit the title to be more specific. – MatteoS Mar 04 '15 at 17:04
  • 1
    KeePass generates passwords so long and complex, I could not possibly memorize all of them. I rarely even see my passwords. It does have a master password though. If he is kidnapped, he would not have access to the passwords for a specific document or site, but he could be coerced to provide the master password. – Chloe Mar 05 '15 at 02:40
  • 20
    Relevant to this discussion: http://xkcd.com/538/ – Sobrique Mar 05 '15 at 10:49
  • 4
    Any system where only 1 person is needed to get access is vulnerable to the rubber hose, you can torture someone to go through the steps of decryption. – ratchet freak Mar 05 '15 at 16:57
  • Maybe the encryption key automatically changes every n hours if he doesn't perform some task. – Typewriter Mar 05 '15 at 18:33
  • 6
    another relevant comic http://static.fjcdn.com/pictures/Korean_ca37a1_872460.jpg – jhocking Mar 05 '15 at 21:27
  • Notice that we see a YubiKey, shown e.g. at about 36 minutes in the scene where Snowden disconnects the hotel room phone. I would be surprised if that one is not involved in some manner. – user Mar 06 '15 at 18:59
  • Relevant to recent news story: "Quebec resident Alain Philippon to fight charge for not giving up phone password at airport" http://www.cbc.ca/news/canada/nova-scotia/quebec-resident-alain-philippon-to-fight-charge-for-not-giving-up-phone-password-at-airport-1.2982236 – Aaron Hall Mar 06 '15 at 19:51
  • Perhaps Snowden is bluffing in the hope that it will prevent the rubber hose from coming to visit him. – dotancohen Mar 08 '15 at 13:48
  • There is an actual file system with plausible deniability, linked from the Rubber-hose cryptanalysis Wikipedia page, itself also called Rubberhose: https://en.wikipedia.org/wiki/Rubberhose_(file_system). This answers the question in the title, but not the one in the post body. – zovits Mar 10 '15 at 08:36
  • I'm not sure I'd WANT such an undisclosable password. The people who are torturing you won't believe you that you can't give up the password, so they'll just go on torturing you. You'd have to be REALLY sure that the secret you were trying to protect was worth you being tortured to death for it. Just a thought. – Out of Band Feb 04 '17 at 12:45
  • I will add a simple answer that I don't have enough reputation on this site to add: Simply make the conditions for opening the file such that your family can verify, in a blockchain or other timestamped means, that you have lived a very nice life for a significant enough time and were not tortured. In addition, if you claim even once that you've been tortured, the file disappears forever. – Gregory Magarshak Jun 12 '17 at 14:20

11 Answers11

84

Shamir's Secret Sharing is a method for this. It allows one to encrypt a file with a public / private keypair, and then effectively split up the parts of the private key to be distributed to several different people. After that action (and assuming the various parts and original input private key are destroyed after distribution), it would require a quorum of people to act together in order to decrypt the data.

Jeff Ferland
  • 38,329
  • 9
  • 96
  • 174
  • 46
    Or a quorum of people to be tortured. It makes that attack harder, not impossible. – PlasmaHH Mar 05 '15 at 09:07
  • 15
    @PlasmaHH If you consider rounding up and torturing everyone whom you shared the partial key doable for that adversary, then there is no defence against it getting access to your data, other than to make it impossible to decrypt by yourself as well. If you can decrypt it and they have unlimited power over you, they can decrypt it too. – March Ho Mar 05 '15 at 09:35
  • 1
    @MarchHo: I don't see why it should be much more doable to torture a single individual than a bunch of individuals sharing the key. If you commit to torture then I don't see any limits on what you are willing to do. – PlasmaHH Mar 05 '15 at 09:45
  • 26
    @PlasmaHH Willing and able are completely different. The US government is willing to torture all of the Taliban and ISIS commanders, but is unable to do so. – March Ho Mar 05 '15 at 10:57
  • 13
    Low efficacy of torture is a serious problem here. – Palec Mar 05 '15 at 22:25
  • 1
    @PlasmaHH: if the keyholders are distributed across multiple jurisdictions, it may not be feasible for a single attacker to compel them (through whatever means) before one of their number discovers the attack and destroys his key part. – eggyal Mar 06 '15 at 00:12
  • 1
    @Palec Your comment makes no sense in this context. The key is an easily verifiable fact. The only questionable part of the attach scheme is how you can reliably identify the keyholders, which, is only a concern if you cared for minimizing the usage of torture. – Aron Mar 06 '15 at 01:29
  • 3
    @Aron: The key itself is easily verifiable, but people usually don’t remember their keys. They store them somewhere. And getting the key from a remote location is not as fast and easy. – Palec Mar 06 '15 at 07:53
  • @Palec the only physical encryption key I own and do not carry on my person comes from my bank. The only reason I do not carry it on my person is because it is too friggin large. I used to carry my RSA token everywhere. Besides, most people don't use real keys, they use passwords, which are inflated to key length. – Aron Mar 06 '15 at 07:57
  • @Aron So the torturers will know you’re probably lying. But that’s what they should expect anyway – see the Q&A on Skeptics I linked. Even passwords can be stored and forgotten. AFAIK that is best practice when it comes to root passwords. – Palec Mar 06 '15 at 08:05
  • 2
    Doesn't this also have the problem of needing everyone to be responsive in order to use the password? This sounds like it could get very complicated to use on a normal basis, especially if the shareholders are across multiple timezones. – David says Reinstate Monica Mar 06 '15 at 12:40
  • That solution reminds me of Fidelius Charm from Harry Potter. – Tomáš Zato Apr 15 '16 at 08:02
  • In more practical terms, Shamir's Secret Sharing Scheme (SSSS) is a general software implementation of bitcoin multisignature wallets. Simpler implementation thus is storing the password in question inside a multisig wallet, or have it's private key (or section thereof) be the password. – Gaia May 02 '16 at 19:06
61

All of our answers are speculation, of course, but I suspect that the most likely way that the documents are protected are by following Bruce Schneier's advice regarding laptop security through airports:

Step One: Before you board your plane, add another key to your whole-disk encryption (it'll probably mean adding another "user") -- and make it random. By "random," I mean really random: Pound the keyboard for a while, like a monkey trying to write Shakespeare. Don't make it memorable. Don't even try to memorize it.

Technically, this key doesn't directly encrypt your hard drive. Instead, it encrypts the key that is used to encrypt your hard drive -- that's how the software allows multiple users.

So now there are two different users named with two different keys: the one you normally use, and some random one you just invented.

Step Two: Send that new random key to someone you trust. Make sure the trusted recipient has it, and make sure it works. You won't be able to recover your hard drive without it.

Step Three: Burn, shred, delete or otherwise destroy all copies of that new random key. Forget it. If it was sufficiently random and non-memorable, this should be easy.

Step Four: Board your plane normally and use your computer for the whole flight.

Step Five: Before you land, delete the key you normally use.

At this point, you will not be able to boot your computer. The only key remaining is the one you forgot in Step Three. There's no need to lie to the customs official; you can even show him a copy of this article if he doesn't believe you.

Step Six: When you're safely through customs, get that random key back from your confidant, boot your computer and re-add the key you normally use to access your hard drive.

And that's it.

This is by no means a magic get-through-customs-easily card. Your computer might be impounded, and you might be taken to court and compelled to reveal who has the random key.

To be even more secure, Snowden himself may not know who has the backup key--as the associate he gave it to may have passed it along elsewhere. Also, it is likely that the person that did receive the backup key from Snowden is in a different country than any likely attacker and is doing his or her best to stay very safe.

EDIT: In response to the below comment, I decided to add the following advice:

Create a dummy operating system that starts at the beginning of the laptop's hard drive. The encrypted operating system with sensitive information will be the following partition. Configure the laptop's bootloader to boot from the dummy operating system without your intervention.

TrueCrypt had a similar hidden operating system feature where the TrueCrypt bootloader would accept two different passwords, giving access to two different operating systems. The hidden operating system was concealed with a bit of clever steganography.

We can do something similar in Linux and LUKS, but without the steganography, by doing the following:

  1. Installing Linux twice--on two partitions.
  2. Encrypting both of them with LUKS.
  3. Configuring the bootloader (probably GRUB2) to boot the first Linux installation, and remove the entries for the second installation.
  4. Whenever you want to boot your second, secret installation, boot your laptop and reach the GRUB screen. Modify the bootloader entry (temporarily) directly from the boot screen to point to the second partition.

Step four is not very user friendly, and we could get rid of it and make a separate bootloader entry for our secret operating system, but then anybody that looked at the screen could tell that there are two operating systems on the machine. An investigator can still tell, but now they must look at the laptop's hard drive with a partition editing tool.

James Mishra
  • 1,154
  • 6
  • 12
  • 18
    If your laptop is not bootable, they might not allow you to take it aboard. – Kevin Krumwiede Mar 04 '15 at 23:45
  • 5
    My LG G2 phone has a UI like that. One pattern/pin takes you to guest mode and another pattern/pin leads to your normal account. Once a user is taken to guest mode, that user has no idea he is in guest mode (assuming you didn't uncheck any of the normal pre-approved apps on your phone). And even if you do pre-approve all the apps to work in guest mode, a user in guest mode won't have access to any of the data held from your normal account. – Stephan Branczyk Mar 05 '15 at 12:10
  • 3
    What about giving out a "self-destruct key" as is envisioned in different disk level encryption schemes. You just give it to them and they destroy all the data. There is a flaw somewhere probably, maybe cloning the disk would make this irrelevant, as they still have another copy to try it again. – WalyKu Mar 05 '15 at 18:36
  • 7
    @Kurtovic The disk is guaranteed to be mirrored by any government-level attacker, simply because that is what a sensible person would do. Either way, attempting to destroy the ciphertext when detained may incur legal penalties. – James Mishra Mar 05 '15 at 21:11
  • 12
    STEP TWO Solve the key sharing problem. STEP THREE PROFIT. – Aron Mar 06 '15 at 01:30
  • @KevinKrumwiede Dead devices in general are not permitted on some flights. – Ryan Kennedy Mar 09 '15 at 14:33
  • A dummy operating system seems like a good alternative to the dead device option. It would at least get you through a basic check of your laptop. "Yep, it boots. Here's Windows. Nothing interesting here." – Rick Chatham Jul 09 '15 at 19:52
38

Here is an original technique I have come up with that can survive a rubber-hose attack:

  1. Get a stack of cash, about 50 one-dollar bills. Maybe mix some fives and tens in with them.
  2. Shuffle them into a random order
  3. Derive a password from the serial numbers, for example by taking the two least significant digits from each bill in order to form a 100-digit number
  4. Use this password as your encryption key
  5. Keep your cash in a neat stack next to your computer.
  6. If government agents raid your home, they will take the cash. With a bit of luck, it'll simply vanish into a policeman's pocket, never to be seen again. But if you get an honest cop who checks it into evidence, it's still going to be sorted and counted; it is very unlikely to maintain its order through that process, especially if your stack contains different denominations.
  7. At this point, your password is well and truly gone. You might vaguely remember a few digits of your password, but no torture can force you to disclose it. Especially because of your policy of changing the password every time you started to remember it.
  8. (Optional) You don't have to actually do any of this. Your password can be your dog's name, as long as you're willing to stick to the story that you did steps 1-5, and maybe keep a stack of a few bills next to your computer.

You don't have to use cash, either; your password could be embedded in the order of the books on your bookshelf, or some other ephemeral thing that will most likely be destroyed by government agents in a search. But cash has the advantage of being much more likely to vanish completely in a search.

Aric TenEyck
  • 481
  • 3
  • 3
  • 16
    This sounds like a great way to lose your password forever, without the help of any would-be attackers. – David says Reinstate Monica Mar 06 '15 at 12:38
  • 1
    I like it. Although if this becomes commonplace, police searching the house will counter it by photographing everything they encounter during a search. – S.L. Barth is on codidact.com Mar 06 '15 at 14:44
  • 32
    "Hey dude, I borrowed $20 from your drawer. Whoa! Why do you look so mad? Don't worry, I'll pay you back on Friday..." – Digital Chris Mar 06 '15 at 17:30
  • "Everything" potentially covers a lot of photography... – Aric TenEyck Mar 06 '15 at 18:25
  • 2
    Clever! I like it. I have doubts about how well it would work however, at least in the U.S. where cash is not as likely to disappear, but end up preserved in an evidence locker. Then, even if they bills do end up out of order (and I agree that this is quite likely) once they get the scheme out of you, they still have a very limited set of candidate passwords to work with. Still, a neat idea. – Xander Mar 08 '15 at 23:31
  • 5
    +1 for thinking out of the box, but this would definitely need some optimizing before implementation. – Mast Mar 09 '15 at 09:07
  • 1
    If the bills are out of order, there are 50! keys (assuming you made sure beforehand that there were no duplicates) that would need to be tried in a brute-force attack. This is about equal to 10^64, or 2^214. – Aric TenEyck Mar 09 '15 at 13:43
  • You can do the same with a stack of poker cards, except that it's much less likely to be disturbed. Maybe if you set it on a surface that is likely to be moved... – Out of Band Feb 04 '17 at 12:43
  • The difference is that, "The stack of cash I had on my desk never made it into evidence" is much more plausible than, "The deck of cards I had on my desk never made it into evidence". – Aric TenEyck Feb 07 '17 at 16:07
  • Combining this method with the Bruce Schneider technique seems like a really good solution. You have two keys you don't have memorized, so you can always phone a friend if your cash pile gets messed up by accident. And if the friend also uses a (different) cash pile, oh, the possibilities are endless! So much plausible deniability! – Rocky Feb 14 '17 at 00:26
27

He might be referring to neuroscientific methods of cryptographic primitives such as those outlined in the following paper: https://www.usenix.org/system/files/conference/usenixsecurity12/sec12-final25.pdf

Basically, you can prevent against "rubber hose attacks" as they call it (torture the password out of somebody) by training the user via some sort of game or app that subconsciously plants the password (combination of moves to accomplish a goal, for example) using implicit learning. The user can use the password, though cannot recall it outside of the situation of playing the game.

Think, for instance, moves you memorize in Pac-Man or Mario when you play the same level over and over... you get "good" at it because you are practicing the same movements repeatedly to accomplish the goal, even though if I sat you down with a game controller in an empty room (no game, no screen), you wouldn't (easily) be able to replicate the pattern.

I'm not sure if that is what Snowden is referring to, but it's one possibility.

armani
  • 2,658
  • 1
  • 21
  • 20
  • 16
    Could you not be tortured to (1) reveal how you decrypt (playing the game) and then (2) be forced to play it? I suppose, under torture, you could argue that you can't play the game the same way due to the stress? – Alex Kuhl Mar 04 '15 at 16:30
  • 1
    The paper discusses that the user is not consciously aware that they have the password, nor might they even realize what the authentication mechanism is, especially under stress. Furthermore, check the "Basic Coercion Threat Model" section of that paper's Section 5.1 where they show the situation you describe as being impractical (for a 5-minute test it would take "about one year of nonstop testing per user which will either interfere with the user’s learned password rendering the user useless to the attacker, or alert security administrators.") – armani Mar 04 '15 at 16:42
  • 2
    The part you're citing does not solve the problem I mention. They define their "basic threat model" where an attacker intercepts some number of users, gets info out of them, and then tries to impersonate them at some physically secure location where an alarm is raised after a failed login attempt. The scenario here is different: the encryption is local to one computer and the attacker can force the actual user to play the game, no impersonation required. This is mentioned here http://arstechnica.com/security/2012/07/guitar-hero-crypto-blunts-rubber-hose-attacks/ and in the comments. – Alex Kuhl Mar 04 '15 at 19:11
  • You grabbed that part from their Intro, whereas in Section 5.1 they describe how it is solved, but they do apply certain unreasonable constraints such as the user being physically at some location to authenticate. Either way, this nitpicking doesn't answer the OP's question, and I believe implicit learning techniques might be what Snowden is referring to. – armani Mar 04 '15 at 21:20
  • 1
    @AlexKuhl, the game analogy is great (thanks Armani) but the key is not necessarily how you play the game but rather the game itself. For instance, I can demonstrate how to swim but I cannot explain (in significant detail) how I learned how to swim. The process of learning (or the game) is the key -or rather contains the key, because a user will subconsciously inherit the item being taught and it is that subconscious level where the 'password' is stored in your brain. Bare in mind that this paper is a mere conceptual approach and not necessarily a ready to implement design... – Matthew Peters Mar 04 '15 at 21:53
  • 1
  • imagines a game of "Missile Command" having to be played before being allowed to launch missiles at the enemy.. ironically you protect the cities too well and the enemy blows you up before you complete the game.
  • – DoubleDouble Mar 05 '15 at 19:35