mirror of
https://0xacab.org/anarsec/anarsec.guide.git
synced 2025-07-31 18:38:50 -04:00
standarize em dashes
This commit is contained in:
parent
d6f4ad9d2e
commit
5d9796b043
12 changed files with 98 additions and 98 deletions
|
@ -14,7 +14,7 @@ a4="tails-best-a4.pdf"
|
|||
letter="tails-best-letter.pdf"
|
||||
+++
|
||||
|
||||
This text describes some additional precautions you can take that are relevant to an anarchist [threat model](/glossary#threat-model) - operational security for Tails. Not all anarchist threat models are the same, and only you can decide which mitigations are worth putting into practice for your activities, but we aim to provide advice that is appropriate for high-risk activities. The [No Trace Project Threat Library](https://www.notrace.how/threat-library/) is another great resource for thinking through your threat model and appropriate mitigations. If you are new to Tails, start with [Tails for Anarchists](/posts/tails/).
|
||||
This text describes some additional precautions you can take that are relevant to an anarchist [threat model](/glossary#threat-model) — operational security for Tails. Not all anarchist threat models are the same, and only you can decide which mitigations are worth putting into practice for your activities, but we aim to provide advice that is appropriate for high-risk activities. The [No Trace Project Threat Library](https://www.notrace.how/threat-library/) is another great resource for thinking through your threat model and appropriate mitigations. If you are new to Tails, start with [Tails for Anarchists](/posts/tails/).
|
||||
|
||||
<!-- more -->
|
||||
|
||||
|
@ -67,7 +67,7 @@ You can mitigate the techniques available to powerful adversaries by **not using
|
|||
|
||||
### Internet not tied to your identity
|
||||
|
||||
"Mobile Wi-Fi" devices exist which give you Internet access through the mobile network (via SIM cards) - these are a bad idea. The unique identification number of your SIM card (IMSI) and the unique serial number of your adapter (IMEI) are also transmitted to the mobile operator every time you connect, allowing identification and geographic localization. The adapter works like a mobile phone! If you do not want different research sessions to be associated with each other, do not use the same device or SIM card more than once!
|
||||
"Mobile Wi-Fi" devices exist which give you Internet access through the mobile network (via SIM cards) — these are a bad idea. The unique identification number of your SIM card (IMSI) and the unique serial number of your adapter (IMEI) are also transmitted to the mobile operator every time you connect, allowing identification and geographic localization. The adapter works like a mobile phone! If you do not want different research sessions to be associated with each other, do not use the same device or SIM card more than once!
|
||||
|
||||
To use internet not tied to your identity, you have two options: Wi-Fi from a public space (like going to a cafe without CCTV cameras), or by using a Wi-Fi antenna through a window from a private space. The latter option is preferable for any computer activity that takes a prolonged amount of time because the main police priority will be to seize the computer while it is unencrypted, and this is much harder for them to achieve in a private space. In a public space, there is also more of a risk of cameras seeing you type your password. However, using a Wi-Fi antenna is also more technical (guide coming soon).
|
||||
|
||||
|
@ -75,24 +75,24 @@ When using Wi-Fi in a public space, keep the following operational security cons
|
|||
* Do not get into a routine of using the same cafes repeatedly if you can avoid it.
|
||||
* If you have to buy a coffee to get the Wi-Fi password, pay in cash!
|
||||
* Position yourself with your back against a wall so that no one can "shoulder surf" to see your screen, and ideally install a [privacy screen](/posts/tails/#privacy-screen) on your laptop.
|
||||
* Maintain situational awareness and be ready to pull out the Tails USB to shut down the computer at a moment's notice. It is very difficult to maintain adequate situational awareness while staying focused on your Tails session - consider asking a trusted friend to hang out who can dedicate themselves to keeping an eye on your surroundings. If the Tails USB is removed, Tails will shut down and [overwrite the RAM with random data](https://tails.net/doc/advanced_topics/cold_boot_attacks/index.en.html). Any LUKS USBs that were unlocked in the Tails session will now be encrypted again. Note that [Tails warns](https://tails.net/doc/first_steps/shutdown/index.en.html) "Only physically remove the USB stick in case of emergency as doing so can sometimes break the file system of the Persistent Storage."
|
||||
* One person in charge of a darknet marketplace had his Tails computer seized while distracted by a fake fight next to him. Similar tactics have been used [in other police operations](https://dys2p.com/en/2023-05-luks-security.html#attacks). If his Tails USB had been attached to a belt with a short piece of fishing line, the police would most likely have lost all evidence when the Tails USB was pulled out. A more technical equivalent is [BusKill](https://www.buskill.in/tails/) - however, we only recommend buying this [in person](https://www.buskill.in/leipzig-proxystore/) or [3D printing it](https://www.buskill.in/3d-print-2023-08/). This is because any mail can be [intercepted](https://docs.buskill.in/buskill-app/en/stable/faq.html#q-what-about-interdiction) and altered, making the hardware [malicious](https://en.wikipedia.org/wiki/BadUSB).
|
||||
* Maintain situational awareness and be ready to pull out the Tails USB to shut down the computer at a moment's notice. It is very difficult to maintain adequate situational awareness while staying focused on your Tails session — consider asking a trusted friend to hang out who can dedicate themselves to keeping an eye on your surroundings. If the Tails USB is removed, Tails will shut down and [overwrite the RAM with random data](https://tails.net/doc/advanced_topics/cold_boot_attacks/index.en.html). Any LUKS USBs that were unlocked in the Tails session will now be encrypted again. Note that [Tails warns](https://tails.net/doc/first_steps/shutdown/index.en.html) "Only physically remove the USB stick in case of emergency as doing so can sometimes break the file system of the Persistent Storage."
|
||||
* One person in charge of a darknet marketplace had his Tails computer seized while distracted by a fake fight next to him. Similar tactics have been used [in other police operations](https://dys2p.com/en/2023-05-luks-security.html#attacks). If his Tails USB had been attached to a belt with a short piece of fishing line, the police would most likely have lost all evidence when the Tails USB was pulled out. A more technical equivalent is [BusKill](https://www.buskill.in/tails/) — however, we only recommend buying this [in person](https://www.buskill.in/leipzig-proxystore/) or [3D printing it](https://www.buskill.in/3d-print-2023-08/). This is because any mail can be [intercepted](https://docs.buskill.in/buskill-app/en/stable/faq.html#q-what-about-interdiction) and altered, making the hardware [malicious](https://en.wikipedia.org/wiki/BadUSB).
|
||||
* If coffee shops without CCTV cameras are few and far between, you can try accessing a coffee shop's Wi-Fi from outside, out of view of the cameras.
|
||||
|
||||
### Non-Targeted and Targeted Correlation Attacks
|
||||
|
||||
As described in the quotation above, a global adversary (i.e. the NSA) may be capable of breaking Tor through a correlation attack. If this happens, the Internet address you used in a coffee shop without CCTV cameras will only lead to your general area (e.g. your city) because it is not associated with you. Of course, this is less true if you use the location routinely. Correlation attacks are even less feasible against connections to an .onion address because you never leave the Tor network, so there is no "end" to correlate with through network traffic analysis (if the server location is unknown to the adversary). It is worth emphasizing that "End-to-end correlation attacks have been studied in research papers, but we don't know of any actual use to deanonymize Tor users."
|
||||
|
||||
What we will term a "targeted" correlation attack is possible by a non-global adversary (i.e. local law enforcement), if you are already in their sights and a target of [physical surveillance](https://www.notrace.how/threat-library/techniques/physical-surveillance/covert.html) and/or [digital surveillance](https://www.notrace.how/threat-library/techniques/targeted-digital-surveillance.html). This is a subtype of correlation attack where the presumed target is already known, thus making the attack easier to achieve because it vastly reduces the amount of data to filter through for correlation. A non-targeted correlation attack used to deanonymize a Tor user is unprecedented in current evidence used in court, although [a "targeted" correlation attack has been used](https://medium.com/beyond-install-tor-signal/case-file-jeremy-hammond-514facc780b8) as corroborating evidence - a suspect had already been identified, which allowed investigators to correlate their local footprint with specific online activity. Specifically, they correlated Tor network traffic coming from the suspect's house with the times their anonymous alias was online in chatrooms.
|
||||
What we will term a "targeted" correlation attack is possible by a non-global adversary (i.e. local law enforcement), if you are already in their sights and a target of [physical surveillance](https://www.notrace.how/threat-library/techniques/physical-surveillance/covert.html) and/or [digital surveillance](https://www.notrace.how/threat-library/techniques/targeted-digital-surveillance.html). This is a subtype of correlation attack where the presumed target is already known, thus making the attack easier to achieve because it vastly reduces the amount of data to filter through for correlation. A non-targeted correlation attack used to deanonymize a Tor user is unprecedented in current evidence used in court, although [a "targeted" correlation attack has been used](https://medium.com/beyond-install-tor-signal/case-file-jeremy-hammond-514facc780b8) as corroborating evidence — a suspect had already been identified, which allowed investigators to correlate their local footprint with specific online activity. Specifically, they correlated Tor network traffic coming from the suspect's house with the times their anonymous alias was online in chatrooms.
|
||||
|
||||
To explain how this works, it helps if you have a basic understanding of what Tor information is visible to various third parties - see the EFF's [interactive graphic](https://www.eff.org/pages/tor-and-https). For a non-targeted correlation attack, the investigator will need to **start from after Tor's exit node**: take the specific online activity coming from the exit node and try to correlate it with an enormous amount of global data that is entering Tor entry nodes. However, if a suspect is already identified, the investigator can instead do a "targeted" correlation attack and **start from before Tor's entry node**: take the data entering the entry node (via **the suspect's physical or digital footprint**) and try to correlate it with **specific online activity** coming from an exit node.
|
||||
To explain how this works, it helps if you have a basic understanding of what Tor information is visible to various third parties — see the EFF's [interactive graphic](https://www.eff.org/pages/tor-and-https). For a non-targeted correlation attack, the investigator will need to **start from after Tor's exit node**: take the specific online activity coming from the exit node and try to correlate it with an enormous amount of global data that is entering Tor entry nodes. However, if a suspect is already identified, the investigator can instead do a "targeted" correlation attack and **start from before Tor's entry node**: take the data entering the entry node (via **the suspect's physical or digital footprint**) and try to correlate it with **specific online activity** coming from an exit node.
|
||||
|
||||
A more sophisticated analysis of the specific online activity would involve logging the connections to the server for detailed comparison, and a simple analysis would be something that is publicly visible to anyone (such as when your alias is online in a chatroom, or when a post is published to a website). For your physical footprint, a surveillance operation can note that you go to a cafe regularly, then try to correlate this with online activity they suspect you of (for example, if they suspect you are a website moderator, they can try to correlate these time windows with web moderator activity). For your digital footprint, if you are using Internet from home, an investigator can log all your Tor traffic and then try to correlate it with specific online activity.
|
||||
|
||||
To mitigate the risk of "targeted" correlation attacks:
|
||||
|
||||
* If you only need to use the Internet briefly to submit a communique, you can **do [surveillance detection](https://www.notrace.how/threat-library/mitigations/surveillance-detection.html) and [anti-surveillance](https://www.notrace.how/threat-library/mitigations/anti-surveillance.html) before going to a coffee shop**, just like you would prior to a direct action.
|
||||
* For projects like moderating a website or hacking that require daily Internet access, it is not realistic to find a new Wi-Fi location every day. In that case, the ideal mitigation is to **use a Wi-Fi antenna from indoors** - a physical surveillance effort won't see you entering a cafe, and a digital surveillance effort won't see anything on your home Internet.
|
||||
* For projects like moderating a website or hacking that require daily Internet access, it is not realistic to find a new Wi-Fi location every day. In that case, the ideal mitigation is to **use a Wi-Fi antenna from indoors** — a physical surveillance effort won't see you entering a cafe, and a digital surveillance effort won't see anything on your home Internet.
|
||||
* If a Wi-Fi antenna is too technical for you, you may even want to **use your home internet** for some projects that require frequent internet access. This contradicts the previous advice to not use your personal Wi-Fi. It's a trade-off: using Tor from home avoids creating a physical footprint that is so easy to observe, at the expense of creating a digital footprint which is more technical to observe, and may be harder to draw meaningful conclusions from (especially if you intentionally [make correlation attacks more difficult](/posts/tails/#make-correlation-attacks-more-difficult)). In our view, the main risk of using your home internet is not that the adversary is able to break Tor through a correlation attack, but that the adversary is able to hack your system, such as through [phishing](#phishing-awareness), which [enables them to bypass Tor](/posts/qubes/#when-to-use-tails-vs-qubes-os).
|
||||
* If you want to submit a report-back the morning after a riot, or a communique shortly after an action (times when there may be a higher risk of targeted surveillance), consider waiting and at least taking surveillance detection and anti-surveillance measures beforehand. In 2010, the morning after a bank arson in Canada, police surveilled a suspect as he traveled from his home to an Internet cafe, and watched him post the communique and then bury the laptop in the woods. More recently, investigators physically surveilling [an anarchist in France](https://www.notrace.how/resources/#quelques-premiers-elements-du-dossier-d-enquete-contre-ivan) installed a hidden camera to monitor access to an Internet cafe near the comrade's home and requested CCTV footage for the day an arson communique was sent.
|
||||
|
||||
|
@ -144,7 +144,7 @@ Our adversaries have two attack vectors to compromise BIOS, firmware, hardware,
|
|||
|
||||
* **Wi-Fi that is unrelated to your identity**. We recommend using Wi-Fi that is unrelated to your identity (i.e. not at your home or work) not only to mitigate deanonymization, but also to mitigate remote hacking. It is best to never use the dedicated Tails laptop on your home Wi-Fi. This makes the laptop much less accessible to a remote attacker than a laptop that is constantly connected to your home Wi-Fi. If an attacker is targeting you, they need a point to start, and your home Wi-Fi is a pretty good place to start.
|
||||
* **Remove the hard drive**—it's easier than it sounds. If you buy the laptop, you can ask the store to do it and potentially save some money. If you search on youtube for "remove hard drive" for your specific laptop model, there will probably be an instructional video. Make sure you remove the laptop battery and unplug the power cord first. We remove the hard drive to completely eliminate the hard drive firmware, which has been known to be [compromised to install persistent malware](https://www.wired.com/2015/02/nsa-firmware-hacking/). A hard drive is part of the attack surface and is unnecessary on a live system like Tails that runs off a USB.
|
||||
* Consider **removing the Bluetooth interface, camera, and microphone** while you're at it, although this is more involved—you'll need the user manual for your laptop model. The camera can at least be "disabled" by putting a sticker over it. The microphone is often connected to the motherboard via a plug - in this case just unplug it. If this is not obvious, or if there is no connector because the cable is soldered directly to the motherboard, or if the connector is needed for other purposes, cut the microphone cable with a pair of pliers. The same method can be used to permanently disable the camera if you don't trust the sticker method. It is also possible to use Tails on a dedicated "offline" computer by removing the network card as well. Some laptops have switches on the case that can be used to disable the wireless interfaces, but for an "offline" computer it is preferable to actually remove the network card.
|
||||
* Consider **removing the Bluetooth interface, camera, and microphone** while you're at it, although this is more involved—you'll need the user manual for your laptop model. The camera can at least be "disabled" by putting a sticker over it. The microphone is often connected to the motherboard via a plug — in this case just unplug it. If this is not obvious, or if there is no connector because the cable is soldered directly to the motherboard, or if the connector is needed for other purposes, cut the microphone cable with a pair of pliers. The same method can be used to permanently disable the camera if you don't trust the sticker method. It is also possible to use Tails on a dedicated "offline" computer by removing the network card as well. Some laptops have switches on the case that can be used to disable the wireless interfaces, but for an "offline" computer it is preferable to actually remove the network card.
|
||||
|
||||
* **Replace the BIOS with [HEADS](https://osresearch.net/)**. A [video](https://invidious.sethforprivacy.com/watch?v=sNYsfUNegEA) demonstrates an attack on the BIOS firmware against a Tails user, allowing the security researcher to steal GPG keys and emails. Unfortunately, the BIOS cannot be removed like the hard drive. It is needed to turn on the laptop, so it must be replaced with [open-source](/glossary#open-source) firmware. This is an advanced process because it requires opening the computer and using special tools. Most anarchists will not be able to do this themselves, but hopefully there is a trusted person in your networks who can set it up for you. The project is called HEADS because it's the other side of Tails—where Tails secures software, HEADS secures firmware. It has a similar purpose to the [Verified Boot](https://www.privacyguides.org/en/os/android-overview/#verified-boot) found in GrapheneOS, which establishes a full chain of trust from the hardware. HEADS has [limited compatibility](https://osresearch.net/Prerequisites#supported-devices), so keep that in mind when buying your laptop if you plan to install it—we recommend the ThinkPad X230 because it's less involved to install than other models. The CPUs of this generation are capable of effectively removing the [Intel Management Engine](https://en.wikipedia.org/wiki/Intel_Management_Engine#Assertions_that_ME_is_a_backdoor) when flashing HEADS, but this is not the case with later generations of CPUs on newer computers. [Coreboot](https://www.coreboot.org/users.html), the project on which HEADS is based, is compatible with a wider range of laptop models but has less security. HEADS can be configured to [verify the integrity and authenticity of your Tails USB](https://osresearch.net/InstallingOS/#generic-os-installation), preventing it from booting if it has been tampered with. HEADS protects against physical and remote classes of attacks!
|
||||
|
||||
|
@ -168,16 +168,16 @@ If your Tails USB stick has a write-protect switch like the [Kanguru FlashTrust]
|
|||
|
||||
On a USB with a write-protect switch, you will not be able to make any changes to the Tails USB when the switch is locked. If you can make changes, so can malware. While it would be ideal to leave the switch locked all the time, we recommend two cases where the switch must be unlocked:
|
||||
|
||||
1) **For a dedicated upgrade session.** If you need to upgrade Tails, you can do so in a dedicated session with the switch unlocked - this is necessary because the upgrade needs to be written to the Tails USB. Once you are done, you should restart Tails with the switch locked.
|
||||
2) **If you decide to use Persistent Storage, for occasional configuration sessions.** [Persistent Storage](/posts/tails/#optional-create-and-configure-persistent-storage) is a Tails feature that allows data to persist between sessions that would otherwise be amnesiac on the Tails USB itself. Because it requires writing to the Tails USB to persist data, it is generally impractical to use with a write-protect switch. However, it may be acceptable to disable the switch for occasional Persistent Storage configuration sessions, such as installing additional software. For example, in an 'unlocked' session, you enable additional software for persistence and install Scribus, selecting to install it every session. Then, in a 'locked' session, you actually use Scribus - none of the files you work on are saved to the Tails USB because it is 'locked'. Note that in this scenario, the USB switch will need to be locked to the read-only position *after* after the Welcome Screen, because Tails will not load the Persistant Storage otherwise. The Persistent Storage feature is not possible with the `toram` boot or with a DVD.
|
||||
1) **For a dedicated upgrade session.** If you need to upgrade Tails, you can do so in a dedicated session with the switch unlocked — this is necessary because the upgrade needs to be written to the Tails USB. Once you are done, you should restart Tails with the switch locked.
|
||||
2) **If you decide to use Persistent Storage, for occasional configuration sessions.** [Persistent Storage](/posts/tails/#optional-create-and-configure-persistent-storage) is a Tails feature that allows data to persist between sessions that would otherwise be amnesiac on the Tails USB itself. Because it requires writing to the Tails USB to persist data, it is generally impractical to use with a write-protect switch. However, it may be acceptable to disable the switch for occasional Persistent Storage configuration sessions, such as installing additional software. For example, in an 'unlocked' session, you enable additional software for persistence and install Scribus, selecting to install it every session. Then, in a 'locked' session, you actually use Scribus — none of the files you work on are saved to the Tails USB because it is 'locked'. Note that in this scenario, the USB switch will need to be locked to the read-only position *after* after the Welcome Screen, because Tails will not load the Persistant Storage otherwise. The Persistent Storage feature is not possible with the `toram` boot or with a DVD.
|
||||
|
||||
Where can we store personal data for use between Tails sessions if the write-protect switch prevents us from using Persistent Storage? We recommend storing personal data on a second LUKS USB. This "personal data" USB should not look identical to your Tails USB to avoid confusion. To create this separate USB, see [How to create an encrypted USB](/posts/tails/#how-to-create-an-encrypted-usb). If you are reading this from a country like the UK, where not providing encryption passwords can land you in jail, this second drive should be an HDD containing a [Veracrypt Hidden Volume](https://www.veracrypt.fr/en/Hidden%20Volume.html) (SSD and USB drives are [not suitable for Hidden Volumes](https://www.veracrypt.fr/en/Trim%20Operation.html)).
|
||||
|
||||

|
||||
|
||||
Compartmentalization is an approach that neatly separates different identities by using separate Tails sessions for separate activities - in Tails session #1 you do activities related to moderating a website, and in Tails session #2 you do activities related to researching for an action. This approach also comes into play with your "personal data" USBs. If the files you save could be used to link your activities together, use a different "personal data" USB for each activity. For a "personal data" USB that stores very sensitive files (such as the text of a communique), it is best to reformat and then destroy the USB once you no longer need the files (see [Really delete data from a USB drive](/posts/tails/#really-delete-data-from-a-usb)). This is another reason to use a separate USB for any files that need to be saved - you don't accumulate the forensic history of all your files on your Tails Persistent Storage, and you can easily destroy USBs as needed.
|
||||
Compartmentalization is an approach that neatly separates different identities by using separate Tails sessions for separate activities — in Tails session #1 you do activities related to moderating a website, and in Tails session #2 you do activities related to researching for an action. This approach also comes into play with your "personal data" USBs. If the files you save could be used to link your activities together, use a different "personal data" USB for each activity. For a "personal data" USB that stores very sensitive files (such as the text of a communique), it is best to reformat and then destroy the USB once you no longer need the files (see [Really delete data from a USB drive](/posts/tails/#really-delete-data-from-a-usb)). This is another reason to use a separate USB for any files that need to be saved — you don't accumulate the forensic history of all your files on your Tails Persistent Storage, and you can easily destroy USBs as needed.
|
||||
|
||||
Finally, a note about email - if you already use Tails and encrypted email, you may be familiar with Thunderbird's Persistent Storage feature. This feature allows you to store your Thunderbird email account details, as well as your inbox and PGP keys, on a Tails USB. With a "personal data" USB, Thunderbird won't automatically open your accounts. We recommend that you do one of the following:
|
||||
Finally, a note about email — if you already use Tails and encrypted email, you may be familiar with Thunderbird's Persistent Storage feature. This feature allows you to store your Thunderbird email account details, as well as your inbox and PGP keys, on a Tails USB. With a "personal data" USB, Thunderbird won't automatically open your accounts. We recommend that you do one of the following:
|
||||
|
||||
- Create new Thunderbird email accounts in each session. PGP keys can be stored on the separate 'personal data' USB like any other file, and imported when needed. This has the advantage that if law enforcement manages to bypass LUKS, they still don't have your inbox without knowing your email password.
|
||||
- Keep the Thunderbird data folder on the "personal data" USB. After logging in to Thunderbird, use the Files browser (Applications → Accessories → Files) and enable the "Show hidden files" setting. Navigate to Home, then copy the folder called `.thunderbird` to your "personal data" USB. In each future session, after you have unlocked the 'personal data' USB and before you start Thunderbird, copy the `.thunderbird` folder to Home (which is running in RAM, so doesn't require the write-protect switch to be unlocked).
|
||||
|
@ -194,7 +194,7 @@ If its not possible to find a USB with a write-protect switch, you can alternati
|
|||
|
||||
>In the terminology used by KeePassXC, a [*password*](/glossary/#password) is a random sequence of characters (letters, numbers and other symbols), while a [*passphrase*](/glossary/#passphrase) is a random sequence of words.
|
||||
|
||||
Never reuse a password/passphrase for multiple things ("password recycling") - KeePassXC makes it easy to store unique passwords that are dedicated to one purpose. [LUKS](/glossary/#luks) encryption **is only effective when the device is powered off** - when the device is powered on, the password can be retrieved from memory. Any encryption can be [brute-force attacked](/glossary#brute-force-attack) with [massive amounts of cloud computing](https://blog.elcomsoft.com/2020/08/breaking-luks-encryption/). The newer version of LUKS (LUKS2 using Argon2id) is [less vulnerable to brute-force attacks](https://mjg59.dreamwidth.org/66429.html); this is the default as of Tails 6.0 and Qubes OS 4.1. If you'd like to learn more about this change, we recommend [Systemli's overview](https://www.systemli.org/en/2023/04/30/is-linux-hard-disk-encryption-hacked/) or [dys2p's](https://dys2p.com/en/2023-05-luks-security.html).
|
||||
Never reuse a password/passphrase for multiple things ("password recycling") — KeePassXC makes it easy to store unique passwords that are dedicated to one purpose. [LUKS](/glossary/#luks) encryption **is only effective when the device is powered off** — when the device is powered on, the password can be retrieved from memory. Any encryption can be [brute-force attacked](/glossary#brute-force-attack) with [massive amounts of cloud computing](https://blog.elcomsoft.com/2020/08/breaking-luks-encryption/). The newer version of LUKS (LUKS2 using Argon2id) is [less vulnerable to brute-force attacks](https://mjg59.dreamwidth.org/66429.html); this is the default as of Tails 6.0 and Qubes OS 4.1. If you'd like to learn more about this change, we recommend [Systemli's overview](https://www.systemli.org/en/2023/04/30/is-linux-hard-disk-encryption-hacked/) or [dys2p's](https://dys2p.com/en/2023-05-luks-security.html).
|
||||
|
||||
Password strength is measured in "[bits of entropy](https://en.wikipedia.org/wiki/Password_strength#Entropy_as_a_measure_of_password_strength)". Your passwords/passphrases should ideally have an entropy of about 128 bits (diceware passphrases of **ten words**, or passwords of **21 random characters**, including uppercase, lowercase, numbers, and symbols) and shouldn't have less than 90 bits of entropy (seven words).
|
||||
|
||||
|
@ -243,7 +243,7 @@ Every time you use the filesystem, mount it like this:
|
|||
|
||||
`gocryptfs cipher plain`
|
||||
|
||||
You will be prompted for the password. Note that the order is important - `cipher` is the first argument and `plain` is the second.
|
||||
You will be prompted for the password. Note that the order is important — `cipher` is the first argument and `plain` is the second.
|
||||
|
||||
You can now add files to your mounted, decrypted container in the 'plain' folder. When you unmount the filesystem, the container will be encrypted. To do this:
|
||||
|
||||
|
@ -310,7 +310,7 @@ First, some clarification. [PGP and GPG](/glossary/#gnupg-openpgp) are terms tha
|
|||
|
||||
GPG is a classic example of [public-key cryptography](/glossary/#public-key-cryptography). GPG provides cryptographic functions for [encrypting](/glossary/#encryption), decrypting, and signing files; our concern here is digitally signing files. The Tails team [digitally signs](/glossary/#digital-signatures) their .img releases. GPG gives us a way to verify that the file has actually been "signed" by the developers, which allows us to trust that it hasn't been tampered with.
|
||||
|
||||
Now you need to understand the basics of public-key cryptography. [This Computerphile video](https://invidious.sethforprivacy.com/watch?v=GSIDS_lvRv4) has a great overview with visual aids. To summarize, a **secret/private** key is used to **sign** messages, and only the user who has that key can do so. Each **private** key has a corresponding **public** key - this is called a **key pair**. The public key is shared with everyone and is used to verify the signature. Confused? Watch the video!
|
||||
Now you need to understand the basics of public-key cryptography. [This Computerphile video](https://invidious.sethforprivacy.com/watch?v=GSIDS_lvRv4) has a great overview with visual aids. To summarize, a **secret/private** key is used to **sign** messages, and only the user who has that key can do so. Each **private** key has a corresponding **public** key — this is called a **key pair**. The public key is shared with everyone and is used to verify the signature. Confused? Watch the video!
|
||||
|
||||

|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue