mirror of
https://0xacab.org/anarsec/anarsec.guide.git
synced 2025-06-24 14:30:35 -04:00
tails best syntax
This commit is contained in:
parent
e55ed5ad34
commit
fde97523e1
1 changed files with 84 additions and 84 deletions
|
@ -14,9 +14,9 @@ a4="tails-best-a4.pdf"
|
|||
letter="tails-best-letter.pdf"
|
||||
+++
|
||||
|
||||
As mentioned in our [recommendations](/recommendations/#computers), Tails is an [operating system](/glossary#operating-system-os) that is unparalleled for sensitive computer use that needs to have no forensic trace (writing and sending communiques, research for actions, etc.). Tails runs from a USB drive, and is [designed](https://tails.boum.org/about/index.en.html) to leave no trace of your activity on your computer, and to force all Internet connections through the [Tor network](/glossary#tor-network). If you are new to working with Tails, start with [Tails for Anarchists](/posts/tails/).
|
||||
As mentioned in our [recommendations](/recommendations/#computers), Tails is an [operating system](/glossary#operating-system-os) that is unparalleled for sensitive computer use that requires leaving no forensic trace (writing and sending communiques, research for actions, etc.). Tails runs from a USB drive and is [designed](https://tails.boum.org/about/index.en.html) to leave no trace of your activity on your computer, and to force all Internet connections through the [Tor network](/glossary#tor-network). If you are new to Tails, start with [Tails for Anarchists](/posts/tails/).
|
||||
|
||||
This text details some extra precautions that you can take which are relevant to an anarchist [threat model](/glossary#threat-model). Not all anarchist threat models are the same and only you can decide what mitigations are worth putting into practice for your activities, but we aim to provide advice that is appropriate for high-risk activities. The [CSRC Threat Library](https://www.csrc.link/threat-library/) is another great resource for thinking through your threat model and appropriate mitigations.
|
||||
This text describes some additional precautions you can take that are relevant to an anarchist [threat model](/glossary#threat-model). Not all anarchist threat models are the same, and only you can decide which mitigations are worth putting into practice for your activities, but we aim to provide advice that is appropriate for high-risk activities. The [CSRC Threat Library](https://www.csrc.link/threat-library/) is another great resource for thinking through your threat model and appropriate mitigations.
|
||||
|
||||
<!-- more -->
|
||||
|
||||
|
@ -41,11 +41,11 @@ This first issue can be mitigated by **cleaning metadata from files before shari
|
|||
|
||||
### Using Tails for more than one purpose at a time
|
||||
|
||||
This second issue can be mitigated by what's called **'compartmentalization'**:
|
||||
This second issue can be mitigated by what's called **"compartmentalization"**:
|
||||
|
||||
* [Compartmentalization](https://www.csrc.link/threat-library/mitigations/compartmentalization.html) means keeping different activities or projects separated from each other. If you use Tails sessions for more than one purpose at a time, an adversary could link your different activities together. For example, if you log into different accounts on the same website in a single Tails session, the website could determine that the accounts are used by the same person. This is because websites can tell when two accounts are using the same Tor circuit.
|
||||
* To prevent an adversary from linking your activities together while using Tails, restart Tails between different activities. For example, restart Tails between checking different project emails.
|
||||
* Tails is amnesiac by default, so to save any data from a Tails session it needs to be saved to a USB. If the files that you save could be used to link your activities together, use a different encrypted ([LUKS](/glossary#luks)) USB stick for each activity. For example, use one Tails USB stick for moderating a website and another one for research for actions. Tails has a feature called Persistent Storage, but we recommend not using this for data storage, which will be explained [below](#using-a-write-protect-switch).
|
||||
* [Compartmentalization](https://www.csrc.link/threat-library/mitigations/compartmentalization.html) means keeping different activities or projects separate. If you use Tails sessions for more than one purpose at a time, an adversary could link your different activities together. For example, if you log into different accounts on the same website in a single Tails session, the website could determine that the accounts are being used by the same person. This is because websites can tell when two accounts are using the same Tor circuit.
|
||||
* To prevent an adversary from linking your activities while using Tails, restart Tails between different activities. For example, restart Tails between checking different project emails.
|
||||
* Tails is amnesiac by default, so to save any data from a Tails session, you must save it to a USB. If the files you save could be used to link your activities together, use a different encrypted ([LUKS](/glossary#luks)) USB stick for each activity. For example, use one Tails USB stick for moderating a website and another for researching actions. Tails has a feature called Persistent Storage, but we do not recommend using it for data storage, which is explained [below](#using-a-write-protect-switch).
|
||||
|
||||
## Limitations of the [Tor network](/glossary#tor-network)
|
||||
|
||||
|
@ -60,26 +60,26 @@ This second issue can be mitigated by what's called **'compartmentalization'**:
|
|||
|
||||
This first issue is mitigated by [**Tor bridges**](https://tails.boum.org/doc/anonymous_internet/tor/index.en.html#bridges):
|
||||
|
||||
* Tor Bridges are secret Tor relays that keep your connection to the Tor network hidden. However, this is only necessary where connections to Tor are blocked, for example in some countries with heavy censorship, by some public networks, or by some parental controls. This is because Tor and Tails don't protect you by making you look like any random Internet user, but by making all Tor and Tails users look the same. It becomes impossible to know who is who among them.
|
||||
* Tor Bridges are secret Tor relays that hide your connection to the Tor network. However, this is only necessary where connections to Tor are blocked, such as in heavily censored countries, by some public networks, or by some parental control software. This is because Tor and Tails don't protect you by making you look like any other Internet user, but by making all Tor and Tails users look the same. It becomes impossible to tell who is who among them.
|
||||
|
||||
### Protecting against determined, skilled attackers
|
||||
|
||||
> A powerful adversary, who could analyze the timing and shape of the traffic entering and exiting the Tor network, might be able to deanonymize Tor users. These attacks are called *end-to-end correlation* attacks, because the attacker has to observe both ends of a Tor circuit at the same time. [...] End-to-end correlation attacks have been studied in research papers, but we don't know of any actual use to deanonymize Tor users.
|
||||
|
||||
This second issue is mitigated by **not using an Internet connection that could deanonymize you** and by **prioritizing .onion links when available**:
|
||||
This second issue is mitigated by **not using an Internet connection that could deanonymize you**, and by **prioritizing .onion links when available**:
|
||||
|
||||
* Wi-Fi adapters that work through SIM cards are not a good idea. The unique identification number of your SIM card (IMSI) and the unique serial number of your adapter (IMEI) are also transmitted to the mobile network provider every time you connect, allowing identification as well as geographical localization. The adapter works like a cell phone! If you do not want different research sessions to be associated with each other, do not use such an adapter or the SIM card more than once!
|
||||
* There are several opsec considerations to keep in mind if using Wi-Fi at a cafe without CCTV cameras.
|
||||
* Wi-Fi adapters that work through SIM cards are a bad idea. The unique identification number of your SIM card (IMSI) and the unique serial number of your adapter (IMEI) are also transmitted to the mobile operator every time you connect, allowing identification and geographic localization. The adapter works like a mobile phone! If you do not want different research sessions to be associated with each other, do not use the same adapter or SIM card more than once!
|
||||
* There are several opsec considerations to keep in mind when using Wi-Fi in a cafe without CCTV cameras.
|
||||
* See [below](#appendix-2-location-location-location) for more information on choosing a location.
|
||||
* Do not make a routine by using the same cafes repeatedly, if it can be avoided.
|
||||
* If you need to buy a coffee to get the Wi-Fi password, pay in cash!
|
||||
* Position yourself with your back against a wall so that nobody can 'shoulder surf' you to see your screen, and ideally install a privacy screen on the laptop.
|
||||
* Maintain situational awareness, and be ready to pull out the Tails USB and power down the computer at a moment's notice. An individual responsible for a darknet marketplace had his Tails computer seized while distracted by a fake fight beside him. Similar tactics have been employed [in other police operations](https://dys2p.com/en/2023-05-luks-security.html#attacks). If his Tails USB had been attached to a belt by a short length of fishing line, the feds would have very likely lost all evidence when the Tails USB was yanked out - note that [Tails warns](https://tails.boum.org/doc/first_steps/shutdown/index.en.html) "Only physically remove the USB stick in case of emergency as doing so can sometimes break the file system of the Persistent Storage." A more technical equivalent is [BusKill](https://docs.buskill.in/buskill-app/en/stable/introduction/what.html) - we don't recommend buying it through mail, which can be [intercepted](https://docs.buskill.in/buskill-app/en/stable/faq.html#q-what-about-interdiction) to make hardware [malicious](https://en.wikipedia.org/wiki/BadUSB)). If the Tails USB is removed, Tails will shut down and [overwrite the RAM with random data](https://tails.boum.org/doc/advanced_topics/cold_boot_attacks/index.en.html). Any LUKS USBs that were unlocked in the Tails session will now again be encrypted. If maintaining situational awareness feels unrealistic, consider asking a trusted friend to hang out who can dedicate themselves to this.
|
||||
* If cafes without CCTV cameras are few and far between, you can try to access the Wi-Fi of a cafe from outdoors, outside of the view of their cameras. Some external Wi-Fi adapters will be able to catch signals that are further away, as discussed [below](#appendix-2-location-location-location).
|
||||
* If a determined adversary breaks Tor through a [correlation attack](https://anonymousplanet.org/guide.html#your-anonymized-torvpn-traffic), the Internet address you had used in a cafe without CCTV cameras will only lead to your general area (for example, your city) because it is not associated with you. Of course, this is less true if you use it routinely. A correlation attack being used to deanonymize a Tor user is unprecedented in current evidence that has been used in court, though [it has been used](https://medium.com/beyond-install-tor-signal/case-file-jeremy-hammond-514facc780b8) as supporting evidence once a suspect was already identified to correlate with. Correlation attacks are even less feasible against connections to an .onion address because you never exit the Tor network so there is no 'end' to correlate with.
|
||||
* However, a more likely low-tech 'correlation attack' is possible by local law enforcement, starting from your identity rather than starting from your anonymous Internet activity, if you are already in their sights and a target of [physical surveillance](https://www.csrc.link/threat-library/techniques/physical-surveillance/covert.html). For example, if a surveillance operation notices that you go to a cafe regularly, and an anarchist website is always updated in those time windows, this pattern can indicate that you are moderating that website. Perhaps an undercover can even get a glance at your screen.
|
||||
* Possible mitigations in this scenario include **doing [surveillance detection](https://www.csrc.link/threat-library/mitigations/surveillance-detection.html) and [anti-surveillance](https://www.csrc.link/threat-library/mitigations/anti-surveillance.html) prior to heading to a cafe**, and changing Wi-Fi locations regularly, but this may not be particularly realistic for projects like moderating a website which require daily Internet access. Alternatively, mitigations can involve **using a Wi-Fi antenna from indoors** (guide forthcoming), **scheduling posts to be published later** (WordPress has this feature), or potentially even **using Tor from your home Internet** for some projects. This contradicts the prior advice, but using Tor from home will avoid creating a movement profile that is so easily physically observed (compared to a network traffic profile that is more technical to observe, and may be more difficult to draw meaningful conclusions from).
|
||||
* If you want to send in a report-back the morning after a riot, or a communique soon after an action (times when there might be a higher risk of targeted surveillance), consider waiting and at minimum take surveillance detection and anti-surveillance measures beforehand. In 2010, the morning after a bank was gutted with fire in Canada, police surveilled a suspect while he travelled from home to an Internet cafe, and watched while he posted the communique and then proceeded to bury the laptop in the woods. More recently, investigators physically surveilling [an anarchist in France](https://www.csrc.link/#quelques-premiers-elements-du-dossier-d-enquete-contre-ivan) installed a hidden camera to monitor access to an Internet cafe close to the comrade's home, and requested CCTV footage for the day during which an arson communique was sent.
|
||||
* Do not get into a routine of using the same cafes repeatedly if you can avoid it.
|
||||
* If you have to buy a coffee to get the Wi-Fi password, pay in cash!
|
||||
* Position yourself with your back against a wall so that no one can "shoulder surf" to see your screen, and ideally install a privacy screen on your laptop.
|
||||
* Maintain situational awareness and be ready to pull out the Tails USB to shut down the computer at a moment's notice. One person in charge of a darknet marketplace had his Tails computer seized while distracted by a fake fight next to him. Similar tactics have been used [in other police operations](https://dys2p.com/en/2023-05-luks-security.html#attacks). If his Tails USB had been attached to a belt with a short piece of fishing line, the police would most likely have lost all evidence when the Tails USB was pulled out - note that [Tails warns](https://tails.boum.org/doc/first_steps/shutdown/index.en.html) "Only physically remove the USB stick in case of emergency as doing so can sometimes break the file system of the Persistent Storage." A more technical equivalent is [BusKill](https://docs.buskill.in/buskill-app/en/stable/introduction/what.html) - we don't recommend buying it through the mail, which can be [intercepted](https://docs.buskill.in/buskill-app/en/stable/faq.html#q-what-about-interdiction) to make the hardware [malicious](https://en.wikipedia.org/wiki/BadUSB)). If the Tails USB is removed, Tails will shut down and [overwrite the RAM with random data](https://tails.boum.org/doc/advanced_topics/cold_boot_attacks/index.en.html). Any LUKS USBs that were unlocked in the Tails session will now be encrypted again. If maintaining situational awareness seems unrealistic, consider asking a trusted friend to hang out who can dedicate themselves to it.
|
||||
* If coffee shops without CCTV cameras are few and far between, you can try accessing a coffee shop's Wi-Fi from outside, out of view of the cameras. Some external Wi-Fi adapters can pick up signals from further away, as discussed [below](#appendix-2-location-location-location).
|
||||
* If a determined adversary breaks Tor through a [correlation attack](https://anonymousplanet.org/guide.html#your-anonymized-torvpn-traffic), the Internet address you used in a coffee shop without CCTV cameras will only lead to your general area (e.g. your city) because it is not associated with you. Of course, this is less true if you use it routinely. A correlation attack used to deanonymize a Tor user is unprecedented in current evidence used in court, although [it has been used](https://medium.com/beyond-install-tor-signal/case-file-jeremy-hammond-514facc780b8) as corroborating evidence once a suspect has already been identified to correlate with. Correlation attacks are even less feasible against connections to an .onion address because you never leave the Tor network, so there is no "end" to correlate with.
|
||||
* However, a more likely low-tech "correlation attack" is possible by local law enforcement, based on your identity rather than your anonymous Internet activity, if you are already in their sights and a target of [physical surveillance](https://www.csrc.link/threat-library/techniques/physical-surveillance/covert.html). For example, if a surveillance operation notices that you go to a cafe regularly, and an anarchist website is always updated during those windows, this pattern may indicate that you are moderating that website. An undercover may even be able to catch a glimpse of your screen.
|
||||
* Possible mitigations in this scenario include **doing [surveillance detection](https://www.csrc.link/threat-library/mitigations/surveillance-detection.html) and [anti-surveillance](https://www.csrc.link/threat-library/mitigations/anti-surveillance.html) before going to a coffee shop**, and changing Wi-Fi locations regularly, but this may not be particularly realistic for projects like moderating a website that require daily Internet access. Alternatively, mitigations include **using a Wi-Fi antenna from indoors** (guide coming soon), **scheduling posts to be published later** (WordPress has this feature), or possibly even **using Tor from your home internet** for some projects. This contradicts the previous advice, but using Tor from home avoids creating a movement profile that is so easy to physically observe (as opposed to a network traffic profile which is more technical to observe, and may be harder to draw meaningful conclusions from).
|
||||
* If you want to submit a report-back the morning after a riot, or a communique shortly after an action (times when there may be a higher risk of targeted surveillance), consider waiting and at least taking surveillance detection and anti-surveillance measures beforehand. In 2010, the morning after a bank arson in Canada, police surveilled a suspect as he travelled from his home to an Internet cafe, watched him post the communique, and then buried the laptop in the woods. More recently, investigators physically surveiling [an anarchist in France](https://www.csrc.link/#quelques-premiers-elements-du-dossier-d-enquete-contre-ivan) installed a hidden camera to monitor access to an Internet cafe near the comrade's home and requested CCTV footage for the day an arson communique was sent.
|
||||
|
||||
## Reducing risks when using untrusted computers
|
||||
|
||||
|
@ -92,41 +92,41 @@ This second issue is mitigated by **not using an Internet connection that could
|
|||
|
||||
### Installing from an infected computer
|
||||
|
||||
This first issue is mitigated by **using a computer that you trust to install Tails**:
|
||||
This first issue is mitigated by **using a computer you trust to install Tails**:
|
||||
|
||||
* As per our [recommendations](/recommendations/#computers), this would ideally be from [Qubes OS](/posts/qubes/) because it is much more difficult to infect than a normal Linux computer. If you have a trusted friend with a Tails USB stick which was installed with Qubes OS (and who uses these best practices), you could [clone it](/posts/tails/#installation) instead of installing it yourself.
|
||||
* Use the install method ["Terminal: Debian or Ubuntu using the command line and GnuPG"](https://tails.boum.org/install/expert/index.en.html), because it checks the integrity of the download more thoroughly using [GPG](/glossary/#gnupg-openpgp). If using the [command line](/glossary/#command-line-interface-cli) is above your head, ask a friend to walk you through it, or first learn command line basics and GnuPG with [Linux Essentials](/posts/linux/).
|
||||
* Once installed, do not plug your Tails USB stick (or any [LUKS](/glossary/#luks) USBs that are used during Tails sessions) into a computer while another operating system is running on it; if the computer is infected, the infection can then [spread to the USB](https://en.wikipedia.org/wiki/BadUSB).
|
||||
* According to our [recommendations](/recommendations/#computers), this would ideally be a [Qubes OS](/posts/qubes/) system, as it is much harder to infect than a normal Linux computer. If you have a trusted friend with a Tails USB stick that has been installed with Qubes OS (and who uses these best practices), you could [clone it](/posts/tails/#installation) instead of installing it yourself.
|
||||
* Use the "Terminal" installation method ["Debian or Ubuntu using the command line and GnuPG"](https://tails.boum.org/install/expert/index.en.html), as it more thoroughly verifies the integrity of the download using [GPG](/glossary/#gnupg-openpgp). If using the [command line](/glossary/#command-line-interface-cli) is over your head, ask a friend to walk you through it, or first learn the basics of the command line and GnuPG with [Linux Essentials](/posts/linux/).
|
||||
* Once installed, do not plug your Tails USB stick (or any [LUKS](/glossary/#luks) USBs used during Tails sessions) into a computer that is running another operating system; if the computer is infected, the infection can [spread to the USB](https://en.wikipedia.org/wiki/BadUSB).
|
||||
|
||||
### Running Tails on a computer with a compromised BIOS, firmware, or hardware
|
||||
|
||||
This second issue requires several mitigations. Let's start with some definitions.
|
||||
This second issue requires several mitigations. Let's start with a few definitions.
|
||||
|
||||
* *Hardware* means the physical computer that you use.
|
||||
* *Firmware* means software that's embedded in a piece of hardware; you can think of it simply as "software for hardware". It can be found in several different locations (hard drives, USB drives, graphics processor, etc).
|
||||
* *BIOS* means the specific firmware that is responsible for booting your computer when you press the power button—this is a great place for [malware](/glossary/#malware) to hide because it is undetectable by the operating system.
|
||||
* *Hardware* is the physical computer you are using.
|
||||
* *Firmware* is the software that's embedded in a piece of hardware; you can simply think of it as "software for hardware". It can be found in several different places (hard drives, USB drives, graphics processor, etc.).
|
||||
* *BIOS* is the specific firmware that is responsible for booting your computer when you press the power button—this is a great place for [malware](/glossary/#malware) to hide because it is undetectable by the operating system.
|
||||
|
||||
Our adversaries have two attack vectors to compromise BIOS, firmware, hardware, or software; [remote attacks](/glossary#remote-attacks) (through the Internet) and [physical attacks](/glossary/#physical-attacks) (through physical access). Not everyone will need to apply all of the advice below. For example, if Tails is only being used for anonymous Internet browsing and writen correspondence, some of this may be overkill. However, if Tails is being used to take responsibility for actions that are highly criminalized, a more thorough approach is likely relevant.
|
||||
Our adversaries have two attack vectors to compromise BIOS, firmware, hardware, or software: [remote attacks](/glossary#remote-attacks) (via the Internet) and [physical attacks](/glossary/#physical-attacks) (via physical access). Not everyone will need to apply all of the advice below. For example, if Tails is only being used for anonymous web browsing and writen correspondence, some of this may be overkill. However, if Tails is used to take responsibility for actions that are highly criminalized, a more thorough approach is likely relevant.
|
||||
|
||||
#### To mitigate against physical attacks:
|
||||
|
||||
> Your computer might be compromised if its physical components have been altered. For example, if a keylogger has been physically installed on your computer, your passwords, personal information, and other data typed on your keyboard could be stored and accessed by someone else, even if you are using Tails.
|
||||
|
||||
* First, **obtain a 'fresh' computer**. A laptop bought from a random refurbished computer store is very unlikely [to already be compromised](https://arstechnica.com/tech-policy/2014/05/photos-of-an-nsa-upgrade-factory-show-cisco-router-getting-implant/). Buy your computer with cash so that it is not traced to you, and in person because mail can be intercepted—a used [T Series](https://www.thinkwiki.org/wiki/Category:T_Series) or [X Series](https://www.thinkwiki.org/wiki/Category:X_Series) Thinkpad from a refurbished computer store is a cheap and reliable option. It is best to use Tails with a dedicated laptop, which will prevent the adversary from being able to target the hardware through a less secure operating system, or through your normal non-anonymous activities. Another reason to have a dedicated laptop is so that if something in Tails breaks, any information that leaks which exposes the laptop isn't automatically also tied to you and your daily computer activities.
|
||||
* First, **get a fresh computer**. A laptop from a random refurbished computer store is unlikely [to already be compromised](https://arstechnica.com/tech-policy/2014/05/photos-of-an-nsa-upgrade-factory-show-cisco-router-getting-implant/). Buy your computer with cash so it cannot be traced back to you, and in person because mail can be intercepted—a used [T Series](https://www.thinkwiki.org/wiki/Category:T_Series) or [X Series](https://www.thinkwiki.org/wiki/Category:X_Series) Thinkpad from a refurbished computer store is a cheap and reliable option. It is best to use Tails with a dedicated laptop, which prevents the adversary from targeting the hardware through a less secure operating system or through your normal non-anonymous activities. Another reason to have a dedicated laptop is that if something in Tails breaks, any information that leaks and exposes the laptop won't automatically be tied to you and your daily computer activities.
|
||||
|
||||

|
||||
|
||||
* **Make the laptop screws tamper-evident, store it in a tamper-evident way, and monitor for intrusions**. With these precautions, if physical attacks happen in the future, you'll be able to notice. See the tutorial [Making Your Electronics Tamper-Evident](/posts/tamper/) to adapt the laptop chassis screws, use some form of intrusion detection, and store the laptop in a way where you'll be able to notice if it's been physically accessed. Store any external devices you’ll be using with the laptop in the same way (USB, external hard drive, mouse, keyboard). If physical attack vectors are mitigated, an adversary can only use remote attacks.
|
||||
* **Make the laptop's screws tamper-evident, store it in a tamper-evident manner, and monitor for break-ins**. With these precautions in place, you'll be able to detect any future physical attacks. See the [Making Your Electronics Tamper-Evident](/posts/tamper/) tutorial to adapt your laptop's screws, use some form of intrusion detection, and store your laptop so you'll know if it's been physically accessed. Store any external devices you’ll be using with the laptop in the same way (USB, external hard drive, mouse, keyboard). When physical attack vectors are mitigated, an adversary can only use remote attacks.
|
||||
|
||||
#### To mitigate against remote attacks:
|
||||
|
||||
* **Anonymous Wi-Fi**. Using anonymous Wi-Fi is not only recommended to mitigate against deanonymization, but also against remote hacking. It is best to never use the dedicated Tails laptop from your home Wi-Fi. This makes the laptop much less accessible to a remote attacker than a laptop that you constantly have connected to your home Wi-Fi. If an attacker is targeting you specifically, they need a starting point, and your home Wi-Fi is a pretty good one.
|
||||
* **Remove the hard drive**—it's easier than it sounds. When you buy the latop, you can ask the store to do this and potentially save some money. If you look on youtube for 'remove hard drive' for your specific laptop model, there will likely be an instructional video. Make sure that you remove the laptop battery first and unplug the power cord. We remove the hard drive to completely eliminate the hard drive firmware, which has been known to be [compromised in order to install malware that is persistent](https://www.wired.com/2015/02/nsa-firmware-hacking/). A hard drive is part of the attack surface, and is unnecessary with a live system like Tails which runs from a USB.
|
||||
* Consider **removing the Bluetooth interface, camera, and microphone** while you are at it, though this is more involved—you'll need the user manual for your laptop model. At a minimum, the camera can be "deactivated" by placing a sticker over it. The microphone is often connected to the motherboard via a plug - in this case it is sufficient to disconnect it. If this is not clear, or there is no connector because the cable is soldered directly, or the connector is needed for other purposes, then cut the microphone cable with pliers. The camera can be permanently disabled using the same method if you don't trust the sticker method. It is also possible to use Tails on a dedicated "offline" computer by also removing the network card. Some laptops have switches on the case that can be used to disable the wireless interfaces, but for an "offline" computer it is preferable to actually remove the network card.
|
||||
* **Anonymous Wi-Fi**. Using anonymous Wi-Fi is recommended not only to mitigate deanonymization, but also to mitigate remote hacking. It is best to never use the dedicated Tails laptop on your home Wi-Fi. This makes the laptop much less accessible to a remote attacker than a laptop that is constantly connected to your home Wi-Fi. If an attacker is targeting you, they need a point to start, and your home Wi-Fi is a pretty good place to start.
|
||||
* **Remove the hard drive**—it's easier than it sounds. If you buy the laptop, you can ask the store to do it and potentially save some money. If you search on youtube for "remove hard drive" for your specific laptop model, there will probably be an instructional video. Make sure you remove the laptop battery and unplug the power cord first. We remove the hard drive to completely eliminate the hard drive firmware, which has been known to be [compromised to install persistent malware](https://www.wired.com/2015/02/nsa-firmware-hacking/). A hard drive is part of the attack surface and is unnecessary on a live system like Tails that runs off a USB.
|
||||
* Consider **removing the Bluetooth interface, camera, and microphone** while you're at it, although this is more involved—you'll need the user manual for your laptop model. The camera can at least be "disabled" by putting a sticker over it. The microphone is often connected to the motherboard via a plug - in this case just unplug it. If this is not obvious, or if there is no connector because the cable is soldered directly to the motherboard, or if the connector is needed for other purposes, cut the microphone cable with a pair of pliers. The same method can be used to permanently disable the camera if you don't trust the sticker method. It is also possible to use Tails on a dedicated "offline" computer by removing the network card as well. Some laptops have switches on the case that can be used to disable the wireless interfaces, but for an "offline" computer it is preferable to actually remove the network card.
|
||||
|
||||
* **Replace the BIOS with [HEADS](https://osresearch.net/)**. A [video](https://invidious.sethforprivacy.com/watch?v=sNYsfUNegEA) demonstrates a remote attack on BIOS firmware against a Tails user, enabling the security researcher to steal GPG keys and emails. Unfortunately, the BIOS cannot be simply removed like the hard drive. It is needed for turning on the laptop so must be replaced with [open-source](/glossary#open-source) firmware. This is an advanced process because it involves opening up the computer and using special tools. Most anarchists will not be able to do this by themselves, but hopefully there is a trusted individual in your networks who can set it up for you. The project is called HEADS because it's the 'other side' of Tails—where Tails secures software, HEADS secures firmware. It has a similar purpose to the [Verified Boot](https://www.privacyguides.org/en/os/android-overview/#verified-boot) found in GrapheneOS, which establishes a full chain of trust starting from the hardware. HEADS has [limited compatibility](https://osresearch.net/Prerequisites#supported-devices), so keep this in mind when you're buying your laptop if you intend to install it—we recommend the ThinkPad X230 because the installation is less involved than for other models. The CPUs of this generation are able to have the [Intel Management Engine](https://en.wikipedia.org/wiki/Intel_Management_Engine#Assertions_that_ME_is_a_backdoor) effectively removed in the process of flashing HEADS, but this is not the case with later CPU generations on more recent computers. [Coreboot](https://www.coreboot.org/users.html), the project on which HEADS is based, is compatible with a broader range of laptop models but has inferior security. HEADS can be configured to [verify the integrity of your Tails USB](https://osresearch.net/InstallingOS/#generic-os-installation) which will prevent it from booting if it has been tampered with. HEADS protects against physical and remote classes of attacks!
|
||||
* **Replace the BIOS with [HEADS](https://osresearch.net/)**. A [video](https://invidious.sethforprivacy.com/watch?v=sNYsfUNegEA) demonstrates a remote attack on the BIOS firmware against a Tails user, allowing the security researcher to steal GPG keys and emails. Unfortunately, the BIOS cannot be removed like the hard drive. It is needed to turn on the laptop, so it must be replaced with [open-source](/glossary#open-source) firmware. This is an advanced process because it requires opening the computer and using special tools. Most anarchists will not be able to do this themselves, but hopefully there is a trusted person in your networks who can set it up for you. The project is called HEADS because it's the other side of Tails—where Tails secures software, HEADS secures firmware. It has a similar purpose to the [Verified Boot](https://www.privacyguides.org/en/os/android-overview/#verified-boot) found in GrapheneOS, which establishes a full chain of trust from the hardware. HEADS has [limited compatibility](https://osresearch.net/Prerequisites#supported-devices), so keep that in mind when buying your laptop if you plan to install it—we recommend the ThinkPad X230 because it's less involved to install than other models. The CPUs of this generation are capable of effectively remoting the [Intel Management Engine](https://en.wikipedia.org/wiki/Intel_Management_Engine#Assertions_that_ME_is_a_backdoor) when flashing HEADS, but this is not the case with later generations of CPUs on newer computers. [Coreboot](https://www.coreboot.org/users.html), the project on which HEADS is based, is compatible with a wider range of laptop models but has less security. HEADS can be configured to [verify the integrity of your Tails USB](https://osresearch.net/InstallingOS/#generic-os-installation), preventing it from booting if it has been tampered with. HEADS protects against physical and remote classes of attacks!
|
||||
|
||||
* **Using USBs with secure firmware**, like the [Kanguru FlashTrust](https://www.kanguru.com/products/kanguru-flashtrust-secure-firmware-usb-3-0-flash-drive) which has [retailers globally](https://www.kanguru.com/pages/where-to-buy), so that the USB will [stop working](https://www.kanguru.com/blogs/gurublog/15235873-prevent-badusb-usb-firmware-protection-from-kanguru) if the firmware is altered through compromise.
|
||||
* **Use USBs with secure firmware**, such as the [Kanguru FlashTrust](https://www.kanguru.com/products/kanguru-flashtrust-secure-firmware-usb-3-0-flash-drive), which has [retailers worldwide](https://www.kanguru.com/pages/where-to-buy), so that the USB will [stop working](https://www.kanguru.com/blogs/gurublog/15235873-prevent-badusb-usb-firmware-protection-from-kanguru) if the firmware is compromised.
|
||||
|
||||

|
||||
|
||||
|
@ -136,43 +136,43 @@ Our adversaries have two attack vectors to compromise BIOS, firmware, hardware,
|
|||
|
||||
> What's a *write-protect* switch? When you insert a normal USB into a computer, the computer does *read* and *write* operations with it, and a *write* operation can change the data. Some special USBs developed for malware analysis have a physical switch that can lock the USB, so that data can be *read* from it, but no new data can be *written* to it.
|
||||
|
||||
If your Tails USB stick has a write-protect switch and secure firmware, such as the [Kanguru FlashTrust](https://www.kanguru.com/products/kanguru-flashtrust-secure-firmware-usb-3-0-flash-drive), you will be protected from the USB firmware being compromised during a Tails session. When the switch is locked, you will also be protected from the Tails software being compromised. This is critical. Compromising your Tails USB stick would necessitate being able to write to it. This means that even if a Tails session is infected with malware, Tails itself is immutable so the compromise cannot "take root", and would no longer be present during your next Tails session. If you are unable to obtain such a USB, you have two options.
|
||||
If your Tails USB stick has a write-protect switch and secure firmware, such as [Kanguru FlashTrust](https://www.kanguru.com/products/kanguru-flashtrust-secure-firmware-usb-3-0-flash-drive), you are protected from compromising the USB firmware during a Tails session. If the switch is locked, you are also protected from compromising the Tails software. This is critical. Compromising your Tails USB stick would require being able to write to it. This means that even if a Tails session is infected with malware, Tails itself is immutable, so the compromise cannot "take root" and would not be present during your next Tails session. If you are unable to obtain such a USB, you have two options.
|
||||
|
||||
1) [Burn Tails to a new DVD-R/DVD+R](https://tails.boum.org/install/dvd/index.en.html) (write once) for each new version of Tails - it should not have the label "DVD+RW" or "DVD+RAM" so that the DVD cannot be rewritten.
|
||||
2) Boot Tails with the `toram` option, which loads Tails completely into the memory. To use the `toram` option, it depends on whether your Tails USB boots with [SYSLINUX or GRUB](https://tails.boum.org/doc/advanced_topics/boot_options/index.en.html).
|
||||
* For SYSLINUX, when the boot screen appears you must press the Tab key, and enter a space. Type `toram` and press Enter.
|
||||
* For GRUB, when the boot screen appears you must press `e`, navigate with the arrows of the keyboard to the end of the line that starts with `linux`. The line is most likely wrapped and displayed on several lines but it is a single configuration line. Type `toram` and press F10 or Ctrl+X.
|
||||
* Once you are at the Tails desktop, you can then eject the USB on which Tails is located before starting your work (whether connecting to the Internet or plugging in another USB).
|
||||
1) [Burn Tails to a new DVD-R/DVD+R](https://tails.boum.org/install/dvd/index.en.html) (write once) for each new version of Tails - it should not be labeled "DVD+RW" or "DVD+RAM" so that the DVD cannot be rewritten.
|
||||
2) Boot Tails with the `toram` option, which loads Tails completely into memory. Using the `toram` option depends on whether your Tails USB boots with [SYSLINUX or GRUB](https://tails.boum.org/doc/advanced_topics/boot_options/index.en.html).
|
||||
* For SYSLINUX, when the boot screen appears, press Tab, and type a space. Type `toram` and press Enter.
|
||||
* For GRUB, when the boot screen appears, press `e` and use the keyboard arrows to move to the end of the line that starts with `linux`. The line is probably wrapped and displayed on multiple lines, but it is a single configuration line. Type `toram` and press F10 or Ctrl+X.
|
||||
* Once you are on the Tails desktop, you can eject the USB that Tails is on before you do anything else (whether it is connecting to the Internet or plugging in another USB).
|
||||
|
||||
On a USB with a write-protect switch, you will not be able to make any changes to the Tails USB when the switch is enabled. If you could make changes, so could malware. Although ideally the switch would be enabled all the time, we recommend two cases where the switch can be disabled:
|
||||
On a USB with a write-protect switch, you will not be able to make any changes to the Tails USB when the switch is enabled. If you could make changes, so could malware. While it would be ideal to leave the switch on all the time, we recommend two cases where the switch must be turned off:
|
||||
|
||||
1) **For a dedicated upgrade session.** When Tails needs to be upgraded, you can do so in a dedicated session with the switch disabled - this is required because the upgrade will need to be written to the Tails USB. As soon as you are done you should reboot Tails with the switch enabled.
|
||||
2) **If you decide to use Persistent Storage, for occasional configuration sessions.** [Persistent Storage](/posts/tails/#optional-create-and-configure-persistent-storage) is a Tails feature that allows data to persist between sessions that are otherwise amnesiac on the Tails USB itself. Because it requires writing to the Tails USB to persist data, it is generally impractical to use along with a write-protect switch. However, disabling the switch for occasional Persistent Storage configuration sessions, for example to install additional software, might be acceptable. For example, in an 'unlocked' session, you enable additional software for persistence and install Scribus, selecting that it is installed every session. Then, in a 'locked' session you actually use Scribus - none of the files you work on will be saved to the Tails USB, because it is 'locked'. The Persistent Storage feature is not possible using the `toram` boot, or with a DVD.
|
||||
1) **For a dedicated upgrade session.** If you need to upgrade Tails, you can do so in a dedicated session with the switch disabled - this is necessary because the upgrade needs to be written to the Tails USB. Once you are done, you should restart Tails with the switch enabled.
|
||||
2) **If you decide to use Persistent Storage, for occasional configuration sessions.** [Persistent Storage](/posts/tails/#optional-create-and-configure-persistent-storage) is a Tails feature that allows data to persist between sessions that would otherwise be amnesiac on the Tails USB itself. Because it requires writing to the Tails USB to persist data, it is generally impractical to use with a write-protect switch. However, it may be acceptable to disable the switch for occasional Persistent Storage configuration sessions, such as installing additional software. For example, in an 'unlocked' session, you enable additional software for persistence and install Scribus, selecting to install it every session. Then, in a 'locked' session, you actually use Scribus - none of the files you work on are saved to the Tails USB because it is 'locked'. The Persistent Storage feature is not possible with the `toram` boot or with a DVD.
|
||||
|
||||
Where can we store personal data for use between Tails sessions, if the write-protect switch prevents us from using Persistent Storage? We recommend storing personal data on a second LUKS USB. This 'personal data' USB should not look identical to your Tails USB, to avoid confusing them. To make this separate USB, see [How to create an encrypted USB](/posts/tails/#how-to-create-an-encrypted-usb). If you happen to be reading this from a country like the UK where not providing encryption passwords can land you in jail, this second drive should be a HDD containing a [Veracrypt Hidden Volume](https://www.veracrypt.fr/en/Hidden%20Volume.html) (SSD and USB drives are [not appropriate for Hidden Volumes](https://www.veracrypt.fr/en/Trim%20Operation.html)).
|
||||
Where can we store personal data for use between Tails sessions if the write-protect switch prevents us from using Persistent Storage? We recommend storing personal data on a second LUKS USB. This "personal data" USB should not look identical to your Tails USB to avoid confusion. To create this separate USB, see [How to create an encrypted USB](/posts/tails/#how-to-create-an-encrypted-usb). If you are reading this from a country like the UK, where not providing encryption passwords can land you in jail, this second drive should be an HDD containing a [Veracrypt Hidden Volume](https://www.veracrypt.fr/en/Hidden%20Volume.html) (SSD and USB drives are [not suitable for Hidden Volumes](https://www.veracrypt.fr/en/Trim%20Operation.html)).
|
||||
|
||||

|
||||
|
||||
Compartmentalization is an approach that cleanly separates different identities - in Tails session #1 you do activities related to moderating a website, and in Tails session #2 you do activities related to research for an action. This approach also comes into play for your 'personal data' USBs. If the files that you save could be used to link your activities together, use a different 'personal data' USB for each activity. For a 'personal data' USB that stores very sensitive files (like the text of a communique), once you no longer need the files it is best to reformat then destroy the USB (see [Really delete data from a USB drive](/posts/tails/#really-delete-data-from-a-usb)). This is another reason to use a separate USB for any files that need to be saved - you don't accumulate the forensic history of all of your files on your Tails Persistent Storage, and can simply destroy USBs as needed.
|
||||
Compartmentalization is an approach that neatly separates different identities - in Tails session #1 you do activities related to moderating a website, and in Tails session #2 you do activities related to researching for an action. This approach also comes into play with your "personal data" USBs. If the files you save could be used to link your activities together, use a different "personal data" USB for each activity. For a "personal data" USB that stores very sensitive files (such as the text of a communique), it is best to reformat and then destroy the USB once you no longer need the files (see [Really delete data from a USB drive](/posts/tails/#really-delete-data-from-a-usb)). This is another reason to use a separate USB for any files that need to be saved - you don't accumulate the forensic history of all your files on your Tails Persistent Storage, and you can easily destroy USBs as needed.
|
||||
|
||||
Finally, a note on emails - if you already use Tails and encrypted email ([despite it not being particularly secure](/posts/e2ee/#pgp-email)), you may be used to the Thunderbird Persistent Storage feature. This feature allows you to store Thunderbird email account details on a Tails USB, as well as your inbox and PGP keys. With a 'personal data' USB, Thunderbird won't automatically open your accounts anymore. For this, we recommend either:
|
||||
Finally, a note about email - if you already use Tails and encrypted email ([even though it is not very secure](/posts/e2ee/#pgp-email)), you may be familiar with Thunderbird's Persistent Storage feature. This feature allows you to store your Thunderbird email account details, as well as your inbox and PGP keys, on a Tails USB. With a "personal data" USB, Thunderbird won't automatically open your accounts. We recommend that you do one of the following:
|
||||
|
||||
- Re-creating Thunderbird email accounts in each session. PGP keys can be stored on the separate 'personal data' USB like any other file, and imported when needed. This has the benefit that if law enforcement manages to bypass LUKS, they still don't have your inbox without knowing your email password.
|
||||
- Keeping Thunderbird data folder on the 'personal data' USB. After logging in to Thunderbird, use the Files browser (Applications → Accessories → Files) and enable the setting "Show hidden files". Navigate to Home, then copy the folder titled `.thunderbird` to your 'personal data' USB. In each future session, after unlocking the 'personal data' USB and before launching Thunderbird, copy the `.thunderbird` folder into Home.
|
||||
- Create new Thunderbird email accounts in each session. PGP keys can be stored on the separate 'personal data' USB like any other file, and imported when needed. This has the advantage that if law enforcement manages to bypass LUKS, they still don't have your inbox without knowing your email password.
|
||||
- Keep the Thunderbird data folder on the "personal data" USB. After logging in to Thunderbird, use the Files browser (Applications → Accessories → Files) and enable the "Show hidden files" setting. Navigate to Home, then copy the folder called `.thunderbird` to your "personal data" USB. In each future session, after you have unlocked the 'personal data' USB and before you start Thunderbird, copy the `.thunderbird` folder to Home.
|
||||
|
||||
Another reason to not use Persistent Storage features is that many of them persist user data onto the Tails USB. If your Tails session is compromised, the data you access during it can be used to link your activities together. If there is user data on the Tails USB, like an email inbox, compartmentalization of Tails sessions is no longer possible. To achieve compartmentalization with Persistent Storage enabled you would need a dedicated Tails USB for each identity, and updating them all every month is a lot of work.
|
||||
Another reason to avoid using Persistent Storage features is that many of them persist user data to the Tails USB. If your Tails session is compromised, the data you access during that session can be used to tie your activities together. If there is user data on the Tails USB, such as an email inbox, compartmentalization of Tails sessions is no longer possible. To achieve compartmentalization with Persistent Storage enabled, you would need a dedicated Tails USB for each identity, and updating them all every month is a lot of work.
|
||||
|
||||
# Encryption
|
||||
|
||||
## Passwords
|
||||
|
||||
[Encryption](/glossary#encryption) is a blessing—it's the only thing standing in the way of our adversary reading all of our data, if it's used well. The first step to secure your encryption is to ensure that you use very good passwords—most passwords don't need to be memorized because they will be stored in a password manager called KeePassXC, so can be completely random. To learn how to use KeePassXC, see [Password Manger](/posts/tails/#password-manager-keepassxc).
|
||||
[Encryption](/glossary#encryption) is a blessing—it's the only thing standing in the way of our adversary reading all our data, if it's used well. The first step in securing your encryption is to make sure that you use very good passwords—most passwords don't need to be memorized because they are stored in a password manager called KeePassXC, so they can be completely random. To learn how to use KeePassXC, see [Password Manager](/posts/tails/#password-manager-keepassxc).
|
||||
|
||||
>In the terminology used by KeePassXC, a [*password*](/glossary/#password) is a randomized sequence of characters (letters, numbers and other symbols), whereas a [*passphrase*](/glossary/#passphrase) is a random series of words.
|
||||
>In the terminology used by KeePassXC, a [*password*](/glossary/#password) is a random sequence of characters (letters, numbers and other symbols), while a [*passphrase*](/glossary/#passphrase) is a random sequence of words.
|
||||
|
||||
Never reuse a password/passphrase for multiple things ("password recycling") - KeePassXC makes it easy to save unique ones that are dedicated to one purpose. [LUKS](/glossary/#luks) encryption **is only effective when the device is powered down** - when the device is on, the password can be retrieved from memory. Any encryption can be [brute-force attacked](/glossary#brute-force-attack) with [massive amounts of cloud computing](https://blog.elcomsoft.com/2020/08/breaking-luks-encryption/). The newer version of LUKS (LUKS2 using Argon2id) is [less vulnerable to brute-force attacks](https://mjg59.dreamwidth.org/66429.html); this is the default from Tails 6.0 ([forthcoming](https://gitlab.tails.boum.org/tails/tails/-/issues/19733)) onwards, and Qubes OS 4.1 onwards. If you'd like to learn more about this change, we recommend the overview by [Systemli](https://www.systemli.org/en/2023/04/30/is-linux-hard-disk-encryption-hacked/) or [dys2p](https://dys2p.com/en/2023-05-luks-security.html).
|
||||
Never reuse a password/passphrase for multiple things ("password recycling") - KeePassXC makes it easy to store unique passwords that are dedicated to one purpose. [LUKS](/glossary/#luks) encryption **is only effective when the device is powered off** - when the device is powered on, the password can be retrieved from memory. Any encryption can be [brute-force attacked](/glossary#brute-force-attack) with [massive amounts of cloud computing](https://blog.elcomsoft.com/2020/08/breaking-luks-encryption/). The newer version of LUKS (LUKS2 using Argon2id) is [less vulnerable to brute-force attacks](https://mjg59.dreamwidth.org/66429.html); this is the default as of Tails 6.0 ([forthcoming](https://gitlab.tails.boum.org/tails/tails/-/issues/19733)) and Qubes OS 4.1. If you'd like to learn more about this change, we recommend [Systemli's overview](https://www.systemli.org/en/2023/04/30/is-linux-hard-disk-encryption-hacked/) or [dys2p's](https://dys2p.com/en/2023-05-luks-security.html).
|
||||
|
||||
Password strength is measured in "[bits of entropy](https://en.wikipedia.org/wiki/Password_strength#Entropy_as_a_measure_of_password_strength)". Your passwords/passphrases should ideally have an entropy of around 128 bits (diceware passphrases of approximately **ten words**, or passwords of **21 random characters**, including uppercase, lowercase, numbers and symbols) and shouldn't have less than 90 bits of entropy (approximately seven words).
|
||||
Password strength is measured in "[bits of entropy](https://en.wikipedia.org/wiki/Password_strength#Entropy_as_a_measure_of_password_strength)". Your passwords/passphrases should ideally have an entropy of about 128 bits (diceware passphrases of **ten words**, or passwords of **21 random characters**, including uppercase, lowercase, numbers, and symbols) and shouldn't have less than 90 bits of entropy (seven words).
|
||||
|
||||

|
||||
|
||||
|
@ -180,92 +180,92 @@ What is a diceware passphrase? As [Privacy Guides notes](https://www.privacyguid
|
|||
|
||||
Our recommendations are:
|
||||
|
||||
1) Memorize diceware passphrases of 7-10 words for anything that is not stored in a KeePassXC database
|
||||
2) Generate passwords of 21 random characters for anything that can be stored in a KeePassXC database. Maintain an offsite backup of your KeePassXC database(s) in case it is ever corrupted or seized.
|
||||
1) Memorize diceware passphrases of 7-10 words for everything that is not stored in a KeePassXC database.
|
||||
2) Generate passwords of 21 random characters for everything that can be stored in a KeePassXC database. Maintain an offsite backup of your KeePassXC database(s) in case it is ever corrupted or seized.
|
||||
|
||||
> **Tip**
|
||||
>
|
||||
> Diceware passphrases can be easy to forget when you have several to keep track of, especially if you use any irregularly. To mitigate against the risk of forgetting a diceware passphrase, you can create a KeePassXC file with all "memorized" passphrases in it. Store this on a LUKS USB, and hide this USB somewhere offsite where it won't be recovered during a police raid. You should be able to reconstruct both the LUKS and KeePassXC passphrases if a lot of time has passed. One strategy is to use a memorable sentence from a book - this decrease in password entropy is acceptable if the USB is highly unlikely to ever be recovered due to its storage location. This way, if you ever truly forget a "memorized" passphrase, you can access this offsite backup. Like all important backups, have at least two.
|
||||
> Diceware passphrases can be easy to forget if you have several to keep track of, especially if you use them infrequently. To reduce the risk of forgetting a diceware passphrase, you can create a KeePassXC file with all "memorized" passphrases in it. Store this on a LUKS USB, and hide that USB somewhere off-site where it won't be recovered in a police raid. You should be able to reconstruct both the LUKS and KeePassXC passphrases if a lot of time has passed. One strategy is to use a memorable sentence from a book - this reduction in password entropy is acceptable if the USB is highly unlikely to ever be recovered due to its storage location. That way, if you ever really forget a "memorized" passphrase, you can access that offsite backup. As with all important backups, you should have at least two.
|
||||
|
||||
For Tails, you will need to memorize two passphrases:
|
||||
For Tails, you need to memorize two passphrases:
|
||||
|
||||
1) The [LUKS](/glossary/#luks) 'personal data' USB passphrase, where your KeePassXC file will be stored
|
||||
1) The [LUKS](/glossary/#luks) "personal data" USB passphrase, where your KeePassXC file is stored.
|
||||
2) The KeePassXC passphrase
|
||||
|
||||
If you use Persistent Storage, that is another passphrase which will have to be entered on the Welcome Screen upon booting but it can be the same as 1.
|
||||
If you are using Persistent Storage, this is another passphrase that you will have to enter on the Welcome Screen at boot time, but it can be the same as 1.
|
||||
|
||||
## Encrypted containers
|
||||
|
||||
[LUKS](/glossary#luks) is great, but 'defense-in-depth' can't hurt. If police seize your USB in a house raid, they will attempt a [variety of tactics to bypass the authentication](https://www.csrc.link/threat-library/techniques/targeted-digital-surveillance/authentication-bypass.html), so a second layer of defense with a different encryption implementation can make sense for highly sensitive data.
|
||||
[LUKS](/glossary#luks) is great, but defense-in-depth can't hurt. If the police seize your USB in a house raid, they will try a [variety of tactics to bypass the authentication](https://www.csrc.link/threat-library/techniques/targeted-digital-surveillance/authentication-bypass.html), so a second layer of defense with a different encryption implementation can be useful for highly sensitive data.
|
||||
|
||||
|
||||
[Gocryptfs](https://nuetzlich.net/gocryptfs/) is an encrypted container program that is [available for Debian](https://packages.debian.org/bullseye/gocryptfs) and thus easy to install with Tails as [additional software](/posts/tails/#optional-create-and-configure-persistent-storage). If you don't want to have to reinstall it every session, Additional Software will need to be [configured in Persistent Storage](#using-a-write-protect-switch).
|
||||
[Gocryptfs](https://nuetzlich.net/gocryptfs/) is an encrypted container program that is [available for Debian](https://packages.debian.org/bullseye/gocryptfs) and can be easily installed as [additional software](/posts/tails/#optional-create-and-configure-persistent-storage). If you don't want to reinstall it every session, you will need to [configure Additional Software in Persistent Storage](#using-a-write-protect-switch).
|
||||
|
||||
To use gocryptfs, you will need to use Terminal (the [command line](/glossary#command-line-interface-cli)).
|
||||
|
||||
On your Personal Data LUKS USB, use the file manager to make two folders, and name them `cipher` and `plain`. Right click in the white space of your file manager, and select 'Open Terminal Here'. This allows you to already be in the proper location when Terminal opens, instead of needing to know how to navigate by using the `cd` command.
|
||||
On your Personal Data LUKS USB, use the file manager to create two folders and name them `cipher` and `plain`. Right click in the white space of your file manager and select 'Open Terminal Here'. This will allow you to be in the correct location when Terminal opens, instead of having to know how to navigate using the `cd` command.
|
||||
|
||||
In Terminal, list the folders that are present and it should output the two you just created, among others:
|
||||
In Terminal, list the folders you have, and it should output the two you just created, among others:
|
||||
|
||||
`ls`
|
||||
|
||||
The first time you use Gocryptfs, you create a Gocryptfs filesystem;
|
||||
The first time you use Gocryptfs, create a Gocryptfs filesystem;
|
||||
|
||||
`gocryptfs -init cipher`
|
||||
|
||||
You will be prompted for the password. Create a new entry in your KeepassXC file and create a password by using the Generate Password feature (the dice icon). Then copy the password, and paste it into the terminal (Edit → Paste, or Ctrl+Shift+V). It will output a master key—save this in the KeepassXC entry.
|
||||
You will be prompted for a password. Create a new entry in your KeepassXC file and generate a password using the Generate Password feature (the dice icon). Then copy the password and paste it into the terminal (Edit → Paste or Ctrl+Shift+V). It will output a master key—save it in the KeepassXC entry.
|
||||
|
||||
Every time you use the filesystem, mount it like so:
|
||||
Every time you use the filesystem, mount it like this:
|
||||
|
||||
`gocryptfs cipher plain`
|
||||
|
||||
You will be prompted for the password. Note that the order matters - `cipher` is the first argument and `plain` is the second.
|
||||
You will be prompted for the password. Note that the order is important - `cipher` is the first argument and `plain` is the second.
|
||||
|
||||
You can now add files to your mounted decrypted container in the folder 'plain'. When you unmount the filesystem, the container will be encrypted. To do so:
|
||||
You can now add files to your mounted, decrypted container in the 'plain' folder. When you unmount the filesystem, the container will be encrypted. To do this:
|
||||
|
||||
`fusermount -u plain`
|
||||
|
||||
Now plain is just an empty folder again. Before storing important files in the container, do a test to make sure that it is working as expected, especially if you are unfamiliar with using the command line interface.
|
||||
Now plain is just an empty folder again. Before storing important files in the container, you should run a test to make sure it works as expected, especially if you are unfamiliar with the command line interface.
|
||||
|
||||
## Encrypted Communication
|
||||
|
||||
PGP email is the most established form of encrypted communication on Tails in the anarchist space. Unfortunately, PGP does not have [forward secrecy](/glossary#forward-secrecy)—this means that a single secret (your Private Key) can decrypt all messages rather than only a single message, which is today's standard in encrypted messaging. It is the opposite of 'metadata protecting', and has [several other failings](/posts/e2ee/#pgp-email).
|
||||
PGP email is the most established form of encrypted communication on Tails in the anarchist space. Unfortunately, PGP does not have [forward secrecy](/glossary#forward-secrecy)—that is, a single secret (your private key) can decrypt all messages, rather than just a single message, which is the standard in encrypted messaging today. It is the opposite of "metadata protecting", and has [several other shortcomings](/posts/e2ee/#pgp-email).
|
||||
|
||||
For [synchronous](/glossary/#synchronous-communication) messaging—when you are both online at the same time—we recommend [Cwtch](/posts/e2ee/#cwtch) for encrypted communications on Tails.
|
||||
For [synchronous](/glossary/#synchronous-communication) messaging—when you are both online at the same time—we recommend [Cwtch](/posts/e2ee/#cwtch) for encrypted communication on Tails.
|
||||
|
||||
For [asynchronous](/glossary/#asynchronous-communication) messaging—when you are not online at the same time—we recommend [Element](/posts/e2ee/#element-matrix). What server you use is important as well—[Systemli](https://www.systemli.org/en/service/matrix/) and [Anarchy Planet](https://anarchyplanet.org/chat.html) are reputable hosts.
|
||||
For [asynchronous](/glossary/#asynchronous-communication) messaging—when you are not both online at the same time—we recommend [Element](/posts/e2ee/#element-matrix). Which server you use is also important; [Systemli](https://www.systemli.org/en/service/matrix/) and [Anarchy Planet](https://anarchyplanet.org/chat.html) are reputable hosts.
|
||||
|
||||
For more information on either option, see [Encrypted Messaging For Anarchists](/posts/e2ee/).
|
||||
For more information on both options, see [Encrypted Messaging For Anarchists](/posts/e2ee/).
|
||||
|
||||
# Phishing Awareness
|
||||
|
||||
We will end by thinking about how an adversary would go about their [remote attack](/glossary/#remote-attacks) targeting you or your project; the answer is very likely to be ['phishing'](/glossary/#phishing). *Phishing* is when an adversary crafts an email (or a text, a message in an app, etc.) in such a way to trick you into divulging information, gain access to your account, or introduce malware to you machine. [*Spear phishing*](/glossary/#spear-phishing) is when the adversary has done some reconnaissance, and uses information they already know about you to specially tailor their phishing attack.
|
||||
Finally, consider how an adversary would conduct a [remote attack](/glossary/#remote-attacks) targeting you or your project; the answer is most likely ["phishing"](/glossary/#phishing). *Phishing* is when an adversary crafts an email (or text, message in an application, etc.) to trick you into revealing information, gain access to your account, or introduce malware to your machine. [*Spear phishing*](/glossary/#spear-phishing) is when the adversary has done some reconnaissance and uses information they already know about you to tailor their phishing attack.
|
||||
|
||||
You have probably already heard the advice to be skeptical of clicking links and opening attachments—this is why. To make matters more confusing, the "from" field in email can be forged to trick you—[PGP signing](/posts/e2ee/#pgp-email) mitigates against this to prove that the email actually comes from who you expect.
|
||||
You have probably heard the advice to be skeptical about clicking on links and opening attachments—this is why. To make matters worse, the "from" field in emails can be spoofed to fool you—[PGP signing](/posts/e2ee/#pgp-email) mitigates this to prove that the email is actually from who you expect it to be from.
|
||||
|
||||
Sometimes the goal of phishing is to deliver a 'payload' which will call back to the adversary—it is the [initial access](https://attack.mitre.org/tactics/TA0001/) foothold to infecting your machine with malware. A payload can be embedded in a file and executed when the file is opened. For a link, a payload can be delivered through malicious JavaScript in the website that will allow the payload to execute on your computer. Tor should protect your location (IP address), but the adversary now has an opportunity to further their attack; to [make the infection persist](https://attack.mitre.org/tactics/TA0003/), to [install a screen or key logger](https://attack.mitre.org/tactics/TA0009/), to [exfiltrate your data](https://attack.mitre.org/tactics/TA0010/), etc. The reason that Tails has no default Administration password (it must be set at the Welcome Screen for the session if needed) is to make the [privilege escalation](https://attack.mitre.org/tactics/TA0004/) more difficult, which would be necessary to slip around Tor.
|
||||
Sometimes the goal of phishing is to deliver a "payload" that calls back to the adversary—it is the [initial access](https://attack.mitre.org/tactics/TA0001/) entry point to infect your machine with malware. A payload can be embedded in a file and run when the file is opened. In the case of a link, a payload can be delivered via malicious JavaScript in the website, allowing the payload to be executed on your computer. Tor is supposed to protect your location (IP address), but now the adversary has a way to further their attack; [make the infection persistent](https://attack.mitre.org/tactics/TA0003/), [install a screen or key logger](https://attack.mitre.org/tactics/TA0009/), [exfiltrate your data](https://attack.mitre.org/tactics/TA0010/), etc. The reason Tails does not have a default Administration password (it must be set on the session's Welcome Screen if needed) is to make it more difficult to [escalate privileges](https://attack.mitre.org/tactics/TA0004/), which would be necessary to bypass Tor.
|
||||
|
||||
## Attachments
|
||||
|
||||
For untrusted attachments, you would ideally **sanitize all files that are sent to you before opening them** with a program like [Dangerzone](https://dangerzone.rocks/), which takes potentially dangerous PDFs, office documents, or images and converts them to safe PDFs. Unfortunately, Dangerzone is [not easily available in Tails yet](https://gitlab.tails.boum.org/tails/tails/-/issues/18135). An inferior option is to **open untrusted files in a dedicated ['Offline Mode'](https://tails.boum.org/doc/first_steps/welcome_screen/index.en.html#index3h2) session**, so that if they are malicious they can't phone home, and you shut down immediately after so that their opportunity to persist is minimized. Tails prevents against deanonymization through phishing by forcing all internet connections through the Tor network. However, this is still vulnerable to [0-day exploits](/glossary#zero-day-exploit), which nation-state actors possess. For example, the FBI and Facebook collaborated to develop a 0-day exploit against Tails [which deanonymized a user](https://www.vice.com/en/article/v7gd9b/facebook-helped-fbi-hack-child-predator-buster-hernandez) after he opened a video file attachment from his home Wi-Fi.
|
||||
For untrusted attachments, you would ideally **sanitize all files sent to you before opening them** with a program like [Dangerzone](https://dangerzone.rocks/), which takes potentially dangerous PDFs, office documents, or images and converts them into safe PDFs. Unfortunately, Dangerzone is [not yet readily available in Tails](https://gitlab.tails.boum.org/tails/tails/-/issues/18135). An inferior option is to **open untrusted files in a dedicated ['offline mode'](https://tails.boum.org/doc/first_steps/welcome_screen/index.en.html#index3h2) session**, so that if they're malicious they can't call home, and you shut down immediately afterward, minimizing their chance of persistence. Tails prevents deanonymization through phishing by forcing all internet connections through the Tor network. However, this is still vulnerable to [0-day exploits](/glossary#zero-day-exploit) that nation-state actors have. For example, the FBI and Facebook worked together to develop a 0-day exploit against Tails [that deanonymized a user](https://www.vice.com/en/article/v7gd9b/facebook-helped-fbi-hack-child-predator-buster-hernandez) after he opened a video attachment from his home Wi-Fi.
|
||||
|
||||
## Links
|
||||
|
||||
For untrusted links, there are two things to protect; your anonymity and your information. Unless the adversary has a 0-day exploit on Tor Browser or Tails, your anonymity should be protected **if you don't enter any identifying information into the website**. Your information can only be protected **by your behaviour**—phishing awareness allows you to think critically about whether this could be a phishing attack and act accordingly.
|
||||
With untrusted links, there are two things to protect: your anonymity and your information. Unless the adversary has a 0-day exploit on the Tor Browser or Tails, your anonymity should be protected **if you don't enter any identifying information into the website**. Your information can only be protected **by your behavior**—phishing awareness allows you to think critically about whether this could be a phishing attack and act accordingly.
|
||||
|
||||
Examine untrusted links prior to clicking them by **manually copy and pasting the address into the browser**—don't click through a hyper-link because the text can be used to deceive what link it will take you to. **Never follow a shortened link** (e.g., a site like bit.ly which takes long web addresses and makes a short one) because it cannot be examined prior to redirection. [Unshorten.me](https://unshorten.me/) can reveal any shortened link.
|
||||
Investigate untrusted links before you click by **manually copying and pasting the address into your browser**—do not click through a hyperlink as the text can be used to mislead you about where you are going. **Never follow a shortened link** (e.g. a site like bit.ly that takes long web addresses and makes a short one) because it cannot be verified before redirection. [Unshorten.me](https://unshorten.me/) can reveal any shortened link.
|
||||
|
||||

|
||||
|
||||
Furthermore, **don’t follow links to domains you are unfamiliar with**. If in doubt, perform a search for the domain, with the domain name in quotation marks with a privacy-preserving search engine (like DuckDuckGo) to see if it’s a legitimate web site. This isn’t a 100% fix, but it’s a good precaution to take.
|
||||
Also, **don’t follow links to domains you don't recognize**. When in doubt, search for the domain with the domain name in quotation marks using a privacy-preserving search engine (such as DuckDuckGo) to see if it’s a legitimate website. This isn’t a 100% solution, but it’s a good precaution to take.
|
||||
|
||||
Finally, if you click on any link from an email, and are asked to log in, be aware that is a common endgame for phishing campaigns. **Do not do it**. Instead, manually go to the website of the service you are trying to log into and log in there. This way, you’ll know you’re logging in to the correct website because you’ve typed in the address for it, rather than having to trust the email link. For example, you could be entering your password into mailriseup.net and not mail.riseup.net (this is called 'typo-squatting').
|
||||
Finally, if you click on any link in an email and are asked to log in, be aware that this is a common endgame for phishing campaigns. **Do not do it**. Instead, manually go to the website of the service you are trying to sign in to and sign in there. That way, you’ll know you’re logging in to the right site because you’ve typed in the address for it, rather than having to trust the link in the email. For example, you might type your password at mailriseup.net instead of mail.riseup.net (this is called "typo-squatting").
|
||||
|
||||
You may want to open untrusted links in a dedicated Tails session, with no Persistent Storage unlocked or Personal Data USBs mounted.
|
||||
You may want to open untrusted links in a dedicated Tails session without unlocked Persistent Storage or attaching "personal data" USBs.
|
||||
|
||||
# To Conclude
|
||||
|
||||
Using Tails without any of this advice is still a huge improvement over many other options. Given that anarchists regularly entrust their freedom to Tails, such as for submitting communiques, taking these extra precautions can further strengthen your trust in this operating system.
|
||||
Using Tails without any of this advice is still a vast improvement over many other options. Given that anarchists regularly entrust their freedom to Tails, such as sending communiques, taking these extra precautions can further strengthen your trust in this operating system.
|
||||
|
||||
# Appendix: Deanonymization of your WLAN (Wi-Fi) adapter despite Tails?
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue