From 849cff732096d042a10ea80a13a59df55cc0f3e5 Mon Sep 17 00:00:00 2001 From: anarsec Date: Thu, 10 Aug 2023 23:39:05 +0000 Subject: [PATCH] qubes best feedback --- content/posts/tails-best/index.md | 88 ++++++++++++++++--------------- 1 file changed, 45 insertions(+), 43 deletions(-) diff --git a/content/posts/tails-best/index.md b/content/posts/tails-best/index.md index 6f3680f..0bc16cf 100644 --- a/content/posts/tails-best/index.md +++ b/content/posts/tails-best/index.md @@ -33,19 +33,19 @@ Let's start by looking at the [Tails Warnings page](https://tails.boum.org/doc/a > 1. Sharing files with [metadata](/glossary#metadata), such as date, time, location, and device information > 2. Using Tails for more than one purpose at a time -### Sharing files with metadata +### 1. Sharing files with metadata -This first issue can be mitigated by **cleaning metadata from files before sharing them**: +You can mitigate this first issue by **cleaning metadata from files before sharing them**: * To learn how, see [Removing Identifying Metadata From Files](/posts/metadata/). -### Using Tails for more than one purpose at a time +### 2. Using Tails for more than one purpose at a time -This second issue can be mitigated by what's called **"compartmentalization"**: +You can mitigate this second issue by what's called **"compartmentalization"**: * [Compartmentalization](https://www.csrc.link/threat-library/mitigations/compartmentalization.html) means keeping different activities or projects separate. If you use Tails sessions for more than one purpose at a time, an adversary could link your different activities together. For example, if you log into different accounts on the same website in a single Tails session, the website could determine that the accounts are being used by the same person. This is because websites can tell when two accounts are using the same Tor circuit. * To prevent an adversary from linking your activities while using Tails, restart Tails between different activities. For example, restart Tails between checking different project emails. -* Tails is amnesiac by default, so to save any data from a Tails session, you must save it to a USB. If the files you save could be used to link your activities together, use a different encrypted ([LUKS](/glossary#luks)) USB stick for each activity. For example, use one Tails USB stick for moderating a website and another for researching actions. Tails has a feature called Persistent Storage, but we do not recommend using it for data storage, which is explained [below](#using-a-write-protect-switch). +* Tails is amnesiac by default, so to save any data from a Tails session, you must save it to a USB. If the files you save could be used to link your activities together, use a different encrypted ([LUKS](/glossary#luks)) USB stick for each activity. For example, use one Tails USB stick for moderating a website and another for researching actions. Tails has a feature called Persistent Storage, but we do not recommend using it for data storage, explained [below](#using-a-write-protect-switch). ## Limitations of the [Tor network](/glossary#tor-network) @@ -56,30 +56,32 @@ This second issue can be mitigated by what's called **"compartmentalization"**: > 1. Hiding that you are using Tor and Tails > 2. Protecting your online communications from determined, skilled attackers -### Hiding that you are using Tor and Tails +### 1. Hiding that you are using Tor and Tails -This first issue is mitigated by [**Tor bridges**](https://tails.boum.org/doc/anonymous_internet/tor/index.en.html#bridges): +You can mitigate this first issue by [**Tor bridges**](https://tails.boum.org/doc/anonymous_internet/tor/index.en.html#bridges): * Tor Bridges are secret Tor relays that hide your connection to the Tor network. However, this is only necessary where connections to Tor are blocked, such as in heavily censored countries, by some public networks, or by some parental control software. This is because Tor and Tails don't protect you by making you look like any other Internet user, but by making all Tor and Tails users look the same. It becomes impossible to tell who is who among them. -### Protecting against determined, skilled attackers +### 2. Protecting against determined, skilled attackers +An *end-to-end correlation* attack is a theoretical way that a global adversary could break Tor's anonymity: > A powerful adversary, who could analyze the timing and shape of the traffic entering and exiting the Tor network, might be able to deanonymize Tor users. These attacks are called *end-to-end correlation* attacks, because the attacker has to observe both ends of a Tor circuit at the same time. [...] End-to-end correlation attacks have been studied in research papers, but we don't know of any actual use to deanonymize Tor users. -This second issue is mitigated by **not using an Internet connection that could deanonymize you**, and by **prioritizing .onion links when available**: +You can mitigate this second issue by **not using an Internet connection that is tied to your identity**, and by **prioritizing .onion links when available**: -* Wi-Fi adapters that work through SIM cards are a bad idea. The unique identification number of your SIM card (IMSI) and the unique serial number of your adapter (IMEI) are also transmitted to the mobile operator every time you connect, allowing identification and geographic localization. The adapter works like a mobile phone! If you do not want different research sessions to be associated with each other, do not use the same adapter or SIM card more than once! -* There are several opsec considerations to keep in mind when using Wi-Fi in a cafe without CCTV cameras. +* Wi-Fi adapters that work through the mobile network (via SIM cards) are a bad idea. The unique identification number of your SIM card (IMSI) and the unique serial number of your adapter (IMEI) are also transmitted to the mobile operator every time you connect, allowing identification and geographic localization. The adapter works like a mobile phone! If you do not want different research sessions to be associated with each other, do not use the same adapter or SIM card more than once! +* Use an Internet connection that isn't connected to you, such as in a cafe without CCTV cameras. There are several opsec considerations to keep in mind when using Wi-Fi in a public space like this. * See [below](#appendix-2-location-location-location) for more information on choosing a location. * Do not get into a routine of using the same cafes repeatedly if you can avoid it. * If you have to buy a coffee to get the Wi-Fi password, pay in cash! * Position yourself with your back against a wall so that no one can "shoulder surf" to see your screen, and ideally install a privacy screen on your laptop. - * Maintain situational awareness and be ready to pull out the Tails USB to shut down the computer at a moment's notice. One person in charge of a darknet marketplace had his Tails computer seized while distracted by a fake fight next to him. Similar tactics have been used [in other police operations](https://dys2p.com/en/2023-05-luks-security.html#attacks). If his Tails USB had been attached to a belt with a short piece of fishing line, the police would most likely have lost all evidence when the Tails USB was pulled out - note that [Tails warns](https://tails.boum.org/doc/first_steps/shutdown/index.en.html) "Only physically remove the USB stick in case of emergency as doing so can sometimes break the file system of the Persistent Storage." A more technical equivalent is [BusKill](https://docs.buskill.in/buskill-app/en/stable/introduction/what.html) - we don't recommend buying it through the mail, which can be [intercepted](https://docs.buskill.in/buskill-app/en/stable/faq.html#q-what-about-interdiction) to make the hardware [malicious](https://en.wikipedia.org/wiki/BadUSB)). If the Tails USB is removed, Tails will shut down and [overwrite the RAM with random data](https://tails.boum.org/doc/advanced_topics/cold_boot_attacks/index.en.html). Any LUKS USBs that were unlocked in the Tails session will now be encrypted again. If maintaining situational awareness seems unrealistic, consider asking a trusted friend to hang out who can dedicate themselves to it. + * Maintain situational awareness and be ready to pull out the Tails USB to shut down the computer at a moment's notice. One person in charge of a darknet marketplace had his Tails computer seized while distracted by a fake fight next to him. Similar tactics have been used [in other police operations](https://dys2p.com/en/2023-05-luks-security.html#attacks). If his Tails USB had been attached to a belt with a short piece of fishing line, the police would most likely have lost all evidence when the Tails USB was pulled out - note that [Tails warns](https://tails.boum.org/doc/first_steps/shutdown/index.en.html) "Only physically remove the USB stick in case of emergency as doing so can sometimes break the file system of the Persistent Storage." A more technical equivalent is [BusKill](https://docs.buskill.in/buskill-app/en/stable/introduction/what.html) - however, we only recommend buying this in person, such as at a conference. Any mail can be [intercepted](https://docs.buskill.in/buskill-app/en/stable/faq.html#q-what-about-interdiction) and altered, making it [malicious](https://en.wikipedia.org/wiki/BadUSB). If the Tails USB is removed, Tails will shut down and [overwrite the RAM with random data](https://tails.boum.org/doc/advanced_topics/cold_boot_attacks/index.en.html). Any LUKS USBs that were unlocked in the Tails session will now be encrypted again. If maintaining situational awareness seems unrealistic, consider asking a trusted friend to hang out who can dedicate themselves to keeping an eye on your surroundings. * If coffee shops without CCTV cameras are few and far between, you can try accessing a coffee shop's Wi-Fi from outside, out of view of the cameras. Some external Wi-Fi adapters can pick up signals from further away, as discussed [below](#appendix-2-location-location-location). -* If a determined adversary breaks Tor through a [correlation attack](https://anonymousplanet.org/guide.html#your-anonymized-torvpn-traffic), the Internet address you used in a coffee shop without CCTV cameras will only lead to your general area (e.g. your city) because it is not associated with you. Of course, this is less true if you use it routinely. A correlation attack used to deanonymize a Tor user is unprecedented in current evidence used in court, although [it has been used](https://medium.com/beyond-install-tor-signal/case-file-jeremy-hammond-514facc780b8) as corroborating evidence once a suspect has already been identified to correlate with. Correlation attacks are even less feasible against connections to an .onion address because you never leave the Tor network, so there is no "end" to correlate with. -* However, a more likely low-tech "correlation attack" is possible by local law enforcement, based on your identity rather than your anonymous Internet activity, if you are already in their sights and a target of [physical surveillance](https://www.csrc.link/threat-library/techniques/physical-surveillance/covert.html). For example, if a surveillance operation notices that you go to a cafe regularly, and an anarchist website is always updated during those windows, this pattern may indicate that you are moderating that website. An undercover may even be able to catch a glimpse of your screen. - * Possible mitigations in this scenario include **doing [surveillance detection](https://www.csrc.link/threat-library/mitigations/surveillance-detection.html) and [anti-surveillance](https://www.csrc.link/threat-library/mitigations/anti-surveillance.html) before going to a coffee shop**, and changing Wi-Fi locations regularly, but this may not be particularly realistic for projects like moderating a website that require daily Internet access. Alternatively, mitigations include **using a Wi-Fi antenna from indoors** (guide coming soon), **scheduling posts to be published later** (WordPress has this feature), or possibly even **using Tor from your home internet** for some projects. This contradicts the previous advice, but using Tor from home avoids creating a movement profile that is so easy to physically observe (as opposed to a network traffic profile which is more technical to observe, and may be harder to draw meaningful conclusions from). - * If you want to submit a report-back the morning after a riot, or a communique shortly after an action (times when there may be a higher risk of targeted surveillance), consider waiting and at least taking surveillance detection and anti-surveillance measures beforehand. In 2010, the morning after a bank arson in Canada, police surveilled a suspect as he travelled from his home to an Internet cafe, watched him post the communique, and then buried the laptop in the woods. More recently, investigators physically surveiling [an anarchist in France](https://www.csrc.link/#quelques-premiers-elements-du-dossier-d-enquete-contre-ivan) installed a hidden camera to monitor access to an Internet cafe near the comrade's home and requested CCTV footage for the day an arson communique was sent. +* As described in the quotation above, a global adversary (i.e. the NSA) may be capable of breaking Tor through a [correlation attack](https://anonymousplanet.org/guide.html#your-anonymized-torvpn-traffic). If this happens, the Internet address you used in a coffee shop without CCTV cameras will only lead to your general area (e.g. your city) because it is not associated with you. Of course, this is less true if you use it routinely. Correlation attacks are even less feasible against connections to an .onion address because you never leave the Tor network, so there is no "end" to correlate with through network traffic analysis. +* What we will term a "reverse correlation attack" is possible by a non-global adversary (i.e. local law enforcement), if you are already in their sights and a target of [physical surveillance](https://www.csrc.link/threat-library/techniques/physical-surveillance/covert.html) and/or [digital surveillance](https://www.csrc.link/threat-library/techniques/targeted-digital-surveillance.html). A correlation attack used to deanonymize a Tor user is unprecedented in current evidence used in court, although [a "reverse correlation attack" has been used](https://medium.com/beyond-install-tor-signal/case-file-jeremy-hammond-514facc780b8) as corroborating evidence - a suspect had already been identified, which allowed investigators to correlate their local footprint with specific online activity. Specifically, they correlated Tor network traffic coming from the suspect's house with the times their anonymous alias was online in chatrooms. To explain how this works, it helps if you have a basic understanding of what Tor information is visible to various third parties - see the EFF's [interactive graphic](https://www.eff.org/pages/tor-and-https). For a normal correlation attack, the investigator will need to start from after Tor's exit node: try to correlate the user's online activities to an enormous amount of global data. However, if a suspect is already identified, the investigator can then do a "reverse correlation attack" and start from before Tor's entry node: **try to correlate the suspect's physical or digital footprint to specific online activity**. For your physical footprint, a surveillance operation can note that you go to a cafe regularly, then try to correlate this with online activity they suspect you of (for example, if they suspect you are a website moderator, they can try to correlate to when articles are posted). For your digital footprint, if you are using Internet from home, an investigator can log all your Tor traffic and then try to correlate it with when articles are posted to this anarchist website. + * Possible mitigations in this scenario include **doing [surveillance detection](https://www.csrc.link/threat-library/mitigations/surveillance-detection.html) and [anti-surveillance](https://www.csrc.link/threat-library/mitigations/anti-surveillance.html) before going to a coffee shop**, and changing Wi-Fi locations regularly. For projects like moderating a website that require daily Internet access, this may not be particularly realistic. In that case, the ideal mitigation is to **use a Wi-Fi antenna from indoors** (guide coming soon) - a physical surveillance effort won't see you entrying a cafe, and a digital surveillance effort won't see anything on your home Internet. If this is too technical for you, you may even want to **use your home internet** for some projects that require very frequent internet access. This contradicts the previous advice to not use your personal Wi-Fi. It's a trade-off: using Tor from home avoids creating a physical footprint that is so easy to observe, at the expense of creating a digital footprint which is more technical to observe, and may be harder to draw meaningful conclusions from (especially if you intentionally [make correlation attacks more difficult](/posts/tails/#make-correlation-attacks-more-difficult)). + * If you want to submit a report-back the morning after a riot, or a communique shortly after an action (times when there may be a higher risk of targeted surveillance), consider waiting and at least taking surveillance detection and anti-surveillance measures beforehand. In 2010, the morning after a bank arson in Canada, police surveilled a suspect as he traveled from his home to an Internet cafe, and watched him post the communique and then bury the laptop in the woods. More recently, investigators physically surveilling [an anarchist in France](https://www.csrc.link/#quelques-premiers-elements-du-dossier-d-enquete-contre-ivan) installed a hidden camera to monitor access to an Internet cafe near the comrade's home and requested CCTV footage for the day an arson communique was sent. +* To summarize: For highly sensitive activities, use Internet from a random cafe, preceeded by surveillance detection just like you would prior to a direct action. For activities that require frequent internet access such that the random cafe model isn't sustainable, it's best to use a Wi-Fi antenna positioned behind a window to access from a few kilometers away. If this is too technical for you, using your home Wi-Fi is an option, but requires putting faith in it being difficult to break Tor with a correlation attack, and it being difficult to draw meaningful conclusions from your home's Tor traffic through a "reverse correlation attack". ## Reducing risks when using untrusted computers @@ -90,15 +92,15 @@ This second issue is mitigated by **not using an Internet connection that could > 1. Installing from an infected computer > 2. Running Tails on a computer with a compromised BIOS, firmware, or hardware -### Installing from an infected computer +### 1. Installing from an infected computer -This first issue is mitigated by **using a computer you trust to install Tails**: +You can mitigate this first issue by **using a computer you trust to install Tails**: * According to our [recommendations](/recommendations/#computers), this would ideally be a [Qubes OS](/posts/qubes/) system, as it is much harder to infect than a normal Linux computer. If you have a trusted friend with a Tails USB stick that has been installed with Qubes OS (and who uses these best practices), you could [clone it](/posts/tails/#installation) instead of installing it yourself. * Use the "Terminal" installation method ["Debian or Ubuntu using the command line and GnuPG"](https://tails.boum.org/install/expert/index.en.html), as it more thoroughly verifies the integrity of the download using [GPG](/glossary/#gnupg-openpgp). If using the [command line](/glossary/#command-line-interface-cli) is over your head, ask a friend to walk you through it, or first learn the basics of the command line and GnuPG with [Linux Essentials](/posts/linux/). -* Once installed, do not plug your Tails USB stick (or any [LUKS](/glossary/#luks) USBs used during Tails sessions) into a computer that is running another operating system; if the computer is infected, the infection can [spread to the USB](https://en.wikipedia.org/wiki/BadUSB). +* Once installed, do not plug your Tails USB stick (or any [LUKS](/glossary/#luks) USBs used during Tails sessions) into any other computer while it is running a non-Tails operating system; if the computer is infected, the infection can [spread to the USB](https://en.wikipedia.org/wiki/BadUSB). -### Running Tails on a computer with a compromised BIOS, firmware, or hardware +### 2. Running Tails on a computer with a compromised BIOS, firmware, or hardware This second issue requires several mitigations. Let's start with a few definitions. @@ -106,7 +108,7 @@ This second issue requires several mitigations. Let's start with a few definitio * *Firmware* is the software that's embedded in a piece of hardware; you can simply think of it as "software for hardware". It can be found in several different places (hard drives, USB drives, graphics processor, etc.). * *BIOS* is the specific firmware that is responsible for booting your computer when you press the power button—this is a great place for [malware](/glossary/#malware) to hide because it is undetectable by the operating system. -Our adversaries have two attack vectors to compromise BIOS, firmware, hardware, or software: [remote attacks](/glossary#remote-attacks) (via the Internet) and [physical attacks](/glossary/#physical-attacks) (via physical access). Not everyone will need to apply all of the advice below. For example, if Tails is only being used for anonymous web browsing and writen correspondence, some of this may be overkill. However, if Tails is used to take responsibility for actions that are highly criminalized, a more thorough approach is likely relevant. +Our adversaries have two attack vectors to compromise BIOS, firmware, hardware, or software: [remote attacks](/glossary#remote-attacks) (via the Internet) and [physical attacks](/glossary/#physical-attacks) (via physical access). Not everyone will need to apply all of the advice below. For example, if you're only using Tails for anonymous web browsing and writen correspondence, some of this may be overkill. However, if you're using Tails to take responsibility for actions that are highly criminalized, a more thorough approach is likely relevant. #### To mitigate against physical attacks: @@ -120,13 +122,13 @@ Our adversaries have two attack vectors to compromise BIOS, firmware, hardware, #### To mitigate against remote attacks: -* **Anonymous Wi-Fi**. Using anonymous Wi-Fi is recommended not only to mitigate deanonymization, but also to mitigate remote hacking. It is best to never use the dedicated Tails laptop on your home Wi-Fi. This makes the laptop much less accessible to a remote attacker than a laptop that is constantly connected to your home Wi-Fi. If an attacker is targeting you, they need a point to start, and your home Wi-Fi is a pretty good place to start. +* **Wi-Fi that is unrelated to your identity**. We recommend using Wi-Fi that is unrelated to your identity (i.e. not at your home or work) not only to mitigate deanonymization, but also to mitigate remote hacking. It is best to never use the dedicated Tails laptop on your home Wi-Fi. This makes the laptop much less accessible to a remote attacker than a laptop that is constantly connected to your home Wi-Fi. If an attacker is targeting you, they need a point to start, and your home Wi-Fi is a pretty good place to start. * **Remove the hard drive**—it's easier than it sounds. If you buy the laptop, you can ask the store to do it and potentially save some money. If you search on youtube for "remove hard drive" for your specific laptop model, there will probably be an instructional video. Make sure you remove the laptop battery and unplug the power cord first. We remove the hard drive to completely eliminate the hard drive firmware, which has been known to be [compromised to install persistent malware](https://www.wired.com/2015/02/nsa-firmware-hacking/). A hard drive is part of the attack surface and is unnecessary on a live system like Tails that runs off a USB. * Consider **removing the Bluetooth interface, camera, and microphone** while you're at it, although this is more involved—you'll need the user manual for your laptop model. The camera can at least be "disabled" by putting a sticker over it. The microphone is often connected to the motherboard via a plug - in this case just unplug it. If this is not obvious, or if there is no connector because the cable is soldered directly to the motherboard, or if the connector is needed for other purposes, cut the microphone cable with a pair of pliers. The same method can be used to permanently disable the camera if you don't trust the sticker method. It is also possible to use Tails on a dedicated "offline" computer by removing the network card as well. Some laptops have switches on the case that can be used to disable the wireless interfaces, but for an "offline" computer it is preferable to actually remove the network card. -* **Replace the BIOS with [HEADS](https://osresearch.net/)**. A [video](https://invidious.sethforprivacy.com/watch?v=sNYsfUNegEA) demonstrates a remote attack on the BIOS firmware against a Tails user, allowing the security researcher to steal GPG keys and emails. Unfortunately, the BIOS cannot be removed like the hard drive. It is needed to turn on the laptop, so it must be replaced with [open-source](/glossary#open-source) firmware. This is an advanced process because it requires opening the computer and using special tools. Most anarchists will not be able to do this themselves, but hopefully there is a trusted person in your networks who can set it up for you. The project is called HEADS because it's the other side of Tails—where Tails secures software, HEADS secures firmware. It has a similar purpose to the [Verified Boot](https://www.privacyguides.org/en/os/android-overview/#verified-boot) found in GrapheneOS, which establishes a full chain of trust from the hardware. HEADS has [limited compatibility](https://osresearch.net/Prerequisites#supported-devices), so keep that in mind when buying your laptop if you plan to install it—we recommend the ThinkPad X230 because it's less involved to install than other models. The CPUs of this generation are capable of effectively remoting the [Intel Management Engine](https://en.wikipedia.org/wiki/Intel_Management_Engine#Assertions_that_ME_is_a_backdoor) when flashing HEADS, but this is not the case with later generations of CPUs on newer computers. [Coreboot](https://www.coreboot.org/users.html), the project on which HEADS is based, is compatible with a wider range of laptop models but has less security. HEADS can be configured to [verify the integrity of your Tails USB](https://osresearch.net/InstallingOS/#generic-os-installation), preventing it from booting if it has been tampered with. HEADS protects against physical and remote classes of attacks! +* **Replace the BIOS with [HEADS](https://osresearch.net/)**. A [video](https://invidious.sethforprivacy.com/watch?v=sNYsfUNegEA) demonstrates a remote attack on the BIOS firmware against a Tails user, allowing the security researcher to steal GPG keys and emails. Unfortunately, the BIOS cannot be removed like the hard drive. It is needed to turn on the laptop, so it must be replaced with [open-source](/glossary#open-source) firmware. This is an advanced process because it requires opening the computer and using special tools. Most anarchists will not be able to do this themselves, but hopefully there is a trusted person in your networks who can set it up for you. The project is called HEADS because it's the other side of Tails—where Tails secures software, HEADS secures firmware. It has a similar purpose to the [Verified Boot](https://www.privacyguides.org/en/os/android-overview/#verified-boot) found in GrapheneOS, which establishes a full chain of trust from the hardware. HEADS has [limited compatibility](https://osresearch.net/Prerequisites#supported-devices), so keep that in mind when buying your laptop if you plan to install it—we recommend the ThinkPad X230 because it's less involved to install than other models. The CPUs of this generation are capable of effectively removing the [Intel Management Engine](https://en.wikipedia.org/wiki/Intel_Management_Engine#Assertions_that_ME_is_a_backdoor) when flashing HEADS, but this is not the case with later generations of CPUs on newer computers. [Coreboot](https://www.coreboot.org/users.html), the project on which HEADS is based, is compatible with a wider range of laptop models but has less security. HEADS can be configured to [verify the integrity of your Tails USB](https://osresearch.net/InstallingOS/#generic-os-installation), preventing it from booting if it has been tampered with. HEADS protects against physical and remote classes of attacks! -* **Use USBs with secure firmware**, such as the [Kanguru FlashTrust](https://www.kanguru.com/products/kanguru-flashtrust-secure-firmware-usb-3-0-flash-drive), which has [retailers worldwide](https://www.kanguru.com/pages/where-to-buy), so that the USB will [stop working](https://www.kanguru.com/blogs/gurublog/15235873-prevent-badusb-usb-firmware-protection-from-kanguru) if the firmware is compromised. +* **Use USBs with secure firmware**, such as the [Kanguru FlashTrust](https://www.kanguru.com/products/kanguru-flashtrust-secure-firmware-usb-3-0-flash-drive), so that the USB will [stop working](https://www.kanguru.com/blogs/gurublog/15235873-prevent-badusb-usb-firmware-protection-from-kanguru) if the firmware is compromised. Kanguru has [retailers worldwide](https://www.kanguru.com/pages/where-to-buy), allowing you to buy them in person to avoid the risk of mail interception. ![](flashtrust.webp) @@ -134,39 +136,39 @@ Our adversaries have two attack vectors to compromise BIOS, firmware, hardware, # Using A Write-Protect Switch -> What's a *write-protect* switch? When you insert a normal USB into a computer, the computer does *read* and *write* operations with it, and a *write* operation can change the data. Some special USBs developed for malware analysis have a physical switch that can lock the USB, so that data can be *read* from it, but no new data can be *written* to it. +> What's a *write-protect* switch? When you insert a normal USB into a computer, the computer does *read* and *write* operations with it, and a *write* operation can change the data on the USB. Some special USBs developed for malware analysis have a physical switch that can lock the USB, so that data can be *read* from it, but no new data can be *written* to it. -If your Tails USB stick has a write-protect switch and secure firmware, such as [Kanguru FlashTrust](https://www.kanguru.com/products/kanguru-flashtrust-secure-firmware-usb-3-0-flash-drive), you are protected from compromising the USB firmware during a Tails session. If the switch is locked, you are also protected from compromising the Tails software. This is critical. Compromising your Tails USB stick would require being able to write to it. This means that even if a Tails session is infected with malware, Tails itself is immutable, so the compromise cannot "take root" and would not be present during your next Tails session. If you are unable to obtain such a USB, you have two options. +If your Tails USB stick has a write-protect switch and secure firmware, such as [Kanguru FlashTrust](https://www.kanguru.com/products/kanguru-flashtrust-secure-firmware-usb-3-0-flash-drive), you are protected from compromising the USB firmware during a Tails session. If the switch is locked, you are also protected from compromising the Tails software. This is critical. To compromise your Tails USB stick, an attacker would need to be able to write to it. This means that even if a Tails session is infected with malware, Tails itself is immutable, so the compromise cannot "take root" and would not carry over to subsequent Tails sessions. If you are unable to obtain such a USB, you have two options. -1) [Burn Tails to a new DVD-R/DVD+R](https://tails.boum.org/install/dvd/index.en.html) (write once) for each new version of Tails - it should not be labeled "DVD+RW" or "DVD+RAM" so that the DVD cannot be rewritten. +1) [Burn Tails to a new DVD-R/DVD+R](https://tails.boum.org/install/dvd/index.en.html) (write once) for each new version of Tails. Don't use DVDs labeled "DVD+RW" or "DVD+RAM", which can be rewritten. 2) Boot Tails with the `toram` option, which loads Tails completely into memory. Using the `toram` option depends on whether your Tails USB boots with [SYSLINUX or GRUB](https://tails.boum.org/doc/advanced_topics/boot_options/index.en.html). * For SYSLINUX, when the boot screen appears, press Tab, and type a space. Type `toram` and press Enter. * For GRUB, when the boot screen appears, press `e` and use the keyboard arrows to move to the end of the line that starts with `linux`. The line is probably wrapped and displayed on multiple lines, but it is a single configuration line. Type `toram` and press F10 or Ctrl+X. - * Once you are on the Tails desktop, you can eject the USB that Tails is on before you do anything else (whether it is connecting to the Internet or plugging in another USB). + * You can eject the Tails USB at the beginning of your session before you do anything else (whether it is connecting to the Internet or plugging in another USB) and then still use it like normal. -On a USB with a write-protect switch, you will not be able to make any changes to the Tails USB when the switch is enabled. If you could make changes, so could malware. While it would be ideal to leave the switch on all the time, we recommend two cases where the switch must be turned off: +On a USB with a write-protect switch, you will not be able to make any changes to the Tails USB when the switch is locked. If you can make changes, so can malware. While it would be ideal to leave the switch locked all the time, we recommend two cases where the switch must be unlocked: -1) **For a dedicated upgrade session.** If you need to upgrade Tails, you can do so in a dedicated session with the switch disabled - this is necessary because the upgrade needs to be written to the Tails USB. Once you are done, you should restart Tails with the switch enabled. +1) **For a dedicated upgrade session.** If you need to upgrade Tails, you can do so in a dedicated session with the switch unlocked - this is necessary because the upgrade needs to be written to the Tails USB. Once you are done, you should restart Tails with the switch locked. 2) **If you decide to use Persistent Storage, for occasional configuration sessions.** [Persistent Storage](/posts/tails/#optional-create-and-configure-persistent-storage) is a Tails feature that allows data to persist between sessions that would otherwise be amnesiac on the Tails USB itself. Because it requires writing to the Tails USB to persist data, it is generally impractical to use with a write-protect switch. However, it may be acceptable to disable the switch for occasional Persistent Storage configuration sessions, such as installing additional software. For example, in an 'unlocked' session, you enable additional software for persistence and install Scribus, selecting to install it every session. Then, in a 'locked' session, you actually use Scribus - none of the files you work on are saved to the Tails USB because it is 'locked'. The Persistent Storage feature is not possible with the `toram` boot or with a DVD. Where can we store personal data for use between Tails sessions if the write-protect switch prevents us from using Persistent Storage? We recommend storing personal data on a second LUKS USB. This "personal data" USB should not look identical to your Tails USB to avoid confusion. To create this separate USB, see [How to create an encrypted USB](/posts/tails/#how-to-create-an-encrypted-usb). If you are reading this from a country like the UK, where not providing encryption passwords can land you in jail, this second drive should be an HDD containing a [Veracrypt Hidden Volume](https://www.veracrypt.fr/en/Hidden%20Volume.html) (SSD and USB drives are [not suitable for Hidden Volumes](https://www.veracrypt.fr/en/Trim%20Operation.html)). ![](luks.png) -Compartmentalization is an approach that neatly separates different identities - in Tails session #1 you do activities related to moderating a website, and in Tails session #2 you do activities related to researching for an action. This approach also comes into play with your "personal data" USBs. If the files you save could be used to link your activities together, use a different "personal data" USB for each activity. For a "personal data" USB that stores very sensitive files (such as the text of a communique), it is best to reformat and then destroy the USB once you no longer need the files (see [Really delete data from a USB drive](/posts/tails/#really-delete-data-from-a-usb)). This is another reason to use a separate USB for any files that need to be saved - you don't accumulate the forensic history of all your files on your Tails Persistent Storage, and you can easily destroy USBs as needed. +Compartmentalization is an approach that neatly separates different identities by using separate Tails sessions for separate activities - in Tails session #1 you do activities related to moderating a website, and in Tails session #2 you do activities related to researching for an action. This approach also comes into play with your "personal data" USBs. If the files you save could be used to link your activities together, use a different "personal data" USB for each activity. For a "personal data" USB that stores very sensitive files (such as the text of a communique), it is best to reformat and then destroy the USB once you no longer need the files (see [Really delete data from a USB drive](/posts/tails/#really-delete-data-from-a-usb)). This is another reason to use a separate USB for any files that need to be saved - you don't accumulate the forensic history of all your files on your Tails Persistent Storage, and you can easily destroy USBs as needed. Finally, a note about email - if you already use Tails and encrypted email ([even though it is not very secure](/posts/e2ee/#pgp-email)), you may be familiar with Thunderbird's Persistent Storage feature. This feature allows you to store your Thunderbird email account details, as well as your inbox and PGP keys, on a Tails USB. With a "personal data" USB, Thunderbird won't automatically open your accounts. We recommend that you do one of the following: - Create new Thunderbird email accounts in each session. PGP keys can be stored on the separate 'personal data' USB like any other file, and imported when needed. This has the advantage that if law enforcement manages to bypass LUKS, they still don't have your inbox without knowing your email password. -- Keep the Thunderbird data folder on the "personal data" USB. After logging in to Thunderbird, use the Files browser (Applications → Accessories → Files) and enable the "Show hidden files" setting. Navigate to Home, then copy the folder called `.thunderbird` to your "personal data" USB. In each future session, after you have unlocked the 'personal data' USB and before you start Thunderbird, copy the `.thunderbird` folder to Home. +- Keep the Thunderbird data folder on the "personal data" USB. After logging in to Thunderbird, use the Files browser (Applications → Accessories → Files) and enable the "Show hidden files" setting. Navigate to Home, then copy the folder called `.thunderbird` to your "personal data" USB. In each future session, after you have unlocked the 'personal data' USB and before you start Thunderbird, copy the `.thunderbird` folder to Home (which is running in RAM, so doesn't require the write-protect switch to be unlocked). -Another reason to avoid using Persistent Storage features is that many of them persist user data to the Tails USB. If your Tails session is compromised, the data you access during that session can be used to tie your activities together. If there is user data on the Tails USB, such as an email inbox, compartmentalization of Tails sessions is no longer possible. To achieve compartmentalization with Persistent Storage enabled, you would need a dedicated Tails USB for each identity, and updating them all every month is a lot of work. +Another reason to avoid using Persistent Storage features is that many of them persist user data to the Tails USB. If your Tails session is compromised, the data you access during that session can be used to tie your activities together. If there is user data on the Tails USB, such as an email inbox, compartmentalization of Tails sessions is no longer possible. To achieve compartmentalization with Persistent Storage enabled, you would need a dedicated Tails USB for each identity, and updating them all every month would be a lot of work. # Encryption ## Passwords -[Encryption](/glossary#encryption) is a blessing—it's the only thing standing in the way of our adversary reading all our data, if it's used well. The first step in securing your encryption is to make sure that you use very good passwords—most passwords don't need to be memorized because they are stored in a password manager called KeePassXC, so they can be completely random. To learn how to use KeePassXC, see [Password Manager](/posts/tails/#password-manager-keepassxc). +[Encryption](/glossary#encryption) is a blessing—it's the only thing standing in the way of our adversaries reading all our data, if it's used well. The first step in securing your encryption is to make sure that you use very good passwords—most passwords don't need to be memorized because they are stored in a password manager called KeePassXC, so they can be completely random. To learn how to use KeePassXC, see [Password Manager](/posts/tails/#password-manager-keepassxc). >In the terminology used by KeePassXC, a [*password*](/glossary/#password) is a random sequence of characters (letters, numbers and other symbols), while a [*passphrase*](/glossary/#passphrase) is a random sequence of words. @@ -205,7 +207,7 @@ To use gocryptfs, you will need to use Terminal (the [command line](/glossary#co On your Personal Data LUKS USB, use the file manager to create two folders and name them `cipher` and `plain`. Right click in the white space of your file manager and select 'Open Terminal Here'. This will allow you to be in the correct location when Terminal opens, instead of having to know how to navigate using the `cd` command. -In Terminal, list the folders you have, and it should output the two you just created, among others: +In Terminal, use the `ls` command to list the folders you have, and it should output the two you just created, among others: `ls` @@ -231,7 +233,7 @@ Now plain is just an empty folder again. Before storing important files in the c PGP email is the most established form of encrypted communication on Tails in the anarchist space. Unfortunately, PGP does not have [forward secrecy](/glossary#forward-secrecy)—that is, a single secret (your private key) can decrypt all messages, rather than just a single message, which is the standard in encrypted messaging today. It is the opposite of "metadata protecting", and has [several other shortcomings](/posts/e2ee/#pgp-email). -For [synchronous](/glossary/#synchronous-communication) messaging—when you are both online at the same time—we recommend [Cwtch](/posts/e2ee/#cwtch) for encrypted communication on Tails. +For [synchronous](/glossary/#synchronous-communication) messaging—when you are both online at the same time—we recommend [Cwtch](/posts/e2ee/#cwtch) on Tails. For [asynchronous](/glossary/#asynchronous-communication) messaging—when you are not both online at the same time—we recommend [Element](/posts/e2ee/#element-matrix). Which server you use is also important; [Systemli](https://www.systemli.org/en/service/matrix/) and [Anarchy Planet](https://anarchyplanet.org/chat.html) are reputable hosts. @@ -247,19 +249,19 @@ Sometimes the goal of phishing is to deliver a "payload" that calls back to the ## Attachments -For untrusted attachments, you would ideally **sanitize all files sent to you before opening them** with a program like [Dangerzone](https://dangerzone.rocks/), which takes potentially dangerous PDFs, office documents, or images and converts them into safe PDFs. Unfortunately, Dangerzone is [not yet readily available in Tails](https://gitlab.tails.boum.org/tails/tails/-/issues/18135). An inferior option is to **open untrusted files in a dedicated ['offline mode'](https://tails.boum.org/doc/first_steps/welcome_screen/index.en.html#index3h2) session**, so that if they're malicious they can't call home, and you shut down immediately afterward, minimizing their chance of persistence. Tails prevents deanonymization through phishing by forcing all internet connections through the Tor network. However, this is still vulnerable to [0-day exploits](/glossary#zero-day-exploit) that nation-state actors have. For example, the FBI and Facebook worked together to develop a 0-day exploit against Tails [that deanonymized a user](https://www.vice.com/en/article/v7gd9b/facebook-helped-fbi-hack-child-predator-buster-hernandez) after he opened a video attachment from his home Wi-Fi. +For untrusted attachments, you would ideally **sanitize all files sent to you before opening them** with a program like [Dangerzone](https://dangerzone.rocks/), which takes potentially dangerous PDFs, office documents, or images and converts them into safe PDFs. Unfortunately, Dangerzone is [not yet readily available in Tails](https://gitlab.tails.boum.org/tails/tails/-/issues/18135). An inferior option is to **open untrusted files in a dedicated ['offline mode'](https://tails.boum.org/doc/first_steps/welcome_screen/index.en.html#index3h2) session**, so that if they're malicious they can't call home, and shut the session down immediately afterward, minimizing their chance of persistence. Tails prevents deanonymization through phishing by forcing all internet connections through the Tor network. However, this is still vulnerable to [0-day exploits](/glossary#zero-day-exploit) that nation-state actors have access to. For example, the FBI and Facebook worked together to develop a 0-day exploit against Tails [that deanonymized a user](https://www.vice.com/en/article/v7gd9b/facebook-helped-fbi-hack-child-predator-buster-hernandez) after he opened a video attachment from his home Wi-Fi. ## Links -With untrusted links, there are two things to protect: your anonymity and your information. Unless the adversary has a 0-day exploit on the Tor Browser or Tails, your anonymity should be protected **if you don't enter any identifying information into the website**. Your information can only be protected **by your behavior**—phishing awareness allows you to think critically about whether this could be a phishing attack and act accordingly. +With untrusted links, there are two things you must protect: your anonymity and your information. Unless the adversary has a 0-day exploit on the Tor Browser or Tails, your anonymity should be protected **if you don't enter any identifying information into the website**. Your information can only be protected **by your behavior**—phishing awareness allows you to think critically about whether this could be a phishing attack and act accordingly. -Investigate untrusted links before you click by **manually copying and pasting the address into your browser**—do not click through a hyperlink as the text can be used to mislead you about where you are going. **Never follow a shortened link** (e.g. a site like bit.ly that takes long web addresses and makes a short one) because it cannot be verified before redirection. [Unshorten.me](https://unshorten.me/) can reveal any shortened link. +Investigate untrusted links before you click by **manually copying and pasting the address into your browser**—do not click through a hyperlink as the text can be used to mislead you about where you are going. **Never follow a shortened link** (e.g. a site like bit.ly that takes long web addresses and makes a short one) because it cannot be verified before redirection. [Unshorten.me](https://unshorten.me/) can reveal shortened links. ![](duckduck.cleaned.png) -Also, **don’t follow links to domains you don't recognize**. When in doubt, search for the domain with the domain name in quotation marks using a privacy-preserving search engine (such as DuckDuckGo) to see if it’s a legitimate website. This isn’t a 100% solution, but it’s a good precaution to take. +Also, **don’t follow links to domains you don't recognize**. When in doubt, search for the domain with the domain name in quotation marks using a privacy-preserving search engine (such as DuckDuckGo) to see if it’s a legitimate website. This isn’t a surefire solution, but it’s a good precaution to take. -Finally, if you click on any link in an email and are asked to log in, be aware that this is a common endgame for phishing campaigns. **Do not do it**. Instead, manually go to the website of the service you are trying to sign in to and sign in there. That way, you’ll know you’re logging in to the right site because you’ve typed in the address for it, rather than having to trust the link in the email. For example, you might type your password at mailriseup.net instead of mail.riseup.net (this is called "typo-squatting"). +Finally, if you click on any link in an email and are asked to log in, be aware that this is a common endgame for phishing campaigns. **Do not do it**. Instead, manually go to the website of the service you are trying to access and sign in there. That way, you’ll know you’re logging in to the right site because you’ve typed in the address yourself, rather than having to trust the link in the email. For example, you might type your password at mailriseup.net instead of mail.riseup.net (this is called "typo-squatting"). Similarly, a "[homograph attack](https://www.theguardian.com/technology/2017/apr/19/phishing-url-trick-hackers)" substitutes Cyrillic letters for normal letters, which is even harder to visually recognize. You may want to open untrusted links in a dedicated Tails session without unlocked Persistent Storage or attaching "personal data" USBs. @@ -281,7 +283,7 @@ In September 2019, our collective published a short statement ("[Security warnin **The problem:** WLAN adapters send manufacturer-specific information with the data transfer. This information can enable a unique assignment despite a MAC address spoofed by the MAC changer. **This affects both internal WLAN adapters that are installed in your laptop in the form of a network card, as well as external WLAN adapters connected via USB**. The technical details are explained below. This fingerprinting is not conclusive forensic evidence. In combination with other evidence, however, it could result in a legally constructed 'unique' assignment: which computer was responsible for a certain Internet publication. -**A concrete example**: Due to previous police surveillance, a café in your city is suspected of being used for the publication of communiques. The café operator has allowed himself to be bribed or coerced by the cops into configuring his (commercially available) Internet router in such a way that it logs all of the data packets of all computers seeking contact. If the presence of various computers in this café was 'recorded' at the same time as an explosive Indymedia publication, this could be used for further investigations, despite the fact that the content of the data packets only shows that the data was anonymized using Tor. If your computer was logged (despite a spoofed MAC address) and if the fingerprint of your WLAN adapter turns up again elsewhere (by chance, or through targeted investigations - e.g. during a house raid) and can be proven as belonging to you, a prosecutor could try to use this as evidence of you submitting the Indymedia publication. +**A concrete example**: Due to previous police surveillance, a café in your city is suspected of being used for the publication of communiques. The café operator has allowed himself to be bribed or coerced by the cops into configuring his (commercially available) Internet router in such a way that it logs all of the data packets of all computers seeking contact. If the presence of various laptops in this café was 'recorded' at the same time as an explosive Indymedia publication, this could be used for further investigations, despite the fact that the content of the data packets only shows that the data was anonymized using Tor. If your computer was logged (despite a spoofed MAC address) and if the fingerprint of your WLAN adapter turns up again elsewhere (by chance, or through targeted investigations - e.g. during a house raid) and can be proven as belonging to you, a prosecutor could try to use this as evidence of you submitting the Indymedia publication. **Recommendation**: Until there is a (stable) solution for the "WLAN fingerprinting" problem, you should remove the internal WLAN adapter for particularly sensitive research and publications and use a (cheap) external USB WLAN adapter and **dispose of it after use**. We also advise you to use WLAN adapters that can be controlled by the Tails operating system without manufacturer-specific firmware (e.g. WLAN adapters with Qualcomm's Atheros chip that use the ath9k driver). @@ -353,4 +355,4 @@ Hacking is really a way of life. If you are truly committed to your cause, you s [^8]: Recognizable by the green fields in the column "Non-free firmware required." -[^9]: AnarSec note: This guide is not taking into account the possibility of physical surveillance. We would not recommend using a car, due to how it can easily be [tracked with a GPS device](https://www.csrc.link/threat-library/techniques/covert-surveillance-devices/location.html). +[^9]: AnarSec note: Keep in mind that a car can easily be [tracked with a GPS device](https://www.csrc.link/threat-library/techniques/covert-surveillance-devices/location.html).