Disclaimer: I am not a security professional, and this is not security advice. I’m just an old school Direct Action activist and indymediatista, with some information to add to the mix. Add a grain of salt and stir well.1
Given everything that’s happening around the world recently, some folks have reason to worry they might be facing targeted spying, by people with decidedly malicious intentions2. To keep themselves and their communities safe, they need to be able to have reliably private conversations right now. If that’s you, I strongly suggest you don’t do that with the “private message” features tacked onto most social media apps, including fediverse ones like Mastodon (I’ll come back to why in a bit).
Instead, I encourage you to check out these decentralised chat projects, where all private messages are encrypted using E2EE (End-to-End Encryption). All three of these projects have apps for all major OS.
Delta.Chat: private text messaging, media-sharing, private group chats, etc, using your email account (encrypted using AutoCrypt, based on OpenPGP).
Snikket: All of the above, plus public groups, and voice/video calling, using the Jabber protocols (Modern XMPP, encrypted using OMEMO, based on the Signal protocol).
Element: All of the above using the Matrix protocol (encrypted using Olm and MegOlm, also based on the Signal protocol), plus groups and their messages are stored on every participants’ server, not just the one it was started on.
I’m pretty confident, all else being equal, that any of these options give your messages better protection than sending them in any fediverse app. Even Hubzilla and family, although they’re more reliably private than most. But there may be other options that are better for your particular situation (or “threat model” as the boffins say). It’s worth thinking this through with other people, so you can share the load and check each other’s work.
Each of the apps offered by the projects listed above, like all software, has its pros and cons.
For example, Delta Chat is the simplest to set up and use, because you’ve already got an email address, right? But its out-of-the-box features are limited compared to most modern chat apps. There are also some concerns about the robustness of using AutoCrypt for encryption, because of the stoush between proponents of a refreshed OpenPGP, and partisans of LibrePGP.
Delta Chat can send messages to anyone with an email address, but they’re sent in plaintext if the recipient isn’t using Delta Chat on their end. There’s a risk this could lead to people accidentally sending sensitive messages unencrypted, which is the main reason Signal developers gave for removing the option to send plaintext SMS with their apps. Pros and cons.
If you’re willing to set up a dedicated chat account (or already have one), and so are the people you need to message privately, then Snikket and Element could be for you. Element doesn’t handle one-to-one chats as smoothly as Snikket, but has a lot more options for groups chats. Pros and cons.
Keep in mind that whatever apps you choose to use, how secure they are depends to some degree on the security of the underlying OS you run it on, and in turn the device that’s running on. For example, there’s not much point in military-grade encryption if someone can turn on desktop sharing on your device without your knowledge or consent, digitally peeking over your shoulder while you’re reading your encrypted messages.
Similarly, even the most secure app depends on being used the right way. No encryption, no matter how robust and throughly audited, can protect you from people actually peeking over your shoulder to read your private messages. So your privacy-protection practices need to be holistic to be effective.
Indeed, I suspect the most common way cops and spooks intercept private messages isn’t breaking the security of apps. It’s not even rubber hose cryptography. They just get an informant into the group who are sending the messages. The main purpose of making reliably-private messaging easily available, and commonly used, is to help you avoid coming to their attention in the first place. So ease of use, and availability on a wide range of common devices, is just as important in real world use as technical robustness.
As I said at the outset, I am not a security professional. I’ve linked to the best information I can find so far about the security considerations of each app, and I leave you to do your own due diligence. Again, grain of salt! But publishing this would be a waste of your time and mine, if even I didn’t have reason to believe you’d be more informed after reading it. So what background am I speaking from here?
I’ve been involved in protest politics and direct action activism since the 1990s, on and off. In the 2000s I was writing guides for activists on how to encrypt email with GnuPG, and helping to run activist infrastructure projects in the 2000s, including Aotearoa Indymedia where I published that guide. I’ve studied security culture as a whole (“operational security” or “opSec” in spook jargon), not just digital security. Not out of hobbyist curiosity, but because it was necessary to keep people I cared about safe.
I know the limits of my security knowledge and choose what I say about it very carefully. But I’m human, and I will sometimes make mistakes. Check my work, and feel free to send detailed and referenced feedback so I can improve it.
Observant readers will notice that although I seem to approve of using the Signal Protocol for message encryption, the Signal service isn’t on my list of suggestions. Predictably, I was challenged on this when I posted the first rough versions of this rant on the fediverse, and then on the SocialHub developers portal to make it more discoverable. So why don’t I recommend Signal? To explain this, I need to talk about a couple of my basic security assumptions.
When an online service is proprietary, no security promise they make can be independently audited. As it happens, being centralised is one of the things that makes a service proprietary, because no one can independently confirm what the server is doing. Even if they publish code and claim it’s what they use in production.
Terms of Service that allow connections only from apps supplied by the vendor make it a proprietary service too, even if they publish full source code for them. At least in the absence of fully reproducible builds, where it can be independently confirmed that the apps they distribute are compiled from the source code they publish.
If a service ticks both of these boxes, it’s even harder to independently confirm that it keeps it promises, without breaking the ToS in some way. If this comes under “bypassing any kind of digital lock”, it’s illegal to check that they keep their promises, and anyone trying to do it risks imprisonment.
Signal ticks both boxes. It’s centralised and the ToS forbids connections from anyone not using the apps they supply, which means that like WhatSapp, their security promises are expected to be taken on trust. Which makes their promise of E2EE totally worthless, because the whole point of encryption being “End-To-End” is that you don’t have to trust the service or its operators for your messages to be secure.
Even if its promises weren’t worthless (from a security point of view), Signal is a SPoF (Single Point of Failure) and a juicy target for state-level actors. Since there’s no way to confirm it’s not a honeypot, it’s safest to presume it is. So with all due respect, anyone holding up Signal as the gold standard for secure communication (and this is surprisingly common) is not seeing the forest for the trees.
I’m planning a piece where I deep dive into all my concerns about Signal, what they’re based on, and how they could be tested (without risking jail time). In the meantime, if you’re curious, have a fossick around in the notes about Signal on my old CoActivate wiki. But keep in mind they haven’t been updated since 2019, which is why I’m planning an update. Watch this space, but as always, treat my future blog promises with as much scepticism as I hope you’ll now apply to Signal.
Anyway, since I’m hesitant to recommend Signal for anything you wouldn’t say during a private conversation in a crowded cafe, it’s not surprising that I recommend using private messages on the fediverse with the same level of caution. Despite being confident in the respect for privacy among the developers of fediverse apps (Free Code ones, not Meta’s Threats).
To explain why I say this, we need to talk about why most new social software starts as a public-only network. The original OStatus fediverse was, and BlueSky’s emerging ATmosphere network still is. For now, even data about who’s blocking who on BlueSky is public (which it isn’t on the fediverse).
Ensuring privacy in networked software is hard. It’s much easier to build software for public discussions than for private messages. For a start, there’s a whole lot less to build, especially when your software is intended to work in a decentralised network. But more importantly, there’s a whole galaxy of security and Trust & Safety headaches you don’t have to deal with.
Posting to a web forum is traditionally understood to be making a public statement3. It’s fine to quote it elsewhere, link to it, index it for search, and so on.
A system for private messages between forum members can be as simple as just forwarding to their email address. Which is pretty much the only private data you need to secure. Your interface needs to make it crystal clear when it’s posting publicly vs. privately, and your back-end needs to reliably keep the promises it makes. But there’s a hard limit to how badly your forum software can accidentally violate people’s privacy.
Designing for federating public comments between forums is a bit trickier than for traditional, centralised ones. But in theory, you’re only sending public comments - intended to be shared with the world - from one server to another. Unless your email sender starts posting private messages over the federation protocol you’re using (which would be a critical bug!), there aren’t too many other ways it can go wrong privacy-wise.
If you start hosting private messages on the server itself, suddenly there’s a lot of ways it can go sideways. You need to make sure those messages are securely stored. You need to make sure the sender and intended recipients can access them, and only them. You need to balance making them easy to access with a range of browser apps, OS, and devices, with the need to keep them opaque to unintended recipients.
This gets even trickier if you start federating private messages with other servers. If you encrypt them, especially using E2EE (End-to-End Encryption), there’s fewer ways those messages can end up in the wrong hands, at least in theory. But it adds a bunch of complicated code to write and maintain, and a whole new range of ways it can all go wrong4.
In summary, it’s great that existing fediverse software supports some level of quiet interaction. As long as your server admins are trustworthy and competent, posting using features like the ‘Only People Mentioned’ scope on Mastodon, or even the encrypted private messages in Hubzilla, are probably private. Probably.
But building a privacy-respecting replacement for something like FarceBook, so people can safely share the most intimate details of their lives with specific people, and only them? That’s a huge project. It’s been tried in Friendica, Hubzilla and other apps in that branch of the fediverse, but their encryption is unaudited5, and they’re all hamstrung by confusing interfaces. The underlying tech is impressive, but when people are posting private stuff, the controls need to be even easier to understand than for public posting. Especially when you can do both in the same app.
Building a community-controlled FarceBook replacement for the fediverse is possible. But it can’t be done properly just by bolting stuff onto existing fediverse software, much as I wish it could. It needs to be approached as a first-principles design project, addressing both security and usability, and both protocol extensions and easy-to-use interfaces6.
The design approach of projects like Bonfire Social is a big step in the right direction. As is the SocialCG’s work on using MLS with ActivityPub, so we can E2EE private messages in the fediverse. People from a few different projects are working together on ways of making federated private groups work independently of the server where they’re started. There’s a lot of great work being done, and lots of older work we can take inspiration and techniques from.
But if we want to do federated privacy right in the verse, we need to take our time.
Image:
"Nano Rhino whispers secrets" by mpclemens, licensed under CC BY 2.0.
I intend to update this piece from time to time, as new information comes to light, to keep it as accurate and useful as possible.
Rather than just the passive mass surveillance we’re all under all the time, so DataFarmers can make money by “nudging” our spending and voting decisions.
Some people now seem to believe that public is not public, but that’s a grizzle for another day.
See the chronic “Unable to Decrypt” errors that have plagued encrypted chats in the the Matrix network, but we’re promised will be fixed by Matrix 2.0.
As far as I know, I’d be keen to learn otherwise.
Again, keep an eye out for a post dedicated to this topic, updating one I posted a few years back on CoActivate.