The problem with relying on apps and services
In 2018 (I think?), I started a website called The New Oil. I didn’t want to start that website. I felt – and still feel, sometimes – that I didn’t know enough. I was (and in my opinion, still am) a nobody; I’m not a sysadmin or a former intelligence officer, I’m not a hacker of any kind. I’m just a normal guy trying to parse through all the tons of information out there and make sense of it all. But the problem I saw at the time that I set out to address was that nobody was explaining anything. There were lots of sites out there that offered great lists of privacy tools but few of any of them offered context. Those sites – at the time – failed to explain the difference between ProtonMail and Tutanota, for example. They just listed both of them as good choices for encrypted messaging. They also didn’t explain things like “both participants have to be using the same service or else you won’t get the full benefit.” After a lot of hesitation, I finally decided to launch my site and simply be transparent that I was not an expert, but I was doing my best. Fortunately, this seems to have been well received.
Just to be clear, what I’m about to say is not blaming those sites. I don’t think those sites started any kind of problem or perpetuated them. Those sites laid the groundwork for my own site and what I do today, and they did it very well. I would not be where I am today both as The New Oil and as a person who has taken back much of my privacy without the hard work of those sites. But I think that those sites were at the time a reflection of the audience.
A common theme – particularly among newbies – is the idea that an app will fix your problems. Want privacy? Download Brave. Or use Signal. Switch to ProtonMail. Maybe get a reputable VPN. Boom! Now you’re private! I don’t know if this should be called a misconception, a misunderstanding, or simply ignorance. Now of course the more experienced in the privacy community know that this isn’t true (just as we know that one size does not fit all). But this is something I see come up again and again and again. Recently I had someone reach out to me claiming that they had a phone taken from them and they wanted my advice. I got the distinct impression that they seemed to want me to recommend some kind of app. They kept listing the apps they already had, and even after I told them that in my opinion they were in a good spot they still repeated the question. It was as if they were waiting for me to say “oh, you should swap That app with This app and don’t forget to download This Other app, too.” But in reality, this person didn’t need apps. They needed OPSEC – operational security.
In the privacy community, it’s common knowledge that metadata matters. We generally define this as “data about the data.” There are many common examples of this, such as: someone calling the Suicide Hotline from the Golden Gate Bridge at 3 am. You have their number calling that number (both of which can be looked up in records even if you don’t attach names to them right away), you have the location of the phone call, and the time. Do you really need the content of that phone call to take an educated guess at what was discussed?
The biggest problem we often face in the privacy community is that we suggest apps without suggesting context. Signal will not solve your problems. Signal is a great app – they do their best not to log metadata. However, your phone still logs everything you do at the carrier level. Signal will protect the content of your calls, and Signal themselves will not log your phone calls or times, but consider this: Signal uses my phone’s address book. I only have about a dozen numbers in there (any non-Signal contacts are store in the appropriate MySudo identity for compartmentalization). One of them is my partner, who I live with. This can easily be inferred by the fact that both of our phones are in the same location for about 12 hours per day – usually overnight. So if my carrier sees that I open Signal from the grocery store, and they can see all my contacts in my device, the natural assumption is that I’m calling my partner. Sure, I might be calling my mom for something unrelated or taking a call from my brother, but the partner is also a safe bet. If they want to go the extra step, they can easily check if my partner's phone (who they've identified because of our living situation and device patterns) also activated Signal at that time.
Does this matter? It depends. I don’t care if my carrier knows I called my partner to let her know that the grocery store is out of mango yogurt and ask which flavor she wants instead. But it is important that I’m aware of that risk and know how to mitigate it. For example, there are certain sensitive searches that I simply will not do on my phone. I don’t mind using my phone to check and see where the nearest fast food place is for lunch. I will not use my phone to – for example – check the price of a new coffee mug from Wikileaks. That can wait til I’m on a desktop device where I have more control and privacy with things like virtual machines.
This is where this topic starts venturing off into other territories like how to threat model. It also raises serious questions about where to draw the line in your own defensive posture and why you even bother. “Why do you use Signal at all if you don’t care that somebody knows you called your partner about yogurt?” There’s lots of reasons that I’m not going to defend here because it would be off topic. The point here is that we need to make sure that we’re not expecting services to save us. Email, as some people know, was never meant to be secure. No matter how perfect you make the encryption or where you base the servers or this, that, and the other it will never be a perfect solution for the most sensitive information because it was never meant to be. There’s nothing wrong with getting excited about new solutions. Privacy and security are a constant arms race. One side develops a new encryption, the other side cracks it. The first side responds with onion routing and the other side learns time-correlating attacks. (These are, of course, gross oversimplifications.) It’s good to be aware of the new tools that are coming out and the problems they aim to solve. But it’s important to know where those tools fall short. Recently I did an interview with Session and one of the questions I asked was “where does Session fall short that you don’t plan to fix?” No tool is built to do everything, and any tool that tries will inevitably suck. It’s important to know what Session doesn’t do so you can be aware of any cracks in your digital armor. That goes for any app or service. It’s important that we don’t rely on the service itself, but rather that we understand what problem we’re aiming to solve and how that particular service can be used to solve that problem and what it doesn’t solve. Now of course, it’s not your job to educate everyone about every facet of every app and every possible concept in privacy and security. However, as a community, let’s make sure we keep an eye out for situations where people are expecting a simple download to fix their problems and make sure to dispel that mentality as much as we can.