Ama Russell and Evamelo Oleita had never been to a protest before June. But as demonstrations against systemic racism and police brutality began to spread across the U.S. earlier this year, the two 17 year-olds from Michigan, both of whom are Black, were inspired to organize one of their own.
Seeking practical help, Oleita reached out to Michigan Liberation, a local civil rights group. The activist who replied told her to download the messaging app Signal. “They were saying that to be safe, they were using Signal now,” Oleita tells TIME. It turned out to be useful advice. “I think Signal became the most important tool for protesting for us,” she says.
Within a month, Oleita and Russell had arranged a nonviolent overnight occupation at a detention center on the outskirts of Detroit, in protest against a case where a judge had put a 15 year-old Black schoolgirl in juvenile detention for failing to complete her schoolwork while on probation. The pair used Signal to discuss tactics, and to communicate with their teams marshalling protestors and liaising with the police.
“I don’t think anything we say is incriminating, but we definitely don’t trust the authorities,” says Russell. “We don’t want them to know where we are, so they can’t stop us at any point. On Signal, being able to communicate efficiently, and knowing that nothing is being tracked, definitely makes me feel very secure.”
Signal is an end-to-end encrypted messaging service, similar to WhatsApp or iMessage, but owned and operated by a non-profit foundation rather than a corporation, and with more wide-ranging security protections. One of the first things you see when you visit its website is a 2015 quote from the NSA whistleblower Edward Snowden: “I use Signal every day.” Now, it’s clear that increasing numbers of ordinary people are using it too.
“Any time there is some form of unrest or a contentious election, there seems to be an opportunity for us to build our audience,” says Brian Acton, the Signal Foundation’s co-founder and executive chairman, in an interview with TIME. “It’s a little bit bittersweet, because a lot of times our spikes come from bad events. It’s like, woohoo, we’re doing great — but the world’s on fire.”
Indeed, just as protests against systemic racism and police brutality intensified this year, downloads of Signal surged across the country. Downloads rose by 50% in the U.S. between March and August compared to the prior six months, according to data shared with TIME by the analysis firm App Annie, which tracks information from the Apple and Google app stores. In Hong Kong they rose by 1,000% over the same period, coinciding with Beijing’s imposition of a controversial national security law. (The Signal Foundation, the non-profit that runs the app, doesn’t share official download numbers for what it says are privacy reasons.) “
We’re seeing a lot more people attending their first actions or protests this year—and one of the first things I tell them to do is download Signal,” says Jacky Brooks, a Chicago-based activist who leads security and safety for Kairos, a group that trains people of color to use digital tools to organize for social change. “Signal and other end-to-end encryption technology have become vital tools in protecting organizers and activists.”
In June, Signal took its most explicitly activist stance yet, rolling out a new feature allowing users to blur people’s faces in photos of crowds. Days later, in a blog post titled “Encrypt your face,” the Signal Foundation announced it would begin distributing face masks to protesters, “to help support everyone self-organizing for change in the streets.” Asked if the chaos of 2020 has pushed Signal to become a more outwardly activist organization, Acton pauses. “I don’t know if I would say more,” he says. “I would say that right now it’s just congruent. It’s a continuation of our ongoing mission to protect privacy.”
What makes Signal different
Signal’s user base — somewhere in the tens of millions, according to app store data — is still a fraction of its main competitor WhatsApp’s, which has some 2 billion users and is owned by Facebook. But it is increasingly clear that among protesters, dissidents and investigative journalists, Signal is the new gold standard because of how little data it keeps about its users. At their core, both apps use cryptography to make sure that the messages, images and videos they carry can only be seen by the sender and the recipient — not governments, spies, nor even the designers of the app itself. But on Signal, unlike on WhatsApp, your messages’ metadata are encrypted, meaning that even authorities with a warrant cannot obtain your address book, nor see who you’re talking to and when, nor see your messages.
“Historically, when an investigative journalist’s source is prosecuted in retaliation for something they have printed, prosecutors will go after metadata logs and call logs about who’s been calling whom,” says Harlo Holmes, the director of newsroom digital security at the Freedom of the Press Foundation.
WhatsApp states on its website that it does not store logs of who is messaging who, “in the ordinary course of providing our service”. Yet it does have the technical capacity to do so. In some cases including when they believe it’s necessary to keep users safe or comply with legal processes, they state, “we may collect, use, preserve, and share user information” including “information about how some users interact with others on our service.”
Signal, by contrast, cannot comply with law enforcement even if it wanted to. (It’s not clear that it does: in early June, Signal’s founder and CEO Moxie Marlinspike tweeted “ACAB” — All Cops Are Bastards — in response to allegations that police had stockpiled personal protective equipment amid the pandemic.) In 2016, a Virginia grand jury subpoenaed Signal for data about a user, but because it encrypts virtually all its metadata, the only information Signal was able to provide in response was the date and time the user downloaded the app, and when they had last used it. “Signal works very, very hard in order to protect their users by limiting the amount of metadata that is available in the event of a subpoena,” Holmes says.
The approach has not won Signal fans in the Justice Department, which is supporting a new bill that would require purveyors of encrypted software to insert “backdoors” to make it possible for authorities to access people’s messages. Opponents say the bill would undermine both democracy and the very principles that make the app so secure in the first place. Ironically, Signal is commonly used by senior Trump Administration officials and those in the intelligence services, who consider it one of the most secure options available, according to reporters in TIME’s Washington bureau.
Signal’s value system aligns neatly with the belief, popular in Silicon Valley’s early days, that encryption is the sole key to individual liberty in a world where authorities will use technology to further their inevitably authoritarian goals. Known as crypto-anarchism, this philosophy emerged in the late 1980s among libertarian computer scientists and influenced the thinking of many programmers, including Marlinspike. “Crypto-anarchists thought that the one thing you can rely on to guarantee freedom is basically physics, which in the mid 1990s finally allowed you to build systems that governments couldn’t monitor and couldn’t control,” says Jamie Bartlett, the author of The People vs Tech, referring to the mathematical rules that make good encryption so secure. “They were looking at the Internet that they loved but they could see where it was going. Governments would be using it to monitor people, businesses would be using it to collect data about people. And unless they made powerful encryption available to ordinary people, this would turn into a dystopian nightmare.”
As a young adult in the 1990s, Marlinspike — who declined to be interviewed for this story — spent his life on the fringes of society, teaching himself computer science, hacking into insecure servers, and illegally hitching rides on freight trains across the United States. A tall white man with dreadlocks, he always had a distrust for authority, but Snowden’s leaks appeared to crystallize his views. In a post published on his blog in June 2013, which is no longer accessible online, Marlinspike wrote about the danger these new surveillance capabilities posed when exercised by a state that you could not trust. “Police already abuse the immense power they have, but if everyone’s every action were being monitored … then punishment becomes purely selective,” he wrote. “Those in power will essentially have what they need to punish anyone they’d like, whenever they choose, as if there were no rules at all.” But, Marlinspike argued, this problem was not unsolvable. “It is possible to develop user-friendly technical solutions that would stymie this type of surveillance,” he wrote.
By the time he’d written that blog post, Marlinspike had already made an effort to build such a “user-friendly technical solution.” Called the Textsecure Protocol (later the Signal Protocol), it was a sort of recipe for strong end-to-end encryption that could ensure only the sender and recipient of a message were able to read its contents, and not authorities or bad actors wishing to pry. In 2010 Marlinspike launched two apps—one for text messaging and another for phone calls—based on the protocol. In 2014 he merged them, and Signal was born.
The app was kept afloat thanks to nearly $3 million in funding from the Open Technology Fund, a Congress-funded nonprofit that finances projects aimed at countering censorship and surveillance. In keeping with security best practices, the Signal Protocol is open source, meaning that it’s publicly available for analysts around the world to audit and suggest improvements. (Signal’s other main competitor, Telegram, is not end-to-end encrypted by default, and security researchers have raised concerns about its encryption protocol, which unlike Signal’s is not open source.) But although by all accounts secure, Signal back in 2014 was hardly user-friendly. It had a relatively small user base, mostly made up of digital security geeks. It wasn’t the kind of influence Marlinspike wanted.
So Marlinspike sought out Acton, who had co-founded WhatsApp in 2009 along with Jan Koum. The pair had since grown it into the largest messaging app in the world, and in 2014 Facebook snapped it up for a record-setting $19 billion. Marlinspike’s views on privacy aligned with theirs (Koum had grown up under the ever-present surveillance of Soviet Ukraine) and in 2016, with Facebook’s blessing, they worked to integrate the Signal Protocol into WhatsApp, encrypting billions of conversations globally. It was a huge step toward Marlinspike’s dream of an Internet that rejected, rather than enabled, surveillance. “The big win is when a billion people are using WhatsApp and don’t even know it’s encrypted,” he told Wired magazine in 2016. “I think we’ve already won the future.”
But Acton, who was by now a billionaire thanks to the buyout, would soon get into an acrimonious dispute with Facebook’s executives. When he and Koum agreed to the sale in 2014, Acton scrawled a note to Koum stipulating the ways WhatsApp would remain separate from its new parent company: “No ads! No games! No gimmicks!” Even so, while Acton was still at the company in 2016, WhatsApp introduced new terms of service that forced users, if they wanted to keep using the app, to agree that their WhatsApp data could be accessed by Facebook. It was Facebook’s first step toward monetizing the app, which at the time was barely profitable.
Acton was growing alarmed at what he saw as Facebook’s plans to add advertisements and track even more user data. In Sept. 2017, he walked away from the company, leaving behind $850 million in Facebook stock that would have vested in the coming months had he stayed. (As of September 2020, Facebook still hasn’t inserted ads into the app.) “I’m at peace with that,” Acton says of his decision to leave. “I’m happier doing what I’m doing in this environment, and with the people that I’m working with,” he says.
Building a Foundation
Soon after quitting, Acton teamed up with Marlinspike once again. Each of them knew that while encrypting all messages sent via WhatsApp had been a great achievement, it wasn’t the end. They wanted to create an app that encrypted everything. So Acton poured $50 million of his Facebook fortune into setting up the Signal Foundation, a non-profit that could support the development of Signal as a direct rival to WhatsApp.
Acton’s millions allowed Signal to more than treble its staff, many of whom now focus on making the app more user-friendly. They recently added the ability to react to messages with emojis, for example, just in time to entice a new generation of protesters like Oleita and Russell. And unlike others who had approached Signal offering funding, Acton’s money came with no requirements to monetize the app by adding trackers that might compromise user privacy. “Signal the app is like the purest form of what Moxie and his team envisioned for the Signal Protocol,” Holmes says. “WhatsApp is the example of how that protocol can be placed into other like environments where the developers around that client have other goals in mind.”
Although it was meant to be an alternative business model to the one normally followed in Silicon Valley, Signal’s approach bears a striking similarity to the unprofitable startups that rely on billions of venture capital dollars to build themselves up into a position where they’re able to bring in revenue. “It hasn’t been forefront in our minds to focus on donations right now, primarily because we have a lot of money in the bank,” Acton says. “And secondarily, because we’ve also gotten additional large-ish donations from external donors. So that’s given us a pretty long runway where we can just focus on growth, and our ambition is to get a much larger population before doing more to solicit and engender donations.” (Signal declined to share any information about the identities of its major donors, other than Acton, with TIME.)
Still, one important difference is that this business model doesn’t rely on what the author Shoshana Zuboff calls Surveillance Capitalism: the blueprint by which tech companies offer free services in return for swaths of your personal data, which allow those companies to target personalized ads at you, lucratively. In 2018, as the Cambridge Analytica scandal was revealing new information about Facebook’s questionable history of sharing user data, Acton tweeted: “It is time. #deletefacebook.” He says he still doesn’t have a Facebook or Instagram account, mainly because of the way they target ads. “To me, the more standard monetization strategies of tracking users and tracking user activity, and targeting ads, that all generally feels like an exploitation of the user,” Acton says. “Marketing is a form of mind control. You’re affecting people’s decision-making capabilities and you’re affecting their choices. And that can have negative consequences.”
An even more sinister side effect of Surveillance Capitalism is the data trail it leaves behind–and the ways authorities can utilize it for their own type of surveillance. Marlinspike wrote in 2013 that instead of tapping into phone conversations, changes in the nature of the Internet meant that “[now,] the government more often just goes to the places where information has been accumulating on its own, such as email providers, search engines, social networks.”
It was a surveillance technique Marlinspike and Acton knew WhatsApp was still vulnerable to because of its unencrypted metadata, and one they both wanted to disrupt. It’s impossible to know how much user data WhatsApp alone provides to authorities, because Facebook only makes such data available for all its services combined — bundling WhatsApp together with Instagram and the Facebook platform itself. (WhatsApp’s director of communications, Carl Woog, declined to provide TIME with data relating to how often WhatsApp alone provides user data to authorities.) Still, those aggregate data show that in the second half of 2019, Facebook received more than 51,000 requests from U.S. authorities for data concerning more than 82,000 users, and produced “some data” in response to 88% of those requests. By contrast, Signal tells TIME it has received no requests from law enforcement for user data since the one from the Virginia grand jury in 2016. “I think most governments and lawyers know that we really don’t know anything,” a Signal spokesperson tells TIME. “So why bother?”
Another reason, of course, is that Signal has far, far fewer users than WhatsApp. But Acton also puts it down to Signal’s broader application of encryption. “They can do that type of stuff on WhatsApp because they have access to the sender, the receiver, the timestamp, you know of these messages,” Acton says. “We don’t have access to that on Signal. We don’t want to know who you are, what you’re doing on our system. And so we either don’t collect the information, don’t store the information, or if we have to, we encrypt it. And when we encrypt it, we encrypt it in a way that we’re unable to reverse it.”
Despite those inbuilt protections, Signal has still come under criticism from security researchers for what some have called a privacy flaw: the fact that when you download Signal for the first time, your contacts who also have the app installed get a notification. It’s an example of one tradeoff between growth and privacy where — despite its privacy-focused image — Signal has come down on the side of growth. After all, you’re more likely to use the app, and keep using it, if you know which of your friends are on there too. But the approach has been questioned by domestic violence support groups, who say it presents a possible privacy violation. “Tools such as Signal can be incredibly helpful when used strategically, but when the design creates an immediate sharing of information without the informed consent of the user, that can raise potentially harmful risks,” says Erica Olsen of the National Network to End Domestic Violence. “Survivors may be in a position where they are looking for a secure communication tool, but don’t want to share that fact with other people in their lives.” Signal says that it’s possible to block users to solve problems like this, but that it’s also working on a more long-term fix: making it possible for people to use the app without providing their phone numbers at all.
The encryption dilemma
Since the 1990s, encryption has faced threats from government agencies seeking to maintain (or strengthen) their surveillance powers in the face of increasingly secure code. But though it appeared these so-called “crypto wars” were won when strong encryption became widely accessible, Signal is now under threat from a new salvo in that battle. The Justice Department wants to amend Section 230 of the Communications Decency Act, which currently allows tech companies to avoid legal liability for the things users say on their platform. The proposed change is in part a retaliation by President Trump against what he sees as social media platforms unfairly censoring conservatives, but could threaten encrypted services too. The amendment would mean companies would have to “earn” Section 230’s protections by following a set of best practices that Signal says are “extraordinarily unlikely to allow end-to-end encryption.”
Even if that amendment doesn’t pass, the Justice Department is supporting a different bill that would force outfits like Signal to build “backdoors” into their software, to allow authorities with a warrant their own special key to decrypt suspects’ messages. “While strong encryption provides enormous benefits to society and is undoubtedly necessary for the security and privacy of Americans, end-to-end encryption technology is being abused by child predators, terrorists, drug traffickers, and even hackers to perpetrate their crimes and avoid detection,” said Attorney General William Barr on June 23. “Warrant-proof encryption allows these criminals to operate with impunity. This is dangerous and unacceptable.”
There’s no denying that encrypted apps are used for evil as well as good, says Jeff Wilbur, the senior director for online trust at the Internet Society, a nonprofit that campaigns for an open Internet. But, he says, the quirk of mathematics that guarantees security for end-to-end encryption’s everyday users—including vulnerable groups like marginalized minorities, protesters and victims of domestic abuse—is only so powerful because it works the same for all users. “The concept of only seeing one suspected criminal’s data, with a warrant, sounds great,” Wilbur says. “But the technical mechanism you’d have to build into the service to see one person’s data can potentially let you see any person’s data. It’s like having a master key. And what if a criminal or a nation state got a hold of that same master key? That’s the danger.”
Even in a world with perfect corporations and unimpeachable law enforcement, it would be a difficult tradeoff between privacy and the rule of law. Add distrust of authorities and Surveillance Capitalism into the mix, and you arrive at an even trickier calculation about where to draw the line. “The problem is, ordinary people rely on rules and laws to protect them,” says Bartlett, the author of The People vs Tech. “The amount of times people get convicted on the basis of the government being able to legally acquire communications that prove guilt — it’s absolutely crucial.”
But at the same time, governments have regularly proved themselves willing and able to abuse those powers. “I do blame the government for bringing it on themselves,” Bartlett says. “The revelations about what governments have been doing have obviously helped stimulate a new generation of encrypted messaging systems that people, rightly, would want. And it ends up causing the government a massive headache. And it’s their fault because they shouldn’t have been doing what they were doing.”
Still, despite the existential risk that a law undermining encryption would pose for Signal, Acton says he sees the possibility as just a “low medium” threat. “I’d be really surprised if the American public were to pass a law like this that stood the test of time,” he says. If that were to happen, he adds, Signal would try to find ways around the law — possibly including leaving the U.S. “We would continue to seek to own and operate our service. That might mean having to reincorporate somewhere.”
In the meantime, Signal is more focused on attracting new users. In August, the nonprofit rolled out a test version of its desktop app that would allow encrypted video calling — an attempt to move into the lucrative space opened up by the rise in home working due to the pandemic. I try to use it to conduct my interview with Acton, but the call fails to connect. When I get through on Google Hangouts instead, I see him scribbling notes at his desk. “Just this interaction alone gave me a couple ideas for improvements,” he says excitedly.
The episode reveals something about how Acton sees Signal’s priorities. “Our responsibility is first to maintain the highest level of privacy, and then the highest quality product experience,” he says. “Our attempt to connect on Signal desktop was — to me, that’s a fail. So it’s like, okay, we’ll go figure it out.”
Selected by softengoxford