Transcript: The Path Forward: Technology and Security with Meredith Whittaker

RELATED POSTS

MS. ZAKRZEWSKI: Hello, and welcome to Washington Post Live. I’m Cat Zakrzewski, a tech policy reporter here at The Post, and I’m joined today by Meredith Whittaker, the president of the encrypted messaging app Signal and a noted critic of huge tech.

Hey, Meredith. Welcome to Washington Post Live.

MS. WHITTAKER: So great to be with you here, Cat. Thanks.

MS. ZAKRZEWSKI: Well, thanks a lot for joining us, and we now have lots to discuss today. So I need to dive right in to an op-ed that ran in The Recent York Times in December that got numerous attention online. Effectively, this op-ed mainly argued that encrypted messaging apps are promoting, quote, “a reasonably extreme conception of privacy.” The op-ed argued that is dangerous because this extreme version of privacy might be exploited by criminals. What’s your response to that op-ed?

MS. WHITTAKER: Look, this isn’t a recent argument. It’s actually an argument that has been debunked in numerous ways over the past 20 years as a sort of hotly contested debate around whether the human right to privacy should extend to digital communications has played out, and this has been called the “crypto wars.” This has had other names, but there are actors on one side who imagine that this human right should extend to any mode of communication or, , creation or what have you ever, digital or analog, after which there are powerful actors who would love to have access to the data that may be otherwise shielded if we did extend the correct to privacy. So this isn’t a recent debate.

I believe what is–what’s interesting in regards to the op-ed is more the op-ed metadata than the contents. The contents was, , truthfully sort of a large number, and I might point you to folks like Jennifer Granick or Matt Green or Eva Galperin who, , offered really compelling takedowns on Twitter and otherwise. There are, , folks who’ve been studying and interesting with these issues for a lot of, a few years who could have easily dissected the weakness of those arguments.

But I believe what is exclusive is that the op-ed is coming at a time when attacks on privacy are accelerating. We see dangerous regulation within the UK and the EU that is ready to potentially go into effect this 12 months, and we see increasing calls for things like age verification and other anti-privacy measures.

We saw a bill passed in California recently that may require certain sites to confirm the age of people who find themselves visiting those web sites, right, creating a reasonably comprehensive surveillance apparatus that may be necessitated for enabling that kind of validation.

So this op-ed is being placed at a time when there may be a renewed attack on the human right of privacy, and it is–you know, it’s notable that this op-ed, , in my view–and I sort of did a thread breaking this down on Twitter for individuals who are excited about more, but I believe this op-ed is weak arguments which are packaged within the imprimatur of authority–it has–you know, The Recent York Times all the time helps with that–that will almost certainly be used as a sort of hole citation by those that are in the following 12 months set to likely unveil anti-privacy regulation and political platforms. And I believe we actually need to observe out for that. What instrumented use do op-eds like this–particularly since it was written by any individual who I’d never heard of. This isn’t someone who is understood on this space. The writer owned a fireworks company by turning–you know, before turning to kind of ethics consulting for tech. So there’s not–you know, this isn’t any individual who seems to have numerous groundwork within the privacy debates or a deep understanding of the sort of nuances and particularities of this issue. Nonetheless, an op-ed that claims privacy is bad might be easily leveraged by opponents of privacy, so…

MS. ZAKRZEWSKI: And I wanted to return to a degree that you just made at first of your answer that this can be a debate that we’ve been having because the early Nineteen Nineties around what’s the role of encryption, and in your view, I mean, is there any approach to preserve the advantages that we get from these privacy-enabling services while allowing government or law enforcement access to encrypted communications?

MS. WHITTAKER: I believe there was a sort of–you know, again, that is magical considering that you just see kind of cropping up like mushrooms, because the need stays, even when the facts are kind of inconvenient, right, is that law enforcement, governments, large institutions, corporations would love access to this data, right? Nonetheless, you can’t break privacy for some people and leave it intact for others. It’s a sort of all-or-nothing proposition, and this is–you know, that is the facts on the sort of technical level. You may’t have a backdoor encrypt in it, , for encryption that only the nice guys–as, , sophomoric because the framing of fine guys and bad guys is in a nuanced and adult world–you can’t have encryption that has a back door for under the so-called “good guys” that isn’t then exploited in some ways by the bad guys. That’s then a vulnerability that’s open for exploit by everyone else, and it ultimately breaks the privacy guarantees that each one of those institutions also depend on, right?

Most individuals in government use Signal. Most individuals in law enforcement, you know–well, I don’t–I don’t actually find out about law enforcement, but I used to be in government for a minute, and that could be very clear, right? People understand the worth of privacy when it applies to themselves. It’s this magical considering that one way or the other leads people to hope, to wish, to kind of falsely imagine that there’s a approach to break privacy but only in order that I can get to those other people’s information. Someway my information will turn out to be protected.

MS. ZAKRZEWSKI: And I desired to ask you too. You mentioned the worldwide push for regulation, and we have seen an actual willingness, especially in Europe and in California, to handle online harms and really severe issues we’re talking about here, like child exploitation online. What are a few of the ways you think that regulators could try to handle a few of those issues that we’re seeing, particularly in relation to online harm on social media, that would not damage people’s right to privacy?

MS. WHITTAKER: Yeah. I mean, there are–there are hosts of–you know, methods for addressing the real-world material harm to people, to children, to others, right? And this is–you know, I’m–you know, I believe there are numerous methods. You saw–you know, you see law enforcement prevail in numerous cases that don’t require kind of back doors, right? And I–you know, I sort of want to guide that–leave that speculation to the experts who’re working on those issues particularly.

What I can say is that we’re seeing child endangerment, which all of us agree children ought to be protected, right? That’s not a–that isn’t a controversial–you know, that isn’t a controversial stance. Nonetheless, we’re seeing the problem, the very emotionally charged issue of kid endangerment, getting used now, because it has been prior to now, as a pretext for arguing that we’d like to implement, , fundamentally unworkable, , mass surveillance capabilities.

So you’ve got the–you know, there’s a bill within the UK called the UK Online Safety bill and people just like the Open Rights Group. I might point people to the Open Rights Group and their work if you wish to get a greater understanding of this bill and its dangers. But that is, , kind of moving through the UK and includes requirements for personal messaging, to intercept and scan all content for kind of, , bad content, right? That content includes extremely ill-defined terms like “grooming,” right? What’s grooming?

I believe we will kind of marinate on that query and take a look at numerous the anti-LGBT campaigns and numerous the, , kind of–you know, I believe dangerous and exclusionary rhetoric that we’re seeing within the U.S. and elsewhere and understand the–you know, this isn’t even a slippery slope. This is sort of a, , cavernous drop that happens once we begin to implement this for a seemingly good cause, which is protecting children, and suddenly, these definitions expand and expand and expand and are kind of instrumented so you’ve got mass surveillance of all communications in a world where people increasingly do–you know, they require these messaging services to go about their every day life, to take part in commerce, to take part in their workplaces.

And, , then there may be just a–you know, there may be a mass surveillance device in front, , between you and the person you wish to message that’s being controlled by an entity who can define and redefine the terms of what is appropriate and unacceptable content, right? That is a very dangerous precedent, and I believe history shows how friendly such policies are to authoritarianism and oppression.

MS. ZAKRZEWSKI: And, on that time, I desired to ask you a little bit bit about a few of the obstacles that Signal has faced world wide. Recently, you led a push to be sure that Signal could still be available to protestors in Iran, despite government efforts to dam the app. Are you able to tell us a little bit bit about how effective that push was? Do you’ve got any data on how many individuals were capable of access the service from Iran?

MS. WHITTAKER: Yeah. I mean, I believe before I answer that query directly, I just want to spotlight, like, that’s an excellent sort of, , almost a crude example that I believe proves the purpose of how threatening to oppressors and authoritarians, the power of the general public to speak privately is, right, that there was a–you know, when social unrest and popular rebellion fomented and commenced manifesting on the streets, certainly one of the primary actions of the federal government was to shut down communications because that was seen as endangering their authority and their modes of social control.

So, yeah, they blocked signal. They blocked another apps, and we put out a call to our community, which is, you know–love you all–really robust and wonderful community on the market to run proxy servers, which mainly would allow people in Iran to bypass that block and get to Signal through one other means.

And we don’t–you know, those servers weren’t run by Signal, right? Those servers were run by community volunteers, but we did hear from folks that the traffic stats on those servers were, in lots of cases, fairly high. We heard reports from people in Iran that they were capable of use Signal. We still had issues with registration blocking and other things, but, , we did see that that contributed to the power of individuals in Iran to speak privately and coordinate with one another at a time when it was particularly critical.

MS. ZAKRZEWSKI: And also you mentioned that Signal community, and I believe certainly one of the things that’s really extraordinary about Signal is how different it’s from numerous the for-profit corporations that make other messaging services that individuals are accustomed to. And I desired to ask you, since you don’t have promoting as a part of your small business model, will there ever turn out to be a degree where people may have to pay for Signal?

MS. WHITTAKER: No. Signal will all the time be free to make use of. So, , certainly one of the explanations for that’s we don’t need to implement a privacy tax just per our mission and ethically. We do not think individuals who can afford it ought to be the one folks to avail themselves of privacy.

But, beyond that, there’s a practical reason, right? If I’m the just one in my friend group who pays, then Signal doesn’t work to me–for me because I can’t talk over with anyone, right? So we actually do need to go away it open and kind of allow that network effect of encryption to take hold.

Now, that doesn’t mean we may not have kind of paid storage or other kind of add-ons in the longer term. We’re exploring the opportunity of a few of those potential revenue streams, but nothing is on the roadmap at once. But Signal as a service will all the time be free to make use of, and that’s pretty core to our mission.

MS. ZAKRZEWSKI: And I did need to ask you furthermore mght about a few of the recent product changes to Signal. There’s been numerous attention on the undeniable fact that Signal has added these ephemeral Stories which are somewhat just like what we’d see on Instagram or Snapchat, and that was a little bit of a controversial decision. I saw some pushback from some heavy Signal users, and so how do you think that that rollout goes? And any stats on what number of individuals are using it?

MS. WHITTAKER: Yeah. So I don’t have stats to share, but I can say, like, the limited information we do collect and emphasis on limited because we don’t collect the sort of user analytics that a surveillance app company would normally collect, but, , we do see Stories used. And we now have heard a lot–a lot of fine reviews from people who find themselves just having fun with them and feeling sort of a newfound refreshing feeling of, like, sharing intimate contact–content and not–you know, and feeling such as you, , trust the platform, that it’s actually ephemeral, that this isn’t, , smoke and mirrors that’s going to guide to some crappy targeted ads or what have you ever. So we’re seeing also positive–you know, positive feedback.

But I believe what’s interesting about that is that numerous the pushback we were getting seems it’s–you know, we’re U.S.-based, U.S. and Canada, but, , that’s the time zone we work in as a distant team. And numerous the oldsters who work for Signal are U.S.-based, and numerous the people we hear from are folks who’re speaking English. You already know, they’re on Reddit. They’re on Twitter. Lots of them are very technically conversant. I might say some percentage of them are probably within the Bay Area. It’s a really specific demographic, and the selection so as to add Stories, to prioritize that was not made to, , prioritize a sort of Western, U.S.-based, , sort of tech-centric demographic. We were really populations of individuals in the bulk world, particularly in South Asia and South America, where Stories have evolved as like just a daily sort of normative type of communication, much different–you know, much different than the use case within the U.S. or other places within the Western world. And we actually didn’t want to go away those folks behind, right? We’d heard repeatedly for numerous years like, “I can’t switch to Signal because Stories are the best way that I communicate with my friends,” and it’s only a non-starter to not have these.

So I believe what’s interesting about that’s the sort of–you know, the sort and quality of pushback that you just get once you deprioritize a kind of, , sort of hegemonic tech-centric, , all the time has available bandwidth, , Western population and start constructing for, , different populations in a heterogeneous global world.

MS. ZAKRZEWSKI: And it’s interesting once you discuss that international focus. We’ve seen definitely that WhatsApp, the Meta-owned encrypted messaging app, is kind of popular outside of the U.S., and I desired to ask you, in relation to privacy protections, how does Signal compare to WhatsApp? What are the most important differences that customers should pay attention to?

MS. WHITTAKER: Yeah. That’s a–that is an excellent query. First, WhatsApp does use the Signal protocol, which is the state-of-the-art to encrypt the contents of WhatsApp messages, at the very least for kind of consumer WhatsApp. WhatsApp for business doesn’t do that, but that’s a unique use case. In order that’s great, and I need to commend them for a visionary selection there, because, , on the time they implemented that, , that wasn’t the norm.

Nonetheless, there are some major differences that basically, really do matter and lead me to be as secure in saying that WhatsApp can’t be considered truly private, and people differences are totally on the sort of structural aspects of the business model and the eye to metadata.

Now, Signal goes out of its way not to gather metadata, and metadata, for those of you who don’t spend your days mired in technical language, is just kind of like, , the data about who’s talking to whom. It’s the sort of, , meta details about who, what, where, why, when that’s along with the contents of the messaging or the opposite substance.

Anyway, so WhatsApp collects metadata. Signal collects almost no metadata. We have now no details about you, and you may look at–you know, you may take a look at Signal.org/BigBrother. You may take a look at how the vanishingly small amount of knowledge we’re capable of provide once we are forced to comply with a subpoena.

In contrast, WhatsApp collects details about who’s in a bunch. It collects details about who’s talking to whom. It collects profile information. It collects photos, and it collects other kind of really key information that is amazingly revealing and sometimes more powerful than the sort of contents of the chat.

I believe moving from the metadata distinction, which is pretty big and we’d like to concentrate on that, we also need to acknowledge that, look, WhatsApp is owned by Meta, which is a–you know, Facebook, right? And their business model is the surveillance business model, right? They’re certainly one of the–you know, the massive players within the surveillance business model. They’ve huge amounts of extraordinarily intimate information that they collect and create via Facebook profiles, Instagram, et cetera. On top of that, they should purchase additional information from data brokers and do buy this information to create kind of staggeringly precise dossiers about us and our communities and who we talk over with and what we is likely to be excited about buying and et cetera, et cetera. So the metadata that WhatsApp has may be–you could argue it’s limited in a single term or one other, even though it’s already fairly powerful. However it’s owned by Meta, right? And that may then be joined with Facebook data and other data that’s kind of at the guts of the business model of that company.

Now, am I saying that they do that routinely? No. I do not know. Right? Like, it is a proprietary company. That information isn’t made available. I’m saying that that’s the engine of their business model. They’ve it available, and I would not trust them to kind of keep that promise if kind of some dreary earning reports and sad, , revenue growth numbers kind of prompted the board to reexamine that, right?

And, , the difference there may be, , Signal is a nonprofit. We haven’t got a board that is going to be pushing us to maximise growth and revenue, and we’re governed by completely different incentives.

MS. ZAKRZEWSKI: And so what does that mean within the context of a government order or a subpoena? What information would WhatsApp give you the chance at hand over to law enforcement?

MS. WHITTAKER: You already know, I’d even have to ascertain this. I don’t imagine they publish that information with the transparency that Signal does, right? We–again, Signal.org/BigBrother offers you all the pieces we’re capable of hand over, but WhatsApp–you know, WhatsApp collects my profile information, so my name, another information I put in my bio. It collects photos. So my photo, you can match that to a Clearview database, Clearview being a large facial recognition company. It collects details about who’s talking to whom, who’s in a bunch. So you may begin to map my network. If you’ve got my name, you’ve got someone I’m talking to, you start to map who I’m talking to, after I’m talking to them. In order that is already a fairly–like, , that’s a strong constellation of information points that would then be obviously handed over to law enforcement. It is also used to kind of, , input right into a machine learning classifier to make predictions and determinations. And it might be mapped to Facebook data once you’ve got my name and profile, run that through, , DeepFace or something, find all my images on Facebook. You already know, I’m sorry. I could extrapolate for some time around kind of the sort of dark potentials here, but I believe the emphasis on like this is–this is powerful data, and when joined with other data or other capabilities, it may do lots. And, , it definitely breaks privacy.

MS. ZAKRZEWSKI: And the way does that compare if Signal were to receive a government order? What information would you switch over?

MS. WHITTAKER: Almost nothing, and that’s by design. Again, we spend numerous time in rooms together talking about the best way to limit the gathering of information, though that collection can be often the norm. So we will provide phone number. That is the phone number some account signed up with, not necessarily yours. It might be a Google Voice phone number. So we do not really know that. We haven’t got that information. We will provide details about when someone signed up. We will provide details about after they last accessed Signal.

But, , again, I need to really backtrack. When a phone number signed up, we haven’t any details about, , who’s whom. So we haven’t got profile information. We haven’t got details about who’s in a bunch. We haven’t got details about who’s talking to whom, and we won’t match that or the opposite. And, again, you could find how little information we have been able to present for those who go to Signal.org/BigBrother.

MS. ZAKRZEWSKI: Got it. And I desired to ask you too, though, about Signal’s own dependence on a few of these large tech corporations. Do you view yourself as depending on corporations like Apple, Amazon, and Google, and what privacy or security risks does that present?

MS. WHITTAKER: Well, I mean, I believe “dependent” is a tough word here, right? Like, let’s, like, zoom out and take a look at the political economy of the tech industry at once. The surveillance business model itself trends toward consolidation. So, , you have–you know, when you’ve got the information, you’ve got the infrastructure, you’ve got the market reach, those things are self-reinforcing, and that’s certainly one of the explanations we’re now–you know, we went from these, like, , sort of primary-colored startups within the early 2000s, , the Googles and, , et cetera, to an ecosystem that’s dominated by a handful of kind of surveillance corporations. And people are the businesses which have the infrastructure and the reach, right?

So every startup, every kind of, , broadly distributed app that won’t owned by these corporations is, like, floating on their infrastructure at some level, right? It’s licensing, , AWS servers or Azure or Google Cloud. They’re to fulfill the standards of all the time available, , immediately sort of performance that’s now just what people expect tech to work like, that defines whether tech works or not. You may have to have kind of, , global reach, and you’ve got to have failover capability, and you’ve got to have the varieties of infrastructure that may cost us a whole lot of tens of millions of dollars a 12 months if we were going to attempt to bootstrap our own data centers and our own site reliability engineers and our own kind of failover capability and maintain and take care of those infrastructures indefinitely.

So, yeah, there may be an enormous issue in tech with the concentration of power and the concentration of power within the hands of the businesses that own the hardware, the infrastructure, the information, and the access, right? After which, , we will discuss App Stores. We will talk in regards to the Play Store. We will discuss all of those dependencies that everybody who operates in tech must work around. So, , we could just go into a little bit cave and create kind of an ideologically pure proof of concept, right, that’s, like, fully distributed and federated and et cetera, et cetera. But, like, nobody goes to make use of that, right? So we’d feel really good and righteous about ourselves, and, like, 4 cryptographers use it to talk over with one another, after which, , certainly one of them has a child and is like, “I can’t–I don’t have time,” right? But, if we would like to really provide a service that permits human beings who should not, in the beginning, driven by an ideological commitment to really have privacy, to really communicate safely and privately and intimately, then we’d like to, like, work with the–you know, work with the landscape we’re given.

And so we do the–you know, I might say you added a matter like what are the privacy and security considerations of kind of, , licensing those infrastructures. We do numerous work to be sure that, , the encryption implies that neither Signal nor the owners of those infrastructures can see anything, but nonetheless, we do need to work with the–work with the world we’re in. And we–you know, our decision is to prioritize the norms and expectations of the humans who’re using us, not kind of, , ideological North Stars.

MS. ZAKRZEWSKI: And also you mentioned the consolidation within the tech industry, and also you worked on the Federal Trade Commission under Lina Khan. There’s been numerous excitement that Lina Khan would usher in a recent era of tech accountability in that role. How do you think that that is been going?

MS. WHITTAKER: Well, , it’s an uphill battle. More power to her and all of the good staffers and lawyers who’re working together with her. However it’s not–you know, I might say the very best thing that individuals who care about privacy, who care in regards to the distribution of the ability that’s currently concentrated within the hands of a handful of enormous tech corporations can do is hold their feet to the hearth. Give them as much leverage as you may to give you the chance to push things through an agency that itself is commonly recalcitrant and kind of constrained and accommodates folks who may not agree or might need to get a job as general counsel someplace else afterward and don’t want their name on some strong laws.

I might also–you know, I might also look to the lobbying dollars that the tech industry is spending for one more indication of the sort of uphill battle.

But I believe, , again, it’s going to take–you know, it’s going to take folks outside. It’s going to take folks inside. It’s going to take a concerted push to get meaningful regulation that cognizes the kind of reality of the tech industry over the finish line.

MS. ZAKRZEWSKI: And we now have just a couple of minute left, and so I desired to ask, what can people expect from Signal in 2023?

MS. WHITTAKER: Well, we’re going to maintain doing what we’re doing. We’re–you know, I even have talked before we’re working on usernames. We don’t have a launch date, but we predict those this 12 months, hopefully first half of this 12 months. But, again, they will likely be done after they’re done because we’re going to do them right, and we’re going to do all the pieces we’d like to be sure that they’re robust and able to launch.

And, yeah, I don’t have one other, , a part of the roadmap to preview here, but we do love hearing from folks who use Signal. There’s a Reddit forum. There are community forums. There’s Twitter. You may join the ranks of the helpful reply guys there, and we will–you know, we’re going to proceed with a sole concentrate on privacy, on constructing the very best, most usable app we will that also protects privacy in a sturdy way. And we hope that we will lead by example, and a few of the other corporations will start coming together with us not only encrypting message contents but encrypting metadata. And we do have specifications for a way we try this, which are also available openly, and we’re all the time pleased to share if there are people who find themselves excited about kind of meeting us on the high bar.

MS. ZAKRZEWSKI: Well, unfortunately, that is on a regular basis that we now have left for today. Thanks a lot, Meredith, for joining us for such an informative and necessary discussion in regards to the way forward for privacy.

MS. WHITTAKER: Wonderful. Thanks a lot, Cat. Have an excellent day, and thanks all.

MS. ZAKRZEWSKI: And thanks all for joining us at home. If you wish to join for future Washington Post Live discussions, you could find that information on our website at WashingtonPostLive.com.

I’m Cat Zakrzewski. Thanks.

Next Post

Welcome Back!

Login to your account below

Create New Account!

Fill the forms below to register

Retrieve your password

Please enter your username or email address to reset your password.