Welcome to Remarkable People. We’re on a mission to make you remarkable. Helping me in this episode is Cindy Cohn.

Cindy stands as no ordinary legal mind; she represents the fiercest champion of digital liberty fighting against the surveillance state’s relentless expansion from her post atop the Electronic Frontier Foundation. Her courtroom victories have carved out constitutional protections that preserve the internet’s foundational freedoms, most notably proving that computer code deserves First Amendment protection through EFF’s landmark litigation strategy. Under her leadership, EFF has grown into a 125-person powerhouse that operates on member donations rather than corporate funding, ensuring complete independence from the tech giants they often challenge. Her legal expertise extends far beyond traditional practice—she’s architected EFF’s comprehensive defense framework that spans everything from building privacy tools to training activists worldwide. The Electronic Frontier Foundation’s approach combines impact litigation, technology development, and grassroots activism to defend user rights against both governmental tyranny and corporate exploitation.

During this gripping conversation, we explore how supposed “child safety” legislation transforms into censorship machinery and why EFF’s approach to digital rights battles requires both individual protection and systemic reform. She demolishes the naive “nothing to hide” mentality by revealing how rapidly shifting political winds can criminalize previously innocent communications. Her insider knowledge exposes the UK’s dangerous encryption demands and explains why even privacy experts like herself must constantly adapt their security practices while EFF develops new tools to stay ahead of emerging threats. The discussion reveals how tech companies abandoned their foundational principles, choosing surveillance capitalism over user protection—exactly why EFF refuses corporate funding from Big Tech. She provides actionable intelligence on everything from Signal configurations to border crossing tactics that listeners can implement immediately, drawing from EFF’s extensive surveillance self-defense resources.

LISTEN TO THE EPISODE HERE

Please enjoy this remarkable episode, Who Defends Your Digital Rights? Meet EFF’s Cindy Cohn.

If you enjoyed this episode of the Remarkable People podcast, please leave a rating, write a review, and subscribe. Thank you!

Follow on LinkedIn

Transcript of Guy Kawasaki’s Remarkable People podcast with Who Defends Your Digital Rights? Meet EFF’s Cindy Cohn.

Guy Kawasaki:
Good morning everyone, it's Guy Kawasaki, this is the Remarkable People Podcast, and we're on this mission to make you remarkable, so we go all over the world looking for remarkable people, and we found one really close to us in San Francisco, California. Her name is Cindy Cohn, and she's the executive director of the Electronic Frontier Foundation. Wow. And in my humble opinion, that's probably the leading defender of civil liberties in the digital world.
She has led this great case of Bernstein v. Department of Justice, which established its software programming as protected speech under the First Amendment. And the National Law Journal named her one of one hundred most influential lawyers in America.
And I love this quote about Cindy, man, I hope somebody says something like this about me someday, the quote is, "If Big Brother is watching, he better watch out for Cindy Cohn." Oh my God, I got to go back in your history, I noticed something doing research about you. So, you got your law degree in 1989 or 1990, right?

Cindy Cohn:
Yes.

Guy Kawasaki:
And then, in a mere four years, you were lead counsel for Bernstein v. Department of Justice. How did you get to that position in a mere four years?

Cindy Cohn:
Well, I kind of fell into it. To be honest, you got to remember in 1989, 1990 to 1994, there was no worldwide web, technology was being done mainly by people who had high technical skills out of universities and research institutions, and I happened to meet some of them. And one of them was EFF's founder, John Gilmore.
And I literally met him at a party in Haight-Ashbury and we became friends, and for a while I dated one of his friends, and he was putting EFF together, the Electronic Frontier Foundation. A couple years later, he called me up and he said, "Do you know how to do a lawsuit?" And I, just a couple years out of law school, really not the right person for this, said, "Sure, I know how to do a lawsuit."
He said, "Good, because we've got this guy and he wrote a computer program and he wants to publish it on the internet, and he's been told that if he does, he could go to jail as an arms dealer." And I said, "What does it do? Does it blow things up?" And he said, "No, it keeps things secret."
And I said, "Well, that sounds like a problem and a First Amendment problem at that," and he said, "I think so too, will you take the case?" And I said, yes. I had never been online, I didn't really know very much about what these guys were doing, they were my friends, but I wasn't even, as you'd pointed out, I'm kind of a baby lawyer. I'd never done a constitutional case of that magnitude, but I got lucky.
And between John and some of the other early internet people, and then very kind like cryptographers, computer science professors, Hal Abelson at MIT, and a bunch of others, they actually taught me enough about how the internet works, how coding works, and how cryptography works that we were able to mount this challenge and do so successfully.
But to me, I just was in the right place at the right time, and had the good fortune to think my friends would think I was cool if I did this lawsuit. And then, the patience and support of a lot of people to be able to sit at the, I always call it driving the big truck, to be able to sit in the driver's seat of this big truck that we drove through the government's cryptography regulations.

Guy Kawasaki:
So, Cindy, are you basically saying to me that in one of the most pivotal cases in intellectual property, you were faking it until you make it?

Cindy Cohn:
Totally. Absolutely. Now, I had the good sense and the luck to have a lot of people around me, and to be able to pull in people. By the time we got deep enough in the case, we had a guy named Bob Corn-Revere join our case, he'd already argued a First Amendment case in the Supreme Court, and he saw what we were doing and came in, and was like, let me help ground you in this kind of long history of the Constitution.

Guy Kawasaki:
Oh wow.

Cindy Cohn:
It's more like we were this rolling tumbleweed that started with just me, but as we went along, we picked up people with expertise and support, so that by the time I was standing in front of the Ninth Circuit Court of Appeal, arguing this case, I was standing on the shoulders of lots and lots of giants who had thrown in to help us. But yeah, the very start of it was quite literally, I thought my friend John would think I was cool if I said yes.

Guy Kawasaki:
Wow. Listen, when I saw that four years after your law degree, you were leading this case, I did a search on ChatGPT and I asked for examples of lawyers who had huge cases early in their careers, and it came up with this list of Neal Katyal, Gloria Allred, Sherrilyn Ifill, Preet Bharara.

Cindy Cohn:
Yeah. Preet. Yeah, very famous lawyer, very good lawyer.

Guy Kawasaki:
Yeah. And I looked at that, listen, you were faster than all of them. So, I said, oh my God, Cindy is the Caitlin Clark of civil liberties. My God.

Cindy Cohn:
Oh, as an Iowan, you could give me no higher compliment than comparing me to Caitlin Clark.

Guy Kawasaki:
There you go.

Cindy Cohn:
Yeah, she's from my home state.

Guy Kawasaki:
And Cindy, I will confess to you that I checked to see if you were related to Roy Cohn, to see if there was any nepotism involved, but there is not.

Cindy Cohn:
No, no, no, I don't want to be related to him. But yeah, my family doesn't come from that, neither of my parents went to college, and I was lucky enough to get to go.

Guy Kawasaki:
I would not brag about being related to Roy Cohn.

Cindy Cohn:
No, I do not want to be related to him. But I did not come from highly educated lawyered up family.

Guy Kawasaki:
While we're on the subject of Bernstein v. Department of Justice, if the Department of Justice had prevailed, what would be different today?

Cindy Cohn:
I think that what would be different today is that we'd have even less security online than we do now. Now, we still have a lot to do, I don't claim that this is an ongoing fight, and the attacks on encryption keep coming. The UK is horrible right now, Australia passed a bad law. But we have Signal, we have HTTPS, right? The ability to go from your browser to a website without that information being in the clear so you can access information without it being immediately tracked.
We have basic security, when you turn off your phone, it's encrypted, so if you lose your phone, you don't lose all your data or access to all your data. That's because Apple put encryption into the actual device so that your data is encrypted when you turn the device off. The same thing's true for most computers and phones now.
Encryption is really baked into so many things we do, and I think it would be not baked into nearly as many, I think the government would've ultimately had to let us have some security online, but it wouldn't have been nearly as pervasive, and we wouldn't be able to continue to deploy it without government approval.
And parts of governments really understand the need for strong security, I would say NIST and some of the other parts of government, but when you get over on the law enforcement side, they're really, really hostile to it.
And we've had an upper hand in that fight since the 1990s because it got taken off the munitions list, and it wasn't regulated, so we could go ahead and innovate first and not have to go to the government on bended knee and beg for permission. And I think that's benefited all of us in terms of even having the security we have now online.

Guy Kawasaki:
Now, when you referred to the United Kingdom just now, is that the request they're making of Apple so that there's a backdoor to the encrypted iCloud files?

Cindy Cohn:
Yes. It's hard because the UK has even more secrecy around these things than we have in the US. So, we don't know exactly what's going on, what we know is that Apple offered something, I think it's called ADP, Advanced Data Protection or might be APT, I might have that wrong. But basically you could turn on a switch and have advanced protection, and that would mean that your iCloud backups were encrypted.
Now, we think that should be the default and not a switch you have to turn on, but that's okay, at least they offered it. And that's really important for human rights defenders, for journalists, for people who find themselves targeted for espionage, as well as by governments. And we know that the UK government has demanded that anybody who provides you with a service or a device have access to the plain text of everything you do on your device.
And this is the way they talk about, they don't say we're going to ban encryption, they always say, oh, we love encryption, we're not banning it, but we're just going to make sure that the people who provide you with services and devices always have access to the plain text. And there's no way to read that as anything other than denying you encryption, real encryption.
And so, we know that the UK government provided something to Apple, we think it's that they had to provide access to the plain text, and that Apple re-engineer the device so that they could always have access to the plain text. And we know that what Apple did in response is say, we're just not going to let anybody turn on this extra protection in the UK, because I think they didn't want to have to downgrade it, it's pretty hard to downgrade it just for UK people.
And of course, if they downgrade it for people in the UK, that's anybody who's talking to anybody in the UK. So, that affects all of us, or most of us. So, we think that's what's going on, we haven't seen the actual documents yet, but I think it's a safe bet. That's what's going on with Apple, and I really appreciate Apple because they've been pretty public about it, as public as they can be.
We don't know what kind of orders have been issued to the other companies who have been very quiet, but I think it's highly unlikely that the UK government just picked Apple. They're not even the biggest operating system, Android is bigger when you go global. So, I suspect that something similar could be going on to Android and other devices that is not as visible to us. Again, I don't know, it's all secret, but I think in time we're going to figure it out. And it's problematic.
People should be rising up. We need strong security and privacy, and it's not just because of law enforcement access, if the police showed up at your front door and said, Guy, we're really worried, there's a lot of break-ins in the neighborhood, so what we want you to do is to leave your back door open so that if the burglars break in, we can catch them easily because we don't want to have to bound down your front door in order to do it.
You'd look at them and tell them they were crazy. But when that's happening in the context of digital devices where it's a little more abstract and they use language to obscure what they're doing, this gets put forward as if it's, oh, law enforcement absolutely needs this, it's crazy, and it's bad for all of us.

Guy Kawasaki:
Wow. So, if Apple were to agree with the UK, then logically the FBI in America would say, you did it for the UK, you should do it for us, and then none of us have encryption anymore.

Cindy Cohn:
That's correct. And even if you think the UK, they're a Western democracy, they have rule of law, and they have due process, it's not just the US that's going to be following, it's all the countries of the world.
You're going to have Nigeria, which has a tremendous problem with corruption in their government, and attacking political opponents, all of the countries in the world are going to say, you did it for the UK, you should do it for us, and I think it's terrible in the United States, it gets even scarier as you go around the world.

Guy Kawasaki:
Okay. So, can we back up for a second?

Cindy Cohn:
I'm sorry, I scared the pants off you, didn't I?

Guy Kawasaki:
No, it is a time to be scared. So, could we just back up a little bit and could you explain for us what the EFF actually does?

Cindy Cohn:
Yep. The Electronic Frontier Foundation is the oldest and the biggest online digital rights organization, we're based in San Francisco, we're now 125 people strong. And essentially, we work to make sure that when you go online, your rights go with you.
So, we are a civil liberties organization, we're focused on law and rights as they relate to digital technologies, and we really center ourselves on the users of technology. So, making sure that the users of technologies have a voice and are protected as we're moving forward. Now, our tools are, we are a lot of lawyers, I think that's still my biggest team.
We do impact litigation, so we take cases like my Bernstein case, to try to set the law, especially constitutional law because that's the province of the courts in the right place to protect users. But we also have an activism team, and we have a tech team, we call them the pit crew, the public interest technology team. And we build some technology.
So, we build a plugin for Firefox and Chrome called Privacy Badger that blocks tracking cookies, those cookies that follow you all around the web, and it blocks other kinds of tracking. We build a thing called Certbot, which is part of the process for certificate authorities, for making sure that when you go to visit a website, your traffic to that website is not trackable because it's encrypted.
So, we build technologies, we bring lawsuits, we do activism, all of these kind of towards this goal of trying to make sure that your rights are respected when you're using technologies.

Guy Kawasaki:
And how do you pay for all of this?

Cindy Cohn:
We are member supported; we get no government money. We get a little bit of foundation money, we get a little bit of corporate money, but not from the big guys, that often comes with too many strings about our advocacy. Companies that are a little smaller, and that might be running a VPN service, or a privacy service, we get some support from them. But mainly it's individuals.
Over half of our money comes from individuals, and a huge chunk of that come from people who are ordinary members, who give us sixty dollars or one hundred dollars or 1000 dollars and get t-shirts and hats and stuff.
I think EFF has been a marker for people, that they love tech, but they also love rights that they're trying not to use tech to crunch on people, and we have a pretty good membership inside the big tech companies, the Facebooks, and the Googles, and Alphabets, and other companies.
Even inside the government, we have a lot of members, because I think it's a way for people to show that they're trying to be in this for the right reasons and that they really want to build and support tools that support people rather than oppress people. We're members supported and always have been.

Guy Kawasaki:
So, member supported is like NPR, where you're a member of KQED or something like that?

Cindy Cohn:
Correct.

Guy Kawasaki:
Okay.

Cindy Cohn:
Only we have much cooler t-shirts, hats, and stickers. But yeah, it's a lot like that. I kind of joke sometimes, at EFF we work for tips, right? We're going to go out there and do what we're going to do, and if people think that what we're doing is important, and it's important to have a foothold and a voice out there to counterbalance the governmental voices, or the corporate voices that might be enshitifying your tools or not really on your side, we are one of those people who show up and try to fight for it.
And there's a lot of digital rights groups now, and I really love, when I got into this, there was just us and now there is a whole constellation of people doing really good work. I think what makes us different is that we do have this tech team, we are really grounded in how the tech actually works.
We don't fly off and pretend, or tell the scary stories about how tech's going to eat your children or any of those things, we're really try to stay very grounded in how things actually work. And we've developed a reputation in the courts, and in Congress, and in various administrations going all the way back to the 1990s, as the people who show up and will tell you the truth about how technology works and how it doesn't work.

Guy Kawasaki:
So, this is a dumb question, and I know the answer already, but I got to ask it just to make sure. So, theoretically, if Elon or Mark calls you up and says, we want to give you a ten million dollar donation, the answer is?

Cindy Cohn:
It's probably no. And this has actually happened. Some of those people, not ones you've named, but some of those companies have offered us a lot of money. And historically in the past, there was a time when we were more aligned with them, especially the early days with Google. We were pretty aligned with them because they were trying to free things up, especially in some of the copyright fights that we've done, and IP fights, where they were really trying to give users access to information and stuff.
Those days are gone. There is a different leadership and they're much bigger and they have a different viewpoint. Right now, if one of those companies showed up and said, let us shower you with money, I would take the call, but if there were any strings attached, if there was anything that made it look like it, and honestly for some of them, I think at this point I probably would just say no because there's no way it wouldn't be perceived that way, the answer is no.
I really want our support to be who the people we are who were standing up for, and I'm not in this to stand up for Jeff Bezos, I'm not in this to stand up for Mark Zuckerberg, I'm in this to stand up for all the people who in some ways feel like they're hostages to these people.
And you can't really do both, you can't stand up for the people who are locking you in with a surveillance business model that tracks everything you do, and ranks you, and the people who are being tracked, you kind of have to be on one side or the other, at this point in time.
I'm sad about that. Those companies used to side with their users a lot more, and one of the sad things that I've seen in the thirty-five years that the organization has been in existence is the sliding away from those kind of tech and user roots towards a more adversarial position towards their users.

Guy Kawasaki:
I would use a stronger verb than sliding away, but yeah, we agree.

Cindy Cohn:
I'm trying to be a little kind, but yeah, no. And I think it's problematic, right? Because I worry. It used to be people came to Silicon Valley because they had a cool idea that they really wanted to make happen, I know he was a difficult guy, but even Steve Jobs, he was a problem solver.
He was trying to solve interesting problems that would help people. And again, didn't know the man, I don't claim to know everything now it just seems it's like how do we exploit people's data to make as much money as possible? And that's a very different framing than what I lived through in the 1990s and the 2000s.

Guy Kawasaki:
From the outside looking in, because I'm not inside the tech bro community, I think that all they care about is long-term capital gains and crypto.

Cindy Cohn:
Yeah. And it's about money for them and power for them, and it's not really about giving us anything better anymore, it's more about exploiting us so that they can maintain their positions. And it's so disheartening, right? Because again, I was in the Silicon Valley in the 1990s and the 2000s, and I know there was another vision, I know that there was another thing that a lot of people were doing.
And the good news is there are people doing that now, we are seeing with decentralization. Signal exists, it's strong, it's powerful, it's so powerful that the people in the federal government use it when they shouldn't.
There is, it's just the rest of the internet is still there, it's just been completely overshadowed and underfunded because of the rise of these tech giants and their surveillance capitalism business model. But if you peel it back, you can still find people with those ideals and those visions.
And if you look in Macedon in the decentralization space, or Wikipedia is still here, it still exists, it's under threat right now, but those places still exist, just all the air gets sucked out of the room by the tech giants.
And some of what we try to do is to point out that the internet isn't Facebook, there's a whole set of other things that aren't in the tech giants, and if we turn our attention towards them, there are people there who could use a little support, and coding, and lifting up to build a better version of our world.

Guy Kawasaki:
Of all these things that are going on right now, what scares you the most?

Cindy Cohn:
I think it's hard to be alive in America right now and not be worried about authoritarianism. I think that's the scariest thing. The scariest thing is we are seeing the takeover of both our business side and our civil liberties side by an idea that one guy gets to make all the rules for all of us, and that there's no questioning that, this kind of king-like mentality.
I think unless we fix that, we can't even get it. Most of the other problems, and we're seeing it in rule by executive order. Executive orders have always existed, but they weren't the law of the land, and they shouldn't be, right? We are supposed to have checks and balances and due process.
And for me as a civil liberties lawyer, these are our tools, we need tools to go into court or to have a Congress that's actually willing to pass a law that protects us, as opposed to just doing the bidding of one guy. I think until we get past the rule of king's mentality, it's hard to deal with any of the other problems, and that to me is the scariest thing that's going on right now is watching these institutions that we need in order to protect us, not step up to the moment and do it.

Guy Kawasaki:
You mentioned Signal several times now, so obviously you must use Signal, but I have some really tactical questions to ask you about Signal from someone who is in the middle of this.

Cindy Cohn:
Sure.

Guy Kawasaki:
Okay?

Cindy Cohn:
Yeah.

Guy Kawasaki:
So, first of all, what time period is your default disappearing messages set to?

Cindy Cohn:
It depends on the conversation. I try to set it for a week, but if it's something where we're planning something over a longer time, I will sometimes keep it longer than that. But I have occasionally used Signal to develop an expert witness in a case or something like that, and then I keep them longer because I want to be able to go back and make sure that my memory isn't so great. And different things have different needs.

Guy Kawasaki:
So, what happens if a Department of Justice lawyer says to you that you have Signals set to automatic disappearing messages and that's spoilage, you are destroying evidence?

Cindy Cohn:
It depends on the situation. If something isn't privileged and is evidence in a case, then you have to turn it off, just like anything else. If you've got auto deletion of your email or anything else, the law requires that if something is at issue in a case, then you can't get rid of it. But I don't think you should live your life as if you're always under a litigation hold because I think that can end up being its own problem on its own side.
So, certainly if something is looking like it's going to be evidence in a case that's actually pending or threatened, then yes, you should put a litigation hold in place and you should not get rid of it. But I think that it's still better in the rest of your life, which shouldn't be all your life, to only keep things for as long as you need them and get rid of them.
And this is our advice to companies too, right? People shouldn't be just gathering up data and keeping it in case it might be helpful someday, that way lies a lot of problems. I used to joke at EFF that we had become an anti-logging society, not in terms of trees, but in terms of your logs, that you should really think hard about what you're logging and why, because it can end up being a vector.
And as people who've been through litigation know, it's really, really expensive if you've kept everything all the time, and you have to turn it over in litigation, whether it's even remotely useful or not. Because sorting through what might be relevant to a litigation hold and what isn't is its own huge burden. So, you may not be saving yourself money or hassle or time in the long run by defaulting to keeping everything.

Guy Kawasaki:
I don't want you to think I'm obsessed with the topic of spoilage, but I have one more spoilage question.

Cindy Cohn:
Did something happen to you, Guy? What happened?

Guy Kawasaki:
If you set your default for every new chat to disappear after a week, can you not make a case that as a course of routine use of Signal, I make everything disappear, I didn't do it to destroy evidence in anticipation of litigation?

Cindy Cohn:
So, this is not legal advice, I am not your lawyer, but yes, but once you have a clear indication that litigation is coming, whether that's because you've gotten a demand letter, or you're in negotiations with someone, somebody showed up, or you reasonably know it's coming, and that can be a little vague at times, but the courts will generally think very specifically.
If you're going back and forth saying, we know we're going to get sued for this, but I think we can defend it, that's the time you ought to turn your little light on.
And certainly, once you get a demand letter, then a good lawyer will send out what's called a litigation hold letter to you, your entire organization, and say anything that's about this dispute, we need to stop getting rid of it, and we need to start keeping it.
So, yes, putting in an automatic thing that gets rid of communications and stuff that you don't need is useful and it can help protect you, that it is your automatic thing, but you can't then pretend like you don't know litigation is coming. Once you know litigation is coming, you need to change course for stuff that's related to that.

Guy Kawasaki:
Okay. Two more tactical questions.

Cindy Cohn:
Okay.

Guy Kawasaki:
Because this is a rare opportunity to speak to an expert like this. I know that you must probably not use biometric authentication for your phone, not your fingerprint or your face, right?

Cindy Cohn:
No, I do.

Guy Kawasaki:
You do?

Cindy Cohn:
Yeah.

Guy Kawasaki:
Okay. So, explain that to me because it seems to me that, not that I am a lawyer, but it seems to me that under the Fifth Amendment, if they cannot compel you to give them their passcode, but they can compel your fingerprint or face. Isn't it better to use a passcode instead of your face or fingerprint?

Cindy Cohn:
I think if you're at risk of being arrested, then that's important, I think if you're going through a border, if you're maybe going to a protest, if you're engaged in something where you think law enforcement is likely to stop you, then you're right, you should turn off the biometrics, and you're exactly right. The Fifth Amendment, for what I think are some pretty dumb reasons actually, distinguishes between putting in a password or showing your face for purposes of the Fifth Amendment.
And honestly, I think that whole case law is pretty stupid, right? I think that the constitution should reflect how people live and not have this, did it require the contents of your mind or not analysis. But whatever, that's where we are with the Fifth Amendment right now. So, yes, if you think something's coming, then that's a really good idea, but the rest of your life, people can't follow ridiculous instructions.
I want technology that makes my life better, that makes it easier, and so does everybody else. So, what security people call this is threat modeling, right? You need to figure out who you are, what are you doing in the world, and what's your threat model, and make your security based on that. EFF has something called surveillance self-defense, ssd.eff.org or eff.ssd.org.
So, look for surveillance self-defense, and we have playlists based on who you are and what you're thinking about. So, if you're a journalist, or you're a human rights defender, you're attending a protest, you're helping people who might be seeking abortions in America today, you have to worry about that, then you might have a different set of things you do to protect yourself than people who aren't at risk.
And so, I think everybody has to do their own analysis, for me, most of the time, walking around, I'm pretty unlikely to be picked off by the cops in the street, and asked to have my phone seized. If you're really worried, you can do that, and there are times and places where I make sure my phone is off, or that I've turned those biometrics off.
There are other times and places in my life where I just want to be able to open it up and look at Maps, and make sure I'm not lost, and I really don't want to have to fumble with putting in a password. So, everybody has to make those decisions for themselves, and we have tools to help people make them intelligently.

Guy Kawasaki:
But Cindy, okay, so what I find almost incredulous is you are the executive director of EFF, and you're saying that you feel pretty comfortable walking around with your face or fingerprint opening up your phone. As the executive director of the EFF you're saying that, I am astounded.

Cindy Cohn:
I think that everybody has to make these decisions for themselves. I love technology, right? Look, if I was the most paranoid person, I wouldn't be carrying around a smartphone. If you're going to take this to the end of what makes you absolutely the safest in every situation, I don't know why you would carry around a beacon that's tracking you all the time in the first place, but we all have to make these trade-offs.
And I would not say that my trade-offs are the ones that other people should make. Right now we're suing DOGE, EFF is suing DOGE under the Privacy Act for access to the Office of Personnel Management Records. Now, in some ways that may make me worried that at some point the Trump administration doesn't think lawyers are off limits for purposes of targeting them.
On the other hand, there's a federal judge who knows that I'm counsel in the case, it's not a good look for the government to be attacking, harassing and tossing in jail the people who are suing over the Privacy Act in the thing.
And I have always felt, and this is just my threat model, that being high profile and being somebody who's laboring in the courts to try to bring justice makes me probably not the first people they're going to go after if they go after. Now, things are changing fast in this country, and that might not be the right threat model today as it was ten years ago or even twenty, when we were doing the Bernstein case.
And believe me, the NSA and the national security people were not very psyched about us attacking cryptography. I did not for a minute think that they were going to come after me personally, that was a different time, and it was off limits, and I think that it would've completely backfired on them in the courts.
I still think it would backfire on them in the courts if they did this kind of direct attack. Now, other people should make their own evaluations, and again, I wouldn't say that this is my position everywhere all the time, but it is my position when I'm walking out my front door and going to the grocery store, or all the other things that I do.
The other piece of this, and I think it's really important because you're asking me personal questions about my own decision-making, about my own security, and I think that's useful for people, but we have to fix these systems. This isn't a set of individual decisions that anyone should have to make, we need to have a comprehensive privacy law.
We need to have strong encryption built into our tools so that we don't have to mess with settings or turn things off in order to have strong encryption. We need to have laws that protect our ability to have security and privacy and make it something that the government just can't do to do these kinds of things.
So, I think on the one hand, individual choices are really important, and on the other hand, sometimes in privacy specifically, people get caught up in their individual decisions as if it's their responsibility to make sure that they're as protected as possible.
And I think that makes no more sense than if you buy a car, you expect it to have brakes, and that those brakes work, and nobody expects you to go out and search for, find, and install your own brakes. I think basic security and privacy is like brakes on a car, and all of our devices and tools and laws need to have them baked in to protect us rather than the responsibility being foisted on us to find all these tools, pick the right ones, and use them in the right way.
That's broken, and a lot of what we do at EFF is try to give you individual advice about how to do what you're doing, but the vast majority of what we do is to try to set the laws and the policies and pressure the companies to make this not your responsibility anymore.

Guy Kawasaki:
Cindy, knock me over with a feather. If you want to use the brake analogy, yes, a Porsche may break from sixty to zero in 125 feet, and a Ford 150 may take 250 feet, you need to know that not all brakes are created equal. And you still put on a seat belt, right?

Cindy Cohn:
Yeah, absolutely. All of those things are important. You don't have zero responsibility, we have a regulatory system that says brakes must be within these normal tolerances, right? Same thing. We need a Privacy Act, the Privacy Act isn't going to say, it's not going to be a one size fits all thing, it shouldn't be, that would hurt innovation. But it should set the boundaries.
You can't put something out on the marketplace that spies so dramatically on your customers that they can't possibly turn it off, they can't possibly control it, they have no agency about that. And I think of it, again, like the way a good regulation will set the tolerances of what can go out there. So, yeah, you might have much better brakes on a car that has a much bigger engine, but there is an outer boundary, right? You can't have no brakes on a car.
And regulation does some of that. Consumers do some of that by the consumer reports or other things telling people, watch out, this car doesn't have very good brakes, you got to have a mix of markets and smart regulation.
I'm not a big fan of regulation, I think it can be very bad, and it can help prop up oligarchies and monopolies, but smart regulation, my classic example of this is when there's a decision by the FCC that the phone companies were saying you could only plug their phones into the wall, and the FCC said, no, you have to let people plug modems into the wall, and that's how we got the home internet revolution.
That's smart regulation, that's regulation that is not only creating the outer tolerances of what we can accept. But also making sure that there's a competitive and other options for people within that space.

Guy Kawasaki:
Are you trying to convince your friends and family to use Signal instead of WhatsApp, or you think it's irrelevant for most people?

Cindy Cohn:
I think WhatsApp uses the same security, it's the same encryption under the hood as Signal. So, I don't think WhatsApp is a bad choice in terms of end-to-end encryption. What I don't like about WhatsApp is because it's a Facebook property, they know who you're talking to, even if they don't know what you're saying, and so on that measure, Signal is better because Signal is designed not to know who you're talking to, at the level that WhatsApp is, and is trying to monetize.
But as a matter of encryption, WhatsApp is not a bad choice. But Facebook Messenger, for instance, is not end-to-end encrypted, I think they're fixing that. But it was not end-end encrypted. And let me tell you the consequences of that. So, there's a woman and her daughter in Nebraska who are both in jail right now, and they're in jail because they used Facebook Messenger to talk to each other about the daughter needing an abortion, and that's illegal in Nebraska.
And as a result of Facebook having the plain text of that communication, it was not end-to-end encrypted, and Facebook got a warrant that required them to turn over the copy of the communications that it has because it's a centralized system, so Facebook has a copy of all those communications. Both the mother and the daughter went to jail.
If that same communication had happened over Signal or probably even over WhatsApp, the mother and daughter wouldn't be in jail right now, because the plain text of that conversation wouldn't have been available to law enforcement. Many more people are having to pay attention to that fact, which might not seem at all when you're just using these technologies, you're just using whatever's easiest for you.
But now that we have a world in which some communications are illegal at a level that I think was not true before, say the Dobbs decision, and all of these states started passing things, there's a whole new community of people who need to understand the difference between the securities of their communication techniques than did before.
Now, this was always true for people who were human rights defenders, people who were working with immigrants, people around the world who come from marginalized backgrounds have known this for a while, and now there's a whole new community of people who are starting to wake up to these differences. So, yeah, it's important that people move to end-to-end encrypted services, and it's important to more people now than ever before.

Guy Kawasaki:
Cindy, I would make the case that WhatsApp, which is end-to-end encrypted, but it doesn't encrypt the metadata.

Cindy Cohn:
Correct.

Guy Kawasaki:
And there's a lot you can figure out from metadata, the mother and the daughter communicated at this point, they contacted this abortion service, and all that.

Cindy Cohn:
For sure.

Guy Kawasaki:
You don't know what they said, but it's little markers on the trail, right?

Cindy Cohn:
Yeah. No, you're right, and EFF fought the NSA over metadata. One of the things that we learned is that the Patriot Act had a section in it called 215 that let the government demand everybody's telephone records from the telephone companies.
And one of the things that we learned and that we learned in 2006, but then everybody learned in 2013 with Mr. Snowden, is that this was actually happening, that the phone companies were handing over the metadata of our phone records. And you're exactly right, that you can glean a lot from those.
The reason I'm a little soft on WhatsApp, but I think it's a perfectly reasonable choice not to use them, is that it is how people around the world really use it at a level, and I'd rather not shame them for the differences between the two of it, but really encourage them to come away from the things that are entirely unencrypted, or that are fake encrypted, Telegram, as we've learned, while it sells itself as being encrypted, really isn't at the level that gives people protection.
In the world of secure messaging, I agree with you that Signal is more secure and a better option, I just want people to pick something that's a little more secure, even if they don't go to the maximum secure. And on that scale, especially again around the world, Signal is still so small compared to the reach of something like WhatsApp.
I don't want to shame people who are using the one even as we encourage them to come to a little more. So, that's more strategy than it is hardcore security advice, but it's certainly better to use Signal, but it's better to use WhatsApp and stick them with the metadata than it is to use something that's completely in the clear.

Guy Kawasaki:
You brought up the Nebraska case, and I am familiar with the Nebraska case, and it opens up a whole other can of worms that I never figured, which is the narrative seems to be that if it wasn't for Facebook and them using Messenger, they wouldn't be in jail.
On the other hand, the facts show that she did have an abortion after the period permitted in Nebraska, and they did try to burn the fetus and all that. So, in a sense, they did do what they were charged with, so it's not like they were falsely imprisoned, or did I get this wrong?

Cindy Cohn:
It depends on your view of the law. I think that this is a law in Nebraska that most people think is tremendously unfair and wrong, and disconnected from the reality of people in America, and women in America. I think that in a world in which every law is perfect and wonderful and should be celebrated and supported, you might be able to take the position that they broke the law, so therefore they got what they deserved, and how they got found is irrelevant.
I don't think we live in that world, and I think that when the law is unjust, making sure that people can still live their lives and have protection and have security is tremendously important. And we live in a world with a lot of laws that are not just right now, and a lot of things like executive orders and other sorts of things that are just ignoring the law, they're just snatching people off the street and sending them to El Salvador.
This is one of the reasons that we need privacy and security is because not all governments are just, and not all laws are just. The other reasons we might need it is just basic human dignity and having the space to be able to live your life without being tracked all the time.
But I would maintain that there's a lot of people in America who are very uncomfortable and unhappy with some of these laws, they were not passed in ways that I think people feel very good about, and I think that giving people the ability to have the level of privacy and security they need to live their lives, and not making a world of perfect enforcement of every single law is how the law has generally been, and that's to stop things that I think shock the conscience.
And I think in this particular instance, this was a mother and daughter who I believe were having conversation inside their own home.
Traditionally, the Fourth Amendment would say that what happens inside your home is completely not available to law enforcement, right? That's why they need a probable cause warrant to come into your house. But because technology meant that this third-party company had the plain text of the communication, suddenly what happens inside the home between a mother and daughter is available to law enforcement.
So, you have to look at how is technology changing everything? And this is a situation in which the founders of America would never thought that the government would be able to prosecute you even if you were violating the law based on a mother-daughter conversation and inside the home.
And because of the way technology has happened here that actually was able to occur. You have to balance all of these things, it's not just one thing. The changes and technology has made changes to the way that we communicate in ways that the Constitution needs to catch up.

Guy Kawasaki:
It's obviously 2025, and now we have someone in charge of Homeland Security who cannot even define the writ of habeas corpus. So, I'm asking you if somebody says to you, or your family or friends says to you, I'm not worried, I have nothing to hide. Is the nothing to hide statement true these days? Or does everybody have something hide at this point?

Cindy Cohn:
I haven't done a demographic survey, but I would suggest that most people, even if they don't have something to hide, talk to somebody who does. Do you have somebody in your life whose papers have expired, who've overstayed their visa?
Do you have someone in your life who's a person of color, who is trans, who's LGBTQ of any kind, not just trans? Do you have somebody in your life who's a person of color, who may think that diversity and equity are important values, and have said something about that, right?
The line over who has something to hide is really changing, and I would argue that by the time you go through all the lists, just of the things we know, there aren't very many people who wouldn't be impacted by this. And again, this is why security and privacy are so important. I also think they're important regardless of whether you individually need them.
I think that one of the problems that we have in privacy is people think about it in individual personal terms, and so they can come to the I have nothing to hide kind of position. But privacy isn't just important for each of us, it's important for all of us. And that's an important distinction.
Most people don't want to stand on a street corner and shout out what they think ought to happen in this country, but I think all of us understand that the First Amendment protects us all, even if we don't want to speak. The Fourth Amendment and privacy do work the same way, giving everybody the shelter of privacy means that even if you don't personally need it, somebody who you love, somebody who you know, or somebody who's going to help change the world for the better does.
And I'm going to give you an example in my lifetime. Being gay in this country was very, very dangerous, saying that gay people ought to have the right to marry, they ought to have the equal right to love who they want to love, that could get you killed. And in fact, it's still pretty dangerous, right?
We're moving backwards. But there was a time in which those conversations had to happen in private, this idea that maybe loving who you want to love as opposed to the traditional heterosexual thing isn't such a bad thing, maybe we should normalize that and make that okay, that was a very dangerous conversation.
Those conversations had to happen in private and in secret. And in my lifetime, those have gone from conversations that had to happen in private and secret to something where we've really changed the law, we've changed a lot of people's minds about it, we've changed attitudes, and that public part of the conversation couldn't have happened unless there was a private part of the conversation.
And I think the same is true if you look at most social movements. If you look at the anti-slavery movement in the United States way back, if you look at some of the anti-immigration sentiments in this country, where we had the Chinese Exclusion Act and other kinds of things, and we shifted into a world where we thought differently about differences in America, the public part of those conversations couldn't have happened without the private part.
So, it may not be you, it may not even be the people you love, but it may be the people who are going to help us make change for the better. And I think we need to stand up for the rights of all of us because this is what a human right is, this is what a human value is.
It's not something that's just dependent on you and your everyday life, although I think most of us have an increasing need in our everyday life for privacy and security, but these are values that we should stand up for, even if it's not right now visible individually to us that we need them.
Because this is how society self-governs, this is how we make changes, this is how we have the space to decide that we don't like the guy who's the president right now, and we want to vote for someone else. Increasingly in this country, those conversations can be pretty dangerous for people to start talking.
We're not all the way to the repression of other systems, where they put the opposition candidate and anybody who is friends with them in jail. You can see that on our horizon right now, we need to stand up for privacy and security, even if we don't need it right now because we may need it pretty soon.

Guy Kawasaki:
One of the big activities of the EFF right now is involved with the Take It Down Act, and I would like if you would please explain what is that act supposed to do?

Cindy Cohn:
Yeah, we spend a lot of time with it. This particular problem, where there's a harm online that people agree is a harm. In the instance of the Take It Down act, it's non-consensual sexual images, your ex posts your sex tape online, or other kinds of situations in which sexual imagery of people is posted online without their consent. It's a real problem. So, people will take a real problem and then they'll propose a legal solution that is not good.
So, the Take It Down Act says that if somebody tells you that you as a platform or a host of a site that you have to take something down, you have to take it down immediately or you're liable. And it is not limited to non-conceptual sexual imagery, even if we could agree what the definition of that is, and that can get a little fuzzy, it just means that people have to take it down if they get a complaint.
And the worry is that those complaints get weaponized. President Trump, his big speech that he gave in January or early February said, he can't wait to use this law, that we should pass it, and he can't wait to use it. I don't think that what President Trump is worried about is non-consensual sexual imagery, I don't think that's what he meant. But there's a classic example of how a law that has passed for one narrow purpose can be used to create a censorship regime for far broader speech than just that.
This is why we really oppose it, this law, I don't think it's going to help for non-consensual sexual imagery, the problem for most of that imagery isn't that the platforms don't take it down, they take it down pretty fast all the time, it's that they can't keep up because there's so much of it. So, it's not even responsive to the problem because I don't think the problem is that platforms don't care about this.
Some might, and that's important, but we didn't need a federal law for that piece of the problem. But instead, it opens it up so that there could be a censorship machine from anybody with power, to take down anything they don't like, or at least a wide range of what they don't like. And again, when you've got the President of the United States saying he can't wait to use this power, it ought to be a pretty good sign that maybe this law is doing something different than the people who proposed it do.
To the point where at the very end of it, some of the people who originally proposed the law flipped and said, this is a bad idea, this isn't the right thing, we had been opposed to it all along because we worried that it could be misused. But by the end, some of the very people who started proposing, a couple of law professors who were big fans of this and who proposed it, issued blog posts saying, don't support this, this is not what we meant, and this is a bad thing.
Nonetheless, it passed, and it got signed into law, it's about a year or two before it really gets implemented, so we won't see right away, but we're going to start to see weaponized takedowns at a level that I think we haven't seen before because this law facilitates it, and it creates the incentives for the companies to take things down if they get a complaint.
And again, I don't think that those complaints are going to be about non-consensual sexual imagery, they're going to be about people saying things they don't like.

Guy Kawasaki:
Cindy, I don't know if you realize this, but I think you just said one of the funniest things I've heard in five years of podcasting.

Cindy Cohn:
What's that?

Guy Kawasaki:
Which is, "I don't think Trump is concerned about non-consensual images." I would say that if that was said at the White House Correspondents dinner, it would be in the words of Barack Obama, "A mic drop moment.' But anyway, I'm still recovering from that.
And when you see something like that in a bill, and the possible perversions of how the bill is used, is that something that Mike Johnson snuck in with that intent, or is this unintended consequence? Am I being paranoid they're putting like shit that in purposely, or it's unintended?

Cindy Cohn:
I think that it's a mix. I think for some of the people it might be unintended, this is a bill that was sponsored by Amy Klobuchar of Minnesota, and I suspect that she's a very smart person. And it's not like people haven't tried to tell her. I don't want to give her too much credit, but I think that people come in with a pretty honest intent to try to address the harms, they're just more interested in the harms than they are in the actual impact of how things are going to work in the real world once they get passed.
So, some people are dishonest, I don't think that Mr. Trump is honest in his support of this, that he really wants to make a stand about non-consensual sexual imagery. Other people are cynical, and some people are well-meaning. There's another law coming along that I want to flag, that has a similar problem, It's a law called KOSA. It's the Kids Online Safety Act.
And again, this is trying to get at an online harm, which is kids online, and them having access to information that could be dangerous for them. But what it's going to do is it's going to create a requirement that you provide credentials to get access to most information online. It's going to require you to show your ID at some level in order to get access to things online.
This is going to age gate everything online. It's going to make it harder for people online who don't have credentials, and that's a lot of people in this country, to actually get access to the internet in any meaningful way. It's not going to stop kids from having access to stuff that they don't want to have, but it is going to age gate everything on the internet, it's going to require a lot of things.
And then it's going to create these huge companies that have everybody's identity information, that are going to be sitting ducks for data breaches, it's going to be the mother lode for people who want to do spying, who want to do identity theft or other sorts of things because they're going to create this.
And so I think KOSA is another one where there is a real harm of kids having access to stuff that they shouldn't have online, and that the solution that is being proposed in this law is not going to solve the problem and is going to cause a whole other set of problems.
We know the things that work for kids online and having them not have access to harms, but they're a lot more expensive and require a lot more thought than simply just requiring companies to put in an age gating thing, either on your device or otherwise.
And this is something where we live a lot, especially on the legislative side, which is good intention, bad idea. And it's hard because I think a lot of lawmakers really want to respond to this problem, and they just don't pay as much attention to whether the thing that they're championing is actually going to solve the problem, and what the collateral impacts are.

Guy Kawasaki:
So, you mean to say that the speaker of the House and his son cannot maintain control of each other and take care of this problem, we need other ways to do this.

Cindy Cohn:
I think there are other ways to do it. I was a kid, you were a kid, is the idea that you had to show an ID, was that a thing that actually kept you out of anything that you really wanted to have access to?

Guy Kawasaki:
No.

Cindy Cohn:
No. So, why do we think that's going to work online, where it's even harder, right? It's not like fake credentials were just made up yesterday, right? I just don't think that's really going to be the way to do it. Again, if it caused no collateral problems at all, then okay, whatever, let's give it a try, but it's going to cause a lot of collateral problems, and those problems are going to fall on the people who otherwise don't have resources.

Guy Kawasaki:
Isn't Australia already doing this? Is it causing problems there?

Cindy Cohn:
Yeah, I haven't seen the research yet, Australia's doing a version of it, they also doing a version of blocking encryption. And I haven't seen the research yet, but I would be shocked if it was actually having a significant impact.
We know that it's a mix for kids online, we know that there's a certain percentage of kids who have a hard time online and react badly, there's another percentage of kids, and I want to be clear about this, this is LGBTQ kids, it's kids from marginalized backgrounds, kids who don't fit in where they're growing up, for whom having access to information on the internet is literally a lifeline.
EFF did a survey, and it's convenience data who filled out our survey, but we asked kids to tell us what's their experience online and how has it helped them. You should read these.
They're on the website. And we had so many heartwarming and terrible testimonials from kids who said, if it weren't for my online community, I would've killed myself by now. Because nobody in my house or in my community understands what it's like to be LGBTQ, gender queer, and it's the online world that saved my life.
And there are a lot of those kids, and I think that people who are thinking only about one kind of harm and legislating based upon one kind of harm, that definitely impacts a segment of kids online, especially a certain segment of young girls online, but only legislating based on that and not seeing the other people that they're going to harm with it, which includes a lot of gender queer kids, and other kids who don't fit in, whether that's religiously or otherwise in the place that they grow up, that's just bad legislation. We have to save all the kids, not just some of them.

Guy Kawasaki:
But maybe they want to harm those kids.

Cindy Cohn:
Well, this is one of the things that the Republicans have been pretty clear about, Marsha Blackburn's been very clear about. When they talk about online harms for kids in a segment of the conservative side, they mean kids shouldn't have access to information about this. They're talking about, their view of harms is if kids have access to information that isn't the very narrow Christian-infused version of things, that's the harm is getting access to DEI information, or other kinds of information like that.
And so, when we talk about online harms, if we don't specify which harms we're talking about, we're talking about people who really just want to censor what other people's children can see.
And I think it's very vulnerable to that, that again, Marsha Blackburn or the Heritage Foundation have both said, that's what they want to pass KOSA to do, and for the Democrats and the other people who are really focused on this one area of online harms, that I think we could all agree are not great, to empower those people as well, it's wrong and it's scary.

Guy Kawasaki:
I have to ask you tactical questions because who better to ask tactical questions than you? This is a very tactical thread we're going to go into now, which is, let's say that you are a US citizen born and bred, you have no criminal record, you return to United States from overseas and border patrol asks for your phone.
Do you give it? Is it your regular phone? Would you take another phone overseas because you knew this might happen? Is it locked? Do you unlock it for them, or do you hand it to them and you say, have at it boys, try to decrypt this phone? What's your attitude at the border?

Cindy Cohn:
Sadly, our border is largely a constitutional rights-free zone, EFF did a case a few years ago where we tried to get the Fourth Amendment to apply to the border, and we were not successful. We're not done, we're going to keep trying, but you're pointing out something really true, which is, you have many fewer rights to protect your phone at the border than you do otherwise. You still have to do some threat modeling and figure out your situation.
If you're an American citizen and you're coming back into the country, they can detain you for a while, but they can't kick you out, you have a right to come back. But they can make you sit in detention for four or five hours, while they try to open your phone if you don't open it for them. And you have to decide for yourself, is that something I want to do? That can be very uncomfortable.
Other people are like, sure, that's fine, but I think, do you have another plane to catch? Are you going to miss your connection? What is your life like? Are you trying to make it to your daughter's wedding? Even as an American citizen, you still have to think about your threat model as you're coming into the country and that should inform what you decide to do.
I do recommend that if you've got stuff on your phone or accessible through your phone that you really do need to keep private, try to think about taking a second phone. Think about getting a burner phone that you use for that, or a device like an empty Chromebook so that when you get overseas, you can use a lot of your services that are cloud-based so you can log back in, you don't need all that information on the computer you carry.
And same for coming back into the country, wipe the stuff off of it, and then just sign back on again once you get back safely home. And the cloud computing revolution has made that a lot more accessible to a lot more people than it used to be. The other thing I recommend is if you are going to carry your own device through the border, turn it off, turn it off.
Because all the devices require when you turn it back on again that you put in a password, that you turn off the biometrics, that you put in a password to open it up again, and it's encrypted at the time. They can break into most phones, but it takes a lot more effort and a lot more money.
And so, you put them in a position where they'd have to decide how much work they want to put into entering into your phone, and I sometimes say make them fish with a line and a pole, don't let them drift net fish through everything.
And I think for a lot of people, unless you're really the target, that will mean that it's not worth it to them. They'll troll through what's easy, but they're not going to deploy the thing that they have to buy from in order to actually collect information from a phone.
Again, it depends on what kind of target you think you are, and how important it is, but I always maintain you should make it a little harder on them, make them have to go through every step. Even if at the end they might be able to have access, make them go through every step, because a lot of people just wash out of the process through then, and I think that's important to put them through it.

Guy Kawasaki:
But what about the logic, that if you refuse to unlock your phone, that's an admission that there's something you're hiding?

Cindy Cohn:
I don't think it is an admission. At the end of the day, they got to convince a jury or a judge, and I think as long as enough of us do it, and it's not just the guilty, then we need to combat that. Privacy is a human right, it's your right. It's your right not to have the law enforcement go rifling through your stuff unless they've demonstrated that you've done something wrong. Lawyers call it probable cause finding in front of a judge, right?
That's why we have the Fourth Amendment the way we have it, which is they have to go to a judge, they have to say there's probable cause that you've violated the law, and then the judge has to agree with them, that's what a warrant is. If they haven't done all those steps, then it's your right to say, no, I'm not going to voluntarily let you do this.
That's why I have a doormat that one of my interns gave me a long time ago that says, "Come back with a warrant." And EFF has them, actually, stickers for your phone. That due process protection is important, and if you just decide that you don't want that protection anymore, of course that's your right, but that doesn't mean that you're doing something wrong if you avail yourselves of the protection of the law.
And I think we all need to stand up for that, this idea that by not letting police just blow past all the protections that people fought and died for us to have. Standing up for our rights as a citizen in order to do what we've done, to me, that's a patriotic thing to do, that's why we did a war against a king, to have our own country, was so that we could set our own rules, and when we could have a government that abided by them. Holding the government by the rules to me is the more patriotic thing to do, not less.

Guy Kawasaki:
A few seconds ago you used the phrase, "Stuff that you might want to hide," but what is the definition of that? I would think that on almost anybody's phone, you could find a place where you said, these tariffs are stupid, it's going to ruin our economy, are we at a point where, oh my God, what if the border patrol saw me say that on social media because they opened up my phone? Am I going to be deported or something? Where are we on that?

Cindy Cohn:
I think it's getting more and more scary, and the Trump administration is trying to require people who want to come to the United States and get visas to open up their social media, to turn everything to public that used to be private, it's horrible and we need to fight this proposal as best we can. I don't think it's constitutional.
But yeah, I think one of the things about the time we're living in which is really scary is that the needle is moving so fast and so unpredictably, that I think when you ask me, what if I have nothing to hide? I don't think anybody can feel safe right now, that their presumption of what that means for them personally, much less for all the people they talk to. Remember, what's on your phone isn't just what you say, it's what other people say to you that you have.
Even if you might not implicate yourself, you might implicate your friends who got pissed off and wrote a text about being angry about something that now law enforcement is looking at to try to decide whether they get to stay in the country, or whether they get detained. We often say privacy is a team sport, the other thing people have to remember is it's not just them, they have information about all the people who they communicate with, who they love, who they follow.
And so, I do think it's a time where that story ought to be going away pretty fast. Everybody has reason to want to avail themselves of their constitutional rights to privacy, to avail themselves of due process, even if you can't think of what you have that might be at risk. That story is changing so fast that I don't think anybody can give you an accurate up-to-date risk profile for yourself, and you ought to take that into consideration.

Guy Kawasaki:
Okay. What does it mean if you are threatened, you or Wikipedia or NPR is threatened with the loss of a not-for-profit status? What would it mean to you if you lost that?

Cindy Cohn:
Oh, it would be terrible. Again, EFF gets support from individuals, and many of those individuals get a tax deduction for supporting us. Now, lots of people don't, and I think that there is a community of support that ought not be dependent on nonprofits, and we ought to think about that a little hard, but we built up this system for civil society, for nonprofits like NPR and otherwise, that is really based on the idea that there is a tax-protected status for our donations.
That, if that goes away, people are going to have to get funding in a way that isn't tax-protected. And that's okay for individual donations, as again, there are wealthy individuals who itemize for whom this is an important thing, and that's a big source of funding, but there's a lot of poor people who support charities even if they don't get a tax deduction, so we need to think about that.
But when it comes to foundations and other kinds of money, many of those foundations can only give to organizations that have C3 status. So, if the MacArthur Foundation or the Ford Foundation or even the foundations on the right wanted to give money that wasn't tax-free, they can't.
They have to change their whole charters and ways of being in order to do that, it's a huge drain of money and support from these organizations that do everything from soup kitchens, and being the core to religious organizations, they're all C3, those kinds of things, as well as people like me, who do civil liberties and civil society protections, to people like NPR and other things who provide us information, it's a huge blow and a huge risk to this entire sector.
Again, because we built up a system that is all interlocking and is all based on the idea that the IRS C3 protection means that something is in the non-profit side.

Guy Kawasaki:
So, how do you think this all plays out? There's some possibilities, like we all wake up, there's a midterm slaughter, we all go phew, we duck that bullet, that's one possibility. Another possibility is Margaret Atwood, come to find out, wasn't a novelist, she was a historian, she got it all right.

Cindy Cohn:
So scary.

Guy Kawasaki:
Another possibility is we have this performative democracy with a constitution and separation of powers, and balance of power, but none of that is really true. What's your prediction for what's going to happen?

Cindy Cohn:
I'm so bad at predicting, I'm really not good at it. We're going to work really hard to make sure that we end up in a position where we are still a self-governing constitutionally protected society, we're going to pull all the levers we can. I would say, look, I was a person who told the founders of Wired Magazine that the country didn't need another tech magazine, I am so bad at predicting the future. But I can say that we won't get to a better future unless people lean in and try.
We're not going to be able to just sit back and have this magically fix itself. We didn't get into this problem overnight, I don't think we're going to get out of it overnight. And we need people to vote to lean in, to support the organizations that are working to do this, of course, EFF is one of them, but we're not the only one, whatever speaks to your heart.
We need people, we don't have an armchair democracy anymore, we need people to show up to make their voices heard, because without that, we will definitely lose. Sometimes people ask me, are we just going to lose no matter what? And I'm like, we could lose today or we could fight and lose in the future. And those are the only two choices. If you just sit back, it's not going to get better magically.
So, I think people need to lean in, they need to engage, they need to find what speaks to their heart, and really show up for it. I hope for some people that's EFF, because I think we do show up and we have done it. We know how to push right now.
But if that's not the thing, find the thing that works for you, because we are not going to get out of this by just sitting back and magically thinking things are going to get better. Tech's not going to solve this all on its own, tech needs people who are willing to step in and make sure that our tech and our world support us rather than suppress us.

Guy Kawasaki:
Wow, Cindy. So, listen, I want to thank you, I want to thank you in two senses. The first sense is of course for the simple act of coming on my podcast, because it's been a very remarkable podcast. But even bigger, I want to thank you and the EFF for the work that you're doing to preserve democracy. The work you're doing is so important. And as soon as I hang up, I'm going to send you money.

Cindy Cohn:
Thank you so much. Oh Guy, that's wonderful.

Guy Kawasaki:
Thank you very much, Cindy. And just let me thank the Remarkable People team, that would of course be Madisun Nuismer, who is our producer, Jeff Sieh is our co-producer, and we have a sound design engineer named Shannon Hernandez, and a researcher named Tessa Nuismer. So, Cindy, that's all the people on the Remarkable People Team, and we're trying to make this a remarkable, long-lasting democracy.