French 75 with Jennifer Granick
Jennifer Granick · August 2, 2023 · 58:19
Back to EpisodeGood evening, Adam. How are you doing today?
I'm sorry, was that a question?
Yes, that was a question. That's how we start things off. I'm doing incredible. A bit of banter.
I'm doing incredible.
Excellent. I'm glad to hear it. I want to hear everyone's doing good. So we have a really exciting show today. You know, I don't know if you heard, I'm sure you did, that we recently hit the 10th anniversary of the Snowden revelations about you know, government surveillance and spying and the internet and all. And a very big deal then, still a very big issue today. But it ends up that, you know, for those of us who have been around for a while, you know, that was not the start of this issue. There's actually been a lot of work going on in the area of, you know, internet surveillance and civil liberties and civil rights for quite a long time. Really going back to the, You know, the earliest days of the commercial internet in the mid to late 90s or so. And so today, well, we've talked about that before, and today we're going to talk about it a little more. And we are very lucky to have a guest on today who is you know, been in it really, I think, from the beginning and is definitely an A player in this field. Definitely one of the most knowledgeable and impactful people in this, in that whole area, I think. So today we've got Jennifer Granick.
Thank you for having me. Happy to be here. Yeah, you're making me feel old.
Well, you know, I was trying to do that without making you feel old, because I think we're actually the same age, too. So I'm a bit selfish in that myself.
You're not the same age. Joe, you're like 63. She's like 25.
Oh, yeah. Thanks, Adam.
Let's just say we're all experienced. That's right.
We're experienced. I don't know about you, but Joe and I qualify for AARP already to get the early bird special.
Oh my God. I think when they send you the membership card, right? When you turn 50. Yeah. I burned it.
Yeah, well, you know, I had felt old years before. The first time I really felt old was when I realized I was friends with my dentist. And I'm like, oh God, it's all downhill from here.
It's when the doctor comes in as younger than you are.
Oh yeah, that's right. Oh, I remember that story about your dentist. He's the one that filled your teeth with the wooden fillings.
Yeah, like George Washington.
Yeah, thanks. You're right. So yes, we've got a lot to talk about. But first, since this is the cocktail hour, we are drinking today the French 75 as chosen by Jen, because today is Bastille Day. Indeed.
I thought a French-themed cocktail would go well with Bastille Day. So here we are.
Cheers. Here we are. All right. Cheers.
It's delicious.
That is good. I really am liking that. I never had this one before. So great. Thank you. Good choice. And, uh, all right. So, you know, yes, you've obviously been in the, in the world of, uh, surveillance and we've had the snowed things and all sorts of stuff going on. I'm talking about mass surveillance and civil liberties on the internet and, uh, You know, there's been a long history of it and stuff, but what's happening lately, basically? I mean, it's hard to... Because I know it's an evolving field.
It is, and it's hard to believe, you know, that the Snowden revelations were 10 years ago. It seems much more recent than that because, you know, for those of us who were surveillance lawyers, you know, before that, the Snowden revelations were really, they were a revelation. It was, you know, sort of a whole new category of information about what our government and other governments were up to. And I think it really unleashed this, you know, sort of energy on the part of the civil liberties community to try to rein in these government excesses. And we're still fighting that battle, you know, 10 years later, both with things that Edward Snowden revealed in terms of, you know, surveillance of our communications when we talk to foreigners, but also with new technology that has you know, become adopted by law enforcement for greater use. You know, we're seeing real increases in some of the novel tracking technology, in use of some of the novel tracking technologies. And there I'm talking about things like Geofence warrants or reverse search query searches or those sorts of things in particular But you know, I I would say one of the big differences between when I started this career and now is the difference between kind of surveillance of individuals, like a kind of comprehensive surveillance of individuals, and what's happening now, which is technologies that allow the government to surveil people in mass as a giant group, and then, you know, sort of either select suspects out of the group or, you know, keep tabs on protests and social movements and, you know, that kind of thing. So, you know, I guess what I think is that we had all hoped that with the Snowden revelations, we could make things a little less scary. But maybe as is just going to be true in my field, they actually are getting incrementally more scary.
Yeah, you know, that's kind of what worries me, I mean, and this has been on my mind for a while, I mean, I've worked in the field for a long time, at least on the technology side, you know, I, I read your book, Talk About Surveillance, which I, you know, it's funny, I wanna say I liked it a lot, but you know, the truth is, whenever I read books in that genre, I feel this emotional mix of anger and depression at what's going on. You're welcome. And that was, you know. Well, that's art, I suppose, if you gotta.
Yeah, that should be a blurb on the back cover.
That's right. But also thinking that, you know, things, well, the thing is, like, every time I read one, I read something in this area. It's like, it ends up that, yeah, I learned something more, that means it was even more pervasive than I thought, that the controls that are supposedly in place are even less effective than believed. That was what, six years or so ago? And I don't think things have gotten better in that time.
You know, I mean, I would say there are some signs of hope, right? Because the way I look at it is like, you know, I'm a lawyer, but technology is a huge determining factor in terms of what our liberties and what our privacy look like. And for, you know, decades, technology has made our private information less and less private. And the internet is just a huge machine that collects data about us and stores it. And then we have all kinds of other technology. But there have been changes in technology that are making us safer today than we were before. And I'm talking about end-to-end encryption. So when WhatsApp becomes widely adopted and end-to-end encryption is turned on by default, suddenly hundreds of millions of people, and I probably got the numbers wrong, But I'm in the ballpark, like hundreds of millions of people, their conversations and communications are more safe than they were before. And when Meta ends up end-to-end encrypting Instagram and Facebook Messenger communications, again, it's going to be this sudden thing where people's conversations, people are able to have private conversations again. iMessage, you know, again. So we have things, you know, people are developing ways to do various analysis and metrics on anonymized data. So, you know, I think we're, you know, there's a pendulum and I think with some kinds of technologies we've got now, there's some pushback and there is reason to be hopeful. So we can continue to wake up every morning and you know, go to the office and continue with the job because, you know, the law can only do so much. Technology itself is really important in that, you know, trying to strike that sweet spot between liberty and the government.
So there was a certain person, I'm not saying which country, and he worked for a nation state, you know, a friendly and I was having a conversation with him on WhatsApp. And I was joking about something. And immediately he said, whoever's listening to this, that was only a joke. And this was this was on WhatsApp, encrypted end to end. And he was still concerned. And I said to that person, I said to that person, do you really think they're listening to you? He goes, during peacetime, I mean during wartime, whatever, they were listening to me. I think during peacetime they're listening to me too, or something like that. So his feelings was, even with encryption, They will listen to him.
Well, I don't really, I don't really, I mean, it's, if you're a particular, you know, you're in a particular field, then I think maybe extra caution is warranted. Um, you know, I, and, you know, with technology of all sorts, there's always like a cat and mouse game between security and then vulnerability and then security and vulnerability again. But, you know, as far as I know, and I think, You know, I think that the way that a government has to collect N10 encrypted messages is from the endpoint device. So they have to break into your phone and, you know, get the messages there. And that is possible. You know, there are the global spyware groups that are spyware companies that sell, you know, that sell spyware that can implant itself on your on your phone and collect the messages there. But people could really test WhatsApp. It's widely deployed and try to determine if there are vulnerabilities or not, and the company fixes them. So I think endpoint attacks are still clearly a way to get around end-to-end encryption. But I have a lot of faith in Signal and WhatsApp and iMessage to iMessage that the engineers there are Watching and keeping an eye, making sure it works.
So well, I don't think Joe is happy with what's up.
Well, no. Well, here's the thing. You know, some of this is technology and some of this is trust. And you might say that I'm a bit cynical, too. I do use Signal. A lot of people in security use Signal. We know that they appear to be very strong technically and very strong in terms of their commitment and the people there. And when I say very strong technically, from the technology side, that's really saying something. There are a lot of things, a lot of products out there, whether it's messaging or file transfer you know, whatever, password managers that say they have end-to-end encryption and either they really don't or it's not effective and they don't have really good ways of doing it. But also, you know, a lot of these things get to trust too and the trust of the company. I mean, let's be honest, you know, I don't use WhatsApp. WhatsApp is owned by Facebook, not a company with the best track record. in terms of privacy, and also not the best one in terms of working with the government either that we're especially seeing lately, as we shoot this in July of 2023.
I mean, people can test.
I'm still a little paranoid.
Signal is open source. You can look at the code. People can test WhatsApp and see what it is. There's really no percentage for Facebook to backdoor WhatsApp. using WhatsApp security to bring people to Instagram and to Facebook for messaging. They don't advertise against it. Skepticalness is warranted, but I think that if it drives you, it's healthy. But what's not healthy is when skepticism drives people away from using the most secure tools out there because they're like, it doesn't matter, I can't believe in anybody. I don't think you want to go that far. You should be using WhatsApp or Signal.
Well, what we do in security is, you know, we always say, look, there's a, you know, there's a risk here, you know, that nothing is that nothing is perfect. And we're shown time and time again, that, you know, some things that you say, well, they're improbable, or they, you know, may not be there. That's a little too tinfoil hat. And they end up.
Oh, yeah.
So, you know, I say, you know, so I say,
That is what we learned from the Snowden revelations.
Use the best but still understand the limitations.
I mean, that is what we learned from the Snowden revelations. I think there were a lot of expectations on people's part about what the government was doing, what it could do. And what we learned is that they were just blowing through all those stop signs and flying over the speed bumps and collecting all kinds of stuff. So it's true.
In six years that I know Joe, maybe seven now, Joe has never checked my hash for my signal, nor has he asked me why I changed my phone. So when he gets the signal alerts, he doesn't do anything.
What's that about?
Adam. Adam, believe me, nobody could impersonate you. As soon as I see the line of text, I know it's you. It's so easy.
So I messaged, my phone died and I have my other phone. I have two phones. And I messaged Eric. And Eric's one of our colleagues and friends. And I messaged him, I goes, how do I know it's you? I go, ask me a question. And he did. And I answered, he goes, no one else would have ever known that. I believe it's you.
That's right. Ask me something only I would know. That's like the best.
Yeah, that's right. But, you know, it's interesting that they talk about the endpoints, too, because I think you're right. The technology for the end-to-end encryption and things like Signal and really what WhatsApp are doing, too, are far better than what we had a few years ago. It's something I've actually believed in for a very long time, and they're doing great work in doing that. But I shouldn't say the bad guys. I would say that the people who want to get around that, whether bad or whatever, they're clever.
Yeah.
The most obvious thing for them to do is to go after the endpoint. And, you know, it's like, you know, yes, iOS is tough. Yes, it's not so easy to backdoor at all. But I've probably gotten more respect for them than in the past. They can certainly do it on a one-off basis.
I mean, that's what the Pegasus software that NSO group puts out, it does, is it hacks into iPhone endpoints and is able to do surveillance there and not just pick up the encrypted messages, but also collect information from a phone. you know, turn on cameras and microphones, you know, stuff like that. And we've seen, we're seeing Citizen Lab, which is at the University of Toronto, studies this kind of spyware. And we're seeing governments use this, you know, not against terrorists or, you know, not even against you know, like foreign diplomats. We're seeing governments use this against like activists who think there should be a tax on sugary soda because it's bad for children's health. Like where, you know, we saw this, I guess it was Saudi Arabia who was spying on the phone of the reporter Khashoggi's wife, who then he was murdered. So, you know, I mean, it's like, this is serious shit. It's journalists, it's, you know, citizen activists. And, you know, in the spyware industry, we're not seeing the kind of checks that you would need to see in order to make sure that these really inherently dangerous tools are not used misused. But then the question is like, what checks are there? So that's a difficult, you know, that's a difficult question.
I was gonna say that there's the matter of checks. And there's also the matter of actually something that, you know, from speaking to you, Jen, that's, that's gotten into my head quite a bit is, you know, it's like they, they were starting with, okay, diplomats, real terrorists, the super, super bad guys they're going after. And now, you know, you hear them going after, you know, journalists, or maybe the next tier, and then maybe activists who are, you know, not dangerous, but to anyone, or parents, but at least, but at least annoying. And it makes you wonder, are they getting better at it? And are they getting? And is the technology, this technology getting closer to, you know, much more widespread adoption of it?
Didn't they go with their parents at PTA meetings?
I don't know if they did put their phones in their messaging.
I think they did kind of generally. Keep in mind, right? And Jennifer, you can correct me if I'm wrong. Let's do a fact check here, right? It doesn't have to be spyware. It could be a zero day where you send a zip file to iPhone, it tries to render it, and it ends up doing something. There's so many different zero days out there that we really don't know. You don't necessarily have to download a payload, put the payload on the phone. You just study, oh, wow, this is a nice little zero day.
I mean, you don't even need zero days, right? You can have known exploits, but if people are clicking on malicious links or opening malicious attachments and you are on a device that will load that code, then you can be owned that way. It's not even, you know, it's not even necessary that it be a particularly novel attack. So, but you know, it depends a lot on how secure the software you're running is. And that's why they have very high end specialized tools to attack the iPhone because the Apple environment is locked down in a way that like I have an Android phone. So it's locked down in a way that Android isn't for just for example.
Yeah, you're right. I mean, Most of what we spend our time on battling are really good old-fashioned fishing, the classics that keep getting better. And to tell you the truth, doing it on a phone, I mean, I personally am a big advocate of the iPhone and iOS. In the Android versus iPhone thing, I'm an iOS guy.
I'm a Google person. I am all Android, all Google. They own me.
Well, either one. Not that it's impenetrable, but I don't know. I guess it's a level of comfort or anything. But then the paranoia also creeps in, in terms of there's not ultimate assurance. If everything's working the way it's supposed to, and if the controls are there the way we think they are. I guess I'm a little gun shy. I'm worried about the next time I'm going to have my heart broken again. And we're like, I thought And I think that's a really good point.
you know, email in particular, all your email, you know, things that are stored on Facebook, Twitter, Twitter DMs, like, you know, anything like that, then all that information is available to governments. It's just a question of what legal process they need in order to make the company comply with turning it over. And, you know, other countries, even so-called friendlies, have rules about getting access to this information. They treat their citizens differently than they treat us. We treat our citizens differently than we treat them. And then we trade information back and forth. And it's not necessarily that hard to get the right kind of legal process. You know, the warrant, the search warrant is kind of the gold standard in U.S. surveillance law. But, you know, one thing we see is that it's just super easy to get warrants. Not all courts are really scrutinizing them. We've seen, you know, in some cases, it's like the police can get a warrant to search a phone in basically every case because they're arguing, well, we know people use phones for everything, so they must have used phones for this crime. So we're going to look at the phone and then they can look at whatever is there. Same thing with Facebook accounts or email accounts or that sort of thing. So one of the things we're working on at the ACLU, the American Civil Liberties Union, where I work, is trying to make it so that the law, you know, draws warrants more narrowly so that every criminal investigation is not like a fishing expedition through all of our information that's stored, you know, on servers. you know, with accounts that we've been using for the past, I mean, in my case, you know, 85 years, but for the past, you know, whatever it is.
85 years, wow. Now I feel bad. You look really good for 85 years, but I'm going to tell you something funny, Jennifer. I was working for somebody and it was not Joe. And what I did purposely was when I was communicating with a third party vendor working for our company, I said, I'm going to send you the password via signal. And I sent the password for your signal. And my boss came back to me and said, why would you do something like that? Just send it to an email. And he was the manager of an organization in cybersecurity. He goes, what does it matter? We're not the NSA. And I just kept my mouth shut.
I mean, you're not afraid of the NSA, right? You're afraid of the NSA. You're afraid of law enforcement, which could be the FBI or local. And then you've got the people who run your email server and all the nodes in between that are, you know, able to look at something when it's not intend encrypted. So.
All right, look, I get it. You know, servers are encrypted. Most email servers will send emails encryption to encryption, but you don't always know where that email is going. You're not doing a trace from server to server to server. And our cell phone companies actually allow those emails to pass through their servers purposely. You can't do it without going through them. So somewhere or another, that text is gonna be, you know, It's going to be there. And plus, you can't control what somebody does on their endpoint. What happens if they lose their machine, their Outlook is open, their browser's open, somebody looks over their shoulder. I mean, I understand that even if it was encrypted, that's okay. But if somebody grabs your machine and it's open, you have no control over it. You want to be able to do it. When I use regular emails, I would use a colon or something like that to send the encryption. So the person has to go to the server, pull the email through 443, Hopefully they don't have a man-in-the-middle attack where that certificate's being seized and sent by somebody else. But people have to use best practices. And I have another question for you. You ever look into those other third-party phones, the ones that are secure phones that don't use Google, that don't use iPhone, that have their own kind of sort of OS? You know, I looked at those a couple of times and I was like, I don't think so.
I mean, I think it's very hard to like, you know, home grow your own security, like roll your own high-end crypto. And I think like usually that doesn't really work. But one of the things that, you know, we saw is there's this one case where, and I may be botching the facts a little bit, but I think it's really interesting just to show how, you know, how invasive the government can be. There was a small company set of guys who were creating an allegedly secure phone to sell to allegedly criminals, drug dealers and others. And they caught the guy, one of the guys, one of the engineers for something. And they were like, OK, you're going to work for us now or you're going to go to prison for a long, long time. So he went to these other people who were the customers of this phone for criminals kind of company and just kept selling the phone even though it was backdoored in such a way that the FBI could get access to the information. And then at some point, the international authorities cracked down on all of the people or on rings of people who were using these phones for various, allegedly nefarious purposes. And it was really interesting because they constructed, I mean, it's like they flipped the engineer to become a state collaborator. They contrived the way that the data was transmitted in some way, which is so complicated. It's got to be to avoid some country's laws about what you can and you can't do. And then they just basically were sitting on the wire finding out everything that these people who were using the phone for illegal activity were doing. We don't know how many people were using the phone for legitimate lawful activity and what the government did with, you know, what governments did with that information either. On the one hand, you know, as somebody who likes hackers, I'm like, that's a good hack, FBI. Like, very impressive. And on the other hand, as somebody who's like, you know, a civil libertarian, it's like something's not, you know, respecting the privacy expectations here. And, you know, I mean, if your MO is we flip engineers in companies, that, you know, back to what you were saying, Joe, that is pretty scary. I mean, I don't think in a big company, you know, like, like, meta or in signal where the source is, you know, it's open source or whatever, that it's the same thing. But yeah, I mean, that's a crafty mode of breaking into stuff.
Yeah, but here's the thing, the other side of that, the way I see it is, you know, yeah, it's terrible, they got this criminal, they had to flip someone to get them to backdoor stuff. But so much of the stuff that's out there is already backdoored, that they are getting done, that they do have access to.
But some are backdoored publicly, like a certain proxy. In certain countries, you have to use that proxy through the government. You know what I'm talking about.
Well, the funny thing is, I've encountered that before with countries where they say, you know, you have to use our internet provider, you have to use our, you know, you have to use our proxy server or content filtering. Yeah, yeah, exactly. And the content filtering is funny, because in these countries, very often their definitions of, you know, obscenity and what's inappropriate is very different from ours. Oh, absolutely. Like a regular bikini? Right. But in any case, you know, we do that and we'd say this is... I don't know. Again, maybe I'm a little too uptight about these things, but we say, you know, that's so annoying, that's terrible, I feel so bad for these people having to do it. But, you know, in a sense, though, what they have is very overt, whereas where we have it, we really have largely the same thing, where all of our activity is tracked. and collected. It's just, you know, it's just not quite as obvious.
Well, you know, I mean, that's why we fight for specific, you know, obviously transparency is a big deal, but we fight for specific laws that will, you know, make sure that this type of surveillance is you know, only takes place where there's some demonstrated need and legitimacy where, you know, you have to go to a judge and as a law enforcement officer and explain why you want to have access to this information. One of the things Snowden revelations showed us is that they had, you know, 800 ways to circumvent those legal protections. And so we keep trying to put more and more legal protections in place. It's not clear to me that for some things, you know, even with legal protections in place that the government's capable of or interested in complying. But, you know, we have to use a combination of the tools that we have in order to protect people's privacy. And those tools are technology. Those tools are, you know, the law also. So we, you know, do our best. I think sometimes we just see that the law is not up to date enough to really protect us against the kind of surveillance that we're seeing.
How about this? The gentleman in California that did, I believe, the mass shooting and his phone was encrypted and they went to Apple and they said, unencrypted, help us. And we need to, you know, protect the citizens. You know, What are your thoughts on that? Is there times when they need to have access?
I mean, I think that's a perfect example of overreach. So, you know, the Apple versus FBI case, there was a shooting in San Bernardino and one of the shooters, he had multiple phones and one of the phones the government could not unlock. And so they brought it to Apple. And Apple was like, we can't do it with this version of OS. We're not capable of helping. And so the government chose to very, very publicly sue Apple. And I think that was a contrived effort. They knew they thought that because it was terrorism related, that they were going to get a lot of support from the public. And I think they were very surprised that people did not support them and did not agree with the government's argument that Apple should basically have to build a whole new set of software that could be used to subvert security on any iPhone. you know, because of this particular case. You know, ultimately, they were able to unlock the phone without forcing Apple to create a backdoor that could have been used by any government around the globe. And guess what? They found nothing there, which is not that much of a surprise because they had the other phones as well. They had to back down. So I think that they were You know, and people in other countries that form a huge part of Apple's user base were like, I don't want the FBI to be able to get into my phone, you know, just by, you know, sending an email to Apple or, you know, loading new code that Apple didn't, you know, didn't have to create. And I think there's like a real, a real opposition among the public for rules that would force companies to create backdoors, either, you know, sort of spur of the moment or basically, you know, with design mandates. And there are countries, the UK and Australia in particular, that are looking at design mandates to try to prevent vendors from creating secure software, whether it's end-to-end encrypted or with lockout mechanisms or that sort of thing. But I don't think that in the scheme of things, when you have a vulnerable communications network, then you have hackers, stalkers, identity thieves. And I don't think that that's legitimate.
I agree. Well, you can, you can, we can tell you coming from security, something that we actually talk about all the time. And we talk about on the show constantly is that, you know, the, the hackers, you know, the, the real bad guys, they will go after every weakness and they will find it every single one. They will find it. They will figure out how to exploit it. And if you put things in there deliberately, you're, you're really making it easy. Yes, absolutely.
There's no door that only the good guys can walk through. And I think one of the things that's particularly people, oh, hackers, whatever, be like, what about other governments? There are other governments out there. They don't necessarily have the same you know, protections for protesters or journalists or whatever that we do. You know, so that's really a, you know, that's once you have it, it's like the other governments are gonna be like, you did it for this government, you've got to do it for us too. And then the other thing is like our government. our government is not reliable. They have not demonstrated the level of respect for the rule of law that they need to in order for us to really feel comfortable that they'll never abuse their access or that they won't abuse their access to a significant amount. We talk about good guys and bad guys, but if you look at the history of US surveillance, and I don't mean like long ago history with spying on Dr. Martin Luther King and that sort of thing. I mean like current history with reports from the federal courts about what the NSA does and, you know, their failure to comply with the rules and regulations. I'm talking now our government is not trustworthy in terms of following those rules. So, you know, in that case, we need to protect ourselves to a large extent. And believe me, there's more than enough other information about us that's out there for the taking. So I'm not super concerned.
Well, that's really interesting, and that's something I actually wanted to ask you about when I had read the book. This had been on my mind, just all the things that the government is doing, and they say we have authority to do this, we have the authority to do that. And just as I run down the list of all the things they've done and all the justifications and everything, I say it sometimes, I'm not a lawyer and I'm a technology person. I'm a bit of a simpleton when it comes to things sometimes. And I read something like the Fourth Amendment and it's very simple to me and very clear. It's pretty straightforward. But then I hear some of the arguments that get around it and I understand lawyers, there's nuance, there's things like that. You know, but when you hear about some of the really tortured ones, you know, it does kind of trouble me to say, you know, these are people who supposedly, you know, took oaths that they were going to follow this. And they're really blatantly breaking it.
I think the Fourth Amendment is classy.
Very concerning.
Yeah, I mean, the Fourth Amendment is very classy. It has like this nice like ring to it. It's very straightforward. But in terms of the way that it's actually interpreted in the law, it's pretty much a fucking disaster. And as somebody who like, that's my favorite amendment, my doormat says, get a warrant, like, you know, I'm just like, very my most of my work. I got to get one of those. That's cool. Most of my work is about the Fourth Amendment, but there's all kinds of ways in which the government gets around it. And I'm just going to give you an example. So if something is protected by the Fourth Amendment, then in order to access it, law enforcement needs to go get a warrant. And the magic thing about a warrant is that you have to go to a judge, an independent entity that's from a different branch of the government, and show probable cause that there's a crime going on and you're going to find evidence. Probable cause is not a big thing. It's not more likely than not. It's just that there's a likelihood or a probability. It's not that I have a standard. It seems like that should be a minimum to get access to people's most private information. But the government to this day, the federal government argues that it does not necessarily need a warrant to get access to your emails or to information that you've posted on Facebook and your communications on Facebook. They have all kinds of excuses. The number one of them being that the terms of service for these products say that the company can, you know, look at your content under certain circumstances. And the government says, well, you know, if it's not secret from everybody, then it's not private. And if it's not private, then the Fourth Amendment doesn't apply. So, you know, even to this day when they're talking about, you know, end-to-end encryption and the San Bernardino phone, and they're saying, you know, Well, this is warrant proof. This is warrant proof data. It's like, well, guess what? You're not even promising that you're always going to get a warrant. You've got all kinds of ways that you've got around it. And if you don't get a warrant or your warrant is insufficient, then you have all kinds of arguments why you still get to use the data. So I don't have a lot of, you know, I don't have a lot of sympathy for that. It seems like in this day and age, it should be pretty clear that your social media information and your email are protected by the warrant requirement. And that's what the government has to do if it wants to read my stuff.
So let me ask you this. I saw a video the other day where a trooper pulled somebody over and said, license your registration. The guy goes, I don't have to provide that to you. I don't have to provide my name. He goes, yes, you do. And then he goes, no. Fourth Amendment says I don't have to give you my name. What are your thoughts about that?
Okay, this is a hard question for me because I actually don't know the answer to this and I should because I'm a Fourth Amendment expert and it's like a real world thing. Oh, I stumped you. Yeah. I think it depends on why they got pulled over. But my understanding of this is that you don't need to provide ID. I think even if you get pulled over for a reason, I think you don't need to provide ID. But if you don't provide ID, then they are allowed to detain you until they can somehow independently know your identity. And that sucks. So most people show the ID. But I think that's the rule is that you're right, you don't, you're not obligated to disclose it. But as soon as we're done with this podcast, I'm gonna look it up. And if I'm wrong, I'm going to be embarrassed.
That's good. This was not a sellout. I bet. No, I'm sure.
I'm sure Adam has all kinds of other questions for me that I maybe don't know the answer to. Well, I've
I've got one for you. Hopefully this is easy. Because think about what you were saying before. It's like, OK, we have the modern age. And I know the Fourth Amendment is simple, and it's written in 18th century language or whatever. But I think it easily translates anyway. OK, so especially hearing about that. So if our social media, our email, our phone calls, if they're over form, all these things are not protected by the Fourth Amendment.
They are protected.
They are protected. But the government's not recognizing it, is that it?
They argue about it, but they are protected.
Okay, but I mean... Sorry, I just got excited.
I was like, let's be clear. No, that's okay. You're wrong.
All right, all right. No, I agree with that. But I mean, okay, so let's say hypothetically we accept some of these government arguments about how all these things are not protected. You know, then what is? If I scroll out, you know, my shopping list with a big feather on a parchment and, you know, lock it in my basement. You know, what, what's left if you take away all this stuff these days?
The quill version of your shopping list.
Yeah, quill, you know, a little ink quill and all that.
This goes back to also like, I turn around and I throw some papers in the garbage. And then the garbage goes on the street. Is my garbage protected? Is it not protected? It's on the street. Can law enforcement look through my garbage? Do they have to wait for it to go in the truck? Do they have to wait for it to go to the dump? These are the weird things that I think about.
Well, I mean, the garbage rule in the federal courts is that when you throw it out, you've abandoned it, which is ridiculous. And in a number of state courts, they're like, that's ridiculous. People don't expect, have the reasonable expectation that people are going to just dig through their garbage. So this, like, abandonment doctrine is really highly questionable. But, you know, it's an example of the government going for it. And Joe, to your point about the technology, what you're really asking is how, when technology changes, what does it mean for our protections? And I think there's two things. One thing is that what the internet and technology has done is it gathers more of our information than ever before and centralizes it into a nice, delicious honeypot for surveillance, for people who want to surveil us. And it used to be more expensive. to collect that information about us. Today you can go to the phone company or to Google and track where I've been 24 hours a day. Not just now, you can go back in the past and find me too. which was impossible in the olden days. But even for following me, you'd have to pay a police officer to follow me around. And that was like a natural disincentive. Now it's like totally cheap. You just send some kind of legal process. Used to not even be a warrant. and you can follow me around. And so it's just cheap and easy. And so why not do it? So what we need to do, you know, when technology is lowered, the barriers is that the law has to step in and do more. The law has to raise the barriers so that we have, you know, at least the same amount of privacy now that we had, you know, when the Fourth Amendment was was crafted so that we don't have to use a quill and an inkwell you know, in order to do stuff that, you know, then can be private.
So I wrote a paper for my master's about, you know, telemetry and my fitness pal, and putting government aside, this guy turns around, allegedly, cheats on his girlfriend or his wife, sleeps with a woman, They happen to be Facebook, I mean not Facebook, Fitbit friends. She sees that his heart rate is 140 at 3.30 in the morning. She then has her attorney, I think, subpoena. The rest of it is MyFitnessPal records and everything. And my point is, is that so much stuff is shared. you can deduce or deduct what's going on. And you put yourself in a precarious position, you know, oh, another person turned around and was doing Pokemon. And he was doing Pokemon in, he was friends with his girlfriend. doing Pokemon, and he happened to be at this address, and she knew where it was, where he was, that was his ex-girlfriend, she went to the house, and then she tried to kill him. These are the things, you gotta, like, people, you share so much information.
Well, you know, it's like the, it's like that story about the Strava app, where they were able, Yeah, oh, the military.
Yeah, yeah, absolutely. In the Middle East, in the Middle East, running around the street.
Yeah, like where all the, where a bunch of people were running around, so. But yeah, I mean, I think your point is a good one. I'm a constitutional lawyer, so I worry about government access to information. But you're absolutely right. I mean, we're sharing more information with people than ever before, too. And it has something to do with the relationship of you know, husbands and or spouses, I'll say, and, you know, parents and their children and friends and that sort of thing. You know, it's a different kind of society when we know a lot more about each other. And personally, I think we have to be more kind with each other. You know, like when you can go back on Twitter and find some really embarrassing, cringy thing that somebody said like four years ago, I don't think Like I think we need to evolve beyond that kind of like gotcha culture because we know so much about people now. We just need to realize that people are complicated. We don't always do the right thing. But yeah, you know, I mean, it's an issue for stockers, for, you know, employers, for, you know, all kinds of civil uses of this data as well. Redlining, you know, sort of setting, I mean, another thing is like setting bail, like, you know, they now have these risk analysis tools that they put in data about you, and they're like, oh, we think this person won't come back to, you know, court, and it's like, based on what we, you know, just a bunch of data that's out there. And for people who don't generate that data, they get a different result, even if they're basically the same kind of person. So there's all kinds of bias in the system, you know, as well based on what data is out there. And you know, what you can, what can be collected.
So if I'm trying to get Ryan Reynolds email address, is that considered stalking?
I think that no, because you keep emailing the wrong email address. So I think that's okay.
Oh, so it's not like attempted murder if you fail, it's not a crime when it comes to stalking?
Well, I'm sure once you get a hold of him, he's going to want to hang out. So I don't think that's stalking. I think that's going to be fine.
I mean, listen, if he does do the podcast, so just so you know, and I'm digressing, I sent him an email, I'm like, dear Ryan Reynolds, I know you've become somewhat successful running, you know, an okay business. I would like to get some ideas about your cybersecurity business. I would like to get some ideas of how you're running cybersecurity for your businesses. We would love to have you, you know, during your busy day, you know, you seem to be so much successful. So we're looking forward. Oh yeah.
I'm sure he jumped at that chance.
Oh yeah. Yeah. Yeah. Adam, Adam, you know, out of the 500 he gets every week, you know, he's definitely going to go, go right to yours. I'm sure that's going to work out.
Uh, well I'll get Ryan.
When you do, I'll be very impressed. Actually, I'll be frightened to find out how you did it.
Ryan, if you're listening to this podcast, give Adam a call, please.
And wear your Deadpool outfit when you do the podcast.
Exactly.
Okay. Well, when we start getting into Deadpool, I think we're headed for the last call then at this point, you know.
I mean, I have a last thing just I want to say because I think that, you know, you can talk about technology and you talk about constitutional law, but ultimately our government is made up of people. And I think that people need to understand the power of the government and how much information about us is out there and how it's used to, you know, how it's used to spy on protesters, to, you know, look for information about journalists, to kind of control or, you know, inputs about who can get a loan and how much and all sorts of things like that. You know, it's important. This data really belongs to us. And it's, you know, important to control the way it's used, not just like on a, you know, we can't protect ourselves on our own. So, you know, we're going to have to look collectively. And that means looking to the law. So I think people should educate themselves. You've got to call your representatives when there are, you know, bills that seek to tear down our privacy and get rid of encryption. You know, they need to hear from us. And when there are safety measures that are implemented, we need to make choices in the market and otherwise to say, like, these are the products and services that we want. So I don't feel hopeless. I probably should. But if I did, I wouldn't be able to get up and go to work every day. I have what I like to call irrational optimism, but I think it is rational. So, you know, we as the people, we the people, we can do something about this. We just need to, you know, kind of allocate a little bit of attention and make sure that the people who are representing us in government know that we're watching and that we care and that they better do the right thing by us.
what are your three recommendations to the audience that's listening on how to protect themselves, whether it's government, whether it's whatever, what are the top three anybody should do to protect their privacy?
Yeah, I mean, I think that's a great question. I mean, it kind of depends on like what your threat model is. I mean, you know this from being in security. It's like, who am I worried about? Am I worried about, am I a 15 year old who's worried about my mom? Am I worried about my boss? Am I worried about abusive spouse? These are all different things. But I would say, number one, password manager. It's so simple, but people should not be renewing passwords or reusing passwords. They should not be sending them over email. A password manager is an extremely important tool. Number two, I would say is to use end-to-end encryption whenever possible when you're having your communications. There's no reason why all of your conversations into eternity should be memorialized for somebody to, you know, just like pick through later on. And then number three, I would say security updates. When software gets insecure, vulnerabilities are found and you should update your software because those are always improving your security as opposed to leaving yourself open to some known vulnerability. These things sound simple. They're not super romantic or anything, but they're basic things that everybody should do.
I was going to say, you're absolutely right. We've talked about these things on the show, and we've also talked about how doing security, a lot of it is not very whiz-bang. A lot of it is kind of dull and kind of grinding, and just remembering to do it all the time and not make a little exception. But yes, those things can make a big difference. And I'll add something else. If you're going to talk about technology, I'll talk a little bit about the latest technology.
Oh, fair enough.
Sure. And maybe the political side, too. For all my talk of doom and gloom earlier, the truth is I think there is a lot more awareness than it used to be. People are getting more involved. And I think we are seeing more traction there, as you said, with the Apple case people standing up, and I think even some of the things we're seeing lately. And you're right. That's the true path to change. This is not something that technology is going to solve. We're technology people, we want to have the strongest things, we're defenders and all, but ultimately we have to make sure that all of us are on the government, that we the people are making sure that they're doing what they should be doing and not doing what they shouldn't be doing.
And so, so Jennifer, I have more questions. I'm kidding. One more question. So we all talk about putting that little cover over your computer, your laptop, because no one wants to see the video. You don't want to get that sexploitation thing, but people forget. that both your phone has an open camera all the time. There's no real camera covers for your phone. And audio is always a part that can be listened. Not only do we do it, not only can it be done covertly, it's done, I guess, overtly or regular, because you have your Google, you have your Alexa and everything else. How can you best protect yourself from your phone listening, oh, I need salami. And then two minutes later, an ad pops up that says salami. How do you protect yourself? How do you get this stuff salty?
I mean salami is really good so I think about that a lot as well.
I could have a lot of meetings. You don't want the government hearing that.
I used to have a little sticker over the camera on my phone, but you're right. If you have those kinds of services turned on, it is a vulnerability. For me, I love technology. That's how I got into this field. I do all kinds of things that are not privacy friendly. I use Google Maps. I did 23andMe. I like what I can get from giving away my privacy. And I think that's one of the reasons why we need to pressure companies to protect the data by not collecting it, not storing it. If they store it, delete it, data retention. If the government comes asking, make sure you require a warrant, encrypt it in transit, encrypt it on her server. you know, all of that stuff. So it's like, it takes a village to like, make us be able to enjoy the benefits of technology without like, everything just falling to pieces.
So, you know, I mean, So why doesn't GDPR come to the US?
I think there's a lot. I think there's a, I mean, it's different politics, I think, to a large extent. I think that we have like kind of a more free market kind of thing that has, I think our Congress is a little bit ineffective, like it's hard for them to get things done. And I think we have some free speech aspects about like what you can do with people's data that kind of conflict with our notion of free speech versus theirs. So I think there's a couple of reasons why that's true. They, in Europe, have, I think, more regulations of what companies can do than we do as a general rule here, just because of a kind of pro-business streak that I think is a little more strong here in the United States than in Europe.
I think it does reflect the culture and our feelings on it. Unfortunately, I would like it if we were more security conscious here like over there. We are, we are.
And I complain about our law when it comes to government surveillance. But we have a lot of things here that are better than what various European companies or countries have. You know, we at least, you know, if a warrant is required, we, you know, have a review by a neutral and detached magistrate from the judicial branch. They don't not all European countries even have that. You know, it's the prosecutorial authority decides. when surveillance is appropriate and not. So we have some benefits here that they do not. We have some things, they have some things we don't either, but you know.
So, Joe, I'm going to save the rest of my questions for part two when we do it.
Okay, that sounds good.
Thank you so much. I really appreciate it.
All right. Yes. Jennifer, thank you so much for joining. And thank you so much, really. This is something we care about, but we're not experts and for really, you know, devoting your career to this. It's important to everyone. I'm going to need it. You're going to need it.
She only has three more years left of life.
She's too old.
I'm like 85. You know, Adam, I was going to wish her another 85 years of success and everything, but you cut it down to three. Thanks, Jennifer. Adam, this has been fun as always. Everyone, remember, yes, please listen to the podcast. Now that we're on YouTube, we have to say, like, subscribe, follow the podcast, send us comments. We're available for all your security needs. If you want to talk with us, please reach out to us. Give us feedback. You might even get Adam to do a bar mitzvah. You never know.
And I also make an excellent French 75 cocktail.
Yes, now we know.
That's right. I think I need another one. This went down easy. I can give you your parties now. Okay. This was incredible.
Awesome.
Okay. Thanks a lot everyone. Thank you. We'll see you.
