If you buy something using links in our stories, we may earn a commission. Learn more.
Across the world, millions of people have gathered to protest police brutality and systemic racism after an officer in Minneapolis killed George Floyd, an unarmed black man. Amid the outpouring of grief and support, tech companies like Google, Amazon, and Reddit have issued statements backing protesters and the Black Lives Matter movement. But these same companies also provide platforms and services that prop up communities of hate and help law enforcement disproportionately track and convict people of color.
This week on Gadget Lab, a conversation with WIRED senior writers Sidney Fussell and Lily Hay Newman about hypocrisy in tech, police surveillance, and how to safely exercise your right to protest.
Read Sidney’s story about tech companies’ relationships with law enforcement here. Read Lily and Andy Greenberg’s tips for how to protect yourself from surveillance while protesting here. Read Lauren Goode and Louryn Strampe’s story about what to bring and what to avoid at a demonstration here. Follow all of WIRED’s protest coverage here.
Sidney recommends the documentary LA 92 about the aftermath of the Rodney King killing. Lily recommends Mission Darkness Faraday bags from MOS Equipment. Lauren recommends this Google doc of anti-racism resources. Mike recommends donating to Campaign Zero and Grassroots Law Project.
Sidney Fussell can be found on Twitter @sidneyfussell. Lily Hay Newman is @lilyhnewman. Lauren Goode is @LaurenGoode. Michael Calore is @snackfight. Bling the main hotline at @GadgetLab. The show is produced by Boone Ashworth (@booneashworth). Our executive producer is Alex Kapelman (@alexkapelman). Our theme music is by Solar Keys.
If you have feedback about the show, or just want to enter to win a $50 gift card, take our brief listener survey here.
You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how:
If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for Gadget Lab. If you use Android, you can find us in the Google Play Music app just by tapping here. We’re on Spotify too. And in case you really need it, here's the RSS feed.
[Intro theme music]
Michael Calore: Hi, everyone. Welcome to Gadget Lab, I'm Michael Calore, a senior editor at WIRED, and I am joined remotely by my cohost, WIRED senior writer Lauren Goode.
Lauren Goode: Hey Mike, I'm here at home as I have been for the past several weeks taping this podcast, but this is the week that lots of people left their homes and went out into the streets, and we're going to talk about that on this week's podcast.
MC: That's right. We are also joined this week by WIRED senior writer Sidney Fusell. Hi Sidney.
Sidney Fussell: Hey guys, thanks for having me on.
MC: Of course, thanks for coming back. As Lauren mentioned, it has been a very momentous and emotional week across the country and around the world. Millions of people have gathered to protest police brutality after a viral video showed an officer in Minneapolis killing George Floyd, an unarmed black man. The sheer scale of the demonstrations and the increasingly violent police response have dominated the national conversation. Police departments have also been scrutinized for their use of enhanced surveillance technology, which is often provided by tech companies like Amazon and Google.
While these companies make statements condemning systemic racism and violence, they've also provided platforms and tools that worsen inequality. On the second half of the show, WIRED senior writer Lily Hay Newman will be joining us to talk about how protesters can protect themselves from these digital surveillance methods. But first, let's get into some of the methods themselves. Sidney, you wrote a story for WIRED this week about tech's ties to law enforcement. Tell us more.
SF: Yeah. I was definitely one of those people who was shocked and stunned and horrified by what I was seeing, and at first I had that initial very good rush of, "Oh, it's so good to see all these companies speaking out for their employees, for the people who use their products, for the people who are affected." There was also at the same time a very big backlash where people were saying, "Well, it's great that companies like Amazon or Google are stepping up and using their platforms to speak out in support of the movement for black lives." But at the same time, there has been a lot of criticism about the relationship between big tech, Silicon Valley and these platforms, and the police.
One of the things I tried to talk about in the piece I wrote was how the very companies that are now tweeting out "black lives matter" have had years of controversy and years of pushback from civil rights advocates saying that they're furnishing tools to police that are making it harder for on-the-ground protesters, harder for people of color. One of the best examples is Salesforce. Salesforce and GitHub both tweeted out in support of Black Lives Matter—and they both have contracts with Customs and Border Patrol. GitHub very controversially had a contract with ICE last year.
And so you end up with a situation where, "Oh, thanks so much for the support, but you're furnishing tech to police." Similarly, Amazon has a product called Rekognition, which is spelled with a k—we don't know why. Rekognition is a facial-recognition product that's been sold to law enforcement. There's been a lot of talk about whether or not it is functional or just completely isn't accurate.
A lot of research has showed that Rekognition actually performs less accurately on darker-skinned faces, which leads to a whole other discussion about racial profiling and whether or not someone arrested and charged with a crime because of a recognition match, whether or not that actually is the person, and whether or not the use of Amazon Rekognition could lead to a further stigmatization and a further overpolicing of people of color if police departments where to adopt it.
And again, this has been going on for years; I remember covering this in 2017. And Jeff Bezos and Andy Jassy, these higher-up Amazon executives, spoke in favor of Rekognition. They said it would make people safer, and they defended it. It's really unsettling to now see them tweeting in favor of Black Lives Matter, in favor of the protesters, when they have in the past defended the very tools which have been criticized for potentially increasing the inequality and increasing some of the issues or frustrations that people are protesting against right now.
A big part of this has been the relationship between big tech and police, and the other part of this, which is where we get into talking more about Facebook and Reddit, is this issue of free speech versus policing white supremacy. Reddit CEO Steve Huffman was tweeting in support of Black Lives Matter when the former CEO, Ellen K. Pao, said, "You don't get to save Black Lives Matter when Reddit nurtures and monetizes white supremacy and hate all day long." It was the biggest, most shocking call-out I'd seen as I was writing the article.
And one of the things Ellen Pao says is that Reddit did not do enough to stop white supremacy. Reddit allowed people on places like r/The_Donald to come together and say these racist, problematic things, and so now are you saying that you support Black Lives Matter, when before you weren't doing enough to stop some of the racist speech? All of that brings us to what is happening right now with Mark Zuckerberg. Zuckerberg has said that while he vehemently disagrees with President Trump's comments about, when the looting starts the shooting starts, he says, "I disagree with him." He declined to remove that message when it was cross-posted from Twitter to Facebook.
That exact same message was unacceptable on Twitter, but it is acceptable on Facebook? Zuckerberg has pushed back against the backlash he's receiving. He's saying, "Yes, I can support Black Lives Matter and, yes, I can say that, while this is objectionable, I'm going to keep it on this site." And that's caused a lot of pushback within Facebook; a lot of employees staged what they called a virtual walkout. Right now a lot of Facebook employees are remote, but they still took the time to log off and protest.
Mark Zuckerberg and Cheryl Sandberg met with a lot of different civil rights organizations who specifically voiced their concerns about Zuckerberg's decision to leave that message up—that there's a clear connection between the violence we're seeing right now and this call to arms to stop looters using gun violence. And they basically said that if you can't see the connection between these two things, you're absolutely not in support of black lives. Although Facebook has offered, I believe it was $10 million to different racial justice organizations. At the same time, Zuckerberg is defending his decision not to remove that message.
LG: It sounds like what you're saying, Sidney, is there's hypocrisy at multiple levels. Tech companies are putting out statements of solidarity while they're either deploying tools that are used by law enforcement, or they're just allowing divisive or outright racist content to live on their platforms. I'm wondering if you could tell us a little bit more, if we know at this point, what kind of tech is currently being deployed on the ground during demonstrations and protests to potentially track protesters? What do we know about that?
SF: One specific technology that I'm especially interested in right now, and hopefully for a future story, is called Project Greenlight. This is a system of cameras in Detroit, Michigan. And what's so fascinating about Project Greenlight is that you have these CCTV cameras that were furnished by the city, but then businesses could also register their own cameras to the same database, so that with police officers, if there's some type of a crime or some issue, police officers can very easily see, "OK, here's the cameras that we have either that are ours or that were registered from business owners or homeowners or whatever, we can see exactly where the cameras are, where they're pointing." And so they have all these different eyes. It's a public-private partnership that combines all these different real-time CCTV cameras.
And what's so interesting about that is how the use has changed so much just over the past year. This was introduced as a crime deterrent. This was supposed to stop things like drive-by shootings, burglaries, things like that. Then it got used for social-distancing measures. And so there was a real issue with people going out and violating the quarantine, people going out past curfew. There were issues with people doing large gatherings—you could upload footage or flag it and say, "Hey, we have this footage of a barbecue," or something like that. And now it's being used to monitor protesters and stop looting.
One of the things that people who study surveillance, one of the things that they really talk about is that once you introduce surveillance that you think is just on the edge, like, "Oh, it's only for violent crimes," it morphs, it changes, it insists upon itself. It becomes something that you learn to rely on. At first, it was for very, very violent crimes but most of society it wouldn't touch, but then the quarantines happened and now it's used for that. And then the protests are happening and now it's used for that.
With Project Greenlight, which may or may not include drones, we know that Detroit has looked into drone contracts, we can't confirm it or not, but I think that Project Greenlight is the perfect example of why even a little bit of surveillance can be so dangerous, and why we do need to really, really push these tech companies—because they may introduce some surveillance technology for one specific purpose, but it will mutate and it will insist upon itself as being important and long-lasting no matter what the situation is.
MC: One place where that's particularly striking is geolocation on smartphones. It's a feature that was sold to us as a way to add convenience so you can see relevant information as you're walking around or searching for things. When your phone knows where you are that sense of place can deliver information that could be helpful to you. But as we've seen and as you talk about in your story there's a way that the location information being broadcast by your phone is being used by law enforcement. And if also tech companies, particularly Google, it's called a geofencing warrant. What can you tell us about it?
SF: Right. Geofencing warrant was something that I believe in early 2018, late 2017, a few people were looking into and the basic... it's a little bit complex. But basically, a crime occurs within a specific area, let's say there's a robbery, there's a shooting or something like that. What a geofence warrant is, police will go to Google and they'll say, "We would like the data on the devices that are within this specific area." And normally it's around 100 meters, 200 meters around the crime.
Basically what Google does is it offers just like a not fully anonymized, just random numbers for the devices within this specific area and the police are the ones you have to do that detective work of saying, "OK, who was in a specific area at this specific time? Was there any shady movement?" They narrow it down to just a few. And then from there they'll go back to Google and say, "OK, we have these four or five devices that were in this specific area at this specific time who move in patterns that look shady to us. Can you give us information on these four or five?"
The defense from police departments is that, "Well, although everyone in this area gets pinged we only know who very few of these people are." It's mostly anonymized, mostly people don't even know this would happen. You really have no way of knowing if this has happened to you because unless the police contact you, you wouldn't know that you were in that initial string of a randomized, anonymized number sets.
There's a lot of concern though about this idea that just by being in the vicinity of the crime and vicinity is doing a lot of work right now because researchers have found much later that the scope of the warrants to be much bigger than the scene of the crime. You may know the house that the scene of the crime happening, why do you need the devices for the entire block or the entire neighborhood? One of the things that there's been a lot of concern about is that why should people who live in high crime areas be subjected to being involved in these random searches, and what other types of data could potentially be shown to police?
There was a problem in North Carolina where police had sought five different geofencing warrants and two of them were in the same public housing complex. Anyone who knows anything about public housing those seem to be very, very dense to get a lot of people for each block. You end up with a lot of people being routinely put through this search just for the sake of whatever crime it is. And I think that really speaks to first and foremost, a lack of technological literacy on judges, again, you do need a warrant for this, you do have to go through the court system but I don't think a lot of judges are aware of how this is working and how many people get caught up in this. And I think that ultimately Google releases transparency reports in which it talks about, "Hey, this is how much data we give to police."
And I recommend everyone look at these transparency reports because it has doubled in the past two years. In 2017, they gave around 10,000 requests from police in terms of requesting Google user data and for 2019, there were 20,000. Where this reliance on Google user data in criminal investigations is increasing, it's becoming normalized. And so going back to what we were saying about Project Greenlight, it may seem like an edge case they only do occasionally for a few people, but if it follows the rules of surveillance it could potentially be normalized and be the type of thing that gets used for lots of different uses.
It's also worth noting that Google released a lot of information about social distancing, broken down to the county in terms of how much people were traveling before and after some of these quarantines started. Again, the data that was created for the purpose of Maps and Uber became useful in terms of tracking social distancing and is now useful in terms of whether or not you were around or not around the scene of the crime. The malleability of this data is it can be overstated, we have to be very, very cautious about how it's being used.
LG: And Sidney very quickly, it's also worth pointing out that some of the tools that are being used are flawed and they're flawed because the technology that underpins them, which we collectively refer to as AI, right? AI now applies to so many different things in the world that we cover, but we certainly get a lot of pitches where things claim to be using AI. But if the data sets that are informing those technologies are biased to begin with and that inherently results in fallible technology. Talk about that quickly.
SF: Absolutely. I mean, I think the best example that I've seen as it relates to the problem of biases in AI relates to this idea of tracking or preventing crime. It's like, "Oh, what rate do crimes occur in this area? Can we predict from there when crimes will occur?" And it really overlooks what their definition of crime is and what types of crimes get reported and what types of communities have those interactions with police for the crimes, "to be reported." The same crimes can be happening in different neighborhoods but you would not see the data reflect that, you would see a high crime in areas with high policing and you would see low crime in areas with low policing because that's where the criminal reports, those statistics are being generated in areas where there are police officers to record that data.
When you look at what a police officer does, you have to remember that there's a person who's going through and collecting this data and sorting it and everything else. And I will just say that there's a lot to be said about the types of crimes that we're going to use this data and use these resources to predict and prevent. And I think we should really talk about some of the why we're really trying so hard to prevent certain crimes and not others and which ones can be reflected in the data and which can't be.
MC: All right. Well, right now we're going to take a break and then when we come back during the second half of the show, we're going to talk about some practical tips on how to protest safely.
[Break]
MC: Welcome back. Protesting is of course a constitutional right for all Americans. But in light of increased police surveillance and the use of force that we've all seen on TV, on Twitter and with our very own eyes, if you want to go out and demonstrate you should plan to do it safely. To help us talk through that we are now joined by WIRED senior writer, Lily Hay Newman. Hi Lily.
Lily Hay Newman: Hi, good to be back with you.
MC: Thanks for coming back on the show. Lily, you and our WIRED colleague, Andy Greenberg put together a guide about how to protest safely in this age of digital surveillance. Why don't you just give us some of the ways people can protect themselves out there?
LN: Yeah. We were specifically looking at how you can protect your privacy and your data and your digital security while you're out protesting. And I think there are two things to consider when you think about this, because also there's a lot of other safety considerations when you're protesting. Physical safety, gear you might want to bring with you, staying hydrated, all these things and especially protesting in a pandemic. But there's also things to consider in terms of your privacy and all of that starts with your smartphone.
You want to both be thinking about the wireless emanations from your phone and the wireless communication that's happening between your smartphone and cell towers or wireless access points, things like that. And then you also want to think about the data that is locally stored on the device or accounts that you're logged into on the device through apps or the mobile browser, things like that. Because if your device is confiscated by police, if police detain you and ask you to unlock your device or demand that you unlock your device, things like that, they can suddenly gain access to all that data on your phone.
The first thing we think about is just do you need to bring a phone at all? For most people in most cases the answer is, "Yes," realistically in today's world. But if you're going to a protest nearby where you live or where you're driving there with a group and you're ready have your people with you, things like that, it could be a situation where you actually could leave your phone at home. And that's kind of the best way if it's possible to just negate all of these concerns, that's the way that you can know for sure that no one's tracking your phone, no one's going to see the data on your phone because the phone isn't there at the protest.
LG: Lily, it sounds like you really think people should try to leave their phones at home. But let's say that you've weighed your options, you want to be able to photograph things or capture video, where you just feel you need your phone on you for other safety reasons and you've decided to bring it with you. What are your options then?
LN: Some ideal scenarios would be something like bringing a burner phone, a cheap prepaid device that you might just pick up at a corner store or a drug store or something like that. It has as little registered to you as possible, things like that and it's just a throw away type of thing it's not your normal number, all those things that would really help reduce the usefulness of data that a surveillance a dragnet would collect about that phone.
Another option for people who have a second phone, maybe it's a work phone or various reasons that you might have a second device, if it has less data on it, if you use it less often, if you don't really have a lot logged into it and it's kind of more convenient to keep it more empty, that's another good option to bring with you. If you're at the point where you're thinking, "I just need to bring my primary devices, the only device I have, I need it to coordinate or in case I get in a bad situation, here are sort of some things to consider with that.
Sidney was talking about geolocation as a factor in this. We're also thinking about devices known as Stingrays or mobile access points that put out WiFi that are controlled by law enforcement. These are fake cell towers or fake hotspots where they provide your phone with some connectivity, but really what they're doing is intercepting data and they sort of trick your phone into connecting because they're a strong signal close by, but really they're not a legitimate cell tower or a legitimate WiFi hotspot. Those are some of the types of things that you're concerned about.
One thing you can do is just keep your phone off as much as possible and only turn it on if you need to make that emergency call or if you need to check where someone is. Another option which is recommended by a lot of activists is similar to keeping your phone at home and not bringing it at all in some sense, is to use what's called a Faraday bag. It's an enclosure where radio signals can't penetrate. All the antennas and various sensors in your phone, they're still in your phone just like normal but they're in this enclosure, in this case a pouch or a bag and nothing can talk to them basically.
You can leave your phone on, everything can just be normal, but when it's in the bag you're good, and when you want to use it you take it out briefly use it and then you put it back in the bag and it's an easy way. You can't slip up and turn it on by mistake when you didn't mean to or something has just physically in the bag. And then the other thing to think about when we were talking about data on the phone, the crucial thing here is just locking your device and making sure on Android phones that you have full-disk encryption turned on. Well, that's in the security settings and that is automatic on iOS if you add a passcode.
LG: Lily, one question I have is if you are compelled by authorities to show them your phone or unlock your phone, what are your rights?
LN: Your rights are that you shouldn't be forced to unlock a device for a search in the middle of a protest in the middle of a street without arrest, without going to a precinct, without a search warrant, things like that. But in practice, the concern that we're thinking about is just the heat of the moment and your realistic feelings about your safety in that moment or what you feel comfortable with.
MC: There's seemingly a difference in terms of the different ways that you can set your phone to unlock either with biometrics, with a face print, with a thumb print, with a passcode. What is the difference and which one would you recommend for people that are going to protest?
LN: I think the easiest answer is just a PIN or a passcode is always the recommendation, preferably six digits. That is sort of the baseline recommendation for going to a protest. Some operating systems offer a feature for emergency convert to passcode. It's like if you use the thumbprint or you use face unlock because it's convenient in your daily life, but suddenly you're in a situation where you're thinking, "Well, I don't want to someone to grab my wrist and just put my finger on the phone." You can press the home button and the side button or something like that to initiate this feature where it'll ask for your passcode. If it's too much effort in general, I think it's still worth setting up for when you're going to the protest and then you can turn it off later and switch back to biometrics or whatever you prefer.
MC: I would also add that I know there's a feature on Android phones where you can leave the phone unlocked if you're carrying it on your person. Since the accelerometer in the phone knows which way gravity is, it knows when you're carrying it in your pocket or you're walking around with it. It also knows when you're close to your home and it will stay unlocked when you're close to your home. These are all things that you have to opt into to turn on, and if you've turned those on you should definitely turn all of those off, basically anything that makes it easier for your phone to automatically unlock.
LN: Yeah. I know it sounds like a lot of different eventualities and a lot of different things to consider, but I think the most important concept is just having it in your mind to like, "Oh, there's a difference between bringing my phone with me and leaving it at home." Or, "There's a difference between taking some precautions to keep it off or keep it in a special bag versus just using it totally normally." And I think if you just have that in the back of your mind you'll naturally make some small modifications as you're able to protect yourself a little better.
MC: All right. Well, I highly suggest that everybody who's listening to this goes out and reads the piece that Lily and Andy wrote about protecting your privacy during protests, and also that you read the guide that our own Lauren Goode and Louryn Strampe on the Gear team wrote about in general practical tips for protesting out in the streets, for exercising your First Amendment rights and doing it in a way that protects your safety and your privacy and of course your sanity. Let's take a quick break and when we come back, we will go through recommendations from everybody on the show.
[Break]
MC: OK, Lily, let's get started with you. What is your recommendation for our listeners?
LN: Since I suggested that people use a Faraday bag to hold their smartphone in if they go to protest, I have a Faraday bag recommendation, just trying to do a service here. Mission Darkness Faraday bags, they're made by MOS Equipment. They're just exactly the type of thing you need, it's just a pouch. They even have other formats like duffle bags where the whole bag is a Faraday bag, things like that. Pricing for the pouches is about $25 to $100 and more for the bigger bags. But the reason I wanted to give a Faraday bag recommendation is that they're not all legit. If you just google it and find something random it may not actually block everything you want to block. This was actually a recommendation from Harlo Holmes, who's Director of Newsroom Security at the Freedom of the Press Foundation, and yeah, Mission Darkness Faraday bags is a good option.
MC: Great. Sidney, what's your recommendation?
SF: My recommendation is that for the people who like me are just very overwhelmed with social media and still want to just learn a lot about riots and protests and some of the things that are happening right now, there's a wonderful documentary on Netflix called LA 92. It's about LA in 1992, the chaos surounding what happened to Rodney King. And it's super relevant, there's no narration whatsoever, it's entirely archival footage. And what I think is so incredible about it is that it really showed how long we've struggled with the idea of the viral video.
I think that's something that we're seeing right now, social media is flooded with tons of videos from tons of different viewpoints. But with what happened in 1992 and of course the infamous Rodney King video, really from the very beginning activists and people on the ground were having a discussion about what do people think when they see violence in videos? And I think that now that we're being completely flooded with different videos of horrific violence, I really would like people to watch this documentary and really ruminate on what it means to watch this stuff online and to share it and whether it's serving the purpose that you think it is.
MC: I can second that. I was in high school in Southern California during the Rodney King incident and I found the documentary to be very powerful. In a way it just is as almost as powerful as living through it the first time. OK, Lauren, what is your record the recommendation?
LG: My recommendation is a Google Doc that is being shared widely on the internet right now. I first saw it shared by Brittany Packnett Cunningham, who's the Co-Founder of Campaign Zero and the cohost of Pod Save the People, but the document was actually compiled by Sarah Sophie Flicker and Alyssa Klein. And it is a list of anti-racism resources aimed at white people, particularly white people and parents to try to deepen the work that we can do to be anti-racist, the ways that we can start at home, ways that we can do this on social media and in our workspaces. There's a list of books, podcasts, articles. Some of the articles are, I mean, it's going to take a lot of work but that's the point, they're really worth reading. Videos to watch. There's a really comprehensive list of books to read. We're going to link to this document in the podcast notes and I hope that you all take a look.
MC: Thanks for that, Lauren. That's a very valuable. For my recommendation, I'm going to share a little bit of information about myself. I am a white man and like many other white people I am wondering what I can do to help. And what I have heard from my friends, white, black, brown, is that the best thing that you can do is open your wallet. There are a lot of people asking for money right now, there are a lot of places you can donate to right now. And if you're unsure of where to go I'm going to give you two places that you can donate that have been vetted and they're great organizations doing really great work towards police reform and criminal justice reform.
One is the organization that Lauren just mentioned called Campaign Zero, which is working towards reforming police activities and the way that particularly black people and communities of color are policed in this country. And the other is the Grassroots Law Project, which is working specifically towards criminal justice reform. That's my recommendation, open your damn wallet, give money to these organizations that are doing good in the world right now at this moment.
All right, that is our show for this week. Lily, thanks again for joining us.
LN: Stay safe everyone.
MC: And Sidney, thanks for coming on the show again.
SF: Thank you, thank you for having me.
MC: And thank you all for listening. If you have any feedback you can find all of us on Twitter, just check the show notes. The show is produced by Boone Ashworth and our Executive Producer is Alex Kapelman. Goodbye, and we'll be back next week.
[Outro theme music]
- What happened when I switched from Mac to Windows
- How Kickstarter employees formed a union
- 5 simple ways to make your Gmail inbox safer
- Quarantine has transformed not-TV into essential TV
- Let's rebuild the broken meat industry—without animals
- 👁 What is intelligence, anyway? Plus: Get the latest AI news
- ✨ Optimize your home life with our Gear team’s best picks, from robot vacuums to affordable mattresses to smart speakers