Category Archives: Technology

Pushing Back On Flock Cameras with Kate Bertash

Flock camera on a light pole with a solar panel, "TFSR 4-13-25 | Pushing Back On Flock Cameras with Kate Bertash"
Download This Episode

Kate Bertash of the Digital Defense Fund to talk about Flock cameras, automatic license plate readers, the ubiquity of ai-driven surveillance, databasing and storage of real-time info of people and vehicle movements in public and privacy fears being raised. Katie also speaks about organizing with her village-mates to counter or limit them and artful approaches towards resistance with her Adversarial Apparel project.

  • Transcript
  • PDF (Unimposd) – pending
  • Zine (Imposed PDF) – pending

Good Links:

Articles on the subject:

. … . ..

Featured Tracks:

  • Somebody’s Watching Me (Instrumental) by Rockwell

. … . ..

Transcription

Kate Bertash: My name is Kate Bertash, pronouns are she and her. I am the Executive Director of the Digital Defense Fund and creator of another project called Adversarial Fashion, which I’m sure we’ll get into in a little bit.

TFSR: Cool, that’s exciting. Can you talk a bit about the Digital Defense Fund or the DDF, how it started, and what it works on?

Kate Bertash: Yes. Digital Defense Fund was actually started the first time Donald Trump was elected. Back in 2016, there was a huge demand for resources in digital security and privacy mostly for the abortion access movement. There had been a bunch of really high-profile attacks, website takedowns, breaches, and a lot of digital attacks targeted at folks who usually help people get their abortions or fundraise for them. And so this pool of resources was set aside.

I had been running events that were volunteer events to connect technologists with abortion access projects in the field, helping organizations turn a spreadsheet into a real database or fix a broken website. In some cases, folks make things like open-source platforms that they use for case management for people who are working to get their abortion via an abortion fund. So there was a lot of work to do to make sure that all the folks who were working out in the field had security and privacy resources, like trainings and evaluations. We also provide funding to help people get the improvements that they need, like software that has good safety standards and all the privacy features that we love.

And since then, it’s grown quite a bit, over the last eight years. We have a small team, about five people full-time. Recently, in the last two years, since the decision that overturned Roe v. Wade, which is the Dobbs decision, we actually now also work with organizations and movements outside of abortion access too. I think that was a really important moment where we realized that all the work that we had put in to try and secure these spaces was going to be really important for the wider pantheon of bodily autonomy, abolition, groups that work on democracy defense, all kinds of different service provision.

I know that I’m really grateful actually to talk to you today about one of the huge overlapping areas, which is that work that we try and do on helping folks to understand how surveillance impacts all of these different movements, not just, of course, abortion or any of the other ones we’re going to talk about today.

TFSR: Yeah, that’s amazing. And I’m so happy that you were able to focus on this very important part of our lives, around people’s reproductive health, or people’s bodily and health autonomy, more succinctly, and then be able to expand that out to commonalities and recognize the common ideologies that are focusing from the outside on limiting people’s access to health procedures.

Kate Bertash: I was gonna say it’s really interesting, because I know that we’re gonna chat a little bit today about automated license plate readers, and it was a kind of funny story when I first got this job, back in the day, some of the first folks that were really very supportive of what we were trying to do with Digital Defense Fund were our colleagues over at the Electronic Frontier Foundation. One of them was, Dave Maass , whose title, I believe, has been updated since then, but he was one of the senior investigative folks over there who was working on automated license plate reader data and how it’s used and misused.

We were having a phone call one day where he was trying to advocate for a particular bill that they introduced in California that was going to basically reconcile the fact that you can actually cover your plate with weather cover when lawfully parked. But why not then just be able to cover your plate when lawfully parked all the time? That might be a really great privacy measure for people who don’t want their stuff ingested into these databases, some of which we’ll discuss a little more about how those are collected and why.

We had discussed how license plate surveillance is actually ubiquitous outside of abortion clinics, unfortunately. Instead of using automated processes, there are often people who disagree with abortion who stand outside a clinic, and they will have a notebook and pen and cameras and record people’s plates going in and out.

Getting to share in that context that license plate surveillance, even when it’s not automated, has been a way in which people are surveilled and oppressed, and it impacts people’s bodily autonomy and freedoms, was a really great piece of context to be able to add to the conversation, beyond some of the other things that I know we’re going to talk about, with impacts on people who are disproportionately policed or targeted because of their immigration status. Just knowing that abortion rights, and I think health access also plays a huge, huge role in the impacts of some of the expansion of these surveillance systems.

TFSR: Yeah, or sexuality, if people are going to a pride event, or are parked outside a gay bar or a queer bookstore or whatever. Or someone attending a demonstration of some sort of political perspective that may not be supported. Yeah, yeah. It really is pretty scary.

So if someone is recording a license plate number and tag information, what can they do with that? Assuming that law enforcement and state agencies have access to these databases connected to the DMV, can the random citizen, community member, or the person that’s recording that’s wanting to do research, what kind of information can they actually find from that, short of joining the police force?

Kate Bertash: Some of what’s really troubling, especially about the rise of these systems, is that they sort of run into this problem that we encounter in the example I gave, with an abortion clinic, but also just in the world in general, which is that we sort of have no right to privacy in public space. Your privacy rights are determined by who owns the ground you’re standing on, which is a very odd thing that I think is a sort of artifact of how America views property rights as the basis of all privacy rights.

But I think one of the things that becomes really troubling about that is that we tend to gauge whether or not something is right or wrong in that regard based on how difficult it is to pull off. So you have people who, it sounds quite tedious, standing outside of a clinic, and they’re taking down stuff with pen and paper. And so you’re like, “Oh, that’s probably not too bad,” whatever I’m going to do with that information.

Whatever private database these people who disagree with abortion are keeping, they might have a spreadsheet, they might use it then to try and find other places where that plate is seen. In many cases, before the Dobbs decision, there were requirements often that people go back to a clinic multiple times in a row to be able to fulfill all of the legal requirements, for having a sonogram on one day, and then there’s like waiting period. So they would try to catch people breaking these rules.

Some of where this becomes especially troubling is that now we have these automated systems. You have the ways in which an automated license plate reader might be available. These are devices, they’re cameras, they’re always on, and they essentially record all the plates that go by them. They are sometimes found on street lamps. They are found on police cars. And there’s just all these different ways that we find places to put them around.

Sometimes they’re owned by the police force. Sometimes they’re also owned by private entities, so parking lots and structures, landlords, businesses. Even now, I think, HOAs are starting to buy and install these devices. So you have then this database that’s accessible by whoever bothered to purchase the system. And whereas we have some measure of accountability—I would say not very much—but certainly more when it is a government entity or agency that is collecting and holding information, you don’t have any control over what a private entity is going to potentially do with that.

TFSR: Yeah, that’s pretty crazy to think that it’s not just going into that person’s notebook, but it’s getting uploaded with maybe location data or time, all this metadata, and then put into a database that’s shared among anyone who just happens to subscribe to the same software.

Kate Bertash: Yeah, and I wanted to pick up on the piece that you discussed. Who owns it and what they can do with it matters a whole lot, just because I think that’s actually some of where getting into these newer systems that are available to monitor folks’ license plates gets problematic. Normally you have these systems where, I think in the past, you would have been a company or a police department or someone else purchasing your system, setting it up, and managing your own data. And so there was a company that came along that decided to make this a whole lot easier by saying, “Actually you don’t have to set up your own servers, your own accounts, your own systems. We’re going to do this as a service for you.”

Flock was one of the companies that popped up to basically say, “We’re going to give you this baked-together system. It’s going to come with the cameras, and they’re going to have all the software already on them. It’s going to connect to a database and to a large nationwide network that we manage.” So they’re going to make it super easy for you to have access to a portal where you can search and look through all of this interface.

And whereas it’s all held on their servers, you are a subscriber. So they basically claim in their user agreement that you own all of the data that your license systems collect, and that they “do not sell access to it.” I’ve actually really had a huge problem with the way that they frame this because they certainly do not resell that data, but they sell access to that data. I forget what that show is that talks about, it’s not “the assistant manager,” it’s ”assistant to the manager.” I’m like, “Okay, whatever.”

But that’s the trouble, that on Flock in particular, and I know to some gradient on other types of vigilant systems and other vendors, anybody who is a subscriber of Flock nationwide can have access to any other customer’s data and search it. So obviously that gets super problematic because many times the data that’s being licensed is actually not even only those that are collected by police forces. In some cases, they are also reselling access to data that is collected by private landowners or people who have bought these systems privately to install as part of their HOA or something else.

And certainly, as the number of people who have access to that information grows, the less and less control people who are recorded on those systems have over who has access to their data, where it goes, and what they’re doing with it.

TFSR: Yeah, that’s creepy. I remember after the Patriot Act was passed, there was a part of it where… I’m going to be totally vague, and if this sparks a note for you and you remember it, then that’s great, and work off of that. I don’t know if it was a part of Total Information Awareness or another program, but I recall there being a concern about the installation of cameras by government entities, by the FBI, I think. And the FBI would also fund businesses purchasing cameras as long as they had access to the contents of what they had recorded. I don’t know how far that actually went, but it sounds kind of like a baby version of what’s being created here, except not market-driven specifically.

Kate Bertash: Yeah. And I think what gets really problematic here, too, is that because this is information that is collected “in public,” you do not need the warrant to be able to access this information. They claim that it’s equivalent to indeed having a cop standing on a street corner, the things that they observe or not, something that you would need to be able to show that is related to an investigation or a crime.

I think one of the things that’s really awful about this in particular is that it reflects this inversion of the way we’ve come to view the accountability for expansion of surveillance services, which is that this is a marketplace. They are trying to make money, and regardless of whether or not anyone needs to know this information, they found that people are willing to purchase the service based on the idea that sometime in the future it might be useful for an investigation.

I know part of the justification under things like the Patriot Act is that there’s this extreme level of visibility and powers, but we might need them. What if one day there is some horrible terroristic threat that justifies this need to keep this always on very deep level of access somewhere waiting on a back burner. That is the excuse, certainly, that has been used to expand the marketplace for these services, including even to jurisdictions like the one I live in, where it might not even be a great bang for your buck on how many times you’re actually going to dig into that system and use it to solve a case.

TFSR: Yes, since you brought up Flock Safety, which has really hit the headlines over the last few years, could you talk about the company, how it markets the service and this equipment, and what is the pricing for it? I don’t want this to turn to an advertisement for Flock, obviously.

Kate Bertash: Yeah, no, absolutely. Let me actually pull that up, just because I do have somewhere in here the pricing that they’ve at least given to my community.

I first became aware of Flock very early in their history. It was many years ago. It was a small company that I think billed itself more as building these in-neighborhood surveillance systems. They were trying to make themselves appear like they were a security camera plus. They started out in Georgia and then have quickly become now a multi-billion dollar valuation company that basically not only sells these pre-baked cameras as a service. Their platform actually includes the Flock cameras. They are not traffic cameras. They don’t measure speed or issue tickets for any traffic infractions, but they are continuously ingesting information, and they come attached to the Flock operating system, which is a portal where users can look up individual plates, last 30 days of driving locations, no warrant required. They also have a huge fleet of other devices that they’ve started to work on, some of which are going to include drones and other types of observation data.

One thing that has really grown a lot, especially about the way that they structure their product in the last couple of years, is the use of something called “a searchable vehicle fingerprint.” I like that you brought up things like whether somebody might be involved in different causes or attending different types of actions or protests because the company basically says it not only gathers the license plate information, but vehicle make, type, color, state of the plate, whether the plate is covered, if it’s missing. There are unique features that are recorded, like roof racks or bumper stickers. Some people put bumper stickers on that indicate their political leanings or something about their identity, and then, basically, you could probably use their system to search for any of those features, including particular bumper stickers. That would certainly be a problem. I know one that the ACLU has taken some umbrage with.

Some of the what I have the biggest issue with especially is that they are a company that goes into communities. They actually have people who are representatives that will go and try to get your police department to see that this might be a great product for them to add and to have your community pay for. One of the things that supports that is these things like we have. I live in Washington State. We have something called the Washington Auto Theft Prevention Authority, and whatever the state sheriffs union is promotes grants that will cover sometimes a portion of the first year. I know they’ve been pushing those really heavily on sheriffs departments, on other local police departments.

The quote that I know that our community got when we were… I’m kind of surprised by one. I’m looking at the actual invoice here, $6,000 per camera. And then you have a sales tax, and then there’s a setup and implementation fee, which is a couple of thousand bucks. So for example, for our community to have started with six cameras, which is quite few—e are a very small town—was still a total of about $43,000. So a grant covered some portion of that. And I know that there are communities where they’re installing hundreds of these.

The prices I know are not fixed. They go up or down based on the type of camera and ability to pay. But it certainly is something that our community then would be on the hook for after the grant runs out. The idea is that, like with many types of technology products that are in your phone, it’s sort of a freemium model pricing. You know, first one’s free, and then you’re kind of locked into these longer contracts where you’re just accumulating annual fees. So it doesn’t matter whether or not you find the system successful, whether or not your community finds it to be good compared to other things they could be spending their tax money on. Certainly, they are looking to get out into as many communities as possible so that they can expand their base of continuously renewing revenue.

TFSR: Was your community specifically experiencing a spate of car thefts? Or is this maybe more an example of: federal funding or state funding is available for this. If it doesn’t get spent, then it’s not going to be in the budget for next year?

Kate Bertash: I’m glad we’re getting into this too, because I had been aware of this company for a couple years, and then, of course, I had been very active on automated license plate reader systems. Back in the day when Dave and I were talking about how these systems work, I created a line of clothing called Adversarial Fashion so that you could inject junk into automated license plate reader systems. These fun shirts and clothes that were covered in license plates and other patterns that would emulate them.

A lot of trouble with these systems is that they have what’s called very low specificity, which means that they’ll read almost anything. They’re meant to obviously work at high speeds, so they sometimes read stuff like picket fences and billboards and other things that they really shouldn’t. It’s kind of fun to prove that the accuracy of these systems is quite low. You can imagine if the accuracy of these systems is kind of iffy, you would perhaps really want there to be a good reason why you would purchase them and to find them actually useful in a small community.

I actually live in a very small town. I had lived in larger cities for many years, but these last couple of years being a rural resident have been very enlightening. So you have to imagine my surprise when I am sitting in my house, and my husband says to me that there’s a Facebook post from the Sheriff announcing proudly that they have successfully installed six Flock cameras in my community.

TFSR: Yay…

Kate Bertash: I was very, very bummed. I’m putting it very lightly. I think I kind of hit the ceiling. I was very angry because I also am very involved locally. I attend and listen in on all of our city council meetings or our community hearings. In the neighboring county, the company had tried to push these types of cameras. The neighboring counties called Klickitat County, and together the county commissioners, city council, and the public had decided it wasn’t a good fit. So you can imagine my surprise when suddenly these cameras are with no discussion in my county, were suddenly already installed.

Thankfully, because I have this kind of job, and because I know amazing people who have taught me how to do this, I have been pulling all the public records to try and get the background. One of them was the grant application, which showed that car thefts, which were used as the justification for buying the system, have actually been going down in the last several years in my county. So I want you to just imagine for $43,000 a year how many car thefts you would want to have to justify the cost of a system? Especially in a small, small place with very little tax revenue. And for the last year recorded, we had six stolen vehicles reported. So not a great deal, I would say. Six divided by 43,000. It didn’t seem to matter whether or not we actually needed these kinds of items. It was more about selling to our Sheriff’s Office that these would be a good addition, especially if they were, “very understaffed or a very small team.”

That kicked off a lot of controversy. When people live in a rural community, they are often out here because they are expecting some measure of privacy. People move out to get away from the hustle and bustle of things and to feel like you have the freedom to live without people always looking at you and everything you’re doing. And so this idea especially in a small place where we don’t have a ton of roads in or out of our county that you could have these choke points where very few cameras then could track basically 100% of our local communities’ movements to and from work, to their kids’ schools, to their doctor, to their church. That did not go over very well, and I’m sure we’ll get a little bit more into what happened next. But I think it was quite a shock to see that even in our small county, where I didn’t think we would be that much of a market for a company like this one, that they don’t seem to discriminate. Everybody’s tax money is as green as anyone else’s.

TFSR: Did Flock learn from some of the problems that they had experienced in the neighboring county? Was that one of the reasons that you didn’t hear anything about it?

Kate Bertash: Yeah, I would guess so. I think one of the big disappointments was that it was a public discussion elsewhere, and it was not a public discussion in our community. And so I actually had organized then two information sessions for my neighbors. I’m very lucky that this is something that I know something about. So I decided to put together a session in each of the two larger towns in our county so that neighbors could come in and learn more about the system.

I understand that everybody also has a different idea of whether or not they think that the risks and trade-offs are worth it, whether this is a good fit. But I think my big goal was to say, “Regardless of whether or not you think this technology is effective or that it does what it should, we each deserve the opportunity to talk about it as a community. It’s a pretty big decision to make without any kind of discussion or consent.”

I love to encourage people that if you feel strongly about this kind of thing, you can go rent a room at the library, you can print up fliers, you can paste them around town. You can talk to all your neighbors. We don’t have a news channel or a newspaper. Facebook is our critical place where we get everybody together. We managed to pack the rooms. And I was very proud that everybody came in to listen and then especially talk. Our Sheriff and the Undersheriff came to both sessions, and we were able to then have a very respectful and productive discussion, which was a very pleasant surprise. Of course, Flock did not send any of their folks who were locally hanging around trying to sell, but it was really good for us to clear the air and say that there are a few problems that we all have with these systems.

I might think that my small-town Sheriff is great, whatever. This could be somebody you voted for, somebody you’ve known growing up. This is where it can get really tough to try and interrogate those relationships and to understand the role that they play in why we do or don’t reject surveillance technology in our community. What is your relationship to who’s going to hold that information? And pointing out that, unfortunately, because this plate data is very valuable, it’s really high risk for being hacked.

I know a couple of different articles have come out where Flock has repeatedly refused to subject its cameras to independent security testing. There was a particular researcher (I will have to find this person’s name), who had done some tests on some of the Motorola LPRs (License Plate Recognition) and found that they were misconfigured, and basically just streaming live data to anybody who wanted to intercept them. Unfortunately we don’t know if there are any similar issues with Flock Safety Systems, because they won’t subject them to that kind of testing. So not only is it then potentially the people who are supposed to have “authorized access” but also unauthorized access.

Because Flock Safety is rolling out across thousands of communities, they don’t really pay very close attention to all of the different rules about what you can install in different types of right of way, or on public land, or getting the right permits. And our community indeed, was also one where they ran roughshod over the permitting process, which led to some [cameras] being uninstalled. But that’s a story for another day.

TFSR: Yeah, that’s a great start. This is a spicy zine called “Birds of a Feather, Destroy Flock Together” that I was handed, and near the middle of it it’s giving examples of how to engage. One of the things that they mentioned, which your story made me think of, was… Here’s what they say: “Cops getting Flocks installed near Portland, Oregon on state highways and freeways without permission resulted in the state telling Flock that it needs to remove the cameras and any related equipment because the company does not currently have a permit to install or operate cameras within State Highway and freeway rights of way.” So it seems like a similar instance. And guess it’s kind of the surveillance version of Bird scooters or whatever those are called.

Kate Bertash: Yeah, exactly, you just kind of go for it until somebody tells you “no.” Again, a lot of my perspective on this is philosophical. I obviously very much disagree with the expansion of the surveillance state. But then it also becomes very practical and personal.

Our community has a really small tax base. Less than 2% of the county in Skamania is taxable land. And so every dollar does count. When you waste public resources by running over this process, and then WsDOT (Washington Department of Transportation) had to get involved, and now we have to have people’s time taken up and get these things uninstalled and reapplied for permits. That translates to actual dollars and cents and time that we often don’t even have to work on very critical things in our community. Like real-time emergency services are an absolutely urgent need that every dollar that’s going towards $50,000 worth of some excessive product we don’t use could be going instead towards search and rescue equipment and things like that.

One of the things that was really a bummer was to see that this is sort of their MO. It costs them, as a multi-billion dollar company, relatively nothing to just forge ahead and shove themselves into every community, and then leave us, as the local folks, to clean up their mess. And that was definitely very frustrating because you have basically people who have also their own concerns about how the information is used day to day.

Somebody had made a really great point. One of the things that was really cool about these meetings was that I was really surprised who showed up. I know my community pretty well. I don’t know how you gauge the mental model of what a rural community is. But we run the full political spectrum. There are a lot of folks here who vote on either side of the aisle. I think also people tend to focus a little bit less in local elections on whether or not somebody is a Republican or a Democrat, but the room is fully represented. I was very pleasantly surprised then to hear that the entire community was on the same page, where nobody likes these cameras and wanted them out.

You had folks who, I know, vote straight ticket red and wear a MAGA hat to the meeting, saying, “Well I might trust you as the Sheriff. But what’s going to happen when this information and access to the system is then given to your successor. Who else gets this job in the future? Suddenly, this is just something that automatically I’m expected to allow the Sheriff’s Department to have.” I think it was really very interesting to hear the commonalities and pushback about how it was about trust as much as it was about anything else.

This company coming in was basically taking the trust that we already had with each other and trying to replace it with something that you could buy or sell. Taking it away from the systems we already use to talk to each other about whether somebody found a car that somebody stole. You know, you have posts all over our Facebook group where people will be like, “Hey, I think my car was stolen.” And then a neighbor will find it for you parked on the hill within 12 minutes, and then instead replace that with something that you have to pay for. That stuck out to me as something that didn’t occur to me many years ago when I was starting to learn and think about these systems. They interfere with your relationships with each other, as neighbors and as a community.

TFSR: Yeah, that’s a really interesting way of thinking about that. You’d mentioned concerns about like, this Sheriff was elected, who knows who’s going to be filling those shoes afterward. With the data that Flock, for instance, as one company that offers this kind of product, is there a shelf life that they promise? And how accurate can that shelf life be?

Kate Bertash: The default setting for Flock Safety’s particular user agreement is a 30-day trailing map of everywhere you’ve been all day. And I think, frankly, that seems very long to me. 30 days is certainly enough time for you to get to know somebody’s work and driving habits. And I think, generally speaking, we also have to just take their word for it. There’s no real way to prove that they don’t keep it longer than 30 days until somebody actually sues them, and they have to prove via the discovery process that that is or isn’t the case.

It’s the same thing when we are looking for a product like a VPN, and they say that they don’t collect logs. It’s really tough for us sometimes to know until that company is sued and has to prove that they do or don’t collect certain information. So we actually just have to take Flock’s word for it, as well as any of these other vigilant systems or any other vendors.

One of the things that the ACLU put out that was really very helpful was a set of parameters or a set of updates that you could ask for your community to make to the contract with Flock to try and shorten this time and also limit who might have the availability to access your community search data. That’s something that we’re still working on locally, trying to get our Office of the Sheriff and our Flock contract updated to make sure that we can shorten that. I know that there are communities that have actually shortened it to as little as a few minutes.

If I’m going to take the most generous view of why you might want this system and have access to it, they often use examples like Amber Alerts as a legitimate use of the system. You have somebody who reports that a child has been kidnapped. We’ll get those notifications on our phone that buzz really loud that tell you what the car is and where it was seen headed, any kind of other critical details. You might argue that a search could then go out to any Flock system and see if in the last few minutes, nationwide or within a particular geography, that plate has been seen.

I would say that we had many discussions, even as a community, over what seems like a fair amount of time. I would prefer that they be removed entirely, but I know that it was easier for me to hear from my community and then see that we could probably figure out some kind of agreement, even with the Office of the Sheriff. Is two days enough time, if it was involved in a search and rescue operation? Is it a few hours? At what point would that risk of having the data be abused or misused be reduced enough, if we’ve had to find a middle ground, that something could be found in a certain number of hours? But again, we would ask Flock to update this and then just have to take their word for it that they’ve updated our system this way. So kind of troubling either way.

TFSR: And maybe I’m misunderstanding too, because of the way that you’ve been talking about this. Is the Flock data actually shared into a database that other Flock users have access to?

Kate Bertash: Yes, by default. Yeah.

TFSR: So, even if they put a limitation on it, like this only gets recorded for five minutes, if you’ve got a scraper running and a very big hard drive and you’re in that network, you could just constantly be scraping this data, saving it on your own device, and it may not be available in that central location, but why wouldn’t someone be able to just upload that to some sort of torrent site or whatever, right?

Kate Bertash: Yes. I am glad you brought this up, because there actually is one entity that has total access via a special agreement with Flock Safety Systems, and that is the FBI. I think a lot of people are not aware that Flock Safety has made a deal where all of the data that is collected nationwide at any given time through any one of their systems is automatically forwarded to the FBI system to search for plates that have been tagged in their system as hot plates. We actually don’t know what the data agreement, the use or retention agreement with the FBI, looks like. That, obviously, is not available to us as the general public. So we don’t actually know what the FBI does when it takes those queries that come in saying, “Is this plate in your system,” whether or not, then what the FBI does with that.

I know that in communities like mine, there are various levels of trust with different government entities or not. The FBI is not a popular one. I think that came as a very unfortunate surprise to a lot of the folks in our very small town, that there are agencies that could basically have this kind of full unfettered access without our notification and without any restriction.

TFSR: Wow, thanks. I was not aware of that. I was just asking a leading question about people creeping on the network, and not even the FBI specifically.

Kate Bertash: Yeah, also creepers. Who knows what any individual user is doing with this stuff? They have a lot of very different types of government or non-government agencies that would access this.

So here I have… I pulled it up: “Flock runs all plates against state police watch lists, and the FBI’s primary criminal database, the National Crime Information Center. When a camera scores a hit against one of those databases, law enforcement receives an immediate notification, as Flock CEO Garrett Langley explained in 2020: ‘We have a partnership through the FBI that we monitor all of the cameras for about a quarter million vehicles that are known wanted.’” So there you go.

TFSR: There’s currently a case winding through the Virginia state court system brought in response to the 172 cameras that Flock has installed on public streets to monitor cars, license plates, passing faces, and other biometrics, and use AI to map out people’s daily routes. Civil liberties groups are challenging this as a Fourth Amendment issue. Can you talk a bit about the fears of this sort of wide-scale observation and recording, its impact on daily lives or activities that those with access might want to chill, or how it might affect already over-policed communities in particular?

Kate Bertash: One of the things that’s very important, especially when we’re looking at this kind of question and what this case attempts to answer. I’m not an attorney, but thankfully, I have many lovely folks in our lives who, via my DDF activities and otherwise, have tried to impress upon me what it is that these devices collect and why they become very relevant in a court case. My understanding is that there is someone who is suing through the Virginia state court system to basically say that being continuously monitored and checked against these hot lists for plates over time constitutes what is called a Carpenter violation.

In the landmark case, Carpenter v. United States, the Supreme Court ruled that… There was a particular person whose cell phone records, their location data was being collected for about 127 days, I believe, and basically they were using it to try and implicate this person in a crime.

Your cell phone’s location data is a very interesting sort of parallel to draw here because as we walk around, our cell phones are always looking for the service that we’ve signed up for. Let’s say I have Verizon or AT&T or something. I’m walking around and my phone is constantly looking for cell towers, and it’s knocking on each cell tower, saying, “Hi, are you my service? And if so, I’m going to transfer some data and receive some text messages and stuff.” And so your phone is kind of just broadcasting your location. That data, those pings, can be collected, and each tower keeps a record of how many times it’s been pinged and by who. This is again, kind of functionally public. Your phone is basically always broadcasting to the air its need for service.

The idea was that this information, when collected without a warrant, somehow constituted a form of surveillance over time. What this case, in my understanding, is trying to basically say is, “This other sort of data I am broadcasting out all the time to the world, which is the visibility of my license plate and other information about my car, does that constitutes unlawful surveillance?” And you can see why it’s a little troubling just because we know functionally that somebody can, at scale, use that information as surveillance.

But again, we run into this very difficult problem of what does it mean for something to be public, and what does it mean for something to be observed? We need license plates, or we’ve decided that we need license plates on our cars because they are a critical safety object. They help us to use traffic enforcement or collect tolls, or ensure that cars are registered to their owners. You do not technically need to have a cell phone legally, but you need to have a license plate legally. I’m looking forward to seeing what the outcome here is just because it opens up a lot of other questions, like you said, not just about plates, but then also about things like your face.

We can’t help but broadcast our face and how it looks to the world all the time without a great deal of effort to cover it up or otherwise make sure that it’s never visible to anybody who could take a picture of it or misuse it. Regardless of how this case turns out, I am excited to see it drive forward the conversation on how it is that we’ve decided that when something is functionally public, it means that it can’t be abused or misused by corporations or the government, which is very clearly not the case. Yeah, we’ll see. I’m waiting with bated breath to see how that pans out.

TFSR: I guess, particularly in the last few years, with the rounds of laws that were passed in many states around the country concerning facial coverings. North Carolina has had since the 1870s laws on the books about not covering one’s face in public that were built around slowing down or stopping the Ku Klux Klan. I’ve seen them applied in protest situations where there’s no clear intentionality of attempting to menace someone or threatening violence on someone present in a demonstration, but people would choose to hide their identity because they don’t want to end up on the front page of some right-wing rag or whatever. But even following the beginning of the COVID pandemic in 2020, states have passed similar laws, specifically just saying that if you’re wearing a mask, you could get away with robbing a convenience store or whatever. So, as you say, it’s difficult to obscure your face.

Kate Bertash: Exactly. And I think we’ve decided to forge ahead. I think the kind of “Why now?” and why this is coming to the fore is that these days cloud computing has gotten so cheap, AI-driven systems get cheaper all the time. Certainly, they do suck up a lot of energy and compute now, but the amount is starting to become much more trivial. It is less and less of a lift for a company to offer as a service the ability to ingest billions of data points and then sift through them.

We decided to make choices about what should or shouldn’t be allowed based on how much effort it is, or whether it would be really hard or seem an inordinate burden to, for example, track your face or write down your license plate number, or follow you all over a county, or whatever it is. But now that these things are a lot more trivial, we haven’t had the conversation about what we actually should be able to know about each other. Rather, does it matter if I see it if I’m attaching it or not to this wider dragnet database that has a lot more information that could be contextualized as something that gives somebody too much access to my personal information? And that’s because, again, our privacy rights don’t derive from the individual.

I think, in a dream world, we would all walk around with our ability to respire data into the world and have a certain level of protection. I like that you brought up a mask. It’s a great way to think about it. You know, the ability to manage or contain or have some kind of autonomy right as I walk through the world, over the kind of data I exchange back and forth with it. But unfortunately, we’ve decided that who owns the ground that you’re standing on is then who gets to decide what happens with your data and how it’s used against you or not.

In my opinion, I don’t think, personally, we’re ever really going to get to a satisfactory place until we reconcile what happened there and ideally change it. Because I think that there is kind of no future for the answer to all of these different types of surveillance and privacy violations until we reconcile the fact that your right to privacy cannot come from property rights. It’s just not going to work.

TFSR: Yeah, and it’s not just now. It’s your right to privacy from 10 years ago forward.

Kate Bertash: Scary stuff!

TFSR: Because you were talking about the ability of the individual to obscure their appearance, I would be happy if you would talk about Adversarial Fashion.

Kate Bertash: Yeah. Yeah, that was exciting. It was a really fun project, just because it came from this conversation that Dave and I had that these systems can be fooled. I wanted to prove the point, which is as critical to the discussion around whether or not it’s appropriate for these products to be in communities. We would imagine that philosophically, we of course care whether or not they should be right or wrong based on whether or not they do or don’t work.

I think it’s wrong to surveil people continuously, even if a system works flawlessly. But the case here is that it doesn’t. These systems are built for volume. They are not quite built for accuracy. As a result, by showing that they could be fooled by a t-shirt, it was a great example of a way for me to show that these automated license plate reader systems, which we do decide are so safety sensitive and must have many applications and things like tolls and enforcement, that they unfortunately still have a lot of mistakes that are possible and require a greater level of both scrutiny and stewardship.

But the most fun part about working on the Adversarial Fashion project, I got to present it at DEF CON, and it was really amazing to see the reception. People would ask me the question, “Oh, well, what’s the point of making something like this? Are we supposed to fight the system by all just putting on a cool t-shirt and putting junk into these systems?” And to that I say, do I think that if everybody did it, it could make a difference? I don’t know, maybe.

But I think more or less, one of the things I really love about anti-surveillance art projects is that these questions that you and I are discussing, they’re very big, they’re unwieldy, often. If I’m an average person trying to just get through my day, it can be really hard to decide how I feel about my license plate being tracked everywhere. It might feel inevitable that I just have no control and no power. And what does it matter if I’m not doing anything wrong and all that stuff. Or you might be confused about whether or not you would want your HOA to have that same power.

But creating an art project like a T-shirt that can fool a system I’m supposed to depend on suddenly takes this thing that’s very big and uncomfortable and it crystallizes it into a real example that I can form a strong emotional reaction to. We can agree, potentially, that if a system can be fooled by a t-shirt, it probably shouldn’t be the sole thing implicating me at the scene of a crime. So you might want a little bit more than that to be involved in convicting me.

I think that was the best part of this project, to see the enthusiasm. Obviously, Adversarial Fashion can include many other things. I know, since then, I’ve done a couple of different projects. I sort of had a hunch that not all face masks are equal in how they help conceal you from today’s modern facial recognition systems. And it was really fun to get to run some small experiments there and show that color matters, or the shape and the coverage and things like that. You get to also teach people that these systems, it’s really easy, when they’re bought and sold by a large company, to believe that something is going on under the hood that you must just not be smart enough to know.

These closed systems, they seem like they work like magic, but I think one of my favorite things about these projects is to show my process and show that I’m using systems like OpenALPR, which are actually the exact same thing under the hood as many of these commercially sold systems. You can download it, and you can learn how to spin it up on your own computer. You can play with it and test it. There’s nobody who you have to ask permission. And then once you start to play with it, through your own observations and your own experiments, you get a sort of sense of how it works, how it’s fooled.

And of course, then a better basis of discussion of why is it that something that I can mess with myself is then repackaged by a multi-billion dollar company to be the end-all be-all that’s gonna save my community from every piece of investigation it’s ever gonna have to do. Obviously, that’s not the case. So we get to also push back on these marketplaces and systems that claim that these are the best products ever, and then they sell them to entities that don’t really have the time to test them or evaluate their claims. Instead, we all just get charged for basically handing over our data for free to a company that’s going to sell access to it for their own gain.

TFSR: Yeah. And then, as you said, the imperfection of these facial recognition technologies, for instance, and the biases of the data sets that they’re working with. There were reports years ago about how anti-Black a lot of the [systems are]. The inability of facial recognition systems to distinguish between different folks with a lot of melanin in their skin, because all the models that they were testing against were lighter skinned folks, and that leading to false charges against people, false identification.

Kate Bertash: Yes, it can be really alarming to see also how there are these knock-on effects when we ask and answer the wrong question about why that works or not. I mean, it’s many years ago, so that’s probably been updated by this point. But I think one of the very troubling outcomes is when folks were saying, “We have all these different facial recognition systems, they’re bought and sold by different companies. These models don’t work very well on Black faces or people with darker skin, generally, and you gotta fix these if you’re gonna use them.” And so there were some companies that then just went to buy prison data, so they had all of these different mug shots of mostly people who are Black and incarcerated. And so you then created this other extremely exploitive, terrible marketplace, to “solve some problem” with how this data set works and how it’s trained. Without an understanding or an ethos of why it’s wrong, you basically just expand the marketplace for exploiting people and their data.

TFSR: Yeah, a thing about the Adversarial Fashion project and this sort of vein of… I’ve seen a few things come through. There are some that I don’t recall the name of, but they were clothing projects that were making a point of focusing on the US drone warfare program and developing head coverings of various sorts, whether it be hoodies or others, that would shield [from] infrared reading. And it wasn’t so much about the practical application of this, that they weren’t going to be selling it to a bunch of people in Yemen to save them from Obama drones. But it’s still, like you said, the purpose of this [is to show] there are weaknesses to this technology.

Also, I think it’s playful because it does invite people to be like, “Oh, what else can I find there?” There was this glasses company out of Chicago for a while, called Reflectacles, that would produce these different anti-facial recognition technology sunglasses or other sorts of glasses, or they would reflect back infrared lights that would be used for nighttime cameras. This sort of stuff, I think, “Well, that, or we can go around with black metal or Juggalo makeup,” which I think is awesome as a way to screw with facial recognition technologies.

But that sort of playfulness of taking space in that way and opening up conversations with people about like, “Well, that’s weird. Why are you doing this? Oh, I didn’t know that camera could do that too. I didn’t know that in the passing police car, underneath it, or in the lights on it, there’s a camera that is constantly scanning for license plates or whatever.”

Kate Bertash: Yeah, and I think, this does actually digs into this area of work that I’m deeply, deeply interested in. I will probably be spending many years of my life, really, always asking more questions and trying to write and think about it more. Which is what does it mean for a system to look at you and decide what of your data or your image in the world is going to be a substitute for your identity, for your presence in space, and what is it going to translate you to?

Looking at how computers see us, they actually show a lot of questions that we haven’t answered about how we look at each other. One of the big ones, especially, is that human facial recognition is not excellent. We actually have a really long-standing problem with the fact that memory is really weird. When people are often taken in and trying to identify who committed a crime against them in a lineup, there are so many different issues that are very well studied about how poorly we recognize faces.

There’s been incredible experiments done where they show people pictures. I believe this is a Danish study where they showed people photos. It was 40 different photos, all of some guys, and they asked people how many different people there were in these pictures. And the mean score was five plus or something. But if you were from this country, you would know that there were only two people, because they were both celebrities. And so seeing them at different angles, different lighting, making different faces, they were just much more intelligible to you. That is a problem we experience as people, and yet now we expect machines to solve it perfectly, something that we don’t even do ourselves.

And so I think I’m very, very much interested in these questions. I think there are all these kinds of areas where it’s to a business’s benefit to try and claim that all of these problems are solved. There’s an entire area of policing product called Video Forensics that is, in my opinion, nearly bunk. It is actually really difficult to go back to a CCTV camera surveillance system and tell precisely what happened. And you see then that there’s this entire industry through Axon and body cameras that aims to show you, as a jury, potentially, not exactly what happened, but the product is meant to “show what the police officer saw.”

We have this cultural idea that cameras are supposed to tell us the truth or the reality of something that happened, when, in reality, that’s not how these companies are building these systems. They are building them to tell a particular story that is beneficial to the person who purchased the system. And I’m both very interested and very terrified of where that’s all going to go in the next couple of years.

TFSR: This is totally anecdotal because I can’t remember what podcast I was listening to, but I remember hearing this last week the anecdote of law enforcement using body cameras are less and less speaking to individuals when they’re interacting with individuals. They’re speaking and repeating context and subtext to the camera because they know it’s going to be watched later if something happens, or if they have to bring it up in court, it’s gonna be shown to the audience. And so it’s another way of inserting their narrative to it, as opposed to actually de-escalating a situation.

Kate Bertash: Yeah, that they just automatically say, “Stop resisting,” even if you’re not. And that the way that they say things that would be a flag too in some court cases where it indicated that this thing happened. You’re basically no longer speaking to the person who you are interacting with in a police investigation or in a police process, but you are talking to the future jury that you want to acquit you ahead of time for any misconduct.

TFSR: That is so dark. [laughs]

Kate Bertash: It is very dark. And I think some of these systems as well are very open about this. One of my favorite documentaries to come out in the last couple of years that talks about this a lot is called All Light, Everywhere, a highly recommended watch. But part of it is that it goes into this question of the history of the camera, its role then in playing our mental model of what cameras are for and the truth that they tell.

And then they actually go to Axon the company and interview all the people who work there and see the product firsthand and the factories that they’re assembled in. They absolutely feel they have nothing to hide. They know how their product works, and they know who their customer is. So they’re very proud to tell this documentarian and their crew all about how well this is going to show… Because they could actually, with many of these cameras, if they really wanted to put in other types of sensors. You could have infrared in them, or any other type of recording equipment. But they don’t.

They actually put a very purposeful style of camera in that is a little bit like a fish eye sort of lens that’s supposed to show this sort of wider view. It’s “supposed to emulate what the officer saw.” But unfortunately, as we know with these kinds of lenses, at their periphery they tend to exaggerate movement because it stretches and distorts it a bit. So somebody who’s “coming at you from the side” could appear to this lens as if they are making a more rushed movement.

TFSR: Wow, that’s crazy. I’m gonna check that out for sure.

According to a 404 Media article that I will link in the show notes on this subject, there are at least 5,000 communities around the US deploying Flock. And that’s just Flock, that one company, one of a few, at least, that are offering networkable AI-enabled surveillance network systems.

You talked about the pushback after the contract was actually signed in your own town, and the continued pushback that y’all are doing to whittle that back. Do you have any other stories of pushback or resistance that you’ve seen or developed, and could you talk a bit about your traveling education on this subject too? That would be dope.

Kate Bertash: Oh, absolutely. I think honestly, the biggest thing that I wanted to really get through, especially in this conversation, is that it’s really easy to feel like you can just get steamrolled by these systems. In our community, the town I live in is actually unincorporated, which means that we are sort of ruled by the county. That’s our roll-up government. But even then, the county commissioners have very little power over Sheriffs. The Office of the Sheriff is an independently elected position, and in many communities like ours, they operate quite independently. Really, the only person who can do much to stop the Sheriff from doing anything is the Governor of the state. The county commissioners have the power of the purse, but as we can see, even with our national context, that’s not always quite enough to stop folks from doing what they want. We actually really don’t have much we can do to force the Office of the Sheriff to give up these systems.

So you have to start getting creative and thinking about what is actually more motivating. I know that you and I come from these very activist backgrounds, where we all suffer a little bit from the overstatement of the idea that the law is where you can get your relief. You could certainly try and sue somebody or force them with a ballot initiative or whatever to do something. But often it’s very limited and it’s also expensive and not often super practical. It doesn’t always also reflect a vision of a world we want to live in, where people are only accepting something like mass surveillance or not because there’s some big court case that we’re waiting to go through a Federal Circuit Court.

Realistically, I would say the biggest thing that has motivated our community is getting together and just seeing how many people agree, hearing the stories, and hearing the concerns. I think one of the most pleasant surprises that I encountered was that a lot of the people in the room who were the most vocal about their concerns for the system and its lack of appropriateness for our community were former law enforcement. They were people who had either themselves worked for the Sheriff’s Department or been police officers and did not like the level of access that this conveyed, understood that it’s difficult to ensure that it’s used responsibly.

Just having that space where people felt like they weren’t by themselves, and that they could speak out, and hear your neighbor. I was actually also very pleasantly surprised that it turned into a space, unlike a lot of those horrible videos you get of what’s going on at school board meetings or something, there was nobody screaming conspiracy theories at each other, no accusations of this sort of government mythologies that go through some of the more sensational media.

I think, generally speaking, bothering to get in the habit of actually using the space that’s available to you. Anybody can go and rent a space at their library. Anybody can go and put up some fliers and talk to your neighbors. You can ask people to come with you. Almost nobody comes to these public meetings. And so things happen like this. This decision got pushed for us on something called “the consent agenda,” which is just this big, huge packet of invoices that’s hundreds of these long. Whatever way feels right for your community, just knowing that being in the room where it happens matters quite a bit. It can be really tough to do that by yourself, but you don’t need to be a member of an official organization that feels like it’s their job to cop watch to know that it’s your right as a citizen and as somebody who is just living in that community to speak your mind and to ask your neighbors to help you out.

One of the other kinds of projects that I’m really appreciating lately has been the DeFlock project. I think that had come up as a sort of map that people were making together. I know we’ll probably link that in the show notes. But to be able to then go through, and I don’t know if you’ve ever done something called “wardriving,” but it’s where people try to collect this public information that’s available about the Wi-Fi networks that are connected to these cameras, and then they upload them to WiGLE, a database of Wi-Fi networks as they are seen around the country. And those can then be pulled into this DeFlock map, so you can actually see where all of these cameras are. Having that level of transparency to know that you have the right to know where they are in your community, and you have the right to tell other people where they are.

One of the pieces of kind of funny bit of advocacy that I thought was actually quite effective was one of these cameras was actually close to a cannabis store in my community and could basically track everybody going in and out of the parking lot. So I took some fliers, and I went into the weed store and I talked to the staff there, talked to the people who use the weed store, and said, “You probably didn’t know that this is what this thing out here does. And I bet you wouldn’t like that very much.” And so people brought it up at the local meetings, and it’s one of the cameras that ended up being removed.

These things that don’t feel like they matter a lot, that one-to-one personal discussion, actually do matter quite a bit. I think we forget sometimes that our power doesn’t have to come from ballot initiatives, lawsuits, passing new laws, or anything like that. It often comes from our desire to live together and for people to win their elections again for Sheriff or just to understand that we all actually have to see each other every day.

I hope that if this conversation inspires folks to do anything it’s to understand your personal influence in your community, and especially knowing that people want to hear from the person who they live near, who knows about this stuff, and they feel grateful. I was very surprised at the amount of really positive feedback from across the political spectrum. People coming up after these meetings, shaking my hand and saying, “Thank you, young lady, for bothering to put this together.” It was a very heartwarming moment where I feel now also like we’re going to be able to carry the conversation forward in a way that’s actually productive with the Office of the Sheriff, and with the company, moving forward. So wish us luck.

TFSR: Yeah, no that’s really awesome. And yeah, I totally agree. I think it makes sense to pay attention to the laws that are getting enforced and who’s enforcing them and all that. And it can be a lever that if you avoid using it could very well be to your detriment. But our strength is in our community and our relationships, for sure.

Kate Bertash: Yeah, and I think it’s so easy to get discouraged too. When I was thinking about the next steps of what we would do here, but also are there experiments… Something that’s kind of cool about a community like my county that has like 10,000 people in it. It is fairly easy, in many ways, in a way that is not in other types of larger cities, to actually get a pretty significant amount of involvement. You can pretty much talk to almost everybody about it. Getting enough of a quorum of people to take a specific action is a lot more doable than you think.

But I remember getting discouraged because I was thinking, “Well, is it a ballot initiative? Do we have to put something on our next county-level election?” And to just find out that that was going to be really expensive, that the company Flock might put their tremendous resources towards battling us out on it. And then just understanding that I have to remember that I am not limited to this marketplace’s imagination. The way they operate is by throwing money at the problem. But as we have figured out from how a lot of politics are going these days, there is a ceiling on how much you can purchase people’s trust and like for you.

So I’m excited to just keep people understanding that if you believe that you live here and you love privacy and that that’s the thing that brings us together, then there’s no amount of money that you should have to pay to be able to buy back your right to that kind of peace of mind from the place that supposedly wants your votes and wants you to live there.

I’m excited to see, especially, how this changes the conversation in our next Office of the Sheriff election. But also just the fact that it’s come up multiple times now in community meetings since means that you change what people think is possible by telling them that this is a topic you are allowed to talk about and to have an objection to. You can really change a lot of people’s minds about what they’re allowed to expect from the world, which is, I think, the most fun part,

TFSR: Okay, it’s been a real pleasure talking to you. Could you tell people where they can find your work? There’s the DDF that you mentioned. Are there any other project that you want to shout out?

Kate Bertash: So, Digital Defense Fund. We’re at digitaldefensefund.org. All of our learning materials and all of our resources are available online for folks to get to for free. Obviously, you can also reach out there for support around digital security for our various activist needs. But otherwise, I am also on Bluesky and Twitter at @KateRoseBee.

I honestly just would like to encourage everybody to look up, even on DeFlock.me, what the situation is in your community. Are these cameras near me? You know, you can look on your own county commissioners’ website or in your local notes of your city council meetings and try to see if there have been any discussions around this. The closer eye you keep on that [the better]. Turn on a Google alert for it in your town or something. You can both get ahead of it and then also understand the role that a lot of different types of surveillance technology are starting to have in your community and whether you agree with the place that you want them to take up in the world.

TFSR: Yeah, and I guess be proactive too. If those alerts don’t come up, and if it’s not already present in your community, this is a great time to start talking about it. Maybe it’s another company that’s offering a similar service that’s either already there or considering moving into the community. Getting those arguments together, and talking to your neighbors about the concerns is a great way to start.

Kate Bertash: Yeah, The Atlas of Surveillance is another great one that we should definitely link to. That’s a space where some folks who are both from EFF and then also students of journalism have actually done the work to try and pull public information requests to see what surveillance technologies are being used in your area. So check out the The Atlas of Surveillance and learn more. Or you can submit if there’s something there you know that’s being used that you don’t see. But thanks so much for the time. This is great.

TFSR: Yeah, my pleasure. If someone is interested in figuring out how to talk to their community about this, do you have on your website, for instance, a starter pack for talking about this? Or does one of those sites have any good key talking points to start with, that sort of thing?

Kate Bertash: I know that soon there will be more templated directions and examples, including the deck that I use. You can always message me on Twitter. I will happily share with you the deck that I use for my community. But I think one of my favorite places to start was an ACLU-written article that’s entitled “How to Pump the Brakes on Your Police Department’s Use of Flock’s Mass Surveillance License Plate Readers,” and they had a lot of great templated language and some actions you can take to start. I know also that EFF is working on ensuring that communities have more access to things like this in the future. So you can keep an eye out for that. But until then, feel free to always message me on Twitter or on Bluesky, and I will happily share my templated deck with you and the other tips.

TFSR: Awesome. It’s been a pleasure, and thanks again for the chat.

Kate Bertash: Oh, thank you so much. This was wonderful.

Corvallis Bookfair, Tyumen Case, and Counter-Surveillance

"TFSR 3-10-24 | Heart of the Valley Anti-Capitalist Bookfair, Updates on Tyumen Case, and Counter-Surveillance" featuring: a photo of posters of the Tyumen prisoners strung between trees in a forest; a logo of flames licking a double-helix of DNA; a print for the bookfair with people reading under a tree
Download This Episode

This week, we’re featuring four segments.

First up, you’ll hear a chat with organizers of the 2024 Heart of the Valley Anticapitalist Bookfair which ran its first iteration in Corvallis, Oregon from January 19-21st.  A zine of their experiences will appear on that blog soon. [ -> 00:24:18 ]

Then, you’ll hear a brief segment updating listeners on the conspiracy case against six anarchists and antifascists in Russia known as the Tyumen case (for where it initiated). The six anarchists, some of whom barely knew each other, were tortured into confessions of conspiracy to further anarchist ideology and damage the Russian war machine. [ 00:24:34 – 00:32:53 ]

Following this, we spoke with Aster, a European anarchist involved in the counter-surveillance and anti-repression project known as the No Trace Project which works to share information about known methods and cases of state surveillance. The project does this in order to improve and expand our collective knowledge, tools and abilities at evading state crackdowns as we organize and act. This interview was conducted via encrypted messages and Aster’s portion is being read by an unrelated volunteer. [ 00:35:47 – 01:05:18 ]

If you plan to visit their site, we suggest at least running a VPN (riseup.net has a free one) and using an anonymized browser. One method is to download the tor browser (find your device/operating system at ssd.eff.org for some tips) and visit the NoTrace Project tor address. Their website can also be found at https://NoTrace.How

Finally, you’ll hear Sean Swain’s reading of names of people killed by cops in the USA during October of 2023. [ 01:09:50 ]

Tyumen Links

. … . ..

Featured Tracks:

Continue reading Corvallis Bookfair, Tyumen Case, and Counter-Surveillance

Open Source, DIY Medicine with Four Thieves Vinegar

Open Source, DIY Medicine with Four Thieves Vinegar

black and white logo of "Four Thieves Vinegar Collective", a hexagram with points up & down, a side view of a character in a plague mask and brimmed hat and a black bar just inside the every other wall of the hexagon
Download This Episode

This week on the show, we’re sharing an interview with Mixael Laufer of the 4 Thieves Vinegar Collective about the the group, building scientific competency, biohacking, authority, intellectual property… oh boy there’s a lot there. Mixael also speaks about some of the projects that 4 Thieves has on offer, including a do it yourself AED setup for defribulation, misoprostol-soaked business cards for self-inducing abortions, instructions for laboratory tools, finding other applications for existing drugs, Long COVID and more.

We’ll be stating this a few times during this episode, but Mixael Laufer is not licensed to offer medical advice and his opinions are his own. Also, be aware (if you want to be) that laws in different jurisdictions may differ. For instance, pressing your own pills has recently been criminalized in WA state in the so-called USA.

We hope you enjoy this interview and you can check out the project at FourThievesVinegar.org, where you can find a growing collection of introductory videos about their work starting Monday, March 13th, 2023 around noon.

Four Thieves Vinegar socials: Twitter, Facebook, Youtube & Instagram

Mixael on socials: Twitter & Mastodon

A few projects mentioned include:

Continue reading Open Source, DIY Medicine with Four Thieves Vinegar

Liaizon Wakest on Autonomous Social Media and the Fediverse

Liaizon Wakest on Autonomous Social Media and the Fediverse

proposed vediverse logo of multicolor pentagon/pentagram with logos for 6 different fediverse projects around it
Download This Episode

This week, we spoke with Liaizon Wakest. Liaizon grew up in an anarchist commune in rural America. They can be found climbing into dumpsters from Mexico to Kazakhstan looking for trash to make art with. In recent years they have been focused on research into ethical technology and infrastructural anarchism. For the hour we speak about the interoperable, open source ensemble of federated online publishing servers and platforms known as the Fediverse and its most popular component, Mastodon. This conversation takes place in the context of media hullabaloo about Elon Musk seeking to purchase Twitter, the paradigm in which a rich egomaniac can own the addictive social media platforms over which so much social and political life is engaged and what positives we can draw from alternatives like Mastodon and the Fediverse.

You can find Liaizon’s account on Mastodon (an analog of twitter) at @liaizon@social.wake.st or on Pixelfed (an analog of Instagram) at @wakest@pixelfed.social. And you can follow us on Mastodon by finding @TheFinalStrawRadio@Chaos.Social or by visiting https://chaos.social/@TheFinalStrawRadio in a web browser.

Another interesting anarchist media project engaging the Fediverse is Kolektiva, which has a PeerTube instance at https://Kolektiva.Media (analog of youtube) and Mastodon at https://Kolektiva.Social where they’re welcoming new users. Kolektiva includes participation from projects like Sub.Media and AntiMidia

You can find a real good interview by our comrades at From Embers about Mastodon which I mention in the interview from February 3rd, 2022 entitled Social Networks, Online Life and The Fediverse: https://fromembers.libsyn.com/social-networks-online-life-and-the-fediverse

Continue reading Liaizon Wakest on Autonomous Social Media and the Fediverse

Cora Borradaile on Phone Extraction, Cloning and Keyword Warrants

Cora Borradaile on Phone Extraction, Cloning and Keyword Warrants

image of a cop holding a cellphone
Download This Episode

This week on the The Final Straw you’ll hear me speaking with Cora Borradaile, who sits on the advisory board of the Civil Liberties Defense Center and works around issues of tech security in movements and is an associate professor at OSU.

We discuss the use of phone cloning by US Marshall’s and other law enforcement while engaging protestors in Portland, OR. We talk about UpTurn’s recent report concerning widespread use of cellphone extraction tools to copy and search the contents of cell phones captured during interactions with cops. Finally, we talk about Keyword Searches, where (often without warrants) google hands over information from peoples google searches to law enforcement.

Sean Swain Update

We’ll also be presenting a segment by Sean’s fiance, Lauren, about his current silencing and the injustice of his case. More on president-in-exile Sean Swain can be found at Swain2020.Org and SeanSwain.Org.

. … . ..

Transcription of the conversation with Cora Borradaile

BOG: So I’m speaking with Cora Borradaile who is on the advisory board of the Civil Liberties Defense Center or CLDC and we spoke before about a range of issues in May of this year, and before the George Floyd uprising and the resulting ACAB Spring. During the uprising researchers, journalists, and activists saw that applications of new, or new to us, surveillance methods were being used by security forces against the populace in the so called US, so I was hoping to pick Cora’s brain a bit about this and see, especially since upcoming months in the US also might get a little spicy with the election and all. Thank you Cora very much for taking the time to have this conversation.

Cora Borradaile: Yeah, it’s great to talk to you.

BOG: So just to list off a few things, like this summer we saw the use of military drones surveilling and sharing information with law enforcement in Minneapolis, lots of militarized gear being brought out in the streets across the US, or for late May at least protests against police violence, and collusion with para-state white supremacists have been ongoing in Portland, Oregon. In July we saw the deployment of federal officers from the Department of Homeland Security (DHS) including Customs and Border Patrol (CBP), and the Department of Justice (DOJ) sent out to fight against protesters in the streets of Portland and attack and kidnap people. Including journalists from that and other cities. the US Marshalls also had their own aerial surveillance to track crowds in Portland, it came out this summer. On the tech side of things, the public got wind, apparently from leaks within the department of homeland security, that DHS had been cloning activists’ cell phones. Could you talk a little bit about this and what cell phone cloning is?

CB: Yeah, from that report their were very few details so a lot of it is guesswork as to what could possibly be going on. I could imagine two different ways in which their cloning cellphones- One which is scarier than the other. The more likely I think, is the less scary version which is if they manage to physically get your cell phone, like if you’re arrested and your cell phone is confiscated from you. Even if it’s confiscated temporarily, then they’re copying everything from your cell phone and possibly making a new cell phone that behaves just like your cell phone. So it would allow them to intercept calls possibly receive messages that you were intended to receive.

The scarier version but probably less likely, is the ability for them to be able to do the same thing without having the need to confiscate your phone to do so. That feels unlikely to me that they were doing that. If it was some sort of remote cloning I would gather that they were just cloning the sort of network ID of your phone, and not the contents of your phone. This would still allow them to do things like intercept calls, and intercept data, but in both scenarios I think end-to-end encryption (E2E) apps that you use like Signal or Keybase or Wire, that enable E2E encryption, I think the messages you’re receiving there would still be safe and that the cloned device shouldn’t be able to have the keys enabled to decrypt those messages that were intended for you. And even traffic that is encrypted – if you are visiting a website on your phone that your accessing via HTTPS, where the S stands for Secure – I think that even they wouldn’t be able to see the contents of that web page either because there is a key exchange that happens between you and the web server that they would have to play man in the middle on. Which is more complicated to do in a way that you wouldn’t be able to tell that something was going wrong.

All that’s to say, it’s still scary and I think if you have poor encryption practices like keeping your phone in an unlocked form, they have access to all of your encryption keys for things like Signal and Keybase and whatever other secure messaging apps you might be using. You should do whatever you need to do to alert everyone you contact with to delete your contact from their messages, groups, and so on. And if you have a phone that is confiscated – and certainly in an unlocked form – I would not trust that phone again. If your phone is confiscated but it was locked at the time and presumably you have a good password so they can’t easily unlock your phone, I would still maybe do a factory reset of your phone and start fresh by installing everything over again.

BOG: So, I’m not sure what the basis of this is, but conversations that I was having with friends when we were talking about the latter of the two instances that you were talking about- the hypothetical that remotely the cloning of the network ID or SIM connection could be done. It would be similar to you getting a new phone but having the same number, and that if Signal was installed on that device and it was connecting to the same phone number, by a Man in the Middle attack via a cloned SIM, it would appear that the interception could still be happening but that everyone would see a notation that there had been a change in safety number. Is that maybe what would happen?

CB: Yes. That is perfectly said. Right, so for them to be able to both clone your phone and intercept messages without those “safety number has changed” messages happening would be very, very difficult. So yeah, certainly if there are reports of anybody who’s had a confiscated phone and then all of a sudden all of their contacts are noticing that their safety number has changed with them, that would be super interesting to find out. —

BOG: –Or they stopped getting messages.

CB: Also horrifying.

*Laughter

BOG: Yeah. You know, they noticed that they stopped getting messages, everyone notices that the safety number changed, then that means that hypothetically the cloned phone or whatever would now be in those chats.

CB: So I don’t know if it’s as simple as that, because when you add a new device on Signal all the other devices get a notification of that.

BOG: Oh I see. So if I had a desktop and at least one cell phone that was getting messages… Yeah, but if that device was no longer getting new messages because the traffic was being routed to a different device, you wouldn’t like –

CB: Right so, your contacts should at the very least get the notification saying that the safety number has changed. If it’s a remote clone I think the only way in which the cloned phone would be able to read the messages in preexisting groups, for example, would be if the device was physically confiscated and copied. Because there are encryption keys that are used to start those conversations which are needed.

BOG: Do you mean the messages that were in loops before?

CB: No, to continue to receive messages from conversations that had already been going on. If someone started a new conversation after the cloning then the other people in the conversation might not be able to notice, but if you were continuing a conversation that had started before the cloning I don’t think you would be able to get that information without having physical access to the device and being able to copy over the encryption keys that were used to start those conversations.

BOG: Because they’re being stored on the phone and not on the server.

CB: That’s right, yeah. So for example, when you add a new Signal device part of what happens is copying over the encryption keys needed to continue conversations. And there’s a QR code that, say if you have Signal on your phone and you start using Signal on your desktop, you link those 2 devices so that both devices are able to receive and decrypt messages that go to you as an identifier.

BOG: If you know that if someone in your group or one of your friends has changed their number, whats a good verification?

CB: Don’t message them on Signal, and ask them, right? Because who knows who’s answering. Try to find a different form of communication even if its via friend or via a regular phone call, but ideally via email or some other band that is unrelated to your phone would be perfect to ask them, ‘Hey, I noticed your Signal safety number’s changed, what went on?’ Most of the time, or every time this has happened to me, the answer’s been ‘Oh, I had to reinstall my operating system on my phone’ or ‘I dropped my phone in a pool and had to get a new phone’. That’s usually the reason for a safety number changing, but definitely what you want to do is find a different way to ask that, other than using Signal and ideally other than using the phone. Especially if we are worried about cloned phones. Because if you just use a normal SMS text message to send to your friend and your friends phone has been cloned, then it could be the cops responding saying ‘Oh, yeah I had to get a new phone’.

BOG: I’ve seen some people do a thing where they ask someone in a group, when their safety number changed, ‘Hey could you leave a voice memo with your name and current time that you’re recording the memo and send it into the loop?’ And that way everyone hears this person’s voice, and it’s the time when they specifically get asked to record the memo so it’s outside of Enemy-Of-The-State-level NSA level operation that’s probably not somebody compiling an automated voice message in that person’s voice.

CB: Yeah, that’s a pretty good method for doing that. As you point out, synthesizing peoples’ voices can be done, but taking into account what your threat level is – are you someone who they’re going to be throwing everything at and be able to synthesize your voice in a very short time? For the protest movements we’ve seen, probably not. However if you are the leader of a protest group, hmm… If you are someone that they’re really going to be going after because they think that going after this one person will completely destroy the movement – which I don’t think is the kind of movement time that we are in right now which is good, to avoid those specific people who could really destroy a movement – that’s a pretty good method.

BOG: If you could speak to that prior scenario, is that actually copying the contents of a phone? I think that was the subject of the recent article by Upturn called Mass Extraction —

CB: That’s right.

BOG: If you could talk a little bit about what the findings were there. I was kind of surprised yet kind of not surprised to see the local law enforcement here in Asheville spent at least $49,000, according to their studies, on cell phone extraction tools. But what are mobile device forensic tools, and what do you know about them, how widespread and what kind of stuff do they do?

CB: So these things have existed for a long time. We’ve been talking about them at CLDC for a long time but this Upturn report is really wonderful for just as you say, how widespread they are. Small police departments have them, medium police departments spend hundred of thousands of dollars on access to this over the course of 5 years, and some of the capabilities were actually, I suppose, not really surprising. But reading them all in one place and knowing how low cost access to that technology is was sobering.

So these cell phone extraction devices, they come in different forms but the kind that is most popularly seen is a small stand alone device that you plug a cell phone into and that stand alone device either tries to break into that phone if it’s locked or otherwise just copies all of the content of that phone for later analysis. Some of the things that were surprising to me was how much was available even when the phone was locked and encrypted. There’s a lot of data that is existing in an unencrypted form on your phone.

For example say your phone is locked, you receive a phone call and the name of your contact still shows up, right? It’s not the name that your contact is sending you, its not metadata associated with that contact. if your mother is calling you, it probably shows up “Mom” in your phone, and the reason it says that is because your address book has an entry with that phone number and the name “Mom” attached to it. So your address book entries are existing in an unencrypted state, for example.

Some of the other things that were sort of surprising that were pointed out, that exist in this unencrypted state even though your phone was in a locked condition, were Telegram files and Discord files, and files associated with Google mail. I think a lot of this stuff could just be from bad decisions that the app developer made. Like Telegram is not necessarily focused on security, and so for convenience or speed they may just not be hiding that information behind the device encryption.

There was definitely some reporting in that Upturn report about being able to brute force guess passwords and so there are some things that you can do to protect yourself from that, which is to have a long enough password. Or if you have an Apple device you can enable your phone to self-wipe if you have 10 incorrect guesses, for example. Which if you have a small child at home maybe you don’t want to do because I almost guarantee you will end up with a wiped phone by the end of the week.

BOG: With encrypted files, if there are messages or what-have-you that are saved in an encrypted section on the phone would that just get copied and saved, and tested against decryption later? Is that the idea?

CB: I think what’s happening in most cases is they’re taking a copy of encrypted information, possibly in the hopes that they could decrypt it later or in the hopes that they would be able to get the unlock password from you by other means, like a court order for example. You know, they did point to instances where they were still able to bypass security features like encryption because of security flaws, which is very common. If your phone is badly out of date and you haven’t been keeping up with installing security updates, always install your security updates. That’s a common thing in computer security, that there are flaws that can be taken advantage of that can allow bad actors to break through otherwise strong encryption. But I think if you’re keeping an up-to-date phone, I think that’s the best that any of us can do.

BOG: Another point that was interesting in the article, and I’m glad that they pointed it out, was the sorts of instances when this is being applied to people. You hear about Apple being pressed to give up encrypted information or give a back door when there’s a mass shooting, or a sort of incident that may involve multiple conspirators and the loss of life – something very serious. But in the Upturn article they talk about how through their research and requesting of records it showed that a lot of law enforcement agencies, even local law enforcement agencies, are attempting either to pressure people whose devices they get a hold of or apply for warrants to copy peoples’ contents of their phones for minor things that they’re being accused of.

Like if it’s something like shoplifting or graffiti or public intoxication, petty drug charges, sex work, these are a few of the examples that they give. Considering the way that policing works in the United States, and this shouldn’t surprise anyone in the listening audience, police tend to focus their attention on poor and racialized parts of the population. So if law enforcement gets people’s data, whether by asking for it and pressuring people into it or by using devices, and then saves it for a later investigation and there’s no sort of oversight of this, it seems very likely that the sorts of data that they’re collecting could be used to build future cases or for building profiles on people for things they haven’t actually been accused of so far.

CB: Yup. Phishing for data. Maybe they’re just trying to justify the purchase of this stuff. In Oregon they spend half a million dollars on cellphone extraction technologies, Portland alone spent a quarter of a million in a period of 4-5 years. That’s a lot of money to justify, right? If you’re only using it 3 times a year for homicide cases then maybe you can’t justify actually spending that money and you would just farm out, whenever you do need it for something like that, either to a fusion center or a pay-per-service from one of these companies. So it might just be they’re partially covering their asses and saying ‘Oh yeah, we use it 10 times a week’.

But we’ve also seen examples of law enforcement agencies that just collect so much data, almost for the purpose of just having data. The LAPD famously uses Palantir which is a horrible company, to do all sorts of data analytics for their region collecting data on pizza purchases and parking passes and all sorts of things that don’t seem relevant at all to law enforcement, but it’s almost a compulsion to just collect the data and see what they can do with it.

BOG: Another thing that I had seen was Google was recently in the news when court documents were unsealed in Detroit relating to witness intimidation and arson by an associate of R. Kelly, and this in regards to keyword warrants. Are you familiar with this case and could you talk a little about keyword warrants and what they are?

CB: Yeah, so keyword warrants. I hadn’t heard about them before this news story came out earlier this month, but it’s not surprising. I certainly was familiar with just how many requests for data Google gets and responds to, affecting hundreds of thousands of user accounts every year in the US. So it wouldn’t surprise me if Google, instead of just getting requests saying ‘Hey, I’d like to have all of the emails associated with email address thefinalstraw@gmail.com’, which seems to be the more straight forward type of request related to a specific account that might be included in a law enforcement issue… probably not though. To expand that to ‘Hey, I want to know all of the information you have about people who searched for ‘The Final Straw’ ’. So that’s the keyword warrant or the keyword search request that happened in this case. We’ve seen examples of Geofencing warrants happening for Google Maps asking for anybody who has searched for an address within a given region, that there were a few stories about over the last year. So yeah of course, the data is there why not ask for it? Google is not going to say no, why would they?

BOG: Basically, again by collecting information based on its availability then attempting to apply it. So in this case with the arson, they asked for people who had searched for the address of the house where a car got set on fire within a certain period of time and then cross-referenced that to a Geofence of what phones were in the area within a period of time, and were able to pinpoint and place charges. And not all of the information came out from that, some of the court records are still sealed. It’s kind of a frightening application of technology and as you say, a very happy-to-oblige industry.

CB: Yeah. I think the potential for false arrests and harassment of people, like say you happen to find someone in that area who you don’t like for one reason or another you can arrest them and hold them for a while even if you have no evidence. Harassment arrests are used all the time by law enforcement and have been for decades, centuries probably.

BOG: So I guess… use DuckDuckGo if you’re going to be committing an – – – – ?

*Laughter

CB: I would avoid Google, I definitely use DuckDuckGo. I prefer DuckDuckGo for selfish reasons, I find the personalized search aspect of Google to be somewhat infuriating. When I search for something I don’t want to find what Google thinks I want to find, I want to find the documents related to my search. It’s hard to avoid these tools, but I think DuckDuckGo, anything but g-mail for email please, and there are alternatives to Google Docs as well. Cryptpad seems to be getting better. Every month there are improvements. It offers collaborative online editing to documents, all E2E encrypted.

BOG: I am going to presume with this question that you are not a lawyer, am I correct in that?

CB: I am not a lawyer, no.

BOG: It seems things like intercepting phone calls, peoples text messages, or getting deep into their cellphones and all of the information that’s collected in them for arguably unrelated topics, might overstep into the realm of FISA (Foreign Intelligence Surveillance Act), or might overstep into the realm of one of those amendments that protects our rights against unfair search and seizure. That just doesn’t seem to be the case? Or in these instances is it that these methods haven’t been brought before courts to be challenged?

CB: Everything I know about the law I learned from CLDC, and Law & Order in a previous lifetime. So what I do know about these from reading various news articles and conversations with CLDC is, as pointed out by Upturn, a lot of the extraction of data from cellphones was based on consent and not a warrant. It was about a 50/50 split depending on jurisdiction. So this was probably the case of intimidation by a cop to a person with a cellphone, to say ‘Oh, well let us check your cellphone”. I’m not sure if they give full disclosure of what they mean by ‘let me check your cellphone’, right? (laughs) ‘Let me copy everything there is on your cellphone off your cellphone, if you’re not guilty of this minor misdemeanor’. You know, they’re just asking permission.

That’s one of the things CLDC shoves down the throats of everyone at their trainings, which is don’t consent to searches. Just don’t do it! Even if they’re going to go ahead and do the search, even if you’re not consenting to it, say over and over again ‘I do not consent to this search’. Have a sticker on your phone that says ‘I do not consent to this search’. Because then it can’t be used in the court of law at least. The other thing that we’ve seen over the years is, parallel reconstruction. I don’t know if I’ve seen a well researched example of this but certainly people have hinted that this a common practice, where they’ll find out something via methods that wouldn’t be admissible in the court of law and then they figure out a way to reconstruct what they know using admissible methods.

BOG: Oh like in The Wire.

CB: Yeah, exactly. So that’s something that might be why they’re getting information that they can’t necessarily use. The other part is just general intelligence work. It’s not necessarily going to be used to arrest anyone, it’s not necessarily going to be used in a court of law, but they just want to know what’s going on, and so are going to collect as much data as they can. Unless you find out about it and unless you prove harm in a court of law, then how are you going to stop it from happening? Which is why this report about the Google keyword searches and Google Geofencing searches is so important. If we can find out about that and we can get a case brought forth and have it deemed unconstitutional to do this kind of search then that would stop those kinds of requests from happening. Then you could put pressure on a company – even a company like Google – you could put public pressure on them to say ‘Don’t respond to these requests, they’ve been deemed illegal’.

BOG: There are a couple of other, I guess not insights but points in that Upturn article that I thought were useful. Like if someone deletes information on their phone, are they actually deleting information off of their phone, and are there appropriate or useful, good tools for actually wiping data off of phones or does it just kind of sit there?

BOG: –MAGNETS–

CB: I don’t know of a good tool. I think that if you do a factory reset of your phone that’s most likely to help make that data inaccessible. Even then, is it actually getting completely deleted? It might not be. You have memory on your computer or on your cellphone, and when you delete something it just kind of takes the index away… I’m trying to use an analogy that people would remember. Do people remember libraries and card catalogs? (laughs) All of my analogies are too old.

BOG: I think it’s fair, go ahead.

CB: You think people will remember?

BOG: I think so, or they’ve heard the analogy enough they’ll recognize what a card catalog is.

CB: They’ve seen a movie with an old-timey library and card catalogs?

BOG: Ghostbusters

CB: So. you have a big library with books on all the shelves and the way you know where to find a book is to go to the card catalog. You look up the book that you want and you find its listed location on the shelf and then you go to the shelf and you find the book. Well now, when you delete a file from a computer, really all you’re deleting is the card from the card catalog. So when it comes time to put a new photo in the memory of your computer or cellphone, you go to the shelf and you find out ‘Oh, there’s supposed to be space here because according to the card catalog there’s nothing stored here, so this old data must be something that I don’t need anymore, now I’m going to delete that old stuff.’ Right, ‘I’m going to remove that book from the shelf whose existence was deemed not there anymore by the card catalog, I’ll throw it away now and put my new one in.’ So it’s not until you use the memory again that the old information actually gets deleted.

BOG: At least on computers there’s – for instance I had to reinstall my operating system recently. And when I installed it I went to encrypt the home folder and the file system and it asked ‘Do you want to overwrite everything else on the hard drive?’ Is that what you’re talking about?

CB: Yeah, so that would be the equivalent of actually going to all the shelves of the old library and removing all of the old books. So that’s pretty common when you’re setting up on a computer but I’ve never seen that option on a phone. I’m wondering, does a factory reset actually delete all of that information? I haven’t noticed that myself.

BOG: Microwaves. I mean I saw –

*Laughter

BOG: Yeah, I got nothing.

CB: Drop your phone in the pool, start over.

BOG: They invented this thing called rice though, where if you put your phone into a bag of rice it extracts the water… and the data…

*Laughter

BOG: Well are there any other things you’d like to share with the audience concerning digital tech or any insights?

CB: I did want to share one thing. You asked about them getting this data, and is this illegal search and seizure. There are still strange laws that date back to the 80’s, for example e-mail can be accessed by law enforcement form somewhere like Google with just a subpoena and not a warrant, necessarily. For a law enforcement agency to get information that would otherwise be deemed illegal search and seizure, they need to get a warrant from a judge that proves probable cause for them to get that data or that physical item. But if it’s email on a server held at Google then they don’t need to prove probable cause and they just need a subpoena which is essentially just a ‘Please can I have this information’. I think that’s where these keyword searches are coming in, I’m not sure that they actually need to have a warrant for those. So that’s maybe one extra detail on that front.

BOG: In those instances it’s in one centralized place, although if your doing a keyword search… Yeah I don’t know– I guess I don’t know how Google works on the inside and if it’s just constantly categorizing what people are typing into its different services for later use and then providing that in easily digestible pills to law enforcement. If you’re sending email and it’s unencrypted, it’s probably getting Hoovered up somewhere and fully readable anyway.

CB: Depends on who your adversary is. I don’t think the Portland police department has access to a big Hoover of data on a global scale, but they certainly can ask Google for all of the emails of the activists whose email addresses they’ve extracted from the phones they confiscated during protests.

BOG: Cora, thank you so much. Cora is an associate professor of Computer Science at Oregon State University with a focus on Security State and The Adoption of More Secure Apps, and also is on the board of the CLDC. Thanks again for having this chat.

CB: It was wonderful talking to you, as always.

 

Digital Security Tools for Organizing with the CLDC

Digital Security Tools for Organizing with the CLDC

 

Radio Possum by Beehive Collective
Download This Episode

We’re happy to share the rest of our conversation with Michele Gretes, director of the Digital Security project at the Civil Liberties Defense Center, and Cora Borradaile, who is on the board of the CLDC. For this podcast special, you’ll hear the two discuss different tools for more secure, encrypted communication that is available on various platforms to folks organizing. They publish guides on CLDC.org/Security. We discuss the end-to-end encrypted alternative to Slack (Keybase) **, pgp email encryption (particularly the enigmail tool), Signal Messenger, problems with Whatsapp, Cryptpad, Jitsi, Wire, VPNs and The Onion Router,the TorBrowser, OnionShare, Zoom, Protonmail and some of the challenges of running longstanding movement infrastructure such as the RiseUp collective does (plus their file sharing and pad services). Check our show notes for links to some of these projects.

** Keybase was just purchased by Zoom. See the CLDC article.

(image lifted from the amazing Beehive Collective)

. … . ..

featured tracks:

Bojkez – Snap Your Fingers – Instrumental EP vol. 1

Glutton For Insurrection – V!RU$ 5TR!K3

Tracking Technology and Food Distro in Pandemic

Tracking Technology and Food Distro in Pandemic

Tucson Food Share logo
Download This Episode

This week, we feature two conversations. Cora Borradaile and Michele Gretes, folks involved in the Digital Security Project of the Civil Liberties Defense Center, speak about contact tracing apps and surveillance. Then, Se speaks about Tucson Food Share’s grocery distribution program.

Contact Tracing Apps

First up, we hear Michele Gretes and Cora Borradaile. Michele is the Digital Security Coordinator of the Civil Liberties Defense Center and also does digital security for an environmental non-profit. Cora is a co-founder of the CLDC Digital Security Program and is an Associate Professor of Computer Science at Oregon State University with a focus on the security state and the adoption of more-secure apps. They talk about surveillance and the use of apps for tracing folks contact with people infected with covid-19 to slow the pandemic spread. This is a segment of a larger conversation we’ll be releasing in the middle of this week as a podcast in which Cora and Michele talk about and compare tools for online organizing that engage encryption and offer alternatives to the google and other “free” products that often surveil their users. We speak about Jitsi, Wire, Zoom, RiseUp, Signal, vpns, The Onion Router, TAILS, KeyBase, Riot.IM, pgp and other mentionables. More at CLDC.org/Security/

  • Apple & Google announced this approach toward contact tracing we didn’t really cover in detail / by name in this  conversation. Here’s an article from Wired about it.
  • The White Paper referenced by Cora references from the EU with cryptographers is here.
  • GDPR (General Data Protection Regulation) laws, European restrictions on the collection and longtime storage of data on private individuals has been in place since 2016.
  • An article from VOX speaking about ICE using private phone data to seek out and arrest undocumented people in the US. Another talking about current tracking by phone companies of our movements.

Tucson Food Share

After that, we’ll hear from Se of Tucson Food Share, based in Arizona. We talk about their project, how it scaled up from Tucson Food Not Bombs to deliver groceries and hand out burritos publicly, multi-lingual engagement, resisting burnout and finding joy in feeding people. More at TucsonFoodShare.Org . You should get in touch if you’re thinking of setting up a food distribution project and have any questions.

Announcements

New Station: KODX Seattle

We’d like to mention that we’re now airing on Monday mornings at 2am on KODX in Seattle. You can check out that station’s schedule up at kodxseattle.org or hear them in north eastern Seattle on 96.9 on the FM dial.

Recent Release: Bomani Shakur and Lorenzo Kom’boa Ervin

Just a headsup, if you’re looking for more content for your ears, we released a small segment of Lorenzo Kom’boa Ervin talking about prisoner organizing in the 1970’s and today. This was paired with a longer chat with Lucasville Uprising survivor and death row prisoner Bomani Shakur aka Keith Lamar. For a little over an hour, Bomani talks about his youth, the uprising in 1993, his case and being railroaded. He has an execution date set by the state of Ohio for November 16, 2023.

. … . ..

Naughty By Nature – Hip Hop Hooray (instrumental) – Hip Hop Hooray

Leslie Fish – Bella Ciao – Smoked Fish and Friends

Playlist

Doing For Selves: Open Source Supplies and Tenant Organizing

Doing For Selves: Open Source Supplies and Tenant Organizing

3d printed n95-quality face mask
Download This Episode

Welcome to a podcast special from The Final Straw. While William is was busy producing an episode featuring voices of medical professionals and activists inside and out of prison to talk about the impacts of covid-19 on incarcerated people for broadcast, I had a couple of conversations about work folks are doing on the outside that I’d like to share.

Sean Swain [00:08:06-00:15:12]

Hacking To Fight Covid-19

[00:15:12-00:33:01]

First, I spoke with Bill Slavin of Indie Lab, space in Virginia that is in the process of shifting it’s purpose since the epidemic became apparent from an broader scientific and educational maker space to work on the manufacturing and distribution of covid-19 related items in need such as testing kits, medical grade oxygen, ventilators and 3d printed n95 quality masks for medical professionals to fill public health needs. Bill talks generally about the ways that community and scientists can come together through mutual aid to deal with this crisis left by the inaction of the government on so many levels. They are also crowd-sourcing fundraising for scaling up their production and facilities and there’s a link in our show notes on that. The platform that Bill talks about in the chat is known as Just One Giant Lab, or JOGL. Consider this an invitation for makers to get involved.

Organizing With Your Neighbors For Homes and Dignity

[00:35:08-01:45:44]

Then, I talked to Julian of Tenants United of Hyde Park and Woodlawn in Chicago. What with all of the talk about rent strikes in the face of such huge leaps in unemployment during the spread of covid-19 and accompanying economic collapse, I thought it’d be helpful to have this chat to help spur on these conversations of how we seize power back into our hands while we’re being strangled by quarantine and hopefully afterwards. You can learn more about the group Julian works with at TenantsUnitedHPWL.Org. Philadelphia Tenants Union and Los Angeles Tenants Union were both mentioned and will be linked in the show notes, alongside a reminder that the national Autonomous Tenants Union Network (ATUN) is being organized and folks can reach out to Philly TU or LA TU via email to get onto their organizing zoom calls. Finally, if you’re in the Chicago area and need a lawyer for housing, check out Lawyers Committee For Better Housing online at lcbh.org. Julian also mentioned squatting of homes in southern CA owned by the state, here’s a link to an article.

Announcements

WNC Mutual Aid Projects

Linked in our show notes is also a googledoc that Cindy Milstein and others are helping to keep updated that lists many mutual aid projects that have sprung up all over concerning the exacerbation of capitalism by the covid-19 crisis, as well as a similar page up from ItsGoingDown.Org

If you’re in so-called Western NC and want to get involved, the project Asheville Survival Project has a presence on fedbook and is soliciting donations of food and sanitary goods for distribution to indigent, bipoc, elder and immune compromised folks in the community. We’ll link some social media posts on the subject that list our donation sites around Asheville in the show notes and you can venmo donations to @AVLsurvival.

If you care to contribute to efforts in Boone, NC, you can follow the instagram presence for @boonecommunityrelief or join the fedbook group by the same name, reach them via email at boonecommunityrelief@protonmail.com find donation sites and venmo donations can happen up at via venmo at @Bkeeves.

NC Prisons Covid-19 Phone Zap

Flyer about call-in to NC prisonsAnd check our show notes for an invitation to call the NC Department of Public Safety and Governor’s offices to demand the release of NC prisoners susceptible to infection and possible death of Corona Virus in the NC system due to improper care. Wherever you are listening, consider getting together with others and calling jails, prison agencies and the executive branches to demand similarly the release of AT THE VERY LEAST the aged, infirm, folks in pre-trial detention, upcoming release or who are held because they can’t pay bail.

North Carolina Corrections Department-Prison Division

(919) 838-4000

North Carolina Governors Office

919-814-2000

https://governor.nc.gov/contact/contact-governor-cooper

sample script:

My name is ________, and I am a North Carolina resident  deeply concerned about the safety of the states’s incarcerated people during the COVID-19 pandemic. Incarcerated people have a unique vulnerability to disease due to their crowded, unsanitary living conditions and lack of access to adequate medical care. For humanitarian reasons as well as reasons of public health, we call for the immediate release of all people in the North Carolina prison system. We also urge that you stop the intake of new prisoners during the pandemic. The cost of failing to take these steps will be paid for in human lives, and we refuse to abandon our neighbors and loved ones to die in lockup.

CALL AS MANY TIMES AS YOU CAN

stay tuned to the twitter accounts for @NCResists and @EmptyCagesColl for updates

10th Anniversary

Even while the world burns, our 10th anniversary still approaches and we’re still soliciting messages from you, our listenership. Not sure what to say, likely you have a LOT of time on your hands, so go back through our archives and dive in. If you want a deep dive, visit our website where you can find hundreds of hours of interviews and music. If you want to drop us a line, check out the link in the show notes, or you can leave a voicemail or signal voice memo at +18285710161, you can share an audio file with the google drive associated with the email thefinalstrawradio@riseup.net or send a link to a cloud stored audio filed to that email address. Tell us and listeners what you’ve appreciated and or where you’d like us to go with this project.

Spreading TFS

If you appreciate the work that we do here at TFS, you can also help us out by making a donation if you have extra cash rustling around. The link on our site called Donate/Merch will show you tons of ways. If, like most of us, money is super tight at the moment, no prob, we struggle together. You can share our show with other folks to get these voices out there and more folks in the conversation. And if you REALLY like us and have a community radio station nearby who you’d be excited to have us air on for free, get in touch with us and we’ll help. The page on our site entitled Radio Broadcasting has lots of info for radio stations and how to let them know you want us on the airwaves. Thanks!

. … . ..

Featured music:

  • From Monument To Masses – Sharpshooter – The Impossible Leap In One Hundred Simple Steps
  • Filastine – Quémalo Ya (instrumental) – Quémalo Ya
  • Etta James – I Don’t Stand A Ghost of a Chance (With You) – Mystery Lady: The Songs of Billie Holiday

Digital Security / Tenant Organizing / #MeToo and Updates from Hong Kong

Digital Security / Tenant Organizing / #MeToo Hong Kong

This week, we feature three portions.

Lauren Regan of CLDC

art by Ar To
poster by Ar To
Download This Episode

First up, we share a chat with Lauren Regan of the Civil Liberties Defense Center, or CLDC, to chat about safer practices around technology for activists, as well as the “reverse search” warrant used by the NYPD with Google to capture info on antifascists and the Proud Boy attackers last year. More at https://cldc.org. An article about tech security and phones that Bursts references is called “Never Turn Off The Phone” [starts 10m 08s]

Palm Beach Tenants Union

Following this, Withers (a new collective member at The Final Straw) shares a chat with Adam and Amy, two organizers with the Palm Beach Tenants Union out of Florida about their work and the sorts of mutual aid disaster work they’ve done with Hurricane Irma and advocating for and organizing with renters in their communities for dignity in housing. More on the Union at https://pbctu.org and more on how you can get involved in mutual aid up at https://mutualaiddisasterrelief.org. There are a number of donation sites around the region to prepare for this Hurricane season, as well as distribute support to Bahamas that you can find by searching social media for DRASL (Dorian Response Autonomous Supply Line), as mentioned on itsgoingdown.org. [starts at 54m 06s]

#MeToo and Updates from Hong Kong

Finally, you’ll hear a conversation with Enid and Rebecca, who feminist activists in Hong Kong about the current state of protests there. Content warning, that segment deals in part with organizing around sexualized assault by police and by protestors. To hear our prior interview with Ahkok on protests in HK, check our website and see the great articles up at crimethinc. Also, the guests talk about the term 自由閪, or “Freedom Cunt” as a re-appropriation of a misogynist insult by police from the protests. [starts at 1hr 15m 51s]

*Correction to the HK conversation: The full name of the IPCC mentioned in regards to the establishment of an independent police inquiry is called the Independent Police Complaints Council. Hong Kong Chief Executive Carrie Lam appointed two new committee members to the already existing committee, not independent investigators. However, the IPCC has hired five foreign investigators to participate in examinations, though it must be clarified that the role of the IPCC is observational rather than investigative. The IPCC has no jurisdiction to either call witness nor collect evidence for the independent inquiry called for by citizens.

If you’re listening to the radio version, as usual, we suggest that you check out the podcast version for longer versions of all three chats in this episode as well as Sean Swain’s audio this week. You can hear that at thefinalstrawradio.noblogs.org or via various streaming platforms we publish to, such as youtube, soundcloud, stitcher, pandora and so-on.

. … . ..

playlist ending

There Is No Liberation Until The Borders Are Gone: Bruno from CIMA and Members of IAF Speak

Download Episode Here

This week we are super pleased to share an interview that William did a few weeks ago with two members of the Indigenous Anarchist Federation, Bombshell and insurgent e! We got to talk about a lot of topics in this episode, which was recorded on about the year anniversary of the formation of the Indigenous Anarchist Federation. Bombshell and insurgent e talked about their histories as anarchist people, about the formation of this Federation, what true decolonization of anarchism could look like, and about the upcoming Indigenous Anarchist Convergence which is happening from August 16th-18th in Kinlani, Navajo land, occupied Flagstaff AZ, plus many other topics!

William really appreciated getting to connect with Bombshell and e, hearing their words on the topics at hand, and also really appreciated their patience with me as he stumbled thru my sentences with them.

To learn more about them you can follow them on Twitter, where they post active updates, news, and analysis @IAF_FAI
or go to their website iaf-fai.org where they post more in depth articles about Indigenous struggle all around the world.

If you do the Twitter follows, just note that there is an active fake account that is attempting to badmouth and discredit the work of the IAF, and this account has the handle @fai-mujer; their interventions have been confusing to followers of the IAF in the past. To see a full account of this situation, plus of course many more topics that are like not about internet trolls but are about the work, you can visit them at iaf-fai.org! To learn more about the Convergence, to register, and for tips for outsider participation, you can visit taalahooghan.org.

If in listening to this you are curious about whose land you were born on or live on, a fantastic resource for this is native-land.ca which provides a world wide map, insofar as it’s possible, of indigenous lands and the names of their people spanning thousands of miles.

For more great interviews with members of IAF, including words from Bad Salish Girl and Green City:

Rev Left Radio

Coffee With Comrades

A list of recommendations from B and e:

-Do some digging and research to find a bunch of recent authors who have done the work to center Indigenaity and decolonization,

-read the complete works of Cutcha Risling Baldy on Decolonized and Indigenous Feminism,

-Talk to and listen to Indigenous people, do the necessary research to not ask folks to perform unnecessary emotional labor.

Books:

Open Veins of Latin America by Eduardo Galeano (en Espanol Las Venas Abiertas de América Latina)

Indigenous Peoples History of the United States by Roxane Dunbar-Ortiz

1491: New Revelations of the Americas Before Columbus by Charles C. Mann

Our History Is The Future by Nick Estes

500 Years of Indigenous Resistance by Gord Hill

Some good podcasts, recommended by William of TFS, from Indigenous folks, while not being politically anarchist identified are good to listen to!

All My Relations by Matika Wilbur and Adrienne Keene

While Indigenous by the NDN Collective

Stay tuned next week for an interview with Kanahus Manuel, a Secwepemc woman fighting a pipeline thru her lands in so called BC!

CIMA Speaks about ICE Raids

But first up Bursts spoke with Bruno Hinojosa Ruiz of the local immigrants advocacy group, CIMA, about the threatened raids by ICE and  CPB, ways for folks to get plugged in wherever they are with defending  their communities and helping those most targeted and strengthening our  bonds. More about CIMA can be found online by searching C I M A W N C on  facebook or at their site cimawnc.org. After the conversation,  Bursts learned that there’s a wiki page that’s compiling ICE offices and companies profiting from Immigrations police and Border Patrol. That  wiki can be found and added to at https://trackingice.com/wiki/Main_Page

Rest In Power, Willem

In related news to the ramping up of ICE repression of people around the so-called US, protests, sit-ins and sabotages of profiteers have been on the rise. Much of this can be tracked by visiting https://itsgoingdown.org/closethecamps/. Of note, in Asheville someone claimed responsibility for damaging an atm owned by PNC and claiming it anonymously on IGD. Also, on Saturday, July 13th, a 69 year old, northwest anarchist named Willem Van Spronsen was gunned down by authorities outside of the North West Detention Center in Tacoma, WA while attempting to destroy buses used by GEO group to transport detainees to and from the center. Van Spronsen was allegedly armed with a rifle and  was attempting to arson the buses when pigs opened fire and ended his life. There’s a statement by a local group focused on shutting down the facility, La Resistencia, up on fedbook and linked in our show notes. We’re sorry to lose you,  comrade and mourn your loss, but are inspired by your motivation.

. … . ..

Music for this episode:

Affinity by Shining Soul off of We Got This

Look of Pain by Soul Position