Incorruptible Mass
Incorruptible Mass
Data Privacy
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Please donate to the show!
This week, Incorruptible Mass is taking a look at the state of your data privacy. We'll talk with Kade Crockford, the director of the Technology for Liberty Program at the ACLU of Massachusetts, about what corporations are doing with your data, how the issue has been supercharged by ICE, and what data privacy rights you really still have left.
You’re listening to Incorruptible Mass. Our goal is to help people transform state politics: we investigate why it’s so broken, imagine what we could have here in MA if we fixed it, and report on how you can get involved.
To stay informed:
- Subscribe to our YouTube channel at https://www.youtube.com/@theincorruptibles6939
- Subscribe to the podcast at https://incorruptible-mass.buzzsprout.com/
- Sign up to get updates at http://ww12.incorruptiblemass.org/podcast?usid=18&utid=30927978072
- Donate to the show at https://secure.actblue.com/donate/impodcast
ANNA
Hello and welcome to Incorruptible Mass. Our mission here is to help us all transform state politics because we know that we could have a state and state legislature that truly represents the needs of the 7 million people who live here, and that's what we all want. So today we have a very exciting guest. We are going to be talking about data privacy. We have some bills that are in the Senate and the House of Massachusetts. We're going to talk about the status of those bills. We will talk a lot about enforcement and how much enforcement matters, even if those bills pass. We will also talk about the history of how those bills were sort of came out of thinking at ACLU. I almost said UCLA, but ACLU, Massachusetts. And then at the end, we're really going to talk about strategy, because I think this is an example of an amazing strategy for getting bills to pass through our notoriously don't-pass-anything, do-nothing state House and State Legislature. So before we do, I am going to have my illustrious co-hosts introduce themselves and I will start today with Jordan.
JORDAN
My name is Jordan Berg Powers, I use he/him, and I am a proud member of the ACLU and have been for, oh God, I'm going to date myself many years, over 30 years and just really appreciate the work and the support And the thinking about all this rights that help everyone, right? Like we always, it's really important to remember that this work is about everyone. And so just really thankful for this conversation today.
ANNA
And Jonathan?
JONATHAN
I was just going to joke, Jordan, you're going to be like, you've been a member for just a few years as you are 25. But Jonathan Cohn, he/his. I'm currently walking in Newton on my way back to Boston, but joining from the South End in Boston typically. And I've been active in progressive political campaigns for over a decade now. Also always love to support the ACLU. Ah, handing the over banana.
ANNA
But I'll say that the sky, the clouds behind you look like a painting, and the trees behind you look like the Blair Witch Project. Fantastic. I am Anna Callahan. She/her, coming at you from Medford, where I'm a city councilor and really have done a lot of work at the local level, but across the country, training groups to elect true representatives of the people, those kinds of things. And I also am super excited about the ACLU. I give a giant donation to the ACLU after I was arrested and thrown in jail for three days, and then we won a lawsuit against the LA police. With some help from the ACLU. So love the ACLU, fantastic work. And now the moment you have all been waiting for, we are super happy to introduce Kade Crockford. And if you can talk a little bit about yourself and the work that the ACLU does before we dive into data privacy, that'd be great. Thanks. Sure.
KADE
My name is Kade. As you said, I work at the ACLU of Massachusetts doing non-litigation policy reforms, so mostly state, local lobbying and campaigns on issues related to technology and the criminal legal system. So historically that's meant a lot of privacy work. You may know us from our facial recognition campaign. We launched a campaign in 2019 to try to bring some democratic controls over government use of facial recognition technologies. And we had a lot of success. There are currently nine or 10 bans on government use of facial recognition at the local level across Massachusetts because of that campaign. And we also passed one of the country's first state laws imposing some regulations on police use of facial recognition. So yeah, the ACLU does a tremendous amount of other work though. You've probably seen us working on LGBTQ rights issues, abortion rights, immigration rights, immigrant rights, rather. We work on free speech, obviously, and freedom of expression. Currently, a fun case that we're engaged in in Quincy is my colleagues are suing the city of Quincy because the mayor wanted to put up some very religious statues outside the new police and EMS station in Quincy. So there are a number of Quincy residents who didn't like their tax dollars being spent on something overtly religious like that. So anyway, we do a lot of different work on a lot of different issues, criminal law reform, police reform, really run the gamut in terms of defending people's basic civil rights and civil liberties here in Massachusetts and across the country.
ANNA
Amazing. Such great work. So before we get into the bills themselves and the status of the bills, I would love it if you and feel free to chime in, my pals here, can give us a little bit of an understanding of why data privacy matters and what currently, without any new legislation, What can companies and other organizations do with our private data?
KADE
So we have a real crisis in this country, which is, well, a lot of crises, obviously, but one of them- Only one?
JONATHAN
One of them is- One mega crisis.
KADE
That's right. One of them is that despite the fact that technology has advanced considerably over the past 20 years, including consumer-facing technologies like smartphones, which didn't exist, 20 years ago, and the use of internet technologies to do pretty much everything to facilitate almost every aspect of our lives. None of that was the case 20 years ago either. In this 20 year period, there's been this explosive growth in digital technologies, which almost exclusively involve some level of surveillance, corporate surveillance, companies collecting detailed information about us, The companies that are the most profitable in Silicon Valley, Facebook and Google, have made almost all of their money from surveillance. They provide us with services and products, yes. People like Instagram and Reels and things of that nature. People enjoy Google Maps and Google Reviews. Maybe not so much Google itself anymore because it really sucks shit actually. But there are Many of these products are useful to us, no question. But what's happening under the surface is that these companies are essentially the product. We are providing Mark Zuckerberg and the folks at Google with huge quantities of information about what we do, where we go, who we associate with. And all of this information is packaged by these companies. And resold to other companies in the form of advertisements, basically. So, you know, Facebook and Meta, Google and all of their those companies, they basically make all of their money from advertising. And almost all of that advertising is based off of surveillance of consumers like us. So that's a huge problem. It causes a lot of really unfortunate incentives in the technology industry. And it also leads to very, very serious problems for ordinary people. So I can get into some of those too, if you're interested.
JORDAN
Yeah, I just want to say that, I think just to put a little pin on it, if you imagine a world where somebody knew everywhere you went and could probably have some access to some of the purchases you made and some of the things you did, that's really valuable. And that's the current landscape we're in without any sort of regulation opting in, opting out, any sort of things that you would think of as like, You would really not want to share that maybe with every single person and every single thing. And it's especially dangerous in a world where, you know, it'd be one thing, it would be sort of innocuous but annoying if it was just like corporations trying to sell us things. But the other world that this also intersects with is a world where ICE is using that to target people, where people are getting targeted for going to a clinic and getting gender affirming care or going and getting the of adaptive services. So, you know, I think as somebody who lives around the corner from a Planned Parenthood, like I imagine a world where like I'm always at Planned Parenthood, right? Like it's just and like then I become a target for some reason, right? Like there's just and currently someone could actually just easily find that. They could easily buy that. They could easily find that and they could easily track my individual where I'm going and what I'm doing. And that's true both surprisingly for regular people and also for elected officials and important people. And it's just wild that like you could track a secret service person, you could track a state rep. Like we should, it's just too, it's just like there just needs some common sense oversight of this stuff.
KADE
Yeah, that's exactly right. And, you know, I got distracted by my own spiel, but, you know, the first part of the spiel was technology has advanced rapidly. The second part is the law has not. Right. So, so unfortunately, we do not have a consumer privacy law for the 21st century in Massachusetts or at the federal level. So Massachusetts residents are totally unprotected. The only thing that protects us right now in terms of how tech companies engage with our, you know, collecting our data and manipulating it and stuff is our Consumer Protection Statute 93, which essentially just says companies can't outright lie to us about what they're doing with our information. So that's it. You know, as long as they have a 45,000 page, you know, type two font, policy, someone on their website that says, you know, here are the things that we may or may not feel like doing with your information, and they don't explicitly lie, then it's sort of anything goes. And, and that's obviously an unsustainable situation for a democratic society. Yep.
ANNA
So I think, oh, Jonathan.
JONATHAN
Oh, I just wanted to demonstrate as somebody traveling during this conversation to illustrate that. Nobody needs to know where I started during this podcast or where I end up at the end of it other than me and those of you are also talking to me right now.
ANNA
And everyone listening to this podcast. So I'm going to go ahead and turn us to the actual bills. And if you don't mind, just let us know what those are in brief and what the status is in each of the two houses.
KADE
Sure. So you're probably familiar with the notice and choice model of privacy law. And that really comes first from Europe, the General Data Protection Regulation, which was passed over 10 years ago at this point. And basically that says for the most part, you know, companies need to give users a choice, right? About whether you'll accept tracking cookies and all these other things. But as anybody who's ever used the internet knows, that shit is extremely annoying. Like people are just like click, click, click to try to get past the pop up screen so that they can access the content that they want to see. And the latest evolution of privacy law thinking says, okay, maybe that's not the right approach for for how we should think about the core privacy framework. And instead, why don't we think about it in the way that we think about let's pretend this, we're having this conversation three years ago, food safety, right? So food safety, when you go to the supermarket, you don't have to read the fine print on a bottle of milk to make sure that it's not going to kill you and your family when you drink it, right? You simply go to the grocery store, you buy the milk, you take it home, and then you eat it. And that is because we have standards in the law that regulate Food safety, they just apply to all companies. If you sell food, you can't kill people, right? You have to make sure that it's safe. That's in the law. Okay. So crazy idea. Yeah, right. So again, let's pretend it's pre RFK Jr. So we think that's just a joke.
JONATHAN
If the milk is edible as opposed to drinkable, it might be a problem.
KADE
Fair enough. So you drink the milk. In any case, we think the same sort of sort of system ought to apply for data privacy, which is to say, you don't have to be a lawyer, you don't have to be a privacy lawyer, you don't have to be a nerd who spends all of their free time reading every single policy just to decide which services you'll use and which you won't, you don't have to click through all those things and understand your privacy choices and things like that. There are simply standards in the law that limit how companies can collect and process our data. So that framework is called data minimization, and that is basically the core of these data privacy bills. So the Senate in September passed legislation that we were very happy with that achieved, I would say two out of three of our coalition's primary goals. The first was strong data minimization language. Check. That's in there. The second was a ban on the sale of especially sensitive data. So that is precise geolocation information from our cell phones. That is biometric data, information about our physical characteristics that's unique to us. Hold on one second. It's information about our health. We go online and we search, what are the symptoms of gonorrhea? We'd probably prefer if data brokers were not selling that information to the highest bidder, right? So the first category is data minimization. The second is a ban on the sale of sensitive data. And then the third, which the Senate did not hit. So we got those two pieces in the Senate bill. Really, really strong, really good. The third is critical. That's enforcement, right? How are we going to ensure that these companies that are engaged in the buying and selling and manipulation of our personal information are not violating the law? Well, historically, when the legislature has wanted to wanted companies to follow a law that they pass, they create what's called a private right of action. And that means if you as a consumer see your rights violated by a big tech company like Google or Microsoft or something like that, you can get a lawyer and personally sue them. You can join up with other consumers in a class action and sue that company and say, You violated the law. I want some money, right? And the reason that that's so important is because it's a real deterrent. It's a real effective enforcement mechanism. And unfortunately, we did not get that in the Senate bill. So in the Senate bill, they said the only entity that can enforce the law on behalf of Massachusetts consumers is the Attorney General's office. And we can get into in a second more about why that's a problem. But anyway, the Senate bill accomplished two out of three of our coalition's core requests. Number one, was strong data minimization language. Number two, a ban on the sale of sensitive data. We did not get the private right of action. Now we turn to the House. The action's now in the House. The Committee of Jurisdiction put out a strong bill. It is now at House Ways and Means. That bill, again, has very good data minimization language, came close on the ban on the sale of sensitive data. They took a slightly different approach than the Senate did. They entirely banned the sale of precise geolocation data, but stopped short of banning the sale of other sensitive data. Instead, they said, you have to opt in as a consumer to the sale of your personal sensitive data. And then they sort of split the difference on the private right of action. They said, and this could get very weedy, very fast, but I'll just suffice to say Suffice it to say, they tied a person's ability to sue under the law to one, the type of company that committed the violation. So in essence, they're saying the biggest actors, Google, Facebook, companies like that, you can directly sue. And then number two, they actually tied that enforcement to existing consumer protection law, 93A. Which, so, so there's a private right of action under 93A. And so the House bill says you can sue under 93A, under that private right of action. We have some issues with that, which we can get into. But anyway, that's where things are right now. The Senate's passed a bill. The House, the bill is currently in ways and means. You know, it's very much actively being looked at. We suspect that it is, that something is going to happen. Something is going to pass. There's been a lot of work that folks have put into this in the building. And so we don't think that this is going to be one that they leave on the cutting room floor, but of course, you never know. So that, you know, we're actively organizing, not just to ensure that the bill gets those things right, but also that they actually do it, because there are obviously competing demands.
JORDAN
And I think it's important for folks to think this isn't a problem that's sort of in the future. This is a problem now. This is a problem now. As I said earlier, people are getting tracked now. ICE is actively-- you can download an app if you're an ICE agent that tracks people right now. There are tracking software. You can Google it and purchase tracking data at this very moment. It would take you the length of time it takes to listen to this podcast to get set up and start finding people their collection data. It's a, unfortunately, very low barrier to entry. And so this is happening, like this is real. And to some extent, we know that some people who have targeted elected officials have used this data as a part of their way to target elected officials. So this isn't some far off, you should take your time, we can work it out, we can wait a couple of more sessions to make sure we get, like, this is it. We're in the stuff. And so there's a real urgency. That I think is both helping move this forward. I share your sort of hope that the legislature will pick this up because of that. And also we, as people listening and people who are advocating, need to understand that this is not a far off problem, which I will say when I first came across a lot of these issues, it was like, oh, some people are using it, but now this is real right now. And so, yeah, I just think it's important to note that.
KADE
Yeah, that's exactly right. There's recent reporting from 404 media, which I encourage everybody to subscribe to. Great independent media outlet that covers tech issues. And they recently posted a story about how ICE has this new tool built off of commercial location data that allows them to click a box on, basically draw a box on any map of any community in the United States, immediately identify all the cell phones currently within that that zone and then click on any of them and follow those people as they move around a community. So this is an urgent issue right now in the United States.
JONATHAN
Kate, one thing I'd be curious to hear from this is, so how does what the Massachusetts legislature is working on, kind of compare or contrast to what has happened in other states? Either things other states have already done or are hoping to do?
KADE
Yeah, great question. Hoping that we land in a very similar place to what Maryland has instituted. Maryland has probably the strongest consumer privacy law in the country. It accomplishes most of the goals that we're setting out to accomplish, has strong data minimization language, bans the sale of sensitive data. Unfortunately, they did not get a private right of action. So we're hoping to one up Maryland in that sense by imposing very similar protections, but including stronger enforcement language. There are many other approaches. You know, one of the things that that the tech industry has realized over the past five or so years is that they can basically rush into a state, lobby the hell out of lawmakers who maybe don't really understand a lot of these issues or hadn't, you know, heard from from from folks on the other side as much and kind of rush through a really shitty law that they call a privacy law that basically gives nobody any new rights. So they've done this in a number of states, including Connecticut, unfortunately. And so when the industry's lobbying in Massachusetts, they'll say, oh, well, we should do what Connecticut has done. Well, what Connecticut has done, it was entirely at the behest of the tech industry. And even officials there now have realized, whoops, we made a mistake, right? This is not right. This is not actually protective of our residents. So there are a lot of different approaches. California's law is pretty good. The Maryland law is really good, but there are a number of states like Virginia and Connecticut that have passed really, really bad privacy laws.
JONATHAN
I think one thing I just want to underscore that's so important is having advocates like UK who are specialized on specifically on technology policy, because there's such a tendency, I feel like on any governmental body, to treat industry folks as having the best expertise on an area that because they are they work at Facebook, Google, they understand tech policy, so we should listen to them about data privacy when that's like asking the foxes about how best to guard the henhouse. And that need for that strong like technology informed kind of civil liberties advocacy work. So shout out to all of that.
KADE
Absolutely.
ANNA
Before we go on, so reminder, we are going to talk about enforcement, how important it is. We're going to talk about the history, but we're really going to have an amazing conversation about the strategy that they used and how to get things passed through the state house. I'm looking forward to that. Before we do that, I'm going to mention that we have a little link below. We love it when people donate to the show. It doesn't pay any of us, but it does pay our young people who help us with the video editing and the audio editing, who help us get the word out by social media, who help us with our graphics. And all of that stuff, you really help ensure that this kind of information, which you do not hear anywhere else, can get to as many people as possible. So we appreciate our donors so much. Thank you. And now I would love for us to talk about enforcement and what you talked a little bit about it, but I just want to have a little bit of a deeper discussion about what are the, the possibilities, what do you think will happen if They, they're, they don't pass any information, like what will it look like if that doesn't happen?
KADE
Yeah. So the industry's number one goal is to make sure that there is no private right of action in this legislation. That they're very clear about that. They are gunning for the, the, the private right of action. And that's for a very simple reason. Because if there's no private right of action, the biggest tech companies in the world can ignore this law. They do not have to follow the law. Uh, and that's for a very simple reason. They will make a dollars and cents business calculation that it is more profitable for them and for their shareholders to continue to ignore the law and do whatever they want with our personal information than it is to face maybe once every 10 or 20 years, a settlement from the Attorney General's office for, I don't know, even 500 million dollars or even a billion dollars, right? That's a very simple business calculation for them. On the other hand, if there's a private right of action, that's when you see potential for real pain at these companies if they don't follow the law. And that's why they don't want it. They don't want to have to deal with this and to comply with it. And so they, you know, they're willing to take the essentially slap on the wrist speeding ticket. That, you know, the AG suing them every 10 or 20 years would amount to. Whereas with privacy laws that have teeth, for example, the Illinois Biometric Information Privacy Act is a statute they passed in Illinois over 10 years ago, has a private right of action and says, if you Facebook collects someone's biometric data without their consent, that person can sue you. And As a matter of fact, it has been very effective. People have sued companies like Facebook, and that's why in Illinois you have significant biometric privacy rights that you really don't have in parts of the country that are not covered by laws with robust enforcement. So, yeah, I mean, you know, look at it this way. We don't have a murder statute on the books that says only Andrea Campbell can put somebody in prison for murder, right? That would be ridiculous.
JORDAN
Yeah.
KADE
We don't, you know, and the reason for that is because we would like to deter people from murdering each other, you know? And so, so, you know, there's, there are many entities. There are police all over the state that can arrest you for murdering someone and DAs all over the state that can put you in jail for that. And so, you know, the, the industry's argument on the PRA is, oh, that's just a gift to the plaintiff's bar. You're just trying to, you know, give, give away something to all these lawyers who want to get rich. Well, I'm sorry. Do we talk about any other type of enforcement that way for the law? Do we say, oh, we only have criminal laws because we want DAs to have power? No, we have criminal laws because we don't want people to murder each other. Right. So, you know, the, the, the, the, the enforcement is a really critical issue and We would encourage folks who care about this to reach out to their lawmakers, especially on the House side right now, and stress that, you know, you want to see a strong privacy bill enforced by a strong private right of action so that your rights are actually meaningful.
JORDAN
I just think it's a go ahead.
JONATHAN
Oh, it's so wild for them to be like, it's just these attorneys wanting to make money when it's like these companies are the ones making money. They're the ones profiting by buying and selling the Well, also, I guess.
JORDAN
I'll say also, like you wouldn't, they're also admitting that they're criminals. Like it's sort of like saying, like what, you know, so like you, nobody could sue if they just followed the law, but they're like, we intend to never follow this law. And if you do this private right of action, someone else will get rich off of it. Right? Like, because that's the omission. Otherwise you're, because you're like, it's sort of funny because they're admitting we plan to never follow this law.
KADE
Exactly. And there are already, I mean, look, they talk about frivolous litigation. There are already mechanisms whereby courts can punish people for filing frivolous lawsuits. Like that is just not a legitimate argument at all. Yeah.
JORDAN
And we have strong, and to be clear, we actually have strong legislation in Massachusetts against frivolous lawsuits. Like you, the person bringing it would have to pay, like you, it's not, meta will not be worth, will not have to like search a lawyer who couldn't make somebody punished for like, this would be easy for them to do. Up do that things if it, if in fact that's what happens, right? So that's, that's just, it's a real, it's a real admission that their plan is to ignore it unless there's some sort of, mechanism. And I think the other piece about, I think the other piece that's frustrating about the way the legislature thinks about it and frankly, the way the media talks about these issues is they treat people who are advocating for protections for all of us the same as the criminals. Right? So like, and like, like as if they're equal and you have to balance them out, right? And so the media will quote the criminal and it'll quote like a, like the people who are trying to protect everyone. And so to take the thin scenario, it's sort of like asking the bank robber, like, well, we had to liberate the money from the bank, like, and, you know, and like keeping it up, you know, and putting that opposite people who are trying to put in protections to protect people from like not getting shot at a bank robbery. Right. Or having bank, right? So it's like, it's like a really troubling way. The media both sides this and the way the legislature both sides this. These are people who are currently making money taking our data and doing whatever they want with it. And if you think that's wrong, asking them how to fix it is really dumb. I agree.
KADE
I agree. And, you know, I think, Jordan, the, the analogy that I always think of is, You know, right now we're having arguments that people in the 1970s were having about industrial pollution, right?
JONATHAN
Yeah.
KADE
It's like in the 1970s, you know, God bless Richard Nixon. He created the EPA. That was a different Republican party, right?
JORDAN
Yeah.
KADE
But, you know, the creation of the EPA was a really water.
ANNA
Should God bless Ralph Nader, please.
KADE
Sure. I was joking, but yeah, but that was a really different moment in this country. But the dynamic is very similar. You know, you have an industry. That profits off of harm, right? They are causing harm and they are making a tremendous amount of money doing that. And they're essentially saying to lawmakers, we want to continue to profit by hurting people. And don't you dare interrupt our profit line, right? Like line must go up and like line goes up when we can dump industrial waste into the Charles River. Line goes up when we can sell this, the personal, you know, geolocation data of every cell phone user. And how dare you, you know, interrupt our, our moneymaking scheme here. It's very similar. It's very similar playbook. It's a very similar dynamic. And we're hopeful that Massachusetts lawmakers will see it similarly and see that the public interest needs to be paramount here and needs to, you know, come first and not personal profit for these companies.
ANNA
Great. All right. I now, because I know we only have a few minutes left, I want to make sure that we get to this, because it's my favorite part of the whole conversation. Please tell us the history of where this data privacy stuff came from in terms of the way that ACLU thought about it and what your strategy was.
KADE
Yeah. So after the Dobbs decision came out, actually prior to Dobbs, because we all knew it was coming, we were doing a lot of thinking inside the ACLU about.
ANNA
And I know our listeners are super, super political, but like Dobbs decision to remind us.
JONATHAN
Yes.
KADE
Yeah. Dobbs overturned Roe v. Wade. So, you know, overturning the court precedent from the 1970s that said that states can't ban abortion, basically. So, that opened the floodgates to all of these laws in places like Texas and many other states banning abortion. And we thought, okay, in Massachusetts, we have spent with our coalition partners, including Progressive Mass, the past two legislative sessions, shoring up abortion rights, ensuring that providers are protected here, you know, doing something new, which was providing state funding for abortion, funds, which we had never done before. So a lot of really important work went into the prior two legislative sessions shoring up abortion protections here. Great. What are we going to do to protect providers digital privacy and the digital privacy of people who are coming to Massachusetts to benefit from those really progressive abortion laws. And so we thought in crisis, there's opportunity, right? We may not have, prior to Dobbs, been able to convince lawmakers in Massachusetts to ban the sale of cell phone location data. But after the Dobbs decision, we were aware that all it would take for the attorney general in Texas to figure out not just one person, but every person who left the state of Texas and came to a Planned Parenthood clinic in Massachusetts is for them to go on a data broker's website and buy their cell phone location data, thereby downloading a list of every single person who left Texas to come here to seek protected healthcare. So we realized that that was an opportunity actually. It was a huge political opportunity. The other thing that we thought about. Crucially is that, you know, when I talked about data minimization earlier, it's possible that some of you stopped listening, and that's because it's extremely fucking boring. Excuse my speech. It's really important, but it's also extremely boring. And it's really difficult to get people emotionally invested in something like data minimization, right? However, it is not difficult to get people emotionally connected to the idea that Elon Musk and every other billionaire douchebag should not be able to buy data showing everywhere you go, everywhere you've ever been, everywhere your children go, everywhere they've ever been. And sell that information on the open market to whoever wants to buy it. Anti-abortion zealots, people who intend to do political violence against political leaders, bounty hunters, you know, folks who are chasing down immigrants like ICE. That's a really, like domestic abusers. Sorry.
JONATHAN
Oh, I was just trying to explain to you, like domestic abusers.
KADE
Domestic abusers, exactly. So it's not a difficult connection for people to make. People really get that instinctively. So we thought, okay, instead of trying to fight for an omnibus comprehensive digital privacy bill that is has so many moving parts and is so difficult to message, frankly, why don't we try to build political power and a coalition around this core demand around what I view as the most sensitive type of data, this geolocation data that shows, again, not one thing about us, but everything about us. And not just about one person, but about every person. So that's what we did. So we built this campaign focused on the Location Shield Act, legislation that would ban the sale of precise geolocation data for phones that are physically present in Massachusetts. And it's been really successful, so successful actually that as an advocate, I've experienced something for the first time, which is lawmakers saying to us this session, Okay, we see that you guys want to ban the sale of cell phone location data, and that's fine. We're going to do that. But we also think we should deal with this whole other range of digital privacy issues. So we're going to do a lot more than what you asked us to do, which frankly was completely shocking to us as advocates and a really welcome surprise. Fantastic. That is so awesome. But yeah, I think, you know, I think to your point though, like what it shows is that, you know, there's this famous quote from this 19th century architect who I always forget his name, but he was involved in the construction of the of the Chicago World's Fair. And in conversations with his colleagues, he said, small plans don't move men's hearts, right? And that's sort of how we thought about our facial recognition campaign too. It's why we were gunning to ban facial recognition at the outset of the campaign. Because when you talk about, you know, sort of nerdy in the weeds regulations, it's really easy to lose people. When you have a clear demand that emotionally resonates with people, it's much easier to build a powerful coalition and the political power that you need to to accomplish something on Beacon Hill. Amazing.
ANNA
And I am going to say, I know you have to leave. And I'm sorry that I didn't quite time this as perfectly as I was hoping, but I would love to just open it up to the Peanut Gallery, as usual. And as long as you're welcome to stay as long as you are able. But really talking about this strategy, because to me, it's like a fascinating strategy to just look at what is happening right now and think about how can we use this to dream big? Number one, which all we are always talking about, right? I completely agree with you that when we take something small, and people are always, you know, they'll, they'll be this, you know, group of people that's talking to the other side, and they let it get whittled down, whatever topic it is, right? It'll get whittled down, it'll get whittled down, it'll get whittled down. And then the thing that they put up is like, why would anyone even care? Because it's so small at this point that you're not gonna have the public behind.
KADE
Behind you.
ANNA
So the idea of, like, looking at the landscape right now of what do people care about right now, what are they concerned about? And after Dobbs, it was like, oh, holy crap. Like, these reproductive rights that we have had for decades are suddenly. We assume that they would be there forever, and they are suddenly gone. Like, and now we're offering something amazing as a state to people outside of our state, which is that they could come here and get the medical care they need. But those people are now in danger because of their data privacy. So that was such a great insight to see that, to understand that as something deeply important to people and to really use that to get our do-nothing legislature to actually do something finally. And I would love to talk about, you know, that as a general strategy as well as just policies that we could maybe use that kind of thinking on. I would love to hear your thoughts.
JORDAN
Yeah.
KADE
I mean, basically, like, we, you have to meet people where they are, right? It's a core organizing principle, and that is true of ordinary people that you're trying to bring into your coalition to build political power. It's also true of legislators, you know, they are, they are responsive to what they're hearing from folks in their communities. And when they are hearing, we're so concerned about Texas banning abortion, but we feel powerless because we live in Massachusetts, well, you better figure out a way to offer them something. And so, you know, that's sort of how we thought about it was how can we give lawmakers something they can be proud of when they go home to their constituents and talk about their accomplishments and accomplish something really meaningful? Because it's not enough to just say, oh, we're very concerned about what's happening in other parts of the country. Let's do something about it, right? And when you can sort of, you know, figure out a policy solution to a really thorny problem that meets people where they are and answers the kind of like bigger questions that people are raising about the political moment that we're in, I think that's really the sweet spot.
JORDAN
Yeah, I'll just agree really quickly and say that's the organizer's dream is to sort of take an opportunity that the moment is giving us. And the important thing is to have really thoughtful big things. So that's why it's so important for us to dream big and to have these, right? Like, it's not that you just were like, oh, what are we going to do? And you're just trying to wing it. Like, this is something that the ACLU that you have been working on for some time to put some thought into. There's a thoughtful piece of legislation. It wasn't something that's sort of overnight. And so it's the marriage of that really good research, that really good thoughtfulness, that dreaming big with the moment. And I think understanding both that you need both and that also that you need to seize that moment. And I'll just end with, like, let's not, assume it will get done because things are so bad. That's been a mistake that advocates in Massachusetts have made more times than I can count. This only gets done if the people listening take the two minutes to send a text, make a phone call. I always tell people, if you don't want to talk to your state legislator, call them after five, leave a message. It's really effective. You don't have to talk to them and you can just say what you want. You're not gonna get interrupted. You're not gonna have, like, you can, like, like, and it just takes two minutes. Like, I really think this is important legislation based on what's happening in the world. I think we need data privacy. I think it needs real enforcement. Please pass it. I'm done. Right. So it will only get done if you take what you're hearing and you do that extra step. But we have an opportunity because of the great thinking and great sort of taking this moment that happens with organizing, but it will take us. It'll take people listening to make that happen.
KADE
Yeah, that's exactly right, Jordan. And, and. And I'll just say also, you know, I've been shocked this session by how many house members have told me personally, we have been getting more calls about data privacy than any other issue, which is genuinely remarkable because we're living in a time when, you know, everything costs more. Transportation is still a major issue for families. Healthcare is a major issue for families. There are kitchen table bread and butter issues that are very seriously affecting people in our Commonwealth. And yet the number one thing they're hearing about on Beacon Hill is data privacy. That is a testament to, honestly, John Cohn and the organizers at Progressive Mass, the huge coalition that we've built, and more than anything, the number of people, ordinary people across the state who have picked up the phone, like you said, Jordan, and called their reps to say, I want you to do this, and the fight's not over. We are at a critical moment right now where questions like, will there be meaningful enforcement that will actually deliver me these rights? That really hangs in the balance. So if you care about this, if you want to see ICE unable to access location data about Massachusetts people, call your rep and say, not only do I want strong privacy legislation, I want a private right of action. I want the ability to be able to sue if these companies violate my rights.
JONATHAN
I just wanted to underscore the importance of that because noting that like the money and big tech lobbyists are definitely contacting state legislators all the time and the importance of them hearing from you one, a line, slightly adapting a line that I use based on commentary I got the other day. When one and like you calling and a few people, other people calling can make a difference that if one person calls a legislative office, maybe that's a personal hangout. Two people call, it's a couple, that person got their, right, three people called and they realized it's actually a community health issue. So that the value of calling, getting friends to call and following up, if you don't get an answer.
KADE
Wonderful.
ANNA
Thank you so, so much. The work that you're doing is incredible. We really appreciate it. Is there any way specifically that you can tell our listeners to get involved, aside from, you know, the general advice.
KADE
We just yeah, I mean, you should get involved with Progressive Mass if you're not already. Go to Jonathan's phone banks. And if you don't have time for that, just make a single phone call to your state rep and say, I want strong data privacy and I want strong enforcement with a private right of action. Thank you guys so much.
ANNA
Thank you so much. Wonderful to see you. Thanks for being on. Thank you, as always, to all our listeners.