Strong Feelings

Design for Safety with Eva PenzeyMoog

Episode Summary

We’ve all heard about unethical tech products that track and surveil users. But there’s another kind of harm happening in tech: abusers co-opting apps and other digital products to control and hurt their victims. Eva PenzeyMoog explains this growing problem—and shows us how to fight back. Content warning: This episode features discussions and specific anecdotes of tech-enabled abuse and interpersonal harm, including domestic violence.

Episode Notes

We’ve all heard about unethical tech products that track and surveil users. But there’s another kind of harm happening in tech: abusers co-opting apps and other digital products to control and hurt their victims. Eva PenzeyMoog explains this growing problem—and shows us how to fight back. 

Content warning: This episode features discussions and specific anecdotes of tech-enabled abuse and interpersonal harm, including domestic violence. 

Eva PenzeyMoog is the founder of The Inclusive Safety Project and author of the new book Design for Safety. Through her work as a tech safety consultant and designer, Eva helps people in tech design products with the safety of our most vulnerable populations in mind.

In terms of trying to talk about this stuff at work, or just with other people who work in tech, it was honestly kind of awkward because this isn't a topic that people like to think about. I talk a lot about domestic violence, there are other ways that this happens. There's issues of child abuse, and elder abuse, and things like unethical surveillance of employees and workers. But domestic violence is the one that I focus on. And just bringing that up, kind of out of nowhere, during a brainstorming meeting, it's kind of weird. And now, you know, my team is very used to it. And they're all really on board and are actually helping with some of this work, which is great, but at first, I think people just aren't used to saying, "Hmm, what about someone going through domestic violence?" It's kind of like, "Wait, that's dark. Do we really need to talk about that?" And yeah, actually, we really do. 

—Eva PenzeyMoog, author of Design for Safety

We talk about:

Plus: in this week’s You’ve Got This, Sara discusses how feelings of powerlessness can lead us to look for things we can control. This can often manifest in some toxic workplace behaviors: micromanaging, inability to delegate, obsessing over data. If you, like so many of us, feel these behaviors creeping in, look for places where you can assert control over things that you can actually take ownership of: set a regular hour for a walk each day, institute “no Zoom Thursdays,” schedule a shutoff time during weekdays. And if you find something that works great for you, send us a message. We’d love to hear about it! For all this and more, head on over to https://www.activevoicehq.com/podcast.

 

Links:

Episode Transcription

Eva PenzeyMoog 0:00 I think people just aren't used to saying, like, "Hmm, what about someone going through domestic violence?" It's kind of like, "Wait, what? That's dark. Do we really need to talk about that?" And it's like, "Yeah, actually, we really do." 

Sara Wachter-Boettcher 0:22 Hello, and welcome to the brand-new fall season of Strong Feelings—the podcast all about the messy world of being a human at work. I’m your host, Sara Wachter-Boettcher, and I’m so happy to be back recording today. I just got home from vacation, where I went to the beach every day for a week, and got to spend a ton of time catching up with one of my oldest friends. Shout out to Andrea. It was a dream. If you haven’t had time off lately, and you possibly can take some, please do it. This rollercoaster of a summer has been rough—I’m looking at you, Delta, and Texas, and Ida, and, well, the list is really long. But the thing is, I know a lot of people feel guilty taking breaks or having fun when there’s so much to be outraged about and to organize against. And I want you to trust me: you need it, you deserve it, no matter what. Your body and your soul are going to thank you. In fact, taking rest, and the importance of doing that even when you’re working on something really crucial, that comes up in today’s episode, featuring none other than Eva PenzeyMoog, author of the brand-new book, Design for Safety. I’ll warn you up front: Eva’s work focuses on tech-enabled abuse and interpersonal harm, so we talk a lot about things like domestic violence, including some specific anecdotes. The stories aren’t graphic, but please take care of yourself. 

1:36 So, back to Eva. I first met her in, I think 2019, when we were at a conference called UX Copenhagen. You know, when we used to go to in-person events? Yeah. So I was facilitating a workshop on design and inclusion, and at some point, Eva told me about the work she’d been doing on tech-enabled domestic violence. I was rapt. I was like, yes, I need to hear more about this. Well, my luck: after the conference, it turned out that we actually both had reservations for the same fancy restaurant at the same time. We were both seated at their community table, and that means we got to spend a ton more time talking about her work, while eating a thousand tiny beautiful courses of food, in Copenhagen. Can you imagine? So awesome. So, fast-forward to today, and Eva has written a whole book about this work. It’s about how technology gets misused and exploited by abusers and others trying to cause harm, and what we—people who design and build digital products—can do about it. I just finished it, and it’s so eye-opening, with a lot of specific examples. Y’all should read it. But first, you should listen to Eva. Let’s get to this interview.

Interview with Eva PenzeyMoog

SWB 2:43 Eva PenzeyMoog is a designer, a speaker, and the founder of The Inclusive Safety Project. Her new book is called Design for Safety, and it just came out last month from A Book Apart. Eva, you know, I've been a big fan of your work for a while, so I am hyped to share it with more listeners. Welcome to Strong Feelings.

EP 2:59 Thanks, Sara. I'm so excited to be here. I love this podcast, and it's really exciting to be on it.

SWB 3:04 Oh my gosh. Well, thank you for that. I am excited to have you, and I'm so excited about your book. Congratulations. 

EP 3:11 Thank you. 

SWB 3:11 It's a big milestone. 

EP 3:12 Yeah.

SWB 3:13 Design for Safety. It's really good. I have read it. And if you work in design or tech in any way, I definitely recommend you read it, especially if you happen to care about humans, which if you're listening to this podcast, I hope you do. So I think first up, can you tell us a little bit about the book and what it covers?

EP 3:28 Yeah, absolutely. So the book covers different issues of safety, specifically interpersonal safety, that arise in tech. So specifically the ways that people are weaponizing our tech products for harm against each other, so you know our very legitimate products that have very legitimate uses, that people are finding ways to turn against each other for really serious harms, such as stalking, tormenting, surveillance, harassment, gaslighting.

SWB 3:56 And when you start talking about, you know, "tech products being turned against people," I think it would be really helpful to start with an example of something that people might not realize is a problem that people are using technology to hurt one another with. Like, what's a common example that comes up?

EP 4:09 Yeah, one very common one is people misusing financial products, so things like shared bank accounts, to take control of their partner's money, which obviously puts people in a very vulnerable situation. If you're in an abusive relationship, it's much harder to leave if you don't have access to your money. Another big one that is becoming really common is abuse via Internet of Things devices. So things like turning off the heat in the middle of a really cold night in the house of your estranged partner, for example, or even things that are less dangerous, but still very abusive, such as changing the temperature, and then later saying, like, "Of course, I didn't do that. You just don't know how this works because you can't figure out tech products because you're not smart. And actually, my experience is the only thing that you can trust. And you just don't know what's going on," which is gaslighting, which is a form of abuse. So lots of different things going on. And there's definitely a spectrum in terms of the types of abuse that are enabled through tech.

SWB 5:06 Now one of the things that I noticed when I started reading the book was just the, like, wide array of ways that abusers or other people who are intent on doing harm, end up misusing tech products, and I'm wondering, how did you first start looking at this? Like, how did you start going, "Oh, gosh, I think that we need to be paying attention to this issue"?

EP 5:24 Yeah, so before I worked in tech, I worked in nonprofit, but I also did volunteer work as a domestic violence educator, and I was a rape crisis counselor. And between those two things, I learned a lot about these different issues, especially with gendered violence. And then when I came into tech, I started seeing ways that the technology is the abuse, or is enabling a form of abuse. One of my very first projects at the software consultancy where I work, 8th Light, was about, essentially, an app to help people who live in high rise apartment buildings manage all the different things that go into that. And part of that was a guest list that, like, the front door person would, you know, see, like, you have your approved guests. So yes, you can go up. 

6:07 And I was thinking about a story that I remembered from the rape crisis counseling training about someone disguising themselves as a food delivery person as a way to sort of sneak into the building and get actually right up into the apartment of his ex who had left. And that's obviously a very dangerous situation. So I was thinking like, "Okay, so how could I design something to prevent that?" And I sort of came up with this idea of an anti-guestlist, to let the front door person know that this person can't be allowed, and actually, if they tried to come here, you should let me know and possibly alert the police. So that was the sort of first one that I was like, it feels like there's a gap here. And there needs to be something else happening. And that kind of set it all off.

SWB 6:46 Yeah. And I think one of the things that is really striking is, like, once you start going down this path of "How can tech be used against people? How can it be weaponized?" it's like, "Oh, wow, that way, and that way, and that way," and there's all these things that come up that have maybe not been talked about that much. I'm wondering, as you started asking more questions, or pushing for these things in your design process, like, what was that like for you to try to make more space for these conversations? 

EP 7:10 So in terms of the actual trying to talk about this stuff at work, or just with other people who work in tech, it was honestly kind of awkward because this isn't a topic that people like to think about. I talk a lot about domestic violence, there are other ways that this happens. There's issues of child abuse, and elder abuse, and things like unethical surveillance of employees and workers. But domestic violence is the one that I focus on. And just bringing that up, kind of out of nowhere, during, like, a brainstorming meeting, it's kind of weird. And now, you know, my team is very used to it. And they're all really on board and are actually helping with some of this work, which is great, but at first, I think people just aren't used to saying, "Hmm, what about, like, someone going through domestic violence?" It's kind of like, "Wait, that's dark. Do we really need to talk about that?" And yeah, actually, we really do. 

SWB 7:56 Well, yeah, I mean, so much of tech, I think has really bought into that idea of, "We're designing for the happy path. And we're gonna think about all the positive outcomes." And so it can be hard to bring up the negatives or to bring up like, "Wait a second, have we thought about people who are bad actors?" When you started having those conversations what gave you the courage to say like, "Wait, no, I need to bring this up, even though it's gonna be awkward, even though it might stop this conversation"?

EP 8:41 You know, it's a very unique experience, I suppose that kind of allowed me to, like, bring all of these things together. It sort of felt like if I don't talk about this, who is going to talk about it? Not in a judgy way or a bad way, but like, I have this experience, and I'm able to make these connections. So I kind of feel like I have to because I can. And if you can, and you should, then you have to is kind of how I think about it. So that sort of mindset, I guess is how I pushed through those early, awkward conversations.

SWB 8:49 Yeah. Did I ever tell you, you know, I worked in a rape crisis center too? 

EP 8:52 No. That's great. 

SWB 8:54 I worked in a rape crisis center for, I think, three years. 

EP 8:56 Wow. 

SWB 8:56 Yeah, did a lot of crisis line, and I did a lot of education programs for kids, actually, so I did work with a lot of child sexual abuse.

EP 9:02 Was that part of how you ended up doing your work with the inclusive compassionate design? And did that experience lend itself to that work?

SWB 9:11 It's an interesting question. I don't think that I had, like, explicitly drawn the connection at first. But the more that I thought about themes that showed up over and over again, in my work, the more that I knew that that had absolutely completely shaped me. And one of the things we did a lot of talking about, as you might guess, is consent. Consent is a topic that I think is not talked about a lot in technology, like, what are people consenting to with their digital products? And a lot of consent gets glossed over, and then consent in this context around how other people might access your information when it comes to, you know, an abuser who has access to things. And so I think that was definitely something that for me really, yeah, really shaped me. And so I'm curious for you, as you think about topics like consent, because that comes up a lot in the book, how do you see consent playing a role in this work?

EP 9:56 Yeah, consent is such a big one. I'm actually working on another piece of writing, maybe for my newsletter or something, that's all about consent because I want to explore it more. But it comes up in so many different ways. Definitely consent in terms of like, "What happens to my data? How is it going to be used? Do I really understand what's happening?" And then, "Do I have control over it if I leave this service?" Or you know, even data, like, at your workplace, do you have control over that? Can you change it later? Is there some sort of record? I think a lot about authentic consent. You know, like, having a checkbox checked by default to opt in to a promotional email, and then saying, "Well, if they don't want it, they'll uncheck it." That's not authentic consent. That's not real because people are busy, people are stressed, people typically don't read everything closely. That's super normal and expected, and yet, we're going to expect that they're going to read this thing? Like come on people, that's not real. And in terms of the safety specifically, I think a lot about it in terms of location data because there's so much that is sort of set to public by default, when it comes to location and other similar things. And then you have to kind of go into the settings and change it. And that also doesn't feel like authentic consent. Like, you haven't actually said, "Yes, I'm okay with whoever: everyone or my followers or whoever it is, seeing this part of my data." If you have to go through this flow, and then turn it off, like, that's not actually consent. 

SWB 11:20 Yeah. I mean, I think about this example from gosh, you know, a long time ago now, maybe five, six, seven, eight years ago, but a friend of mine had realized that back in the time when people were using things like Foursquare a lot, and you know, checking into places, which obviously, if you check in to a location, you are sharing your location with some amount of people. That's kind of the point of those services, but that there were these, like, other apps that were being built to pull that data and to do things like make, I think it was called "Girls Around Me." 

EP 11:49 Oh.

SWB 11:50 Yeah. Which is very creepy sounding because it's a very creepy product that was basically designed to say, like, "Let's pull some Foursquare data, and then only show, you know, women in a certain age range in a certain geographical area around you." And it's like, "Oh, she consented to share her data on Foursquare." But does that mean she consented to have her data pulled into this other context that is pretty creepy? 

EP 12:12 Yeah. 

SWB 12:13 And, like, where does that consent fall apart? And I think that that's such an important question that we aren't asking enough and aren't having conversations about.

EP 12:20 Yeah, that's a horrifying example of what we're talking about, and really gross. And I feel like maybe you've had similar thoughts with your experience working at a rape crisis center, it's like all these problems in tech are just reproducing problems that exist just out in the world, in our culture. And like, a lot of people don't have good understanding or practices around consent when it comes to sexual activity. Why would we expect those people to then be thinking about consent in a legitimate way when they're building tech products? Like, it's all just reproducing these external issues.

SWB 12:53 Yeah, I think that brings up an interesting question, because I know one of the pieces of pushback that I've heard is that these are societal problems, the tech can't be responsible for fixing those things. And I mean, I think that there is some truth that, like, sure, technologists cannot fix the fact that somebody wants to be abusive. Like, that's outside the scope of what the technologies can fix. But I've also seen it sort of be used to avoid responsibility. And I'm curious, from your perspective, what do you think our responsibility is as people who are designing or building software technology products people use?

EP 13:27 Yeah, that's a really good question. And it is something that I have seen a lot, even from people who for the most part, kind of get it will say things like that, like, "Well, you know, it's up to a survivor to sort of understand the tech and to know how it's going to be used and to regain control." And I mean, I have a lot of thoughts about this. But the first one is that that assumes just such a high level of tech literacy, that it's really easy for those of us who work in tech to forget that most people don't have that. A lot of people are using products they don't fully understand. I mean, I'm using products, I don't fully understand all of the features. I'm using it for one thing. So that's just a huge assumption that people have that capacity or ability or desire to understand these things. We kind of know that's not true. My second thought is, you know, you're a human being before you're a capitalist, and you should care about people. Obviously, that's hard to instill the whole, like, "I don't know how to tell you that you should care about people." 

SWB 14:20 Yeah.

EP 14:21 But then the third thing that I think about is, I've talked about this a lot before, so sorry, if any listeners have heard me say this already. But I'm like really obsessed with the history of the seatbelt, and sort of different paradigm shifts in general around safety issues. And, like, the seatbelt is something that, you know, it used to cost extra in cars in the 1950s. Like 2% of people paid that money to get a seatbelt, probably no one was actually wearing them. And now it's around 91% of Americans use the seatbelt. They're standard in cars. Car designers don't have an option not to include them. And that was all really hard-fought; we had to really fight for that change because the engineers and designers of cars were using this exact same argument that, like, "This isn't our responsibility. It's actually the user's job." They actually had marketing campaigns to put the onus on the users of the cars, the drivers and say like, "It's actually just an issue of educating people on how to drive more safely. We don't have to consider crash science. We don't have to calibrate the car tires, pressure," you know, they were doing that for comfort over safety. The dashboards were, like, bright chrome, and it reflected sunlight into people's eyes, and they had all of these things about user safety over us designing and building the cars to be safer. And it took like 30 years of activists and everyday people sort of rebelling against that for the government to create the agency that oversees all of this stuff and make laws to actually hold the car companies accountable. And it's like the exact same thing that's happening in tech right now, where people are saying, "Not our responsibility, not our problem, the users just need to like, learn what to do and be safe." And you know, they're allowed to self-regulate, which we know how that's going. 

SWB 16:00 Yep. 

EP 16:00 So it's really bad, but I am hopeful, like, looking at the sort of history of other paradigm shifts. There's a law that got introduced recently, around biased algorithms that's really promising. We're starting to get, like, the notice of politicians, because people who work in tech as well as everyday people, you know, there's been a shift against the big tech companies, which has been really great to see. So I think that eventually, the government is going to step in and say, "You don't think it's your job to consider user safety. We're gonna tell you it is, and we're going to have actual regulations in place to force your hand because that's what it takes to make this change happen." 

SWB 16:34 Well, so in the meantime, there isn't a lot of regulation, and there isn't necessarily a clear answer, and I know that you've spent a lot of time figuring out, you know, what are some of the ways that designers, that technologists can think about these things in their work? And going back to the thing about consent, how do we start thinking about consent? And I'm thinking about in the book, you have this example of this totally mundane lack of consent issue with Cecile, I think is the name and the grocery store? 

EP 16:59 Mm hmm. 

SWB 16:59 Yeah. Can you talk about that a little bit and talk about, like, how we might think about consent in our process?

EP 17:04 Yeah. So Cecile is an acquaintance of mine. She's Norwegian, she works in tech in Norway, and she had a conference talk about this thing that happened to her where she left her office. You know, in the before times, pre-pandemic, when she was working at an office and bought some tea, like, to keep in her office. And then when she got home that night, her partner was like, "Hey, where's that fancy tea that you bought?" And he had actually seen on his sort of grocery store app, that she had purchased tea, and he was able to see all of her purchases. And he's not an abusive person, he hadn't set this up to monitor her, his app just started doing this without either of them doing anything to set it up. 

17:42 And they eventually found out that it was because they have a shared bank account. And it was sort of an issue of assumed consent. The designers of the grocery store app were like, "Well, you know, people who have a shared bank account, we'll just show both of those transactions from both of those debit cards," was essentially what was going on, which, in this case, it was okay. It wasn't an abusive context, but in an abusive context, that so clearly becomes very dangerous when you can see what people are purchasing, all of the sensitive things that you're able to buy at lots of grocery stores. And it also included a timestamp, as well as the location which, location data is always really, really dangerous to give out, especially when people have no idea that it's happening. And they can't, you know, sort of plan around the fact that their abuser might see this. So very dangerous implications. 

18:31 And I think to sort of answer your question about, like, what we can do in the meantime, thinking about these issues and thinking about consent: obviously, I do think it's worth it for individuals and teams to take these issues on or I wouldn't have written this book and be doing the work that I'm doing. I think we do have an enormous amount of power. And I think it's really important for people who care about this stuff to talk about these things with their team and sort of build allies on their team around this work with the goal of influencing the company leaders, because it's really the company leaders who are, you know, setting the tone for everyone and who are responsible. And I've actually been trying to be more intentional about not saying Facebook or Twitter, like these companies as if they're people because they're not. There's a couple leaders at these companies who are actually responsible for all the harm that's happening. So I think Facebook is a lost cause. But I think most tech companies, people do care. They just need to be informed. They need some tools, and then they need, some of them need some pressure from the people around them to start saying, "Yes, like, take the time to think about this stuff." So I think starting with just building allies on your team, is my advice in terms of where to start.

SWB 19:38 Yeah. And I think, you know, this goes back to something we talked about before, people are uncomfortable with dark conversation. But I can just imagine how, in a lot of contexts, that feature of that grocery app is really convenient. You know, like, I'm thinking about, like in my own relationship, I'm like, "Oh, yeah, if it pooled all that stuff together, like, oh, yeah, that would be fine and had a sense of like, our household shopping, because we share a bank account." And so I think if you're focused on that happy path, and you're focused on brainstorming, and like, "Oh, wouldn't it be cool? Wouldn't that be helpful? Wouldn't that be nice?" And if nobody's going, "Wait, hold on, what are the ways that that could go wrong? Who could that hurt? How could that be used against someone?" Right? Like, if those questions are not acceptable to be asking and not worth consideration, then it's like you miss all of the potential negatives of a feature choice and all you have is the positives, right? 

EP 20:27 Right. 

SWB 20:27 So what I'm hearing here is having these conversations and fostering that on a team is a really critical first step.

EP 20:33 Yeah, it definitely is. And it's also part of the reason that I feel really strongly about the process that I created to get at this stuff, which actually includes something that you created, which is stress testing. And I feel like instead of relying on, you know, someone to raise their hand and bring up this issue and go through all the awkwardness and weirdness of it, having actual activities in place that you've established: "we're gonna have time to talk about this. It's going to happen on this day; we're going to spend the whole afternoon researching, and brainstorming, and doing different things." Having a process really takes the pressure off of one individual who's thinking through things outside of the happy path. So that's another thing that I sort of like about having a more formal process or a more formal strategy is it just takes that pressure off.

SWB 21:17 Yeah, I love that, that it takes the pressure off the individual to be doing something outside of the ordinary, and it turns it into, "No, this is the ordinary," and so- 

EP 21:25 Right.

SWB 21:25 What is that process? Can you say more about the process you've designed?

EP 21:28 Yeah, sure. So it's sort of a series of activities that can kind of just be overlaid, like a general design process. So in the research phase, you set aside time to do research into existing issues that have been documented with similar products or features to what you're working on. So just, like, literally looking for news articles, looking at Google Scholar for any scholarly work, and then using that information to make archetypes, which is basically just a very scaled down persona that defines the abuser and the survivor, sort of what the issue is. This person, you know, she broke up with her boyfriend, and now he's having stalking tendencies, so her goal is to not be stalked with this product we're building. And then you know, the abuser's goal is to find out where his ex-girlfriend lives using this product. 

22:15  And then going from there, you sort of use the research that you've done, as well as the archetypes, to brainstorm novel abuse cases. So things that you have not uncovered already in your research but are totally possible. And this is generative brainstorm. I usually started with a Black Mirror brainstorm, because those are actually really fun, where you just try to think of the worst case, most ridiculous unlikely amount of harm that could happen and then kind of rein it in into more realistic things. And then you just go into a process of solutioning and testing out that solution, similar to usability testing, where you test the prototype and then iterate based on the feedback. So testing it out, can our abuser archetype see any location data about a profile that has everything set to private? Like, are they still able to glean anything? And that does happen a surprising amount, I talked about Strava in the book and how they were sort of enabling some location data to sort of seep through, even if someone had set everything to private.

SWB 23:13 And Strava is an exercise app for those who don't use it. 

EP 23:15 Thank you. Yeah, they were responsive. They got some heat on Twitter, and then they fixed the problem, which was really awesome. My whole thing is like, "Wouldn't it be great if you had just never had to go through that because you had identified this in the first place?" But yeah, that's sort of the end of the process is to go through that testing, and then if you find issues, if you find that actually the abuser is able to achieve their goal or the survivor is not able to achieve their goal, then you go back and iterate.

SWB 23:38 Yeah, I think something that comes up in the book here is you talk about how it's not if somebody's going to try to use your tech product to harm people but more of a "when." The thing that came across really clearly is that abusers will work really hard to find ways to get what they want. 

EP 23:52 Yeah. 

SWB 23:52 And I think that that really plays in here, it's like, just assume that this is going to happen. The Black Mirror exercise sounds interesting because even though you've mentioned it as being worst case and extreme cases, and maybe those are unrealistic, at the same time, what I'm also hearing is, like, if it's possible to be done, people will do it.

EP 24:08 Yeah, absolutely. That was something that I thought about a lot and then sort of landed on, like, I don't want to talk about potential issues, or what might happen. I want to be very clear on this will happen. And just because, you know, something like with Strava, and this has happened a lot where some very smart person identifies this issue, puts it on Twitter, or brings attention to it in some way, and then they fix the issue, which is great. It's great when companies do that. But it's sort of like just because that's the way it shook out doesn't mean that no one was harmed by this; we just might not know about it. Survivors aren't exactly shouting from the rooftops about what they've been through. Obviously, it's a very personal and very bad experience to go through, and most people stay very private about it. And it also doesn't mean that it wouldn't have happened if they had not fixed the issue. 

24:53 So yeah, I'm very intentional about that. And it's something that people who aren't abusive themselves, or have never been in an abusive relationship, have no idea how creative and ingenious, I'm not saying that in a positive way, but ingenious abusers can be with their abuse. They are crafty, they are always looking for new ways to abuse things, and you have to just be aware of that reality. They are looking, they are analyzing, they are thinking about it constantly. That's their goal is to enact power and control over their victim, and they're very creative about it.

SWB 25:28 That's a hard reality to swallow. And I know, I hear that from people that that can be a scary thing to really reckon with and be like, "Oh, this is very serious," right, in something like stalking, which can be minimized very easily because it makes people uncomfortable. And I'm curious, how do you get people to take it as seriously as you know it to be from your own experience doing this work?

EP 25:49 Yeah, I guess I sort of lean into the awkwardness at this point or just try to talk about those really worst case scenarios upfront, because yeah, stalking, there's a range, there's a spectrum. It can be creepy and make someone uncomfortable, or it can be, like, literally a life and death situation to keep your location private from someone. You know, with stalking, I'll talk about like, "Well, you know, we know that after a survivor leaves an abuser is the time that they're most likely to get murdered." And because of that, it is very important that we make sure that we're not leaking out location data because in that situation, they really need to keep their location private. Because you know, three women get murdered every day in America by current or former intimate partners. And that does have to do with people being stalked in their location being found, so we need to make sure we're not contributing to that. Those are really intense statistics, but I think it's important to bring up those realities. 

SWB 26:43 Well, speaking of statistics, something that you bring up in the book is that a huge percentage of domestic violence workers, like people who work at domestic violence centers and shelters and stuff, talk about tech showing up in their clients' situation. So I'm curious, yeah, how frequently does that happen? And what did they say?

EP 27:00 Yeah, it's incredibly frequent. Like, everyone who works in the domestic violence space knows that this is a huge issue, it's getting worse, it's gotten a lot worse during the pandemic. I've heard from different people in the space that, you know, we've all had time, and some people have gotten into like sourdough baking, or these different hobbies. Abusers have had more time to think about how they are going to abuse. That is their hobby, that's what they're spending their extra time doing. So there's actually been a really big increase during the pandemic, and there's been a documented increase in the tech facilitated stuff, because now you finally have time to get around to, like, installing spyware on your partner's laptop, or whatever it is. There's definitely been a lot more calls around that. And it's also been a lot harder because there's, you know, just a lot less privacy if your abuser is always home with you, and you used to have an escape during the workday or whatever, and that's not a thing. So we do know from people who work in this space that the tech side is bad, and it's just getting worse every day, pretty much.

SWB 27:59 If people were to go after this interview and go read your book, right, obviously, step one: get the book, read the book. What is one thing that you would really hope they start to incorporate into their process, like the first thing that you'd want them to be thinking about differently?

EP 28:12 I think I'd want them to start doing the research and understanding how this is gonna factor into their specific products that they're working on at their company or with their current clients. I tried to give, like you mentioned earlier, like, a really broad range in the book to help people sort of start to build a mental model of how this works. So I would hope that people would be immediately motivated to be like, "Okay, I'm going to do some research in, you know, chatbots," or whatever their thing is that they're working on and see, like, what's out there and how it's been misused, and then start thinking about like, new, novel ways that it could be abused for their own thing.

SWB 28:20 Yeah, that's uncomfortable work. That's hard work. How do you kind of keep energy for it and process the pain and trauma that comes along with engaging with all these stories of violence and harm? 

EP 29:00 Yeah, I really appreciate you asking about that because most people don't. Yeah, it is very intense work, so I, you know, have a therapist and don't think I wouldn't be able to do this work without her. Definitely recommend therapy for people who have access. And then the other thing is just taking breaks. Like, I know that sounds really basic, but it is kind of hard, especially when you're doing really important work, it can feel like you aren't allowed to take breaks. And that's definitely how I felt for a long time. And then I read a particular story, and I'm not going to repeat it because I honestly don't want it in other people's brains, but it was really intense, and I had to just shut my laptop and literally not do any more work on it for like two weeks. This was a few years ago when I was writing a chapter for an academic textbook on the topic. And that was kind of when I realized, "Okay, it's okay to take breaks, I want to be doing this work my whole life, this is my thing that I'm going to be doing and definitely need to have the sort of marathon not a sprint mentality." So I, literally, I'll go like weeks at a time without doing any new research or reading articles. I ignore all the Google alerts for different terms I have set up. And like, that's okay, those things will be there later. I'll be able to find this article if I need it, even if I don't read it. So I definitely have like an ebb and flow sort of approach.

SWB 30:18 Sounds like something everybody needs. It's like, I hear you, it's so important, right? And it's like, "Oh, gosh, this is so important. I can't take a break from it." And yet here you are: still human.

EP 30:26 Yeah. And I think looking into the history of paradigm shifts has been really helpful with that to be like, oh with seatbelts: it was a solid like 32 years before there was the first big thing that happened. That's a long time, and people need to sustain themselves. And I think that's the reality for tech is that it's years. I think some of it might be decades, although, you know, I think we're able to move a lot more quickly than back in the 60s when activists were working on car safety issues. So I don't think it's gonna be quite that long. But it is a years long process. And I also think a lot about the fact that the leaders of Facebook, they want us to feel this way. They want us to feel exhausted and overwhelmed, and we can't make a difference, and maybe I am not actually cut out for this. Like they're counting on that, which also kind of motivates me to be like, "Nope, Mark Zuckerberg, I'm actually going to take a break, and take a nap, and then come back to this because I'm coming for you.

SWB 31:20 I love that. Okay, we're gonna pause on that note because fuck yeah. Everybody should definitely pick up "Design for Safety" from A Book Apart. You can go to https://abookapart.com/products/design-for-safety to get the book. Eva, if people want to get in touch with you to tell you you're awesome or to book a workshop with you or anything like that. Where can they reach out?

EP 31:37 They can email me. My email is eva@8thlight.com. I'm also on Twitter, under @epenzeymoog. My DMs are open. 

SWB 31:51 Awesome. Eva, thank you so much for telling us about the book. I hope everyone picks it up.

SWB 32:04 Okay, so Eva, we've talked a lot about what's in the book, but also I know writing a book is really hard. It's a lot of work. It's something that I know a lot of our listeners are curious about, maybe interested in, and maybe a little scared of. I think it is kind of a scary process. And so before we end our time with you, I want to dig in a little bit more into that. Particularly because this book came out during the pandemic, you worked on it during the pandemic. You're one of those mythical people who write a book during the pandemic. And so I'd love to hear a little bit more about that process. When did you get started? And how did that go for you?

EP 32:36 Yeah, so I have sort of two thoughts about it, which is the first that writing during the pandemic in some ways was easier because there was no FOMO. Nothing was happening. When I wrote that academic textbook chapter earlier that I mentioned, it took like an entire winter, and I was working full time still. And it was like all my free time was going to this project. And I didn't see my friends. I didn't go to family events. I missed out on all these concerts and different things, I was really resentful, and it really sucked. So you know, there was nothing to miss during the pandemic. So in some ways, it was actually easier. But then the other thing is, like, so I started infertility treatment in March of 2020, which is right when the pandemic started. So these two things have been intertwined for me the entire time. And during the time that I wrote the book is when my treatment really ramped up. So I did three IUIs, which is sort of like a precursor to IVF. It's a much less intense process. But it's still, you know, pretty intense. Those didn't work. I did four rounds of IVF. I got three embryos from those four rounds. Two of the rounds were total fails, like zero embryos, which is a surprisingly bad result for someone who, you know, my partner and I both, neither of us had, like any known issues. And we're finally hopefully starting to get to the bottom of that. 

33:57 So anyway, it was really, really hard. And I think between infertility and the pandemic, like these were two things that people are very powerless around. I'm vaccinated, you know, Chicago has a mask mandate indoors, again. I'm wearing my mask, I'm doing the things, but like, ultimately, I don't have power to fix it. And it's the same with infertility, like you ultimately just, you have no power over what happens. It might never work, you can't do anything about that except kind of stay in treatment. So having these two experiences so closely intertwined that have taken away so much power and agency, and there's been so much loss, being able to write this book during that time has been a blessing, like there's no other word for it that I can think of that fits quite as much because it reminded me that I do have power. I do have agency. I can affect change in this way when it comes to making tech safer. And that has honestly been part of my survival during this whole time. Like, that's how I survived these two really awful experiences. So that's kind of what I want to get into when people are like, "Whoa, you wrote a book during the pandemic," that's kind of everything that's going on in my mind. Yeah, I did, but it's not like it was something that was, you know, contributing to grind culture. Like, it wasn't like a hustle, gonna work really hard and make some extra cash type of thing, which you don't from writing books. People should know that. And it was like, hard at times. Obviously, it was a ton of work, but it certainly wasn't some big, really tough thing that I had to power through. It was actually the way that I've been surviving this whole time.

SWB 35:25 Yeah, I really hear in there, and this is something I think about a lot is like, when we feel powerless, we tend to go looking for things to control. And it's like, how do you find something healthy to go control? Like, "Okay, I can have some control, I can assert some some of my power here over this project, and that's a reasonable thing," versus trying to control people, which is sometimes how that powerlessness plays out. Well, when you think about your process for writing the book and for building the curriculum that you've built, something that I'm wondering is, as you first started building this and sort of as it's evolved, it seems to me that it's become a little bit of, like, this whole new path for you where it seems like it has completely shaped the way you want to take your career. And I'm curious, like, when did you realize that this was something that you wanted to be kind of all in on, that this was your thing? 

EP 36:12 Yeah. So I had my conference talk that I did before this, it was called "Designing Against Domestic Violence, and that's definitely what led to the book. And so 8th Light has always been, like, really supportive of this work. You know, they're a software consultancy, they don't specialize in this, although now we're starting to because I'm starting to sort of train other people. But when they were like, "Yeah, great, go. You don't have to take PTO, go speak at conferences. That's great." And I was like, "Oh, this could be my job?" You know, it was a few hours a month, every other month, but that was kind of like what hooked me. And I was like, "I just want to be able to talk about this stuff all day."

SWB 36:47 Well, and now that you get to spend more and more of your time talking about it, I'm curious, where do you find that you still get pushback about sort of the value of this work? How do you keep that from stopping you?

EP 37:00 The main form of pushback I get is still in what we were talking about earlier with people sort of arguing against the entire premise of the the entire book and my work, which is that, you know, we as technologists should be thinking about this, and designing against it in the first place, as opposed to putting that responsibility on our users. That's still an argument that I see from people. I honestly feel like it comes from a misunderstanding. Someone actually brought up car safety. In their mind it was like a counterpoint. It was like, "Well, you know, there's all these things to keep them safe. And we still have problems." And I was like, "But yeah, exactly. Don't you know that there's a whole history there? The auto industry in the 50s and 60s is exactly like the tech industry is now, and people legitimately don't know. And I think it's a matter of education. And once in a while, you know, I'll get crappy dudes on Twitter saying, "You don't know what she did before he hit her," or whatever. And sometimes you just have to be like, you can't be won over, and I'm not going to waste my time. But I think most people can be won over and it's just a matter of engaging with them. 

SWB 38:05 Oh, yeah. So that's some advice I want everyone to hear. The people who are going to go to that point and say like, "Well, you don't know what she did before he hit her," those are probably not your audience. But there's a huge number of people who agree with you in theory or in principle but are lacking information, education, they're scared of speaking. It's like, go talk to those people, and if we get all of those people on board, maybe then we can go talk to those people who are at the fringe. 

EP 38:30 Yeah.

SWB 38:14 But we don't have to necessarily argue with those people all day, like, that is exhausting. 

EP 38:34 Yeah. And is that the best use of your time when there are all these people out there who are going to be more open to what you're saying? And I think about this with politics with, you know, anti-vaxxers, like some people, you're going to be able to influence them, and some people, you're not no matter what you do, but you're going to make yourself miserable trying. So just maybe, there are plenty of people that you can convince, and that's probably where you should focus your energy on. 

SWB 38:57 Oh, I love that so much. And so I think particularly thinking about listeners out there who care about the kinds of issues we talked about here, who want to do more to advocate for justice, or inclusion, or compassion, or equity, all of these things, right, but for him, it can feel scary or risky to speak up and to kind of become known as "that person," you know, that like squeaky wheel person? 

EP 39:20  Right. Yeah.

SWB 39:21 Yeah. So for people who are in that place where they're like, "I want to do this, but I feel kind of scared to use my voice in this way," what would you say to somebody who wants to kind of find some courage to speak up for whatever their issue is that they think needs more attention?

EP 39:35 I would say so first of all, like finding your people is really powerful. And it might be you might find allies at your workplace, but it might have to be outside of your work. You might find a group of like-minded people, or even honestly, other people on Twitter, things like that, but finding a community of like-minded people who you can go to to like, share in your frustrations and get solidarity from. And then hopefully also, you know, starting to build even just one ally at work who can back you up when you bring this up. This is one of my go-to tactics is to have a plant in the room of someone that you've already talked to who's going to immediately be like, "Actually, yeah, we do need to talk about that. Thank you for bringing that up. Let's dig in," or whatever, to just sort of like immediately skate through the awkwardness of everyone just staring at you and not knowing what to say when you bring something up. But definitely finding at least one other person or some type of online community or both to kind of go through this with.

SWB 40:32 I love that advice. It's so easy to feel alone when you feel like you're facing something big and difficult, like abusers misusing technology. It's like, "Oh my gosh, me against this issue," and it's not, there's other people who want to speak up about it as well. So I love this idea of going and finding them. Okay. So people need to go find their people, kind of practice starting to speak up, and then if somebody does want to do something real big and bold, like give a talk, write a book, what helped you tackle some of those bigger and more visible projects? What would you tell someone who's sort of wanting to go down that path?

EP 41:07 Yeah, I would say to find someone who has done one of those things and figure out what the actual nuts and bolts are, because in both of those cases, I had someone who kind of spelled it out for me. With a talk and a book, you write your sort of proposal, your outline, your abstract, and then you get it accepted, and then you make the actual thing. For me, learning that was such a huge brainwave. And I was like, "Oh, I don't have to like write an entire book before I even have a publisher." And my sister was actually able to talk me through how it works. And she actually went to abookapart.com and just found the proposal on the website because she knew that that's a thing that most places have because she had already written a book about spices and knew how it works. So finding someone who can spell out the process. I think we fear the unknown a lot, and talking to someone who can be like, "Oh no, it's actually really simple. You do X, Y, and Z. And then you make your talk and then you do this," and just explain it, all it's really useful. 

SWB 42:04 That is such good advice. Absolutely, we do fear the unknown, and just knowing the actual steps can help you understand each of these is achievable.

EP 42:12 Yeah.

SWB 42:13 Eva, I love that so much. Thank you so much for being on today. I am so delighted to share you, your work, your book with more people. 

EP 42:20 Thank you so much for having me, Sara. This was such a fun conversation. 

SWB 42:23 All right. Again, everyone go pick up Design for Safety from abookapart.com. And get in touch with Eva because there is so much that you will learn from this book and from her work both around the specifics of domestic violence, but it's going to change the way you think about your work across the board.

You’ve Got This

SWB 42:43 Okay. Before we wrap it, is time for our closing segment: you've got this. That's where we look at something our guest said a little bit deeper and see how we might apply it to our lives. And for that, I want to talk about something that came up in this interview around feeling powerless. Because, like Eva said, it is so easy to feel powerless right now, to feel powerless about the ongoing pandemic, or to feel powerless about climate change, or in her case, to feel powerless about whether fertility treatments would be successful. So when humans feel powerless, they often look for something they can control. That is natural, it is normal, and it can be a really good thing. I was reading some research recently that was covered in a BBC story back in the winter that I'll link to in the show notes. And one of the really interesting things about powerlessness is that it's so subjective. When we perceive that we have control, our wellbeing goes up. But if we feel like we don't have control, so if our perception is that we don't have control, we feel more stressed, we feel more unhappy, we have a hard time coping with difficult things in our lives. And so we all really need to find ways to feel like we have some power. We all need ways to feel like we have some sense of control in our lives, especially right now, but also all the time.

43:56 So, finding something you can control can be really good, but where this tends to go wrong is when people try to control things that they are not actually able to control, things that are outside of their control. For example, I have seen people get so stressed about COVID for good reasons, right? That they start obsessing over tracking COVID data. And they start obsessing over looking at the numbers and the charts and the graphs and looking at all the different sources and trying to make new meaning out of it as if they are like the director of the CDC. And the thing is. It is good, and right, and useful to know what's happening in your community and to keep up with public health information. I'm not suggesting that people turn off that data. But what I have seen is that people get really, really obsessed with the numbers as if them keeping up with every scrap of data is going to change the situation, as if that's going to give them some control, some secret access to knowledge, and it doesn't. And so it ends up being just the stress response. Like, "Maybe I'll feel more control if I get more of this data and more of this data and more of this data," but it doesn't really work, right? It just results in wanting more and more of it and then still feeling powerless at the end.

45:04 Another big example of this behavior that I've seen in the workplace is micromanaging. Hell, I mean, this is something I have been guilty of. I am not proud of it. I have worked on it because we can micromanage when we feel like we're powerless in other areas as a way to kind of like, get that sense of control back. That can look like refusing to delegate. It can look like trying to hold tight on every single project, like "I'm going to figure out all the answers in my head before I share anything with anyone else." It can be trying to control how other people work. It can be constantly checking in. Oh, we've seen a lot of that, right? Like bosses who are like, "How are you doing? I just want to know, how are you doing? How are you doing?" But it's checking in to a point where the other person feels totally smothered. It could be expecting your team to DM you on slack before they take a walk in the afternoon. The problem here is that instead of trying to control something that's actually yours, micromanaging is trying to control someone else. And that takes away their autonomy. It takes away their feelings of power, right? And they need to feel in control in some ways too. So it actually makes the situation worse for them. And that's so not fair.

46:08 And it doesn't even work because trying to control things that are out of your control, like people, it's really unsatisfying. You will fail at it over and over again because no matter how controlling you are, people are going to reject those attempts or resent them. They will evade you. They'll quit all of these things, right? Because people are people and they're always going to be doing stuff that you don't control. And so you keep seeking out the sense of control from a source that can't really give it to you. So that's exhausting, and it ultimately just leaves you even hungrier for control. So that is a terrible cycle. So, if you are struggling with feelings of powerlessness right now, here is my big tip: find something that truly is in your control and go focus on that, like Eva did with her book. One big place that you can find some feelings of control is actually by setting boundaries. Especially work boundaries, if you, like a lot of other people, are feeling like your work and your home life have like blended together too much. So that could be things like setting a shutoff time and sticking to it, or blocking your calendar every week for a big chunk of hours to do deep work, or setting up "no meeting days" on every Tuesday or Thursday of the week. Whatever it is that you can use to remind you that there are some things that are within your control, even when there's a lot of things outside of your control. And again, this is looking for places where you can assert some control, things that are actually yours, that you can have ownership over. So give that a try, and remember, you've got this. Get more on this topic of powerlessness and finding control at activevoicehq.com/podcast.

47:47 That's it for this week's episode of Strong Feelings, I'm your host, Sara Wachter-Boettcher, and Strong Feelings as a production of Active Voice. Check us out at activevoicehq.com and get all the past episodes, show notes, and full transcripts of every episode at strongfeelings.co. This episode was recorded in south Philadelphia and produced by Emily Duncan. Our theme music is "Deprogrammed" by Philly's own blow dryer. You can check them out at https://blowdryer.bandcamp.com. Massive thanks to the delightful Eva PenzeyMoog for being our guest today, and thank you so much for listening. If you liked our show today, don't forget to subscribe and rate us wherever you listen to your favorite podcasts. See you next time.