The Framework Project

Conversations with people thinking about tech and its impact on society.

Leigh Honeywell

Leigh Honeywell

Leigh is the founder and CEO of Tall Poppy, where she helps companies protect their employees from online harassment. She's worked in security at companies like Slack, Heroku, and Microsoft, and she was most recently a Technology Fellow at the ACLU.

When I met her at the Tall Poppy office, she was remarkably earnest and down-to-earth, and we talked at length about the rise in tech worker activism, the problem of online harassment, and the current state of diversity and inclusion in the tech industry.


JACKIE

So it might be a throwback, and I'm sure you get asked about it all the time, but you were the lead organizer for the Never Again pledge. And I'm just curious—what pushed you to start that?

LEIGH

In the days following the election—it all feels like a blur, thinking back to the days immediately following the election. I remember people literally sobbing in the cafeteria at my job at the time, in the office where I worked. I was thinking, as a person who was a tech worker, I felt a really strong urge to do something to show that everyone wasn't just going to fall in line. I'm Canadian—it's not like I could vote at the time—and I had worked on Hillary's campaign as a volunteer.

JACKIE

Yeah, I think I get what you mean in terms of the perpetual state of helpless rage.

LEIGH

Yes. And, for many years having worked in diversity and inclusion in tech, I would often get people coming to me and saying, "I'm mad about, angry about the way this particular conference harassment incident went or whatever shenanigans have been occurring. What can I do about it?" And I would go back to people and say, "There's these organizations. The best thing that you can do is as a tech worker is to donate to the people who are experts in fixing this problem. You should donate to this group. You should donate to that group. If you care about this bad thing that's happened, vote with your dollars."

The Never Again pledge came as a natural extension of that, which is that, as tech workers, our organizing power is in our ability to just get a new job. And so coming into that pledge was about building that sense of worker power and worker solidarity, to say, "We're not going to build the database of Muslims. We're not going to build this technology that enables mass deportations. And if we're asked to do that, we are going to quit." We wanted to make that clear, that we weren't going to build this technology without a fight.

I think we were quite rightly criticized as much of the post-Trump "resistance" has been, from the perspective of, "Where were you before? Why wasn't there a Black Lives Matter pledge, with technologists refusing to build police surveillance?" And there's a lot of stuff in the Valley that is—the systems upon which many of us built our livelihoods are premised upon surveillance, and that surveillance gets used by governments unevenly.

As tech workers, our organizing power is in our ability to just get a new job. And so coming into that pledge was about building that sense of worker power and worker solidarity, to say, “We’re not going to build the database of Muslims. We’re not going to build this technology that enables mass deportations. And if we’re asked to do that, we are going to quit.”

So when I think back to how we came to that pledge, it was this specific moment that woke up a lot of people, but I think there's a long intellectual tradition of activism from which we drew and in some cases gave credit and probably could have given more credit.

JACKIE

Yeah, it is interesting. Obviously, for a lot of people, the election changed everything, but at the same time there was also this wave of, you know, Obama was complicit in a lot of bad things.

LEIGH

Like the deportation machine.

JACKIE

Yeah, exactly. It's interesting to see which parts of that are, you know, really valid and which parts can be more counterproductive. Yes, people have been late to a lot of things, but then again it's better to have them here now.

LEIGH

I think the other thing that I'm really happy people mostly refrained from doing in the time after the election—that was a focus within the Tech Solidarity movement, as much as you could call that movement because it wasn't that formal—was avoiding the sort of Bay Area techno-utopian solutionism of, "Oh, this bad thing is happening. Let's build an app to fix it."

JACKIE

Which was a whole—there was a wave of that!

LEIGH

There was a wave of organizations getting started, and I kept in conversations with people and kept going, "Do you need to start a new organization? Can you support existing organizations?" And I think you have to find that balance, right?

When there are existing organizations that are highly competent at the work that they’re doing, for the most part, in almost all cases, the best dollars-time-value exchange of Silicon Valley engineer effort into political change is supporting those existing organizations.

On the one hand, let's throw a bunch of spaghetti against the wall and see what sticks. But on the other hand, when there are existing organizations that are highly competent at the work that they're doing, for the most part, in almost all cases, the best dollars-time-value exchange of Silicon Valley engineer effort into political change is supporting those existing organizations. And I think we were able to pretty effectively redirect that energy to those organizations.

Jackie

I had a question about specifically this idea of tech solutionism and how it can be damaging for tech workers to try to fix every problem with a technical solution. Do you feel like that's gotten better? Do you feel like people have gotten more aware of that? And what do you see as the most impactful way that tech workers can actually help other people?

LEIGH

I think the most impactful way today remains dollars, funding effective organizations doing the on-the-ground social change work. Also, electoral work. I think the 2018 election is the most important election in a long time. It's such a critical juncture for the preservation of the basics of democracy in this country. So getting involved there, whether it's dollars to campaigns or boots-on-the-ground volunteering, the technological infrastructure of electoral politics in this country is—it exists already. There's no single app that's going to fix it.

JACKIE

Yeah.

LEIGH

So I think the combination of just donating to candidates in winnable areas and boots-on-the-ground volunteering. You know, take advantage of that unlimited time off. Knock on doors in swing states and swing districts and make a difference that way. It's such a critical race. It's going to affect judges. It's going to affect the census. All of this stuff. I don't think it's alarmist to say that the future of American democracy hinges on this race.

The "let's build an app to fix XYZ new political problem that has arisen" was a result of the election. Those efforts did occur, and many of them petered out as I expected they would. But I think that sense of worker organizing and worker solidarity is a thing that has changed in the Valley past the past two years, right? We saw recently that Salesforce employees—I think 650 of them—signed a letter to Benioff asking Salesforce to cancel its contract with ICE. I don't think that would have happened two years ago. I don't think that would have happened under a different president. And that's really important.

JACKIE

I did want to ask about that specifically. How do you feel about how everything has gone since the Never Again pledge? For instance, in the context of all the current tech worker protests around Google, Microsoft, Facebook, Amazon, Salesforce, etc. These tech worker activism efforts—do you feel like they've been effective, and how do you see the pledge manifesting more practically?

LEIGH

Yeah. I think this kind of engineer and tech worker activism is a pretty new thing. I think that we, in doing the pledge, were an early instance of it. It's important to acknowledge that our pledge existed within this continuity of worker solidarity, and we owe a lot of credit to the way that Coworker.org has pioneered that work, of using the protections that are afforded to us with the National Labor Relations Act to do things other than just straight-up organizing a union.

I would love for Googlers to unionize—that would kick ass—but I think it's super important for people working at tech companies to know that that's not the only way to get those protections. As soon as you and a single colleague go to management about working conditions, you are protected under the NLRB. And I mean, that is a thing that even as a—I mean, I was raised by a pack of lawyers. My mom was a labor lawyer in the eighties. In Canada, so maybe that's why I didn't know about the NLRB before the pledge, before meeting the folks from Coworker.org, I didn't know that protected concerted activity was a thing, and that's such a powerful tool.

It's an incredibly powerful tool in the hands of a set of people who, if they do choose to quit, can go get a job tomorrow. Because tech companies know that their employee base has that power. They know that they need to to act strongly to retain their employees.

I think one of the examples that we've seen of that was the CTO of ICE, who was supposed to speak at GitHub's conference. And that shit got shut down quickly. I think that's a perfect example of what we're saying, and it's so important to say, "Hey, no, it's not okay. If you are working at ICE, you are on the side of baby jails. That's not cool. You don't get to come and hang out at our cool tech conference. We're not okay with baby jails."

JACKIE

Yeah. Well, one thing I've been discussing with people recently was the contrast between—well, unclear yet, I guess but—the contrast between Project Maven at Google and how that got shut down by employee activism and Microsoft, for instance, which relies more heavily on government contracts and where it might not be possible for employees to change that. So, what next for the employees? Is the next thing to do to quit? In short, when do they decide it's time to stop?

LEIGH

I think every individual has their own personal Waterloo of what that line is, and it's tricky when what you're building is pretty much just generic plumbing. And then it's like, is it generic plumbing that someone's installing on their own premises? Is it generic plumbing that's infrastructure? If you look at Amazon and GovCloud, Amazon has been hosting the CIA's data centers for many years. I know people who would not work at Amazon because they don't want to be building infrastructure for the CIA. Right?

I think Salesforce is a good example, right now, where employees do have to make that choice. Do I feel okay continuing to build infrastructure that ICE is using? Where is that specific line—is it if they're literally using the software to manage the babies they're incarcerating, or are they doing some other thing with it? Is that okay? Maybe that's going to be okay for some people.

There's a reason I'm starting my own thing now, and it's that some of those decisions—I mean, the other reasons that I got my green card, and I now have that option that I didn't have before—but yeah, those ethical lines. I think it's important for each and every individual engineer to have the self-examination of, "What is a bridge too far for me? What will make me quit?"

I think it’s important for each and every individual engineer to have the self-examination of, “What is a bridge too far for me? What will make me quit?”

I think that was the conversation that we started in a lot of ways because, with the pledge, we were saying, "Hey, our work has an impact on the world. Our work enables things to change in the world for better or for worse, and it's not just by default for better. We can slow down the gears if necessary." People often say, "Oh, well, if so-and-so didn't build this infrastructure for ICE, somebody else would do it." But just because somebody else would do it doesn't mean you need to be the one who does. At least make their lives a little bit more difficult.

JACKIE

And, you know, there are not that many tech companies, large tech companies, in Silicon Valley that can handle certain kinds of problems. It doesn't take that many people to shut it down.

LEIGH

I think about this one tweet often—I'll send it to you—but it's along the lines of, "The last line of defense is a bunch of SREs with sledgehammers." I think about it so often.

JACKIE

That makes sense to me. But I guess one other thought is that I have personally mixed feelings about some of these questions because—well, I think it makes sense to quit, but quitting has to to come with a threat that more people will quit, right, or something else will happen. Because if you just quit and you're just no longer there, no longer causing them problems internally, then in some circumstances, that might potentially be viewed as a positive for the company, right?

LEIGH

Yeah, there's cases where the management is relieved. I think about this goofy economic framework from the sixties by an economist named Albert Hirschman. It's Exit, Voice, and Loyalty, and the idea with Exit, Voice, and Loyalty is if you are dissatisfied with an organization or a government or a company, any sort of collection of humans, if you're dissatisfied, you have three options—he later added a fourth, which is neglect—but you can exit, you can leave. You can use voice—you can speak up, you can try to change things. And you can go to loyalty. You accept that your displeasure at the particular circumstances, but you decide to suck it up, essentially. Then the fourth one he later added is neglect, which is that you stay, but you check out. Most engineers have had a job that's in that fourth one.

In particular, I think it's important to recognize that not everyone can use exit. If your immigration status is dependent on your work. I've certainly been in the position of, like, "I can't quit a job because I would literally have to leave the country." And a lot of people, whether you're in the stage of the green card process where you're stuck in an employer for a particular amount of time or you have economic constraints—a mortgage, a family—a lot of people have these reasons that may slow down their process of exit or prevent it for a certain amount of time.

But I think it's important to have gone through that thought process. Do I need to start thinking about leaving? Do I need to start putting the wheels in motion? How do I communicate that those wheels are in motion to my management? Just recognizing that the cost—it is a non-trivial cost to replace even just a single engineer at a tech company, both in terms of their actual productive output and the institutional knowledge that they represent. That cost represents worker power. So making your continued presence at an organization contingent on that organization making ethical decisions is honestly one of the most powerful things that tech employees can do.

Making your continued presence at an organization contingent on that organization making ethical decisions is honestly one of the most powerful things that tech employees can do.

JACKIE

So you do have an engineering background, right?

LEIGH

I do, yes.

JACKIE

Me, too. I've been thinking about how most of these discussions center on what I feel like is, in a way, engineering privilege in terms of job options and value to the company and irreplaceability. Do you think—this question could go either one or two ways—how do you think other people who aren't engineers can use the same kind of leverage? Obviously, they can't literally refuse to build something, but maybe it manifests in a different way, or maybe engineers just need to pull extra weight.

LEIGH

Yeah. I think that's true. This privilege that engineers have by being so sought-after is one that isn't evenly distributed among tech employees. I think it's still important that non-engineer employees think about these ethical implications. I think that's where, for instance, protected concerted activity becomes even more important. If you don't have the protection of technical privilege, you still have the protection of the National Labor Relations Act, and using using voice where possible and exit when necessary there, too. It's obviously a harder path, but I think it's still worthwhile. And I think it is important that engineers pull their weight. Because engineers have it way easy in so many ways.

JACKIE

By the way, could you explain more about protected concerted activity? How does that work?

LEIGH

Yeah. So it's within the National Labor Relations Act. There's this idea of protected concerted activity, which is that if you can convince a single other coworker to advocate for changes in working conditions with you, then your action is protected. So if you are able to convince one other person that—to use a really mundane example—the toilet paper in the bathroom is scratchy, if we work together, and I'm able to convince you that we should go to management and say, "Can we upgrade the toilet paper? It's not cool," they can't fire you because of that.

JACKIE

Oh, I see, wow.

LEIGH

Yeah. Yeah, it is. It is protected—you can't be fired. Concerted—you've done it with someone else. Activity, specifically advocating around labor conditions.

JACKIE

That's really good to know.

LEIGH

Coworker.org has a really good FAQ and set of resources.

JACKIE

Yeah, I saw they hosted an event recently. They're based in New York, right?

LEIGH

Yeah.

JACKIE

I'd love to to learn more about what they're doing because it seems super interesting.

Okay, so right now you're working on a company called Tall Poppy to improve employee safety in the face of online abuse, and it's something that I've had on my mind a lot lately. I'm relatively new to my job, and more and more people from my new company have been following me on Twitter. I'm just increasingly conscious of what I'm doing there now. On top of that, there was recently that incident with Jessica Price getting fired from ArenaNet. I don't know what you're able to discuss right now, but I'm just curious about some of the ways that you're thinking about mitigating those kinds of harms.

LEIGH

Yep. I've been thinking a lot about the situation with Jessica Price, which I think was just awful. And, in many ways, her employer contributed to her continued harassment, which I thought was really unnecessary and unhelpful.

So what we're building is, as you said, employer-focused. We've been thinking about, like, what would an employee benefit that helped protect people from harassment—what would that look like? What we've come up with so far is a combination of security awareness training—educational tools that help people take proactive steps to lock down their online presence—and incident response when bad stuff does happen.

Based on my experience—I've been doing this work for about ten years, working with people who've been targeted by trolls, stalkers, and other kinds of creeps to help them recover, prevent further or future harassment, work with law enforcement where appropriate and desired by the target, and, yeah, just work on mitigating what people feel is a nebulous and undefined problem. But it really has only a couple of different patterns to it.

JACKIE

Yeah, that makes sense. Do you feel like, in the context of—well, first, what are you thinking when you say that? I just immediately assumed Twitter, but I don't know if that's what you meant.

LEIGH

Twitter's certainly a source of a disproportionate amount of harm, unfortunately. So the couple of different use cases that I've been considering a lot are online mobbing, a lot of stuff related to intimate partner violence. Stalking is sometimes somebody you don't know but often much more often someone you do know who, at some point, had access to some part of your life, and you no longer want them to have access to that part of your life, but they won't let go. Things like revenge porn. So taking a really broad view of safety that goes beyond what people think of as online harassment. It's not just Twitter trolls. It's like, what is the safety aspect of your digital life and the ways that that extends into your offline life as well? Because, you know, we're both millennials—there's no IRL anymore. It's just life.

There’s no IRL anymore. It’s just life.

JACKIE

Yeah, that's a really good point.

LEIGH

There hasn't been a difference for most of our lives!

JACKIE

Yeah, I do feel like people really try to distinguish between the two. For instance, doxxing, obviously, is really bad. But I think there are also a lot of things that just happen online that are also really bad, and they may not be treated that way beacause they're "just words."

LEIGH

"Why don't you just log off? Why don't you just delete your account?" Sometimes your account is your job. And even sometimes it's not, but why  should people have to narrow themselves down like that? I think there's such a profound, chilling effect on people in general, but in particular women, people of color, LGBTQ folks, people with disabilities, anyone who's underrepresented or marginalized in a particular space, face such a disproportionate brunt of these attacks, of this targeting. And I'm not saying anything new there, but I think the reason that this problem is so important to me is that we're missing out on all these important voices. We're asking women, we're asking people of color, to circumscribe their lives instead of saying, "Let's help people feel safer so that they can speak out."

We’re missing out on all these important voices. We’re asking women, we’re asking people of color, to circumscribe their lives instead of saying, “Let’s help people feel safer so that they can speak out.”

JACKIE

How do you think about—I have a guess, but—the way companies lean on this idea of free speech and having an open platform where people can discuss whatever they want as long as it's not explicitly, whatever, hate speech?

LEIGH

I think that "whatever" is exactly the interesting part because companies and platforms already have so many regulations around what people can say. You can't post child porn to Twitter, nor should you be able to do that. You can't post copyrighted materials, and they have extensive infrastructure to decide what is and what isn't okay, including in some countries where they hide Nazi accounts. Right? So this infrastructure does exist. There's just been decisions made that say, "We value the contributions of these, say, Nazis, more than the contributions of the people whose voices they're silencing."

JACKIE

What you mentioned before made think of that, because it's similar to how when you allow unfettered free speech, in reality, you're not. Giving some people the power to say whatever they want prevents other people from speaking.

LEIGH

My friend—I picked up my phone to find the tweet by my friend Martin Shelton—"The thing conference organizers need to know about failing to kick out toxic people is that they're still choosing who's going to leave." I think that that applies to online spaces as well, so much. And the platforms do have a legitimately difficult set of decisions, many, many, many decisions to make as to where those lines are, and it requires really careful application of human judgment. It isn't a problem that machine learning can solve, except a few cases, and particularly because any sort of algorithms and machine learning that you set up to combat this problem will immediately be exploited and weaponized against marginalized people. So it's a legitimately tricky problem. But so far the incentives for tech companies have been growth, growth, growth, engagement, engagement, engagement. As we've seen by the bad take economy—I think it's really important to remember to be thoughtful when sharing bad takes.

JACKIE

How much are you amplifying?

LEIGH

How much are you amplifying? And it's different from "don't feed the trolls." So often, targets get told, "Don't feed the trolls," and there's no one correct strategy. There's no one response that's going to make everything okay. And so, as people deal with different kinds of harassment or just bad experiences online, it's important for everyone to figure out what works for them. Maybe tweeting screenshots of the shit that you're getting in your mentions makes you feel better and makes you feel less alone. Then that's okay. You should do that. If that makes you feel less alone and, thus, you can handle the onslaught more, that is a perfectly legit coping strategy. If just ignoring, if muting or blocking, is what works for you, is what makes that feel survivable, you should absolutely do that. If proactively blocking people by using Block Together or "blockchain" tools, that is a great strategy as well for the people that want to use it.

So often, targets get told, “Don’t feed the trolls,” and there’s no one correct strategy. There’s no one response that’s going to make everything okay. And so, as people deal with different kinds of harassment or just bad experiences online, it’s important for everyone to figure out what works for them.

JACKIE

What do you see as actions that Facebook, Twitter, etc., have been doing well, if anything they've done lately, any changes that they've made, has been good? What do you see as the most impactful changes that they could make?

LEIGH

It honestly cracks me up that Twitter hasn't solved the crypto spam problem. Like, I never tweet about cryptocurrency. I sometimes tweet about computer security, and I still get cryptocurrency spam. I do report it every time as spam, and it just astonishes me that it ever even crosses the threshold for me.

JACKIE

I feel like sometimes people say like Twitter is certainly capable of stopping Nazis, for instance, because they can fight spam.

LEIGH

Because they do it in some countries.

JACKIE

Yeah, that's true. But, specifically for the crypto spam, it's like—that's not good for their bottom line, I assume. It has to be bad for verified users to be getting impersonated by crypto spammers.

LEIGH

Yes, incentives are—I mean, the incentives are opaque. The reality is, I'm not going to stop using Twitter just because I get a couple crypto spam tweets a week. I'm just going to be mad at them.

JACKIE

Are they impersonating you, or are they—

LEIGH

No, I haven't been targeted by any of the impersonation. I think they have gotten better shutting down that behavior, although I'd have to talk to the folks who've been hit with it to be sure. But just signal-boosting—"check this account out!" Blah, blah, blah. The type of spam that you think would be pretty easy to catch. None of it looks new. A point that I think Sarah Jeong was the first to raise in her book, The Internet of Garbage, is that there are many types of online harassment that are really similar in nature to spam, and spam is a pretty solved problem today, except on Twitter, as we were discussing.

JACKIE

Similar in nature as in?

LEIGH

You have false positives, you have false negatives. It's text-based. I think that's particularly true for the high-volume mobbing style of online harassment. I think that's super, super true, and building platforms in ways that make it easy to direct a mob at someone is one of the problems that we've seen. Like WhatsApp struggling in India with the recent lynchings. Figuring out, what are the parts of the system that you can tweak to reduce that? It can be obviously tricky because virality is the growth engine of so many of these tools, but that's also the engine of harm. Well, it can also be the engine of harm.

JACKIE

Yeah. WhatsApp is a good example of a design decision that inadvertently had a lot of impact. They chose to make this feature where you can forward a message without making it look like a forward or whatever, and it turns out to be a pretty bad vector for misinformation, abuse, and so on. Or like Twitter. Do you have any thoughts on how companies can try to anticipate or get ahead of these kinds of issues?

LEIGH

It makes me think back to the Google Buzz story a bajillion years ago. I'm going to hopelessly botch it, so you'll want to fact-check it.

JACKIE

Yeah, I don't think I really know what it is—I've heard of it!

LEIGH

So what happened was, when you opted into Buzz, it would auto-friend you with everyone in your address book. But that included if you had been forwarding emails to your email account. The example—again, I am probably botching it—say you had been corresponding with someone using an email address that you had forwarded. And so you were replying by selecting to reply from that other account, so they didn't know your real email account. Right? So say your shitty ex is emailing this other email address, which forwards to your Gmail, and you reply, and you change the "from" field to that other email, and so they never see your real Gmail account. Well, shitty ex still gets added to your Google Buzz feed, thereby revealing your actual email account. And that really happened to a blogger I followed the time. I guess that would have been 2011, maybe?

That's a whole long story to say, if you'd had anyone in that room who had ever dealt with a stalker or an abusive ex. They would have been like, "Whoa, Nelly. Can we not disclose this whole friend list? Can we not auto-add anyone and everyone in your address book?" The point I'm making is, if you have people in the room who aren't like you, and you're thinking through these ethical and design decisions, you're going to make better decisions, and you're going to make decisions that prioritize safety. Again, I feel like it isn't a new thing to say, but I think it's so important with safety-related design choices that you have people with a variety of life experiences. Not just a bunch of hoodie-wearing male Stanford dropouts and/or grads.

If you have people in the room who aren’t like you, and you’re thinking through these ethical and design decisions, you’re going to make better decisions, and you’re going to make decisions that prioritize safety.

JACKIE

Do you feel like D&I efforts have been getting better? Or do you feel like they've stalled?

LEIGH

I think we're in an interesting moment where the the research is really well-established on how necessary diversity is, but we don't yet know about that many effective interventions. We know about some, and a lot of companies are still deploying the known effective interventions. I think there's still a lot of room for trying new ideas. We haven't moved the needle that much, is what I'm saying. We've moved the needle some, and we have transparency around the extent of the problem, and both of these steps represented a lot of progress. But we still have a ton of work to do, and we still have some major parts of the problem to address.

The part of the problem that I've been really focused on fixing for much of the time that I've done D&I work is less the entrance of the pipeline and more about keeping people in the industry. Selfishly, I want tech to be an industry where I want to continue working. I love building tools. I love messing around with computers. I love using them as tools to make cool things happen in the world. One of the reasons I went to the ACLU is to figure out if I wanted to go to law school or not. And I love working with lawyers, but it turns out I don't really want to be one, so now I'm like, "Well, there's that plan!" So now I'm back to wondering, "How do I build a tech company where I want to work?"

JACKIE

Yeah. It's sad. I'm fairly young. I've been working for two-and-half years, not a long time, and it's wild, the number of people I know who are already super burnt out. It doesn't take long. And so I see all these people who've been here for, I don't know, a decade or longer, and I'm just like, "How do we get there?" I was tweeting with someone recently about how we can't wait until we're all senior engineers and can contribute to building teams and hiring good people and all of that. But we just have to stay for that to happen.

LEIGH

Oh, my heart.

JACKIE

There was an Erica Baker tweet a while back about how we keep having the same Diversity 101 discussion. How do we get to—

LEIGH

Graduate-level? Yeah, so one of the hats that I wear is that I have taught a workshop called the Ally Skills Workshop a bunch of times. So Valerie Aurora created the workshop. We worked together. I was a founding advisor to the Ada Initiative, which was her and Mary Gardiner's nonprofit. And teaching the workshop—I think the workshop is a really effective intervention, and I wish every tech company had it—I think what's great about it is that it does focus on that retention piece. It focuses on how to build an actual work environment that is friendlier and more welcoming to people who are underrepresented. When bias does happen, how do you address it in the moment? How do you tackle those microaggressions before they become macroaggressions?

That work has felt really powerful and effective in the time that I've done it. It's also work that has a ripple effect. I've been teaching the workshop during, but not at, those security summer camp conferences in Las Vegas every year. This year is the first year I'm missing teaching it in four years, and I'm bummed about that, but there's this whole starting a company thing that's taking a lot of my time. Anyway, one of the interesting parts about having taught it to close to a hundred people, just in that little bubble of the security community—I've taught it elsewhere in the security community, too—just within that set of people, I've seen the sort of effects of the workshop ripple out because it is giving people who have shared values around diversity and inclusion a set of tools to be able to be effective advocates. I I feel like it's made a big difference.

JACKIE

Yeah, that's really great. That's super cool. Okay, one more question, then. We've talked about a lot of not-so-cheery subjects. What do you see as exciting you most? It could be technology-related. It could be industry-related.

LEIGH

Not to toot my own horn, but I'm really excited about getting to work full-time on helping people stay safe. I think that, in many ways, I've had this day job of standard corporate computer security work, and then I've had my advocacy hat of diversity and inclusion from my work with the Ada Initiative and other diversity efforts and teaching the workshop. But really the third big piece of my heart for the past ten years has been doing one-on-one security work with individuals who are facing harassment and just getting the chance to scale up that work, to help more people than I, as an individual, can address—that is what I'm really the most stoked to do. Everything about—like, I really don't want to be a bad example of working ridiculous hours, but I'm just so motivated.

JACKIE

It's fine if you choose to do it! It's just bad if your boss makes you or you make other people do it!

LEIGH

I think that even when you are the boss, yeah, no one is sustainable at eighty hours a week. Yeah. No one. It's not possible. And there's a very real example that you're setting for other people. Obviously, I don't have any employees yet, but I think it is so important to be setting an example in terms of how you act as a leader and not pulling heroics. Not building a hero culture is really important.

JACKIE

It is so important to be setting an example in terms of how you act as a leader and not pulling heroics. Not building a hero culture is really important.

Yeah. Yeah, that's awesome. How has starting a company been?

LEIGH

It's been really exciting. It's been really fun.

JACKIE

How long have you been technically working on it?

LEIGH

So my last day at the ACLU was May 11, and my first day working on Tall Poppy was May 12. That was a Saturday. You know, I've been working in my evenings and weekends to start the preparations for getting it incorporated and all that paperwork jazz, but I really got everything rolling after I left the ACLU. It's been full tilt since then. It's been a blast.

JACKIE

That's the dream!

LEIGH

Yeah, I feel really incredibly lucky to have the chance to do it.

JACKIE

It's exciting someone's trying this approach with a company. It feels like it's just this problem that exists, and you think about it, and you're like, "Well, that's awful."

LEIGH

I think the key part for me was thinking through where in the system of entities and organizations that might take action about online harassment, where is there any leverage? Nonprofits that are doing direct service. There are a lot of incredible organizations out there that are doing this work. Many of them have struggled with finding that consistent funding source, and, if nothing else, I want to take some of the weight off of them by being a service provider to people whose companies have said, "We care about this problem." I have already been in conversations with working to partner with and support the organizations that are doing this really important work with individuals, but I want to just take a chunk of this burden and just handle it.

JACKIE

It's timely, too. I think people are becoming increasingly aware of this problem.

LEIGH

Yeah. I think it's really been top-of-mind for HR and general counsel and security teams. Between situations like #MeToo happening and that affecting people who have jobs and positions within organizations, whether or not that organization is related to the #MeToo situation about which they're speaking out—often, the #MeToo cases that we're seeing are events that happened a while ago, and the target has moved on to another organization but still wants to be able to speak out about it. And I think as we figure out how to be better about addressing abuse within organizations, one of the important pieces of the puzzle is, how can organizations support their employees and their colleagues when they are doing the very important work of speaking out or becoming whistleblowers?

Bianca St. Louis

Bianca St. Louis