The Framework Project

Conversations with people thinking about tech and its impact on society.

Bianca St. Louis

Bianca St. Louis

Bianca is an independent consultant at The C.E. Suite, and she's worked at companies like Pinterest, Andreessen Horowitz, and Code2040.

She invited me to a coffee shop in Oakland, where we discussed everything from selective accountability in the tech industry to the social implications of misinformation to the importance of humanistic perspectives in building technologies.


JACKIE

So you were recently talking about the news about Theranos because Elizabeth Holmes was charged with fraud, and it's just been an escalating scandal in the past year or two. And you raised a lot of points about accountability and oversight and how that gets applied selectively. I was curious where you think existing systems are failing us and how you think the cycle could change.

BIANCA

Okay. Let's see, where do I think these existing systems are failing us? I think, by default, there are a lot of groups in society and communities that people have been taught to give the benefit of the doubt, and then there are groups that people are taught to be hyper-vigilant around.

So, case in point, you think about, when you walk on the street, you're never taught—not taught consciously—"Oh, hold your bag close to you when you see a black person coming." Right? It's not a learned behavior that you have for white people. If I told you, "Oh, have you ever pulled your bag closer when a white guy is walking next to you?" That would be—right?—like, "Huh?" You know?

I think that's where the system's failing us—that we only show certain groups as being irresponsible, "violent," when, in actuality, everyone is capable of mistakes. Everyone is capable of missteps. So I think that is a space that's failing us, and we just need to humanize everybody. Remind people that white women make mistakes, white women can be in the wrong, you know, black folks make mistakes, black people can be in the wrong, and we can be in the right, too. Thinking about who gets the benefit of the doubt. Reminding people that everyone is capable of evil, everyone is capable of harming others, and that there shouldn't just be a spotlight on these particular groups.

I think, in the bigger picture, we just need to remind folks that—I think a lot of times narratives have placed a lot of white communities as saviors, right? Saviors as in, "Here is the pinnacle of moral of moral guidance." So think about it when you see organizations like Save the Children. You never really see black folks going to an impoverished white countries, you know, ever. Ever. So then you have people who are constantly seeing that in the media thinking, "If I am of this race, I'm inherently perfect and great and not flawed."

Then we have to reconcile that with a situation like Theranos. You're in constant denial because those are not the narratives that you see, and I think, as humans, it's hard for us to grasp. Like, wait, hold on, you're operating out of the pattern and the narrative that I've established. So then you're going to keep on letting things pass. Even when you think something could be off. You keep on giving the benefit of the doubt. Hopefully that answers your question?

JACKIE

Yeah, for sure. It's really hard because so much of it is just, you know, literally how people's brains work to process information. Another thing that I thought was interesting about Theranos in particular was how media narratives build on each other. Like, there's one glowing piece, and then, subsequently after that, everyone is trying to work off this narrative that's ultimately pretty arbitrary. Is there anything you've been thinking about in terms of that? Ways to disrupt what I see as media cycles that can unnecessarily amplify certain aspects of the story?

BIANCA

I think it goes back to—okay, all these terms. I think it's a money thing, right? Like, that is the beast that we've created, right, that you can have a really great quality story, but that's not getting you clicks. That's not getting you views. And in this day and age it's just, like, "Hey, if I need to get a trending piece to get me into the conversation, so be it." Because especially in this just-in-time, everything needs to be now, now, now, really distracted generation—there's so much information out there.

So I think we've created this beast. These things aren't super well-funded. So they can't operate—well, they can operate from a place of power, but you have to have lots of capital to not need to be in this cycle. And people want to be shareable. In this social world, people want people to look at their content and if that means, wait, hold on, someone else had the foundation [of the story], but, wait, I found one little line more that shows that we know more. It's just, like, "No, look at me. Look at me!" The first person introduces the story, then everyone else is like, "Look at me, look at me, no, come here! I have a little more."

You're just playing these games versus saying, "No, these stories are substantial stories that I think are important." But I also think it goes to this obsession with the numbers game that I think society and culture has played, in that numbers—we talk about how much we want quality, but we don't act like it. You know what I mean? As much as we say about quality, quality content.

We talk about how much we want quality, but we don’t act like it.

I think about how people design systems for popularity. I think about Instagram. Instagram won't give certain people access to swipe up to learn more. So, basically, certain things are reserved for influencers, but then it's just like, okay. Well, I might be a person with less than a thousand followers, but I still have quality content that might deem that—it's just that we design systems in which we incentivize popularity, influence, virality, versus quality. So it's also a design play, too, and not like, "How do we redesign business operations? How do we thoughtfully think about numbers and that they aren't just the one metric around quality and success?"

JACKIE

Oh, yeah, I think it's super relevant. Did you see Ev Williams published

BIANCA

It's on my docket! It's on my to-do list.

JACKIE

Oh, well, I think the TL;DR is just that Medium tried an ad-supported model. Now they're doing subscription. Unclear how well it's going, but his whole post was focused around the idea that it isn't sustainable for all media to be ad-supported, and I think all of this drama with Facebook and everything builds on that. I think a lot of that is just lies in the popularity, virality, whatever, aspect of it. Do you see subscription media as a realistic option, you know, something that can be commonplace?

BIANCA

So it's a nuanced answer because I think, on one end, I keep on going back to it—we've created this beast. While [subscription media] could be [commonplace], people don't even know what quality is anymore. You know what I mean? Nowadays, people's relationships to information have drastically changed. Like, back in the day, you would go to a few news sites, and you could get all your information. Now everyone thinks they're a journalist. We've lost the art. We've lost the art of having people literally trained to tell you stories.

So to the average person, from a business standpoint, why am I going to pay you, a company, to give me "quality" journalism when I think I can get quality journalism from my friend over here who was at the scene of an event and live-tweeted it? I think there is an opportunity for subscriptions to be a thing, but we have to remind people what quality journalism is, in a way that's not so elitist.

I think the way that people act and think about it from a business intelligence perspective—all the quality media outlets were always so expensive. We've designed a really elitist—

JACKIE

Yeah, people always talk about The Information, for instance.

BIANCA

Yeah, you've designed a really elitist kind of system. So, in my opinion, we have to reframe people's relationships to information—high-quality information—that it can be accessible, not cost, like, hundreds of dollars, because who can afford that?

We have to reframe people’s relationships to information—high-quality information—that it can be accessible, not cost, like, hundreds of dollars, because who can afford that?

JACKIE

That's a great point. There's always an intellectualism associated with it, which I don't think has to be the case.

BIANCA

So then some person is like, "Well, why do I need to pay this subscription if, you know, I don't feel like I'm aligned with that?" I hate to say it, but information needs a rebranding. I think that's reminding people that, the same way you pay for insurance, the same way you pay for a lot of things that don't seem to be super important but are really beneficial.

We can change people's relationship to information and just remind them. As we get older—when you were younger, you had your textbooks and whatever. But [media] is our diet these days. Literally every day, day in and day out, we're processing information. It's going to have positive or negative effects on your anxiety, on your health, on your mental health. The content that you're consuming is now the main form of how we engage.

I guess I'm trying to say is, subscriptions do have an opportunity to have a renaissance, but we really need to have a rehashing of how we position information, how we position quality content, how we can position journalists. This idea that everyone's on Twitter and everyone's a journalist. Yes, it augments it, but we shouldn't just rely on that as a source of full nutrition and full insight. These folks aren't trained necessarily to—

JACKIE

Yeah, on Twitter, there's such a thin line, where you'll have a random person demanding from an expert, "Why should I believe what you're saying?"

BIANCA

We don't teach people this due diligence as well. Even going back to how our systems are designed—verified checks are given to pretty much everyone, right? So I think maybe going back to certain badges—this person is knowledgeable, an expert. Twitter has that information when they look at your analytics. With whom do you resonate the most? Even having key terms—this person talks a lot about x, y, or z, to say that they might have some insight and understanding—and you shouldn't assume they don't know what they're talking about.

JACKIE

Yeah, it's weird to have everyone appear to be on equal footing, even if they aren't when it comes to the information they have, but, you know, you don't see that on Twitter, right?

BIANCA

And then—here's the trust and safety in me—we've also messed up that because then it's like, I don't want all my information out there. Nowadays, people are not trying to hear you. So then even if I were like, "Yeah, here are all the places, all the credentials that I have," someone would be like, "Oh, what do you mean? I don't care about that." Then, well, I don't want to have all that information on there because you never know how people might—call your boss or manager, you know. It's the balance around that.

JACKIE

Yeah, for sure. Okay, so on an unrelated note, then, I think you referenced it, but I'm not sure if you have a more general take on it—about misinformation and how to deal with having content at scale.

BIANCA

Yes, I think I did. It's crazy, right? My little solution would be treating it like counterfeit bills. At every office, with counterfeit bills, you literally have trainings on it. You learn to spot it. People build technologies to check for counterfeit bills. People even thought about, like, "How do we not even use bills?" Right? We don't have to deal with that.

So I think people think this problem is just a Facebook issue, but no, no, no, people's lives are at stake. Working in trust and safety roles, I would see certain [fabricated] photos that people made, and I'd be like, "Are they allowed?" The thing about it is, there's a lot of virality. Information spreads quickly nowadays. I call it the weapon. The weapon of misinformation. I feel like that's the next book title or something.

It's more and more lethal these days. I think about the death by a thousand cuts—it seems like it's minute and tiny, but at scale it's really scary. It has real-life implications. You're playing with people's emotions here. You're playing to people's emotions that can invoke rage, fear, distrust. With misinformation, particularly with images—

JACKIE

Images and video, too, have historically been mediums that people have been able to trust. Fakes were, you know, easy to tell, and it's becoming less and less easy.

BIANCA

Voice is what really scares me. Voice is what really scares me in that way. Which goes to my frustration with technology sometimes. Hey, cool new tech! We're not going to think about implications. Throw it out there into the world. I'm just like, "Does anyone do an audit?"

JACKIE

I interviewed—I don't want to cast them to the wind or whatever—but I talked to this company when I was looking for a new job. Andreessen Horowitz has this talent network and introduced me to one of their portfolio companies. I didn't know what they did until I was talking to them, and it was AI-based voice imitation technology. It was so interesting. I was like, "Oh, what kind of ethical measures have you taken to make sure that people don't have their voices falsified without their knowledge?" A few questions along those lines. And it's really unfortunate. It was clearly an afterthought. There's just so much more energy in the industry as a whole going towards making these technologies as opposed to building tools that also counteract them.

BIANCA

I love innovation. I'm excited about innovation, right? But at the same time, I think we need to step back and understand the implications of, for instance, someone who doesn't like me getting the ability to doctor my voice. My voice information is easily accessible, you know. It's gonna be more easily accessible with the rise in podcasts, the rise in videos. Someone can pull that information and frame me for something.

And then I think a lot about systems. You're going to have tech that can be exploited in systems that aren't just. So then what if I get framed for something? I get thrown into a system that isn't really optimized for me. That's not going to listen to me.

JACKIE

Yeah. I think our legal system is just not prepared at all. Today, in a court of law, a lot of evidence—photos, videos, audio, whatever—holds up, and I don't know if they have good alternatives, really.

BIANCA

So I think about the best possible scenario and the worst. If someone goes out and kills someone because of a picture that they saw that showed someone getting attacked, who's at fault? If they felt like they were in danger—that is a term that's commonly thrown out. If someone sees a doctored picture of someone harming a family member of theirs, and then they go out there and commit an act based on that, are they in the wrong? That is what they saw. There are implications around that.

I feel like we need a way to see that this image was altered. Having a seal or a watermark or a site that shows a picture's history. The history of the sound. What was the source?

With companies, it's weird because I don't want one company having all our data, but I think companies need to think about how they work together to fight this problem. I hate to get serious, but when people are dealing with child pornography, there's this concept called hashing. Like, yeah, there's this central space that can pick up this content if it's on another site to mitigate people having to look at that kind of content.

JACKIE

To relieve the load from the content moderators.

BIANCA

As well as just having a space where this content has been vetted as a piece of content that is particularly harmful. Thinking along the lines of a similar kind of mindset or view. What is a hash-like technology that can map out and say the history of this piece of content, the history of this voice data? I think we've really got to get serious about it because I see people's lives at stake in this issue. Right now we're all pretty on edge. People are emotional, and people aren't really processing the things that they see. We're reactionary.

JACKIE

Yeah, right.

BIANCA

So that was a long-winded answer.

JACKIE

It's just so relevant, and it's something that I feel like a lot of people have really been worried about personally. Not just in the context of their own lives, but really internalizing current events and how this tech complicates it so much. I was listening to—have you heard of Reply All, the podcast?

BIANCA

I feel like I've heard of it.

JACKIE

For the first time yesterday, I listened to an episode, and there was one episode about the antifa supersoldiers meme. Where a bunch of alt-right people misunderstood a troll and thought that a civil war was going to start and went out in the streets on the day of the civil war was supposed to start, and all these liberals were like, "LOL, yeah, I'm coming to join you at the civil war in my mechsuit or whatever." But people believed that. It's totally real for a lot of people that, like, this civil war was a possibility, and they stopped it by going out there, which is wild.

BIANCA

This tech supports the view that I have that folks who are creating it don’t understand people—they don’t understand culture. You’re dealing with people’s safety.

But then it goes to—what people don't understand is—voice, that's literally the OG means of communication, right? That's how we communicate our warnings or opportunities of attack. This tech supports the view that I have that folks who are creating it don't understand people—they don't understand culture. You're dealing with people's safety. Safety will push you to do some crazy things. Biologically, you have a fight-or-flight instinct, in that our body hates to be scared. So you have these alert systems.

I think it's going to be really impactful for us to be serious because people don't play around with safety. People make rash decisions when they feel like their safety is threatened, when they feel like their credibility is threatened, when they feel like—anything that's related to self. Our bodies naturally will get into a panic mode. And I think we have to remind ourselves of that. Those responses are the feelings that misinformation evokes.

For me, I think about a better survey Facebook could have done is, instead of just asking if misinformation is bad, starting to map out how people felt when they saw a piece of misinformation. What did it evoke for you? If it is a piece of misinformation, someone's doing research on it. Right? Then starting to map out the bigger implications. If all these things are making people feel angry, sad, frustrated, ready to attack—that's not safe. Those feelings are not good constant states.

JACKIE

It's such a hard balance to have. I don't necessarily want Facebook and Twitter actively controlling like people's emotions, but also at some point they're amplifying certain ones more than others in the existing state. So I don't know.

BIANCA

I don't think I would want Facebook to do it. But I do think some researcher needs step back and do the mappings of those implications around misinformation because people don't want to believe it's a problem until something bad happens. Even when something bad happens, people still don't want to believe things are problems, but that's neither here nor there.

JACKIE

Okay, so another question I had was around venture capital and its faults. I think there are a lot of things wrong with the current model of venture capital, but, in general, do you see that model changing and becoming less focused on big firms and more focused on smaller individual investors? Do you think that's a good solution to that?

BIANCA

I do think that venture needs to step up with the times and understand the power and impact that it has. I think venture can be such a force for good, but it also can be such a force for inequality. Certain companies would not be seen in the light of day if they didn't have venture dollars, in the sense that their ability to scale and grow has been extremely expedited. So, when on top of that, you're placing just a lot of bias and a lot of false positives, that can lead to a really unequal playing field.

I think VC needs to step back and understand the power that it has and the message that it sends, too, in that it creates this codependency. Like, if you don't have venture dollars—think about all the articles that we see. So-and-so just raised something, so-and-so just got to profitability. So-and-so just had a revenue rate—

JACKIE

Yeah, it's like an imaginary series of trophies.

BIANCA

I think venture needs to just have a wake-up call in its role in impacting innovation versus influencing inequity. It just wasn't ready for the impact that technology would have, in my opinion, so now they're like, "Okay, so we have all this power, we're able to influence all these things. What does a healthy model look like?"

I do appreciate the different funds popping up, but what I do think—I hate to say it—these folks have years and years of relationships. These older firms have seen more, right? I do wonder, then, when we're creating these smaller funds... are we setting those folks up for success or failure?

JACKIE

Yeah. Well, I mean, firms like Sequoia funded, like, Google.

BIANCA

They're working with a different kind of metric. Because they've been in the game longer, their rates of success look different. I worry with smaller funds that you're still going to be risk-averse because you want to have good points on the board. The very thing we're trying to avoid—VC as particularly focused and biased—it's like, "Are you able to take those risks?" Because you're also trying to prove yourself. Because you still have to get returns. You're still playing by the big boys' game, even if you're not.

With VC, for me, I would love for people to see that the role that they’re playing in true innovation and then the role that they’re playing in inequity. Are we truly setting up a system that enables more people to win, more people to create, opportunities of wealth creation for their communities?

With VC, for me, I would love for people to see that the role that they're playing in true innovation and then the role that they're playing in inequity. Are we truly setting up a system that enables more people to win, more people to create, opportunities of wealth creation for their communities? VC is having this awakening. Like, yes, you guys are smart, yes, there's thoughtfulness and methodologies. But there also is a lot of luck.

JACKIE

Randomness.

BIANCA

And randomness. And network impact. Yeah, of course you're going to be able to invest in this company because you're friends and, you know, went to school together or this person trusts you to share the idea in the early stages. I think VC is just going through a humbling phase.

JACKIE

That was super interesting. Yeah, I think about that in particular—the question of who's set up for success. I think it's also easy to typecast VC firms with diverse partners as, "Oh, they're just the diversity firms," even when some of them are explicitly not trying to do that. Let's see. For the next question, around time management and what's considered valuable time spent in the tech industry—there's this strong cultural emphasis on intellectualism and production, I think, as you put it, which feels really accurate to me. Like, everyone needs to be reading the latest Ray Dalio book. There's a lot of posturing around it. How do you think about that personally? What does that say industry-wide?

BIANCA

You know, I am a woman of color, a black woman—surprise, surprise. But I feel like I can be better at conversations and technology because of my experiences. Because I understand that technology isn't seen as the Holy Grail to a lot of people. There is a lot of resistance. This obsession with production and reading—it creates this complex that those people producing that content know everything.

Yes, there's learning, there's smartness. But, at the end of the day, you're building consumer technologies. You need to understand people. Your products aren't falling into your hands. They're not falling into the hands of highly educated folks all the time. Or people who are highly connected, highly knowledgeable about who's who in technology.

So with that in mind, you need to start to immersing yourself—you just need to be a student of culture. You need to understand, what is people's relationship with technology? What is their relationship to the world around them? Because, aside from the capitalistic part of building businesses, at the end of the day, tech was meant to solve problems. And if that was "the mission," how are you going to solve problems if you don't even know what problems are out there, if you don't know how different people experience those problems?

This obsession with reading or always, you know, producing, versus taking time to understand your bias and stepping out of being so intellectual all the time so that it humbles you. Tech could use a lot of humility. When you're not super humble, you're going to be less susceptible to listening to the other person's point of view.

JACKIE

At the end of the day, tech was meant to solve problems. And if that was “the mission,” how are you going to solve problems if you don’t even know what problems are out there, if you don’t know how different people experience those problems?

Yeah, there's this ivory tower mentality around pure reason and rationality and a lack of empathy and compassion.

BIANCA

And I feel like that's horrible when building systems because you will not let in anything that doesn't fit or filter through that rationality, that black-and-white lens. The world is not fucking black and white—pardon my French—it is not black and white. We as humans are not black and white. Just how you can have a really good but also really shitty day, right in one sitting. That's nuanced. We're nuanced as individuals. Just because you're a good person doesn't mean you're incapable of harming others, right?

I think it's important to be reminded that rationality isn't going to always be the solution and the answer. And again, you're dealing with complex individuals such as people. For me, learning myself, learning my bias, helps me be a better technologist. Sitting in different communities, understanding their relationships with technology. Understanding their perspectives on how they feel empowered or disabled by, you know, tech or access to tech.

Lately, more and more, I've been spending my time just seeing, how am I built? You know, how am I building? What are the inherent biases that I bring when I build things, when I create things? The assumptions that I make?

It was so funny—I was talking to a family member, and I was telling the family that I wanted to do Uber, and they're like, "No, you're going to get robbed," and it's like, "Oh!" That was the mindset—if you have strangers in your car, you're going to get robbed. If I'm not even thinking about a person having that view—if I'm assuming that my solution is the best solution—"why don't you want it?" I'm not going to probe to make it more human-centered. I'm not going to think about the human side of it and what the reluctance might be.

We need to create spaces in which our pride isn't just constantly fueled, but our humility as well, our understanding of other cultures, our understanding of ourselves. I hate to sound so zen, but yeah, you should know yourself. You should know how you present your biases your assumptions into the technology that you build, into the ways that you communicate, into the ways that you are open to feedback, because I think, in having this view about output, you're going to start to disrespect people—not disrespect, but create a block with—people who might not fit that like that pattern, essentially. Then you miss out on feedback from peers or colleagues who may not have the same view.

You respect people who operate like you do, right? You're going to be more likely to do that, so then if I'm the person who's out there, you know, fascinated by culture, and my output looks differently, because I'm inherently "different," you, as a creator and producer and engineer, might not take my words as holding weight. So I think that's another kind of sad byproduct of this attitude. It doesn't lend itself to communication because that person's going to think they're right, and the only people to whom they will be willing listen are the very people who live like them and operate like them.

For me, personally, it's huge to just not idolize tech as the end-all, be-all, and I have manifested that in my life. Yes, I'm a passionate advocate for tech and really love tech, but at the same time, tech isn't it all. When you operate from the understanding that this very industry is everything, and it's indestructible, you're going to build like that, too, from a place of pride, so I think it's important having different components of my life reflect the view that there's more to life than the tech that we build and the tech conversations that we have and the savior complex that we have around tech becoming a solution for everything.

JACKIE

Okay, so one last question, then, just to wrap on a positive-ish note. What is something that excites you a lot about technology, either in the industry or in society as a whole?

BIANCA

It's so vain. So there's two—I I feel like black Twitter just brings me joy. In such a bleak and crazy world sometimes, humor and joy, this shared culture between human beings who relate to you and share content with you even when you're not in that particular geography, but you have this shared communication.

JACKIE

Yeah, meme culture is really good for so much. It's a binding cultural force.

BIANCA

It's so simple! Like wow, let me get in on the conversation. You're not looking at, like, "Who is this person who shared this meme? What is their PhD in design?" And yada, yada. You're just like, "That's funny. I like it. You were clever." Faves, bookmarks, you know.

I'm excited at connection as a force for good rather than for harm. I think it's a good time. I think it's a good time to be a genuinely good human. People are tired of the systems that be, and there's room for fresh perspectives, fresh voices, new narratives. So I think it's just a good time to be creating.

I don't know, I feel super privileged! I know we take these spaces for granted, but I think about, "Wow, this tech-enabled tool"—I mean, don't get me wrong, there's hustle, there's whatever, but—"Wow, this tool enabled me to create a life I'd never imagined." So how many more lives can create? How many more visions can it expand? I think that's really important and exciting. And some folks are listening, powerful folks are listening, to people saying, "Things have got to change. Things have got to be altered."

I know we take these spaces for granted, but I think about, “Wow, this tool enabled me to create a life I’d never imagined.” So how many more lives can create? How many more visions can it expand?

In terms of tech that excites me, I think more companies around financial tech. Helping empower and enable people with your technology, like Tiffani [Ashley Bell] with the Detroit Water Project—it's simple, but how do we solve needs for everyone? Basic human rights. I think about the gig economy in how it means that people have options. I guess the gig economy has its flaws, but that's so exciting to me, that someone can have another line of income in such a crazy, volatile economy. I'm excited about how that evolves and how we enabled more people to build different forms of capital.

There's just a lot of new tech, new conversations. I don't know I feel about crypto, but I feel like there is a huge opportunity with anything that's trying to disrupt the old guards. It lends itself to thinking, "If we can try and disrupt this area, what others can we try and disrupt?" It gets the glimmer in people's eyes—maybe there is some energy and time we should put into it.

JACKIE

Yeah.

BIANCA

And then, honestly, the people are what make me most excited about this industry. The good people. Not even trying to, you know, hype you up, but like you. You're like, "Let's have these different conversations, use our access, use our privilege, introduce new conversations." The people restore my hope because I—and a lot of other folks who hustle—we have to make tech that not only enables us, but that enables other people, too. The people who are now feeling empowered to build tech and the solutions—they care. So I'm just excited that the keys are getting into more hands. Yeah, it's good. I hope people stay optimistic. It's not all bad.

Leigh Honeywell

Leigh Honeywell

Hunter Walk

Hunter Walk