{ "text" : "
\n Kim Crayton: I'll tell me if you remember no telling if you remember Well forget never forget. Welcome to the hashtag holiday season podcast. The show focused on the strategic disruption of the status quo and technical organizations, communities and events. Hello everyone and welcome to today's episode of hashtag called to sing podcast. My guest today and a lot of you have been recommending her book to me. I had it on my shelf, I saw it somewhere and I don't even know where I saw it but I immediately went to amazon and bought it like months ago and I'll put things on my shelf. So when the time comes for me to read them, they're there And this is an author that has been on my shelf for a while and welcome to the show. The author of technically wrong Sarah please introduce yourself.
\n Sara Wachter Boettcher: Hey everyone. My name is Sara Wachter better. And yes, I am the author of a book called technically wrong sexist apps, biased algorithms and other threats of toxic tech which probably says a lot about the things I care about right there in the subtitle. And I am a consultant working mostly in kind of strategy user experience and product strategy. And I've been doing that for a lot of years and over the years I started noticing all of these ways that I felt like the tech industry just wasn't doing enough to understand the consequences of its work and the designers were not necessarily thinking about the people that they were including or excluding in their product and that we all needed to pay a lot more attention to the biases and assumptions that we have and that led me down this path where I went from mostly working with clients on kind of more general UFC questions to really digging into tech culture and how that culture has created this place where we're at right now.
\n Kim: OK, so before we talk about where this, this place that we're at right now, because we can go, we're gonna go there. My first questions are always why is it important to cause a scene? And how are you causing the same?
\n Sara: Well, you know, why is it important to cause a scene? I mean, look around if you think things are going great in our country or in this world right now, then maybe you don't think it's important to cause a scene. But what are you're not listening to this podcast? Like it's important to cause a scene because we are dealing with all of the history of systemic equality or excuse me, History of systemic inequality and oppression in this country. And it's kind of culminating in what I would call a crisis moment and in technology specifically, we are not absolved of any of that. We are we are making the tools that people are using, you know everywhere from within an organization like ice to a company like facebook that was, you know, doing all kinds of unethical things with data that led to a lot of misinformation over the last election. I mean like everywhere we turn technology is so deeply embedded with these major problems and major ethical concerns. And so if we're not talking about it, it's definitely never going to get better. What am I doing to cause a scene? You know, some days I feel like I'm not doing nearly enough. Some days I'm I'm tired from all the things that I am doing. So what I really focus on, it's obviously writing and speaking about these issues. So, you know, writing a book like technically wrong or the book I wrote before that called Design for Real Life, which was getting that sort of a piece of this issue. And then going and talking to people who work in the field and kind of saying, look, these are things you might not have been trained to think about. These are things that you may not perceive as being your job. But the fact that we have not done this has this sort of abdication of responsibility has created some real harms for people. And we need to get honest about that. And then we need to figure out how we're going to do better. I will say a couple of years ago when I started giving talks with that kind of message probably in around 2015, I had conferences that were like scared to have me or they were like, can't you just talk about like you X stuff and I was like this, this is you X stuff like this is this is actually really important. And then what's actually happened over the past couple of years is that suddenly there's so much more like mainstream interest in talking about this stuff and sort of starting at least to dig into it, that I'm starting to get a lot more people who are like, can you come talk to our audience about this? And it's really, really encouraging in some ways to know that so many people who are working in the tech industry, I want to do a better job and want to learn about this topic. And then at the same time it's like, it's such a large topic as it feels like something we're going to be working on for forever.
\n Kim: Yeah. Yeah. OK, so when I just getting into chapter two, you hit on so many things that I that I speak about and and I want to bring that these things up as some guiding thoughts. I talk about one of the issues when we talk about the challenges or the assumptions or the not even assumptions, it's the the ego of technologists is how we define the word and attach the word technical to certain jobs. And those that are that the industry considers or that people consider nontechnical or less than and so one of the things talks that I do is I don't do non technical talks because we're using the term technical incorrectly, if we look at the dictionary what people who who are when you have a certain skill that's technical where regardless of what it is and we need to use when we're talking about people working on software or hardware, they are working with technology. And so if you're talking about the skills that you and I have, when we're talking about, how do we talk about bias and and and and inclusion and all these other things. Those are different sets of technical skills. And when I have that conversation, it helps people shift their minds because I've often been, it's the you go to a conference, oh, I don't want to see that, that non technical talk or that soft skills and I'm like, if you are not aware of that, we live in a no longer an industrial economy where we're making widgets, it's people who have those quote unquote soft skills in the information technology, information economy that are driving innovation and differentiation. So one of the things I talk about is you speak about this. So one of you talk about the pipeline and you talk about another thing, I'll talk about why are all women making gains in tech because white women are making gains when everybody else isn't. And then you talk about culture fit and all of those things. When you're when you mentioned, we're just at the beginning of this, they've been entrenched so long that we're going to have so many long term effects of those assumptions and biases and ego driven things that it is daunting for many of us who are working on these things, but we have to start somewhere.
\n Sara: Yeah. And you know, I think one of the things that you really touched on there that I think is sort of a like first thing people need to get their minds around is not conflating. Working on tech products are kind of like making technology stuff for for people to use as being the same as engineering. Engineering is a piece of that. But you know, one of the things that's been such a frustration to me is the way that that is treated as sort of the end all be all of tech, like I'll, everybody who matters in tech is doing engineering and the reality is that engineering is great. You know, my my husband happens to be a software engineer. Many of my favorite people are software engineers, but software engineering by itself is very unlikely to create good tech products that were really unlikely to solve the challenges are facing now. And one of the things that's been so undervalued in that in that model, like you mentioned soft skills, which obviously like even calling them soft skills is oftentimes used to denigrate them, but also thinking about things like actually having expertise in the subject matter that we are dabbling with like we're so focused on things like we're going to disrupt this industry and that industry without ever thinking like, huh, should we bring anybody in who?
\n Kim: Yes, yes, yes.
\n Sara: You know,
\n Kim: You wouldn't see that any other. I mean, I don't think you would, you would in medicine, you just wouldn't, I don't think just start doing stuff without going to experts and saying, hey, how do we do this? And this happens so much when I'm talking about, because I don't talk about inclusion and diversity from a moral political. To me it's about money. It's about it's about building a business because you cannot create products and services for a global market from one or two perspectives, it just doesn't happen. And so when you have you look up and you see that everybody at the table looks exactly like you and you start going out throwing these what you consider initiatives out there that are supposed to attract and then they backfire. There's a reason for that because you don't know what you're doing. There are reasons there are experts. It doesn't mean that what you're doing what you're you are an expert in is invalidated. You're just not an expert in this.
\n Sara: Exactly. So for example, have you ever talked with Safiya Noble? She is a professor. She's an academic and she wrote a book called algorithms of oppression.
\n Kim: No. And she's wonderful self either. Yeah.
\n Sara: So I bet, I bet you'd be interested in that. And I bet that a lot of your listeners would too. But one of the things, you know, I talked with her when I was working on the book and one of the things that we talked a lot about was also that, you know, on the yes, absolutely. We need more diverse teams working on these products, but we also need to be thinking about the diversity of sort of the backgrounds they come from and sort of what they're trained in and thinking about, You know, for example, if you're going to build something like technology for financial services, that's going to be using information based off of something like, you know, where people live like zip code, if that is included in, in a profile that is then used to target or exclude people from whether or not they can get certain types of financial services. If you are trying to build some product that includes those kinds of features and nobody on the team is an expert in, let's say, the history of
\n Kim: How
\n Sara: Financial services have been included and excluded for audiences in the country. Like if you're, if you have nobody on the team who really understands what redlining was and how it happened, for example, what the hell are you doing this? You have the expertise to do. And so when you end up seeing is all these people who have no idea basically they're playing with fire have no idea that they're playing with fire. You got that move fast, break things mentality. That's even, you know, even if that is no longer like an official slogan at facebook that is still so deeply embedded and you're not thinking like, oh what you might be breaking is really important, right? Like you could be re creating and reinforcing all of these historical systemic biases. Exactly.
\n Kim: And you ever think about that? Yeah. And it goes, so when you look at the Cambridge Analytica and and the researcher who initially got gathered that data, we talk about the facebook thing. He was like, well I'm one of 1000 people who are doing it and we just didn't think anybody would care that their data or we just assumed that they would know and I'm thinking
\n Sara: What
\n Kim: And what space does some grandmother who's in some city whose family has moved across the country. And the only way she gets to see her grandkids is through facebook is she having this conversation about her data being given out
\n Sara: Well and not only that, but in that particular example, they were also scraping data from your friends Exactly. Using the app. I mean they never consented to any of them. I
\n Kim: Mean, I think,
\n Sara: I think one of the things, you know, you see this and something I've been, you know, watching pretty carefully and pretty frustrated by is like over and over again, you see a company like facebook make these statements where it's like every scandal, they could basically issue the same statement because they're not saying it is every time it's like, oh we didn't anticipate that happening. Whoops. Yes. And over again and it's like, yeah, OK. But you could have anticipated it and you chose not to and then at what point best resources and anticipating
\n Kim: That. And then at what point are you held accountable for not anticipating because ignorance is not I run. OK. So january july 1st in Georgia there is a hands free law now you cannot have your phone in your hand. If you're driving period, stop sign street, it does not matter. You cannot have that phone in your hand. A cop pulls me over. They do not want to hear. I did not anticipate that that's something, there's something I, there's a rule, there's a there's something I should have done on my part because they're not gonna care that I didn't know about that. So at some point it becomes and, and and this pops in my head is about the risk management who's on these teams that are experts at risk management. It's like it's no one thinking about any of the cons of the things that we're doing,
\n Sara: I mean honestly I I know that that's part of the problem and at least historically that has been part of the problem, because you know when we talk about innovation or whatever buzz words we're using in this industry, we tend to focus very much on positive outcomes, we are biased toward positive outcomes and that because we tend to talk about things like how are we going to change the world? And so we tend to focus on what the desired future is going to be. We also tend to focus on things like delight like in the design industry, that has been a word that has made me want to puke for several years. And so when you start focusing on how are we going to delight users, how are we going to engage users? How are we going to optimize this? Right? All of those are focused toward this sense of like a positive outcome for the business or for people and all of that, like the more that you focus on one thing, the harder it is to see anything else. And so if you're not explicitly sitting down as part of your practice every single time you do something intact, I'm thinking what's the worst that could happen? Who could this hurt? Who does this leave out? What are the potential unintended consequences? What happens if I'm wrong about the assumptions I've made going into this? Like if you don't have that built into your process, then you will make these mistakes and you will make them over and over again
\n Kim: And it just, it just popped up in my head because that sounds so much like privilege. And if you have one person from a marginalized community on any of these teams that could say, hey, they tried this, it blew up in their face, There's another company did this or the government experimented on people. There are bad things that could happen. It's like certain people live in this world where everything is just rosy and they've never had any challenges or been challenged. So they don't even think about that. Anything that they could do could be negative. So they don't even think about the risk assessments. And I'm going to tell you for a Black person in the south, we have to do re personal risk assessments every single day. Yeah, It's just a part of how we have to manage our lives in particularly in the United States and there are other places, but I can only speak to the United States and when you have these companies run by mainly white men, asian men and white women who've never had to have those kind of conversations internally with their families, with their communities. It is a huge blind spot.
\n Sara: Absolutely. And I think the one thing I would add to that is that sometimes you hear from folx that they'll get that they kind of understand that problem that you've just described, but they'll look at the solution is like going back to the pipeline, right? So it's like, oh well, we need more people in the pipelines, we can have more people in these companies and we can talk about the problems with the whole pipeline metaphor. But what that manifest says oftentimes is also really, I think unfair to your most marginalized staff because what you end up with is then, let's say a junior Black woman only one on the team and then everybody sort of relies on her to be able to anticipate everything. Or or like they expect that just magically having her there is going to solve problems when they don't actually, you know, listen to her, trust her, respect her. And so it's like, it's not enough to just be able to say we've brought some diversity onto this team. It's also a matter of saying like, well, do those people feel safe? Do they have power to speak? You know? And are are you actually bringing them in because you want that perspective or are you trying to basically make a band aid solution?
\n Kim: Yeah. Or is it the next But marketing or pr buzz were, And that's why I was I was never impressed. I mean, last year 2017 was the year of the diversity sea level diversity person. And I was like if these people don't have the autonomy to have a budget to make decisions without having to go to somebody else or all these other things, they're just figureheads. And it means absolutely nothing because one of the things that when we because you talk about in the book about culture fit it is no longer acceptable to expect individuals to assimilate to your culture. When you bring anybody in in your culture should be shifting to include them.
\n Sara: Yeah, absolutely. And you know, I had this so I had this conversation actually just recently on my podcast is called No you go. And and that I was talking with Nicole Sanchez who runs via consulting, consulting. Right. And one of the things she said, it's like, you know, don't tell me what your priorities are. Show me your budget and I'll tell you what your
\n Kim: Priorities exactly.
\n Sara: Not willing to put money to it. Then you're telling me you don't care
\n Kim: That. Yes
\n Sara: And
\n Kim: No, no, no, that's just that is right. There is that's why I talk about this from a business perspective because when you talk about this from morals or whatever, they will talk around it all day long. But when I talk about it from let's show me the data show, I can show you data that says your return on investment is improved when you have diverse perspectives in all areas of your company. And it's not just about race and gender. It's about bringing in people with disabilities. It's about people that all different kinds of experiences. And if you can't say that I we've put this month much dollars on that line item then you this is not important to you.
\n Sara: Yeah. And I think that, you know, one of the, one of the other things that I talked about with Nicole that I thought was so was so valuable was her perspective on sort of what the kind of
\n Kim: Surface
\n Sara: Level diversity inclusion work looks like versus what the real work looks like. And so for example, you know, she mentioned that she was brought in hired by a company to provide consulting and advising on where bias might have crept into their hiring process. So this is a company that on its face, you know, they really they clearly value diversity inclusion work. They're willing to pay her to do it. They're talking about it, they want to make their their process less biased. But when she went and looked at the process, the first thing that happened with somebody said, oh well this guy went to Harvard and worked at facebook, so obviously he's a yes. And she's like, well wait a second. Like that's if that's your shortcut, if you already, if that's all you want to hire, why are we doing any of this? Like, you know? And so I think and it really speaks to the way that people haven't quite understood that that's a that's a culture fit thing, right? It's like, oh went to Harvard work to face this. Exactly. And as opposed to saying like, well what what does this person actually bring to us that's different than what we have, The adds to what we have and why do we assume that somebody with that background is obviously better than somebody else? You know, I think that that that that really gets at that deeper piece that is much harder than just saying like setting some hiring targets and doing some, you know, some good some good pr around it.
\n Kim: That thing is a conclusion is not about equality. First of all, it is not about equality. There is no way inhale the average Black female is ever going to catch up to white male. And and and so it's not about equality and it's not about quotas. It's not about because when it's about quotas, then you go out and get a lesbian hispanic with a disability. And then you check all the boxes. No, it's not about the quota thing because we've seen that happen. It's about looking at your first again, to me it goes back to we need to stop hiring as if people are making widgets, people are you are able to compete in the information age by turning information into knowledge that you can use for your competitive advantage. And I'm sorry, the average engineer does not have that skill set. And so it becomes OK, this person, the engineer is able to build this thing, what teams can we put together so that we're all all of these different things are are are equal or they need at the same place. So the engineer is supported by by this and the engineer then supports this thing and it becomes again, changes the entire culture where the engineer is not the highest person on the totem pole, there are part of a team and the whole team moves things together and and that was where I wanted to go with when we're now talking about, you know, we talk about sexist apps, you know, you say biased algorithms when you have those, these were not going to get rid of that, but your chances of creating something unintentionally or lesson when you do that.
\n Sara: Yeah, and I think, you know, I talked to a lot of folx who talk about things like needing to have more ethics built into computer science curriculum, right? And I agree with that, I think that's great. I think engineers should absolutely have more understanding of ethics and consequences of what they're building and should be more in tune with that. But I also think that it is unrealistic to expect all of that to come back to them, right? Like I think that that's a big piece of it, is that because we have put engineering on sort of on a, on a a bit of a thrown in our industry, we are also expecting that that's that those people are going to know all of the things and the reality is that they don't and so we really do need you know, diverse set of people working on this stuff with different types of skill sets because we are not yet, we're not making widgets, we're not just building technology. We are fundamentally changing the way that people live their lives and do their jobs and we are affecting you know, everything from their emotional and personal well being, to their safety, to their financial future to their jobs. I mean we are messing with some stuff that is crucial to the functioning of our society. And so to reduce all of that to an engineering problem is just an immediate an immediate source of harm.
\n Kim: Yeah. And and you just spoke to it. This is the first time that one thing. And I mean when I speak, when I say that, I mean technology has touched everything in such a way that everything is affected before. It was this business. If that I'm trying to make sure I'm trying to articulate what I'm trying to say. Technology is now this web that touches everything in the past, your medical was separate, your transportation was separate, your everything was in these silos. And so if something happened on one end, it did not affect the whole system. Now technology's ability to affect the entire system is something we've never had to deal with. We now need to be asking questions. We've never had to answer. And we cannot do that if they're the people who, first of all, we have to be willing to answer those questions. We have to be willing to get uncomfortable and say, you know what, we don't know what the hell we're doing, but that doesn't absolve us from from, from now trying to figure out the best way to do this.
\n Sara: And I, and oftentimes asking if we should be doing it at all. One of the questions that doesn't come up nearly enough.
\n Kim: One of the things, I'm so happy you brought that up because so Elon musk. OK. So I'm really, really coming to a place where and this again, is about how this society has, has this system has put privilege as the pinnacle of what everybody should strive for. So these individuals are people are really, really admired as business people. I was like, oh, I really like to get that. But then I recognized I never get that because I was never included in the game that they're playing, they never considered me. So I mean that's not even the game that I'm playing. But when I look at Elon musk and I see the the, the innovation and the creativity has. And then again, I look on the other side, it's like, yeah he's been, but he's been, he's a part of apartheid. He's a sister. He's a product of apartheid, which is very much like slavery that just just recently ended. So he had a lot of benefits that many people, even white people didn't have. But then I look at when you talk about why create this, OK, the flamethrower that he just he just did. And and the and the advertising I've seen, I've seen of it or you know, these white people who are, you know, playing and they have this flamethrower. But again, because I'm a Black person, I immediately see that if a young Black person had that in their hand, that would be a totally different outcome for them if if anyone was injured or any police officials at all were to interact with that.
\n Sara: Yeah, absolutely.
\n Kim: And it becomes those things that we think are innocent because of our perspective view from the perspectives of others are not so much so. And these are the questions that when people talk about, oh,
\n Sara: This is another thing
\n Kim: We've taught because of our arrogance as technologies talk to wider community that we never make mistakes and machines are perfect. And so that's one of the reasons I think a lot of people don't question and they're so surprised when something like Cambridge Analytica or ISIS or whatever happens because they think we've told them that everything is binary. That's right or wrong and that we as technologists know what we're doing and we, and especially when we encoded into a machine, that machine is perfect and people don't understand how algorithms are created and how programs are written and all the biases that are in that. And I think that's one of the main reasons that the larger public is not as alarmed as they should be. Because when I talk to people who I would consider that are who still see technology as a toy rather than a tool. When I bring these things up to them, they have these like blank stares. They're like, I didn't even think of what.
\n Sara: And I think it's hard to expect individuals to all kind of understand this and to really grasp it because it feels very big. It can be very overwhelming. And I have a lot of empathy for people who are like, I don't really understand. And ju they want to trust the software and they want to trust the software because we've we've trained them to trust.
\n Kim: That's exactly,
\n Sara: That's the right answer. I really think, I think that getting the public more aware and more engaged with these issues is important. I think it's important because when people are educated about it, I think that matters and when people are able to critique and to push back and to say no. I think that matters. However, I also really think that that we need to be dealing with this at a systemic level, like this is not this is not a personal problem, this is a systemic problem. So we want to deal with it at a systemic level. That means we need to be thinking about, what are the kinds of regulations that we should have? What should we be doing to ensure that companies can't do this? You know, for example, we talk about facebook, we talk about Cambridge Analytica, a lot of the problems that facebook stem from the fact that they have essentially optimized for maximum advertising revenue and they've achieved it. They've made a shit ton of money on that,
\n Kim: Right? Like I
\n Sara: Think it was $40 billion 27 billion the year before that. I mean it's like a shit ton of money and it is growing and that is in fact what they are supposed to do according to their shareholders, right? Like they are responsible for maximizing the value for shareholders by by any means they can effectively. And so in that system, despite them maybe feeling real bad about things, they're not actually incentivized to act differently because nobody's making them and they have this huge pressure to maximize those returns.
\n Kim: Everyone in the hashtag called the same community, shares the same common beliefs based on a set of four specific guiding principles. One tech is not neutral, nor is it a political to intention without strategy is chaos. Three lack of inclusion is a risk and increasingly a crisis management issue. And lastly, but most importantly, four, we must prioritize the most vulnerable to find out more about the guiding principles and adding them to
\n Sara: Your twitter profile
\n Kim: Banner, please visit
\n Sara: Hashtag so what are you gonna do about that? Well, the only thing you can really do, the only way you can really make them changes, they have to feel like they have to change either, you know, it's either about are people going to be so fed up that they're leaving facebook that then, you know, like they can't actually make the revenue by doing the unethical thing. Maybe that will happen, but so far that has not really been realistic enough. People rely on facebook so much. Exactly. So you have to start looking, you know, at the other side. So how do we, how do we regulate a company like facebook? And what kinds of regulations would protect people and probably limit how much facebook can make off of the facts of people's data.
\n Kim: And it's, I'm glad you and you're just bringing up all these issues, because that's another thing I talk about and this is where I had some challenges with the last election. When people don't understand how a businesses or the public businesses run, they are beholden to shareholder value only. That's it. I mean that is their, their job is to increase shareholder value and when they do things that are not in a line with that or, or if their God forbid is in direct opposition to that, they can be sued as officers and, and, and, and executives by those shareholders of those companies. So people need to understand that that is the nature of those public businesses. And then it becomes a question of, as you said, how do you regulate it? But then when you look at the the Senate hearings, the the politicians have no idea what this stuff is either. So they're not even asking questions that would probe deeper into that thing to me was the easiest thing that Mark Zuckerberg could have sat on because it was just that those were just like basic questions. They didn't ask anything about the real nature of the fact that you're selling people's data without their without their permission. And and how did all this fake news happened? And I love this, this whole thing about fake news as if this is the first time this has ever happened. We've always had fake news. We've always had propaganda. The difference though, is the scale that technology allows it to happen.
\n Sara: And I think also the the amount of trust we've put into algorithms to make decisions for us and to assume that they're going to get it right. I mean that was one of the things that was really interesting to me when I was doing research for the book was how Immediately when, you know, I mean, I don't know if you remember this story, but what had happened is in 2016 in the spring of 2016. An article came out that basically accused Facebook's trending news team of having a liberal bias and there was all of this uproar about it and you know there were conservatives in Congress who were demanding there will be an investigation and shortly after that facebook ended up firing all of those people who did their trending news. So these have been contractors, they weren't facebook staff, they were contractors literally working in the basement in the new york office. Most of them were you know like people in their twenties who had studied journalism or some kind of media thing, it's like they're trying to get their foot in the door in a media job. And so they were doing these kind of editorial functions of making sure what showed up and trending was like a good source, right? So if there are multiple stories about a certain topic, they would kind of like curate which source would be used when they would make sure that only real stuff was making it in there. And so as soon as they were fired, they fired them and replace them with the algorithm only. So basically just the algorithm would decide what showed up in trending. And it was literally within days that the trending news feed started boosting all of these fake
\n Kim: Articles
\n Sara: Immediately and though that was a direct correlation to the things that ended up being shared the most of the stuff that's in trending ends up being popular in people's feed, people click on it and then they and and it all happened like that and the assumption was that the machines were good enough to just take over, even though they had done very little testing of them, the people who had been working on that team that that news team were not involved with any of those product decisions were explicitly not part of it at all. They were not in the room, they were not discussing any of this with the engineers when they saw the algorithm was starting to be more involved in their work, they were not able to give it any feedback. And so this goes back to what you were saying about having diverse people in the room, people with different perspectives. I mean there's this assumption that the engineering team could just kind of handle it and it would be fine and the reality is not only was it not fine, like not only did it do a bad job, like a demonstrably bad job, but it did it at a scale at the moment in which it was incredibly dangerous.
\n Kim: Yes. What what was that? The Microsoft thing that that turned into races over overnight. Yeah.
\n Sara: You know, and that one really goes back to what we were talking about before focusing so often on positive outcome. So Microsoft had this bot named Tae, this was I think 2015 Microsoft created this bot on Twitter named today and today was meant to socialize with teens on twitter to try to learn more about how teens speak Within 24 hours, trolls had trained tay to say a lot of racist and sexist stuff and not only that, but they had trained to to then go out and attack other people so they took tae and trained her this both to attack. For example, Zoe Quinn who is at the center of gamer
\n Kim: Game
\n Sara: And when the Microsoft researchers actually wrote about this after the fact there's another thing I was researching a while back, I was you know, I I look at like one of these people who worked on the stuff that went horribly wrong. What did they say?
\n Kim: Yeah,
\n Sara: And as I read about what they said had gone wrong one of the things I noticed was that they talked about how they had specifically been focusing on making today a positive experience, like that was that was almost forbidding the quote that they had and again, I think we come back to OK yeah, when you focus on making, interacting with tae a positive experience, how many ways that it could go wrong, how many ways that it could be abused are you ignoring? So they basically had under invested and I think this is very common, they had under invested and looking at the potential ways it could go wrong, looking at the harm that it could cause and preventing that from happening because they had overinvested in looking at the positive
\n Kim: Side. And and and and and and it's not like we don't have evidence of this. So we just spoke about two of them and how quickly it went from 0 to 100. And and that's that's the thing that the conversations we need to be having because there is this again, these assumptions that there is no bias in these things that these things can't be trained or forget being trained that they're not even created. It's there in their creation stage there without bias. And there is no such thing when humans are involved.
\n Sara: Yeah, absolutely. And particularly, you know, we talk about things like algorithms or ai as if it's so hard to understand because it does get complicated. But also because I think technologists make it make it seem hard to understand the reality is what a lot of these systems are doing. Is there crunching through a bunch of historical data and then using that to make sense of what they think will happen in the future. So, for example, image recognition, right? We are going to look at a ton of images that already exist and from looking at all those images, we can learn about the world and we can learn what the things in the images are. So we look at thousands of pictures of cats and now I understand what cats look like and I can identify a cat in a future picture, right? So that kind of algorithmic decision making to be able to like tag a photo as a cat or a photo of bicycles, bicycle. Well, all it takes is to train the system without having an an accurate amount of diversity and the images that you're using, right? Like if all you show the system are pictures of things from America, then it's going to have a really hard time identifying stuff from other cultures. If all you show the system are pictures of white people, it's gonna have a hard time identifying pictures of people of color, right? Like, I mean, it's not hard to see the limitations of a system like that. You all is about what you put into it because it's not going to be able to do a good job with the stuff that it doesn't have any experience with. So there's bias in terms of the selection, the curation of what we're telling these systems to learn from. And then there's of course, all kinds of bias in terms of what we what kinds of outcomes were seeking, what we think of as being a good
\n Kim: Outcome. And that reminds me of Watson when when they trained Watson to do jeopardy, that's all that was, it was just a whole bunch of looking at historical how people played before. And that's what they fed Watson. It wasn't like Watson got up and decided he was going to do it themselves, it was programmers, engineers feeding it information that it learned and was able to at the end point beat the champions on jeopardy.
\n Sara: Yeah, I think it's interesting because that kind of stuff is what makes people think that like machine intelligence is somehow going to overtake human or whatever. Like people get a lot of, I think mistaken ideas about what a system like that can do and it's like what, what a system like that can do is be really good at jeopardy. But actually, you know, if you take that same person that it's playing in jeopardy and that ultimately pizza in jeopardy. And you compare Watson and that person against a whole bunch of other types of metrics. Exactly. They can't do it right because it can only do the one thing. And it's only doing it based off of all this historical knowledge. If you suddenly put it in terms of like, OK, well then let's ask Watson too, do a different kind of trivia game, it's not gonna perform right? Exactly. Learn from this one narrow historical thing.
\n Kim: Yeah. And it's interesting that we also when we do that, we don't think about the bias that is in the historical data that we're feeding it. So when you're looking at criminal or crime statistics, crime statistics should just be based on actual crimes, but what we have our policing statistics that may try to make assumptions based on what people say. And then when you program that and you're talking about we go back to like redlining and and now you're talking about financial and now you're talking about real estate and all these other things in
\n Sara: There. There's
\n Kim: So much there that is just discriminatory to a whole bunch of other people. But these computers are not equipped to discern. That's not what they're able to. That's a good word. That's what they don't have the ability to deserve.
\n Sara: No they don't. I mean and that's the thing like
\n Kim: But we attribute that to them. We will the average person's attributes a competitive thing to a computer. They don't understand what goes in is what comes out.
\n Sara: And I think that that the way that that it's not just that it's you know it's not it's not just to say like oh the algorithm is biased. What what's really more helpful I think is to say that the data that we used to feed the algorithm was biased. We are bias.
\n Sara: And so the results are biased algorithm itself is just performing a series of steps.
\n Kim: Exactly. It's doing algorithm
\n Sara: Does it might be the steps that it's performing maybe good or bad that but that's that's kind of a different issue than talking about the bias of the data. You know Cathy O'Neil talks about this a lot about how we look at proxy data so often right? Like most algorithms. The fact is you don't know exactly how crime is going to function. And so you're trying to make a predictive model for how crime might function in a city. And so in order to make any kind of predict the model, you have to take unknowns and then try to like make yourself certain enough that you can you think something's going to happen, right? So you use proxy data, you say, well I don't know When crime is committed, but I know when people are arrested so we can look at a rest data. I know where cops are being called. We can we can look at 911 phone calls. So you take the data points that you have. So you have to be able to sit down and say, OK, what is the data that I want to have? Like what do I wish I knew. Well I wish I knew when crimes were actually being committed. OK, I don't know that. Here are the potential things. I do know. Then you have to sit down and say, ok, what are the problems with that? What are the assumptions of that? Right. And so like we just saw recently, I don't know if you saw this was just the other day Buzzfeed. Did this report on a neighborhood I believe was in Harlem it was an area that has been pretty rapidly gentrifying. And they talked about how there is this like massive spike in people calling the cops in this neighborhood. And it turns out it was directly correlated to changing demographics where now white people live in this neighborhood that's been historically, believe it is very, very next neighborhood. And suddenly you've got all these white people calling the cops on people doing things like sitting on their stoop,
\n Kim: Playing
\n Sara: Dominoes. And it's like if you just looked at the data, you would think suddenly, oh my gosh, this is a high crime area. But the reality is the crime in the area hasn't changed at all. But the demographics have changed. Who thinks they can call the police?
\n Kim: And the reasons for calling the police have changed. Exactly. Because there is a different perspective. Because so, in our communities, somebody sitting on their porch, OK, We wait to keep going. Man
\n Sara: Speaking of which, like as a as a white woman, let me, can I speak to all the white
\n Kim: People out? Please please
\n Sara: Stop fucking calling the cops. Yes. Yes.
\n Kim: I mean like, so I am so over white women right now. And
\n Sara: Yeah,
\n Kim: So over white women right now, I mean, to a point where my message used to be specifically for underrepresented and marginalized, I just dropped the underrepresented because that's usually just white women because you are white women are causing harm to marginalize individuals and they're causing harm. It rates that are just unprecedented at this point, And it might not even be that it's unprecedented, it's just the fact that now we have cameras to catch it. And it is really blowing my mind. White women's tears. All of that is just is just, I can't even fathom and I've been having these conversations that people of color and I'm gonna say this people of color are now we can now consider white women a bigger threat than white women. I mean white men.
\n Sara: That's interesting. And it doesn't surprise me in a certain way. I think it's I don't know, I I guess, you know, like being a white woman and trying to be a better white woman. One of the things that I have really been able to recognize is how much how many messages you get, like how you, how you learn as a white woman that effectively like systems are in place to protect you and that could be really screwed up and paternalistic and frustrating, right? Because a lot of the ways the systems are set up to protect you are are ways in which white women are ultimately oppressed, right? That protection
\n Kim: Limits, it limits, it limits your abilities.
\n Sara: But because the system is set up to protect white women and to keep white women comfortable, it's we so easily benefit from it and it's so easy for a white woman to use that against other groups and and we do, right? I mean that we are, we are very much trained to believe, that we deserve to be taken care of by systems and you know, so if we're uncomfortable in a setting, that we might call the police I personally, you know, I would say like a few years ago, I don't think I thought about this nearly enough. I wasn't a big police caller ever. But I was probably more likely to call the police than I am right now. And at this point, I think I need to think a lot about my responsibility. Like if I was going to call the police, what is the responsibility that I have to make sure that I have a good enough reason to call the police and that I'm not calling the police, sort of a first resort to protect my own comfort that there are real safety issues at hand. And also that I'm thinking about who might be harmed by me making that phone call. And like I have to think about that, that is my job. And I, you know, I guess, I guess the biggest thing I'm thinking is just that like so much of this is white people need to get a lot more comfortable having unpleasant conversations that they're not good at having, we've spent a long time not talking about any of this, not really had to, and like we're going to be bad at it. Like I am going to do a good job unless I unless I practice right, Like I can't do a good job having conversations people of color if if I don't invest in learning how to do that well and invest in in building those skills and like that is on us.
\n Kim: Yeah. And it's it's and it's not even it's as extremist colleges and calling the police and this is the thing that you know, white women are like, oh it's no it's it's every time you're in a situation particularly with a woman of color and you don't get your way, it becomes tears. She's been aggressive, she's intimidating. And that has been historically affecting us economically. Because what you do is you run to HR and that goes into our HR records and how am I going to ever get promoted if that is what might what has been reported against me. And these are the things that white women need to understand and it's about taking responsibility for your own emotional maturity. If I have not done. If all I'm doing is disagreeing with you, then there is nothing that you should be crying about screaming about any of those things about because you know what I'm feeling I'm feeling you're being intimidated by you as well. But because as you said, the system is set up to protect white women. It is it is your word against ours. And then I look at when they when you talk about feminist, why aren't Black people? And because feminist feminist movements, women's movements have never white women's movements and never included us anyway. So we're not going to waste our time on those causes. When you talk about suffrage, we were intentionally left out of the women's right to vote. We were intentionally left out of things, in the 60s with feminism and women's rights movement. And what has come of that though is when women, yes, white women are making gains in tech and in these spaces, but your circumstances aren't changing. You're not you're not improving. So they're bringing you in, but you're not, you're you're still being a rash. You're still being treated like shit because we didn't all get there together. Because I had some of us got in there with you, we as a group could say no, we're not taking this, this is not working. And also we have a different perspective of it. So we can have a different conversation.
\n Sara: Yeah, I mean it's not enough to it's not enough to say, well, OK, we're gonna we're gonna include this group that used to be excluded as long as they are willing to act just like all of the dudes that are already there, right? So white women show up and be successful if they're willing to basically reenact the same shit that has already been and and somehow that's supposed to be a win? And it's like that's who exactly is is that a win
\n Kim: For
\n Sara: And and I absolutely agree it's not enough and I, and I completely understand that I, you know, I think that it's them. I I completely understand the sentiment that I have heard from a few different places around, like, you know, when you say feminist or when you say, let's say a women's in tech of women in tech event, that what people of color often here is that that is for white women. I, I hear that and I believe that I believe that 100% I wish I had you know, I wish I felt clearer on like exactly how to fix it. I think that that takes a work on so many different fronts, but a huge piece of it is you know, putting your energy and your money, I mean you're meaning my like white people's energy and money and time behind efforts that are led by people of color and then figuring, you know, figuring out all of these, these kind of difficult grey areas, like you know, I don't want to, I, I have some platform, I have some power, I want to be able to use that to amplify issues that I think people like me are not paying enough attention to at the same time, I also need to not be speaking for color and and sometimes that's, that's kind of a messy area and guess what, like I got to figure that out. Like figure that out and if I fuck it up, my job is to say, wow, I really screwed that up. How am I going to do better next?
\n Kim: Because even me, when I speak on the behalf of I say I speak with them, I'm not speaking for them because I can only speak from my perspective. And so even I am conscious of that as a Black woman that I don't speak for all Black women and I would never want to, but I want to go back to when you're talking about Black women not feeling a part of that. It's not just Black women. I hear that a lot from trans women that they really need the conversation or the messages to be really clear because they don't feel that they're included when people say women tech events. Yeah. So there are, there are several groups that just don't, this aren't thought about, aren't considered in these different things. And that goes back to again. And we talk and we talk about sexism. It's like a for a lot of Black people, women that I talked to that's a secondary issue to racism. So it's like sexism is a big issue for white women racism is a bigger issue for women of color. And that's what's killing us, that's what's causing us so much stress and so much anxiety. And, and until we can have these conversations where, oh, white women who are in these movements have to understand that your message or your thing is not a priority to us. And until you can understand and get on board or what's the priority for us, we're never there's never gonna be a middle ground.
\n Sara: And of course, you know, there are all of these issues that matter tremendously to Black women that are explicitly at the intersection of sexism and racism. And those issues have to be centered in any feminist movement, right? Like that's like that's that's the key. I mean, if it's just about like, how do we get more white women making the same money as white men? Like that is a very hollow message. A very and also like a very limited goal. But yeah, I I I definitely think this is something that it makes people really uncomfortable to talk about this and it makes white people really makes white women uncomfortable. Because because of all the stuff we talked about before, like being really trained to mm be protected and be perceived as like we're good people. And when we start getting involved with feminist causes or whatever, it's like, oh no, no, no. But I'm one of the good ones and it's like, no, you don't. You don't get that label. Like if you don't get to just decide that because you want to be a good person that you're not doing harm, that's not how that works.
\n Kim: Yeah. And again it's not, I will and many people will, I will go to bed, I will pull out every resource I have, I will have a conversation, I will do whatever it takes if there is a genuine interest if, but what historically has happened and I'm just going to tell my story is that inevitably when I believe that and it just happened to me friday just happened to me friday when I believe that there is a white woman who gets it, she says something and it's like fuck you just through all of this up, you just open your fucking mouth. And then that is just so I had an incident on friday last, well we could go friday and they were, they asked me why do I not block people on twitter? And I said well because my, I have a lot of white people who follow me, particularly white men who are trying to figure this out and if I, if I block people then that saves them from seeing the stuff that I have to put up what they need to be exposed to this because they need to see it. So that's one reason that's the main reason I don't block because you need to see it. And also it is your, if you're following me, you have a responsibility to protect me because I'm out here in front and if you can't and I tell people, I'm not trying to gather followers or allies if you cannot make yourself uncomfortable so that I can be comfortable, I have absolutely no no use for you. Absolutely none. And so her question was why do I do that? And then she backed that up with because I come from a political family, we've been taught blah blah. I was like, first of all, you don't know my strategy and and how dare you try to school me and what I need to be doing. And then her next question was so how do you see me? And I was like, right, right, that right there, you just screwed it up that that that relationship is just gone. This is like I just is gone. And that is that really saddened me and I'm still processing that because it happens to us so often. It happens so often where you think your your head and then someone just with a backhanded comment just erases all that. And it's it's very unsettling. It's very unsettling.
\n Sara: Yeah, I mean there's so much assumption in there that she knows what's good for you or what's gonna work for you and that's always problematic. But it's especially problematic when you're talking about somebody who doesn't have a lived experience as you have. And so, you know, I mean, I think that I think that if you want to have a strategy of blocking everybody who irritates you on social media, you should do that. But that's a, that's a super personal choice and there's risks and ramifications to any of those decisions and I think, you know, as opposed to just like wanting to have a conversation about it and understanding sort of like, well what made you decide to take that strategy? What's good about it? What's bad about it? What risk does that leave you open to etcetera? Like instead of having a conversation about that and assuming that you went into it, making these choices in a way that you know, you thought was going to be best for you. The assumption is like, oh well I know what's best for
\n Kim: You
\n Sara: And I think that that's, I mean that's so common in tech and that's so common with white people to write like to us that you know what's right for others and white women. I I know I know this is something that is true about white women in general and probably almost certainly something I've been guilty of is assuming that like because I am also a woman that I'm going to be able to speak to or have any sort of like valuable information for somebody who's had a very different life than me,
\n Kim: Yep, this is a great conversation.
\n Sara: I mean there's, there's so much here.
\n Kim: So thank you Sarah for joining the show. This has been a great conversation so much, so much to talk about. Any last words.
\n Sara: Gosh, thank you so much for having me. I really appreciate that you're doing this and I am really excited to see what other kind of cause a scene guests you're going to have on because I think it's such an important topic.
\n Kim: Well, thank you. I'm excited to sound interesting to see who will come on as well. So thank you and have a great day.
\n Sara: Thank you.
\n Kim: Bye bye. Mhm mm. Thank you for listening to this week's episode of the Hashtag called the Scene podcast. And I'd like to thank all our current sponsors of the podcast and the Hashtag called the Same Movement. Of course, we strongly encourage everyone to become an individual sponsor of the Hashtag called the same community. Just visit the website at Hashtag called the Scene dot com to sign up today on behalf of everyone here at Hashtag how the scene we'd like to thank you again for listening to today's show and have a wonderful day. Right. Right.
\n Sara: Right. Hi. Mhm