Data & Society

Race After Technology

Episode Summary

Ruha Benjamin discusses the relationship between machine bias and systemic racism, analyzing specific cases of “discriminatory design” and offering tools for a socially-conscious approach to tech development. In "Race After Technology: Abolitionist Tools for the New Jim Code," Ruha Benjamin cuts through tech-industry hype, from everyday apps to complex algorithms, to understand how emerging technologies can reinforce White supremacy and deepen social inequity. Presenting the concept of “the new Jim Code,” she shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite. This event is hosted by Data & Society’s Director of Research Sareeta Amrute.

Episode Notes

Ruha Benjamin discusses the relationship between machine bias and systemic racism, analyzing specific cases of “discriminatory design” and offering tools for a socially-conscious approach to tech development. In "Race After Technology: Abolitionist Tools for the New Jim Code," Ruha Benjamin cuts through tech-industry hype, from everyday apps to complex algorithms, to understand how emerging technologies can reinforce White supremacy and deepen social inequity. Presenting the concept of “the new Jim Code,” she shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite.
This event is hosted by Data & Society’s Director of Research Sareeta Amrute.

Episode Transcription

Sareeta Amrute:
 I'm so happy to welcome tonight's speaker back to Data & Society. Ruha Benjamin is associate professor of African-American studies at Princeton University. Founder of the Just Data Lab, author of “Race After Technology: Abolitionist Tools For The New Jim Code,” an editor of “Captivating Technology: Reimagining Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life,” among many other publications.

Sareeta Amrute:
Ruha's work investigates the social dimensions of science, medicine, and technology with a focus on the relationship between innovation and equality, health and justice, knowledge and power. She is the recipient of numerous awards and fellowships, including from the American Council of Learned Societies, The National Science Foundation, The Institute for Advanced Study, and, in 2017, she received the President's Award for distinguished teaching at Princeton. Please join me in welcoming Ruha.

Ruha Benjamin:
Thank you for that introduction. Good evening, everybody. Good to see you. Thank you so much to all the organizers, my wonderful colleagues, Sareeta Amrute, CJ Landow and Rigoberto Lara, and all my friends here at Data & Society. About a year ago, I had the opportunity to circulate the draft among Data & Society and, in particular, a few people who aren't here now that were also part of that feedback session, Mutale Nkonde, Jessie Daniels, and Kadija Ferryman, among many others, so thank you all for having me back now that the book is complete. I'd also like to join in the acknowledgement, the land acknowledgment and [to] think about this land and the traditional unceded territory of the Lenape. 

Ruha Benjamin:
Let us acknowledge the intertwined legacies of settler colonialism and the transatlantic slave trade which contribute to the creation and continued wealth of this city, of this nation. We acknowledge the reparations owed to black and indigenous communities and nations and the impossibilities of return for generations past. Let us also acknowledge the ancestors in the room tonight, as we fight together for better futures. We are alive in an era of necessary resistance to preserve this planet and all the beautiful creation that we have no doubt is worthy of the struggle, Ashe. With that let me begin with three provocations. 

Ruha Benjamin:
First, racism is productive, not in the sense of being good, but in the literal capacity of racism to produce things of value to some even as it wreaks havoc on others. We're taught to think of racism as an aberration, a glitch, an accident, an isolated incident, a bad apple, in the backwoods and outdated, rather than innovative, systemic, diffused, an attached incident, the entire orchard, in the ivory tower, forward-looking, productive. In sociology, we like to say race is socially constructed, but we often fail to state the corollary that racism constructs. Secondly, I'd like us to think about the way that race and technology shape one another. More and more people are accustomed to thinking about the ethical and social impact of technology, but this is only half of the story. 

Ruha Benjamin:
Social norms, values, and structures all exist prior to any given tech development, so it's not simply the impact of technology but the social inputs that make some inventions appear inevitable and desirable, which leads to a third provocation: That imagination is a contested field of action, not an ephemeral afterthought that we have the luxury to dismiss or romanticize, but a resource, a battleground, an input and output of technology and social order. In fact, we should acknowledge that many people are forced to live inside someone else's imagination and one of the things we have to come to grips with is how the nightmares that many people are forced to endure are the underside of elite fantasies about efficiency, profit, and social control. 

Ruha Benjamin:
Racism among other axes of domination helps to produce this fragmented imagination, misery for some, monopolies for others. This means that for those of us who want to construct a different social reality, one grounded in justice and joy, we can't only critique the underside but we also have to wrestle with the deep investments, the desire even, that many people have for social domination. That's the trailer. Let's start with a relatively new app called Citizen, which will send you real-time crime alerts based on a curated selection of 911 calls. It also offers a way for users to report, live stream, and comment on purported crimes via the app. 

Ruha Benjamin:
It also shows you incidents as red dots on a map so you can avoid supposedly dangerous neighborhoods. Now, many of you are probably thinking, what could possibly go wrong in the age of Barbecue Becky's, calling the police on black people, cooking, walking, breathing out of place. It turns out that even a Stanford educated environmental scientist, living in the Bay Area no less, is an ambassador of the carceral state, calling the police on a cookout at Lake Merritt. It's worth noting too that the app, Citizen, was originally called the less chill name Vigilante and in its rebranding, it also moved away from encouraging people to stop crime but rather now simply to avoid it. 

Ruha Benjamin:
What's most important to our discussion, I think, is that Citizen and other tech fixes for social problems are not simply about technology's impact, but also about how social norms, racial norms, and structure shape what tools are imagined necessary in the first place. How should we understand the duplicity of tech fixes, purported solutions that nevertheless reinforce and even deepen existing hierarchies? One formulation that's hard to miss is the idea of racists robots. A first wave of stories a few years ago seemed to be shocked at the prospect that, in Langdon Winner's terms, artifacts have politics. A second wave seemed less surprised. Well of course technology inherits its creators' biases. 

Ruha Benjamin:
Now, I think we've entered a phase of attempts to override or address the default settings of racists robots, for better or worse, and one of the challenges we face is how to meaningfully differentiate technologies that are used to differentiate us. This combination of coded bias and imagined objectivity is what I term the “new Jim code,” innovation that enables social containment, while appearing fairer than discriminatory practices of a previous era. This riff off of Michelle Alexander's analysis in “The New Jim Crow,” considers how the reproduction of racist forms of social control in successive institutional forms entails a crucial sociotechnical component, that not only hides the nature of domination, but allows it to penetrate every facet of social life under the guise of progress.

Ruha Benjamin:
This formulation as a highlight here, is directly related to a number of other cousin concepts by Brown, Broussard, Daniels, Eubanks, O'Neil, Noble, and others. A quick example, hot off the presses, illustrating the new Jim Code: “Racial bias in the medical algorithm favors white patients over sicker black patients,” reports the new study by Obermeyer and colleagues, in which the researchers were able to look inside the black box of algorithm design, which is typically not possible with proprietary systems. What's especially important to note is that the algorithm does not explicitly take note of race. That is to say it is race neutral.

Ruha Benjamin:
By using cost to predict health care need, this digital triaging system unwittingly reproduces racial disparities because on average, black people incur fewer costs for a variety of reasons, including systemic racism and in my review of their study, both of which you can download from the “Journal of Science,” I argue that indifference to social reality on the part of tech designers and adopters can be even more harmful than malicious intent. In the case of this widely used healthcare algorithm, affecting millions of people, more than double the number of black patients would have been enrolled in programs designed to help them stay out of the hospital, if the predictions were actually based on need rather than cost. 

Ruha Benjamin:
Race neutrality, it turns out, is a deadly force. Connecting this with a number of books and articles and works, the new Jim code is as I see it, situated in a hybrid literature that we can think of as race critical code studies. And again, this approach is not only concerned about the impacts of technology, but its production, and particularly, how race and racism enter the process. As we think about how anti-blackness gets encoded in and exercised through automated systems, I write about four conceptual offspring to the new Jim code that follow along a kind of spectrum. The book is organized around these in terms of the chapters. 

Ruha Benjamin:
At this point in the talk, I would normally dive into each of these with examples and analysis but for the sake of time, I'm going to shift gears now to focus on what many people and organizations are already doing about the problem. Like abolitionist practices of a previous era, not all manner of getting free should be exposed. Recall how Frederick Douglass reprimanded those, who reveal the routes that fugitives took to escape slavery, declaring that those supposed white allies turned the Underground Railroad into the Upper Ground Railroad. Likewise, some efforts of those resisting the new Jim code necessitate strategic discretion, while others may be effectively tweeted around the world in an instant.

Ruha Benjamin:
Exhibit A, 30 minutes after proposing an idea for an app that converts your daily change into bail money to free black people, Compton-born black, trans tech Developer, Dr. Courtney Zeigler added, “and it could be called Appolition,” a riff on abolition and a reference to a growing movement toward divesting resources from policing in prisons and reinvesting in education, employment, mental health, and a broader support system needed to cultivate safe and thriving communities. Calls for abolition are never simply about bringing harmful systems to an end, but also envisioning new ones. After all, the etymology of the word includes root words for “to destroy” and “to grow.” 

Ruha Benjamin:
To date, Appolition has raised more than $137,000. That money being directed to local organizations who've posted bail freeing at least 40 people. When Zeigler and I sat on a panel together at the Allied Media Conference, he addressed audience questions about whether the app is diverting even more money to a bloated carceral system. But as Zeigler clarified, money is returned to the depositor after a case is complete. So donations are continuously recycled to help individuals, like an endowment. That said, the motivation behind ventures like Appolition can be mimicked by people who don't have an abolitionist commitment. 

Ruha Benjamin:
Zeigler described a venture that Jay-Z is investing millions in called Promise. Although Jay-Z and others call it a decarceration startup because it addresses the problem of pre-trial detention, Promise is in the business of tracking individuals via the app and GPS monitoring, creating a powerful mechanism that makes it easier to lock people back up. Following criticism by the organization BYP 100, we should understand that Promise exemplifies the new Jim code. It's dangerous and insidious precisely because it's packaged as social betterment. The good news is that tech industry insiders themselves have increasingly been speaking out against the most egregious forms of corporate collusion with state sanctioned racism and militarism.

Ruha Benjamin:
For example, thousands of Google employees condemned the company's collaboration on a Pentagon program that uses AI to make drone strikes more effective, and a growing number of Microsoft employees are opposed to the company's ICE contract saying that, "As people who build the technologies that Microsoft profits from, we refuse to be complicit." This kind of informed refusal is certainly necessary, as we build a movement to counter the new Jim code, but we can't wait for workers' sympathies to sway the industry. And as this article published by Science For The People reminds us, contrary to popular narratives, organizing among technical workers has a vibrant history, including engineers and technicians in the 60s and 70s who fought professionalism, individualism, and reformism to contribute to radical labor organizing. 

Ruha Benjamin:
The current tech workers movement, which includes students across our many institutions, can draw from past organizers' experiences and learning to navigate the contradictions and complexities of organizing and tech today, which includes building solidarity across race and class. For example, when the predominantly East African Amazon workers in the company's Minnesota warehouses organized a strike on Prime Day to demand better work conditions, some engineers from Seattle came to support. In terms of civil society, initiatives like Data for Black Lives and the Detroit Community Technology Project offer an even more far-reaching approach. 

Ruha Benjamin:
The former brings people working across a number of agencies and organizations together in a proactive approach to tech justice, especially at the policy level, and the latter develops and uses technology rooted in community needs, offering support to grassroots networks doing data justice research, including hosting what they call DiscoTechs, which stands for discovering technology, which are multi-media, mobile neighborhood workshop fairs that can be adapted in other locales. And I'll just quickly mention one of the concrete collaborations that's grown out of Data for Black Lives. A few years ago, several government agencies in St. Paul, Minnesota, including the police department and the public schools, formed the controversial joint powers agreement called “the innovation project, giving these agencies broad discretion to collect and share data on young people, with the goal of developing predictive tools to identify at-risk youth in the city. There was immediate and broad-based backlash from the community with the support of the Data For Black Lives Network and in 2017, a group of over 20 local organizations formed what they called The Stop The Cradle to Prison Algorithm Coalition. Eventually, the city of St. Paul dissolved the agreement in favor of what they call a more community led approach, which was a huge victory for activists and community members who've been fighting these policies for over a year.

Ruha Benjamin:
Another abolitionist approach to the new Jim code that I'd like to mention is Our Data Bodies' “Digital Defense Playbook,” which you can download for free online and make a donation to the organization if you're inclined. The playbook contains in-depth guidelines for facilitating workshops and group activities plus tools, tip sheets, reflection pieces, and rich stories crafted from in-depth interviews in communities from Charlotte, Detroit, and Los Angeles that are dealing with pervasive and punitive data collection and data-driven systems, with the aim of developing power not paranoia, according to Our Data Bodies.

Ruha Benjamin:
Although the Playbook presents some of the strategies people are using, in the spirit of Douglass's admonition about the Upper Ground Railroad, not everything that the team knows is exposed. Detroit based digital justice activist Tawana Petty put it bluntly, she says "Let me be real, y’all are getting the Digital Defense Playbook but we didn't tell you all their strategies and we never will, because we want our communities to continue to survive and to thrive and so the stuff that's keeping them alive we're keeping to ourselves." Finally, when it comes to rethinking STEM education as a ground zero for re-imagining the relationship between technology and society, there are a number of initiatives underway. 

Ruha Benjamin:
I'll just mention one very concrete resource that you can also download, the “Advancing Racial Literacy in Tech” handbook, developed by our very own wonderful colleagues here at Data & Society. The aims of this intervention are threefold. First, is to develop an intellectual understanding of how structural racism operates in algorithms, social media platforms, and technologies not yet developed. An emotional intelligence concerning how to resolve racially stressful situations within organizations and a commitment to take action to reduce harms to communities of color. The fact is, data disenfranchisement and domination has always been met with resistance and appropriation in which activists, scholars, and artists have sharpened abolitionist tools that employ data for liberation. 

Ruha Benjamin:
From Dubois' modernist data visualizations to Ida B. Wells-Barnett's expert deployment of statistics in the red record, there's a long tradition of employing and challenging data for black lives. In that spirit, the late legal and critical race scholar, Derrick A. Bell, encouraged a radical assessment of reality through creative methods and racial reversals. He said that, “to see things as they really are, you must imagine them for what they might be.” One of my favorite examples of what we might call a Bellian racial reversal, is this parody project that begins by subverting the anti-black logics embedded in hi-tech approaches to crime prevention.

Ruha Benjamin:
Instead of using predictive policing techniques to forecast street crime, the “White Collar Early Warning System” flips the script by creating a heat map that flags city blocks where financial crimes are likely to occur. The system not only brings the hidden, but no less deadly, crimes of capitalism into view, but includes an app that alerts users when they've entered high-risk areas to encourage citizen policing and awareness. Taking it one step further, the development team is working on a facial recognition program to flag individuals who are likely perpetrators and the training set used to design the algorithm includes the profile photos of 7,000 corporate executives downloaded from LinkedIn.

Ruha Benjamin:
Not surprisingly, the average face of a criminal is white and male. To be sure, creative exercises like this are only comical when we ignore that all of its features are drawn directly from actually existing practices and proposals in the real world, including the use of facial images to predict criminality. By deliberately and inventively upsetting the status quo in this manner, analysts can better understand and expose the many forms of discrimination embedded in and enabled by technology. If, as I've suggested at the start, the carceral imagination captures and contains, then an abolitionist imagination opens up possibilities and pathways. 

Ruha Benjamin:
It creates new templates and builds on a critical intellectual traditions that have continually developed insights and strategies grounded in justice. May we all find ways to build on this tradition. Thank you for your attention.

Sareeta Amrute:
Thank you. Thank you, Ruha. That was great. I'm just going to take the host prerogative and ask a few questions and then open it up to all of you. One thing I was hoping you would do for us after reading the book is to take us through how you bring together seemingly trivial design decisions and their broader implications and, there's so many good examples, but one that really struck me was the soap dispenser and skin color detection.

Ruha Benjamin:
Yeah, sure, so you all remember a few years ago when that viral video went around of two friends trying to use a soap dispenser and it wouldn't work on the darker skin. They are laughing at the time and also it's circulated in a kind of humorous critique of this seemingly benign technology. I use that as a kind of entry point to think beyond the superficial and beyond that one technology to the broader design implications of that and how in much more consequential situations, we might not laugh in that case. So you think about Joy Buolamwini's fantastic research about facial recognition and the implications of that being adopted in policing, for example.

Ruha Benjamin:
Linking something that seems rather trivial but using it as a gateway to think about these bigger questions and issues is one sort of methodology that I employ in the text, in some sense, to disrupt the categories that we use in our heads to make sense of the world. I'm trained as a sociologist and we have a knack of creating little tables, like two by two tables, to make sense of everything and fitting things into a grid and making distinctions between let's say, what is superficial and what's consequential. One of my sort of modes of moving through this material is to disrupt that, and rather than trying to make so many distinctions about types of technologies or varieties of ethics, is to draw connections. 

Ruha Benjamin:
To engage in a more integrative mode that's connecting dots rather than trying to, in some sense, create a new database looking at technology. For some people, that's really frustrating, like I've had readers say to me, it's just all over the place. They want the two-by-two table and then others ... some appreciate the fact that I'm trying to draw together seemingly disparate technologies, doing very different things in the world, in some ways because my goal here is to develop a conceptual tool kit that can be applied in a variety of contexts so that trying to neatly divide up the technologies, wasn't a big priority for me. 

Sareeta Amrute:
Yeah and for me, another thing about the soap dispenser that you do so nicely is you show how consequential it is for the kinds of worlds that have been made possible and those that are foreclosed. Also, it makes me think about some of your comments throughout the book on the question of intent and intentionality. You've warned us that we need to move past the discussion about whether algorithms or their makers intend to be racist, but at the same time, you argue that the purpose and intention behind design really matters. I was wondering if you could walk us through how you think about those two aspects of intentionality. 

Ruha Benjamin:
Yeah, this is a good example of writing as thinking. The way that I started out writing about intentionality was very different than how it ended up because first, I'll say one of the questions that I get and I think people thinking about technology in this way get often is, “do they intend to be racist?” like we're looking for the racist boogeyman behind the screen and using that as a measure of how to then assess the impact of these technologies. In my effort initially to push back against that, I went like to the extreme and was like, intentionality is not important. Stop thinking about intentionality. Stop asking about intentionality.

Ruha Benjamin:
I just went to the extreme of a post-intentional analysis of technology. And then, in fact through my conversations with one of my colleagues, Tamara Nopper, who was pushing back as she was reading drafts and she was like, it's not that there's no intentionality. You can't design something without intention, right? That's not possible. You're thinking, you're intending it to do something and so it may be that in not thinking about or not valuing the sociality or the political landscape, that in itself is a priority and so, it got me to think differently that the intention to be racist is not the only intention that we should be looking for. 

Ruha Benjamin:
The intention let's say to maximize profit, right, is a powerful engine that certainly we know historically, in terms of racial capitalism, has powerfully, discriminatory effects. I think in part, it's looking for all of the variety of desires, intention, and assumptions that animate the design process without needing it to be cloaked in a white hood, carrying a torch, as the sort of litmus test about whether then we deem it racist or not.

Sareeta Amrute:
I just wanted to link that to something that came up in your talk about these three phases of the racist robots. You mentioned that we might be in a third phase now which is about overcoming what's been recognized as racism. How are you thinking about that? How do you analyze that move?

Ruha Benjamin:
Yeah. I think that there's a lot of work being done around fairness, transparency, accountability, FAT*, with respect to machine learning. The more that the problem is exposed, there's an effort both within tech and a number of collaborations to try to fix the technology, make it less biased. With that healthcare algorithm, for example, the researchers are actually able to collaborate with the company to sort of rewrite that so that it doesn't have that racially disparate effect, right? That's one example where there's energy and motivation to get the tech right. Which I think is on one level, perfectly fine, right?

Ruha Benjamin:
But in some ways, that continues to narrow the scope of our analysis and our movement building, to think that just by getting the technology right that then, we have fixed the larger inputs that will make this so consequential. Thinking again to the healthcare algorithm, whether that particular algorithm gets it right, is that not going to magically then, like fairy-dust, fix the larger healthcare, whether it has to do with the insurance structure or the fact that, there's a lot of racism both in medical curriculum and the training process, et cetera, et cetera. My challenge is to yes, focus on the sort of technological changes that we have the power to do, but not to lose sight of the larger structures that continue to then fuel the problem. 

Ruha Benjamin:
It's a both/and suggestion rather than getting so self-satisfied with our tech fixes in this context. 

Sareeta Amrute:
That's extremely helpful. It's extremely helpful to think about getting the tech right as being important, but also a limiting framework for our action and our movement building, which brings me to the subtitle of the book. The subtitle is “Abolitionist Tools For The New Jim Code” and you did a great job in the talk, talking about abolitionism. I was wondering if you could maybe unpack the tools part a little bit for us. How do you link the work of abolitionism to creating a toolkit to do this wider movement building work and you gave us so many good examples of that. 

Ruha Benjamin:
Yeah, thanks for that question. I think in part, that subtitle is an attempt to kind of re-appropriate the idea of what a tool is like what we think of as a tool in its most generic expansive sense. That the hardware and software don't monopolize this, the space of tool-making, and so here I'm thinking about all of the ways ... I think of race as a technology or a tool that creates, produces certain things in the world as a harmful tool, but what would it mean to enact and cultivate life-affirming tools that support, empower et cetera. In thinking about abolitionist tools, I'm really referring not to things necessarily that we have to create, but that already exist, in the ways that communities try to seed their own well-being and cultivate life-affirming practices. 

Ruha Benjamin:
Some of that is thinking about social tools, how we relate to one another, how we engage, arts, community building. Again, abolitionist tools in the talk, I used Appolition as a touchstone for that, like something very concrete, just as a way in again for us to think about, in that case the designer of that, it wouldn't be an abolitionist tool if it was not connected to the social infrastructure of community organizations that were already engaged in this work, right? The tool becomes put to service in what communities are already doing and activists are already doing rather than the tech fix sort of floating down and purporting to solve something for a community. 

Ruha Benjamin:
It's really about zooming out again and thinking about what's the social infrastructure in which technologies operate and that there's other tools already in place, already being honed that in some ways are vital to social change rather than just looking to the ones that are developed within industry.

Sareeta Amrute:
I'm going to ask one more question and I'll open it up. I know people are anxiously waiting, but for those of you who haven't had a chance to read the book yet, it's just beautiful. It's luminous. One thing I kept wondering as I was reading was, kind of are you a secret poet, some of the terms Ruha deploys, the new Jim code, anti-black box, the anti-black box, race critical code studies. I wanted to ask you what do you think the power is in naming things and how do you deploy that, as itself a tool of your practice?

Ruha Benjamin:
Yeah, so first I'll answer that I am a closet poet. Well, I was in college and then grad school sucked the life out of me. I've slowly been trying to regain my footing since. I think part of that is trying to enact something beyond the critique of technology and when we think about what needs to change, where the levers of power are, for me, the process of problem formulation and naming the problem that technology is supposed to then solve is ground zero. Before you even begin collecting data, developing a technology, designing anything, you have to decide what is the problem that this technology is supposed to solve and, as it stands, a very narrow slice of humanity is posing the problems, nay framing the questions, that then our digital and material infrastructure is designed to somehow intervene in. 

Ruha Benjamin:
It is important for not just me, but for me to encourage a more democratic vision, a more creative vision of people who are playing in that problem space, who are experimenting, who are naming their own reality rather than having someone else name it and then try to fix it. In some ways, my play with language is both trying to make it enjoyable to read because it's more enjoyable to write and so in some ways, the genres of writing that I was trained in, as an academic, again I said sucked the life out of me and in some ways then you circulate that and it's not enlivening or vivifying the person on the other end if it wasn't enjoyable for you to write. 

Ruha Benjamin:
In part, it's a mirroring of what I hope the work to do, that I want to enjoy actually producing it so that you enjoy reading it, right? In part because I think of knowledge as not simply something that's cognitive, but it's also something that we produce in terms of our feelings and our relations through art. It's trying to expand how we know the world, not simply through very rigid categories or very dry analysis, but it's also through stories, it's through sort of thinking about the joy of how we play with language and words and so that's where ... and some of it is going to be dry. There are parts of the book that are just ... you're going to be like, "Oh she lied to me. This is really standard sociological analysis."

Ruha Benjamin:
It's in there, but I hope to counter ... to balance that out with a lot of references and engagement with popular culture and with other things that are not your traditional sociology. 

Sareeta Amrute:
Thank you. I'd like to open it up to the audience. There will be a mic that will come around, if you have a question. If you feel comfortable saying your name, your pronoun and where you're coming from tonight, that would be great too. You can raise your hand, if you have a question.

Carol:
Hi, Ruha and everyone. My name is Carol and I am also a sociologist, an interdisciplinary sociologist and media studies scholar and practitioner. Part of my research focuses on AI and all of this work that you're doing, Ruha. Also, I am in the process of developing a nonprofit called Harlem Media Institute where I live in Harlem, which addresses the part of the issues of the meteoric rise of technology, how it's affecting my communities, and what you said about community engagement really resonated with me because that's what part of my focus is on, is to look at how the community is being involved, getting engaged with this technology. 

Carol:
My question to you is when you talk about this technology and how it is affecting the lives particularly of black and brown and indigenous people, what is the response from your white audiences? Do they feel like they're being picked on? Do they feel like well why is it always us that we're the bad guys? Because in this technology discussion, especially on Twitter, I followed you as I do and other scholars who are involved in AI technology, we are the people of color, we are the ones who are being ... what's the word I'm looking for I guess, targeted that's it. Yeah. What are some of the feedback that you get when you are having these discussions?

Ruha Benjamin:
Nice to hear your question, Carol and I would say that one of the most surprising things from the time I started the project to now, has been the general shift of consciousness and discourse in this country around technology. I think, in part we can point to some big sort of spectacular things that happened, whether Cambridge Analytica or other, where the general sentiment seems to be much more on the skeptical, mistrustful end of the spectrum for everyone. Whereas perhaps a few years ago, we would see that, expressed most by communities who've always been on the underside of progress, who have been used in the process of honing scientific and technological tools and often have not benefited in the end from that. 

Ruha Benjamin:
In some ways, I've seen an expansion of that sentiment across the body politic for obvious reasons. I thought initially when I would be speaking about it, I was projecting into the future, I thought I would have to be much more on the defensive and get much more pushback than I actually have been so far, knock on couch. That has been surprising and so, in some ways I think the challenge now is to channel that sentiment, what do we do sort of moving from as our body ... digital body says from paranoia to power, right? It hasn't been as you might suspect a kind of backlash against the work, rather I've heard people talk about the general techlash that seems to characterize the larger discourse around this.

Ruha Benjamin:
It hasn't manifested in the way that you might suspect and might have a few years ago, yeah. Thank you.

Sergey:
Hi. My name is Sergey. I use she/her pronouns. I don't have like a fully-formed question but I'm really struck by the quote that you brought in, around the Underground Railroad and overground railroad and I'm thinking about that, the work that people put together in the playbook, and I guess I'm just curious around how you're thinking about ... I'm not asking you to like say the things you don't want to say, but how you're thinking about that in creating this book and then also, I feel like personally I continually come up against my own like shit around visibility and around sharing resources and also that people are listening in and there might be a lot of really good intentions in the room, but not everyone is on the same page so I'm just curious.

Ruha Benjamin:
Yeah. No, I have some thoughts around that. I think about how that organization is enacting knowledge production as a counterpoint to the big data ethos that more data, more information, more, more, more is always better, right? Exposing, getting everything out there in the open and there, there is a kind of discretion being enacted where yes, we're going to make some things known, but we're also going to withhold knowledge, right, as part of the politic. I think about that in the context of what I've called in other work, informed refusal as a counterpoint to informed consent. 

Ruha Benjamin:
I draw on the work of indigenous scholars like Audra Simpson and Kim Tallbear, which, they are describing informed refusal not simply in what indigenous communities do, vis-à-vis scientists, but also in their own ethnographic methodology. That they refuse to write up everything that their respondents talk about because of the ways that can be misused and turned against them. This really runs against a kind of enlightenment ethos that just ... more knowledge is always better or putting it all out there. I think it's saying something about trying to enact your critique in some ways.Trying to internalize it and let it inform your practice so that you're learning as you're going.

Ruha Benjamin:
I would encourage those who are interested in this idea, you can find the article I refer to, “Informed Refusal” on the research tab of my website and it also cites this larger body of scholarship that I draw on to conceptualize. I would fit this particular framing of the handbook within that literature. 

Hugo:
Hey, thanks Ruha. That was a beautiful talk. My name is Hugo. I use he/him pronouns. I work in data education and machine learning and AI education. You spoke wonderfully to ... I mean the importance, but limitations of I suppose labor movements in directing what industry is capable of and can do community movements. Historically after these, we've seen industrial movements and then legislative movements, usually in that order, and I'm wondering, where you see this going? I mean for example big companies, getting on board and then governments legislating hopefully and the power of that.

Ruha Benjamin:
Yeah. If I understand the question correctly, I see certainly policy legal changes afoot that need to happen, that are happening, whether we think about the various moratoriums and bans on facial recognition as just a starting point, but just thinking even proactively, not just banning, but what kind of governance structure do we need to re-conceptualize the relationship between society and technology as one pillar or one pathway that we need lots of people working on. I also see the organizing pathway, whether it's tech industry organizing, community organizing, but to get that pushed, we need mass mobilization. We need political education. I think about litigation. We have increasingly lawyers thinking about what it means to litigate algorithms. 

Ruha Benjamin:
We saw the case, for example, in Michigan where then Governor Snyder adopted an automated decision system that was meant to alert the state about cases of unemployment fraud and flags up to 40,000 people and was wrong in 93% of the cases. Meanwhile, people went bankrupt. They committed suicide. They got divorced, all because of this automated system flagging them as a fraudulent person and having ripple effects. There's a class accident lawsuit now underway in Michigan around this, not the only one in which lawyers are trying to think about what it means to redress, when harms are done. That's an avenue of action and energy that we need.

Ruha Benjamin:
My own sort of main pathway is really thinking about education, both in the popular education sense, but also in rethinking what STEM education looks like. Yesterday, I had the chance to virtually have a conversation with students at a STEM High School in LA, and let me tell you something, I mean they asked questions like what is platform cooperativism? What is cognitive equity? Aren't we being hypocritical by using these various apps while we were X, Y, and Z? I mean the questions they asked of all ... almost all of the kind of lectures and Skype visits I've been doing were so pointed and so infused with the social and historical literacy. 

Ruha Benjamin:
I had to believe that it was already part of their STEM training, right? It couldn't have been just simply reading something I'd written recently because it was so informed with a deeper sense of what's going on. For me, that is what really gets me excited as seeding a different way of thinking about not simply ethics of technology but the politics and sociality of it. That's another pathway and I'm sure there's more that I am not even addressing, but in some ways I feel like this is all hands-on deck. It's not a we do this first and then this happens. It's about let's find our niche and get to work and build these networks.

Ruha Benjamin:
There's a lot of organizations that are trying to do this umbrella work of drawing together these different pathways, including Data for Black Lives.I think part of it is to just not leave it up to those who are the technical experts to tell us what the ethics are or what the next steps forward are. I think that is, if I can point to one wrong way to go, it would be leaving it up to those who've produced the problem to solve the problem. 

Adam:
Hi. I'm Adam. They/them. I'm very grateful to be here. One kind of theme that you've been discussing both in your presentation and in these questions is kind of what you described as thin description and kind of how that is a digital defense mechanism and kind of a tension that I identified while reading the book was how tech corporations engage in their own thin description by both demanding others of being kind of radically exposed, but not doing it themselves. I'm wondering kind of whether you could speak to that and whether exposure of their practices is necessarily the most prudent way of going about that.

Ruha Benjamin:
Yeah. I wish you would have read the draft a year ago and I would have unfolded that critique or that insight into my own analysis and I haven't thought a lot about that. The way that my advocacy of a kind of thin description mirrors in many ways how we're exposed, whether it has to do at the level of screens or skin or whatever that thinness entails. If you will allow me, I'll sit with that and think about that a while and hopefully get back to you. Thank you.