Data & Society

Abolish Big Data

Episode Summary

This talk by Yeshimabeit Milner, founder & executive director of Data for Black Lives, serves as a call to action to reject the concentration of Big Data in the hands of a few, to challenge the structures that allow data to be wielded as a weapon of immense political influence. To abolish Big Data would mean to put data in the hands of people who need it the most.

Episode Notes

Big Data is more than a collection of technologies or a revolution in measurement and prediction. It has become a philosophy; an ideological regime that determines how decisions are made, and who makes them. It gives legitimacy to a new form of social and political control that takes the digital traces of our existence and then finds ways to use them to sort and manage populations. Big Data is part of a long and pervasive historical legacy of scientific oppression, aggressive public policy, and the most influential political and economic institution that has shaped and continues to shape this country’s economy: chattel slavery. Algorithms and other data technologies are the engines that facilitate the ongoing evolution of chattel slavery into the Prison Industrial Complex, justify the militarization of schoolyards and borders alike, and continued the expansion of contemporary practices of peonage.

About the Speaker

Yeshimabeit Milner is the founder & executive director of Data for Black Lives. She has worked since she was 17 behind the scenes as a movement builder, technologist, and data scientist on a number of campaigns. She started Data for Black Lives because for too long she straddled the worlds of data and organizing and was determined to break down the silos to harness the power of data to make change in the lives of Black people. In two years, Data for Black Lives has raised over $2 million, hosted two sold out conferences at the MIT Media Lab and has changed the conversation around big data and technology across the U.S. and globally.

As the founder of Data for Black Lives, her work has received much acclaim. Yeshimabeit is an Echoing Green Black Male Achievement Fellow, an Ashoka Fellow, and joins the founders of Black Lives Matter and Occupy Wall Street in the distinguished inaugural class of Roddenberry Foundation Fellows and most recently, was named one of Forbes 30 Under 30.

Episode Transcription

Janet Haven (00:00:08):

Hello, welcome to Data & Society. My name is Janet Haven. I am the Executive Director here at Data & Society and it is my sincere pleasure to see all of you here tonight. I'd like to welcome you to our Databite tonight featuring Yeshimabeit Milner, the Founder and Executive Director of Data for Black Lives. Yeshi is an Echoing Green Black Male Achievement Fellow and Ashoka Fellow and joins the founders of Black Lives Matter and Occupy Wall Street in the distinguished inaugural class of Roddenberry Foundation Fellows. Most recently she was named one of Forbes 30 Under 30. Yeshi has a BA from Brown University and serves on the board of the Historic Highlander Center for Research and Education. I am extremely delighted to welcome her tonight to the stage. Thank you.

 

Yeshimabeit Milner (00:01:04):

Thanks Janet. And thank you everyone for being here and for braving and impending public health crisis. You know, this is my very first public Abolish Big Data talk in New York City and there's no place I'd rather do it than right here. So about Data for Black Lives. Data for Black Lives is a movement of over 4,000 scientists and activists committed to harnessing the power of data and technology to make concrete and measurable change in the lives of black people. And Data for Black Lives began with a vision, my vision, a bold, ambitious, but very possible vision of changing the role that data plays in society, in public life, and the lives of historically oppressed people. And in particular the lives of black people. We launched Data for Black Lives with a conference at the MIT Media Lab in November of 2017. And nearly three years in the vision has grown into a powerful force of scientists and activists equipped with the skills, empathy, and the ability to create a new blueprint for the future.

 

Yeshimabeit Milner (00:02:07):

And we are continuing to grow. This year, we're responding to the urgency of the political moment, but also the requests of people all over the U S and the globe to bring the Data for Black Lives mission where they live. The mission to make data a tool for social change instead of a weapon of political oppression. This talk is entitled Abolish Big Data. It's the name of my forthcoming book. It is also a call to action that recognizes that the bullets, police, dogs, and fire hoses of the past have become the predictive policing algorithms, data-driven voter suppression, and the facial recognition of the present. Algorithms and other big data technologies are the engines facilitating the evolution of chattel slavery into the prison industrial complex and have created the conditions for the crisis we face. Now, the prison abolition movement asks the question, how do we create solutions in our communities to social problems without recourse to prisons? In this talk I apply that lens to big data. How can we reimagine the structures and industries that concentrate big data into the hands of a few and how can we abolish the structures that turn data into a powerful and deadly weapon?

 

Yeshimabeit Milner (00:03:28):

This is not a call to end the use of all data, quite the opposite. call to abolish prisons is not a call to end accountability, but to abolish a punitive, violent system that is simply not working for society. The call to abolish big data is to dismantle the oppressive structures and the industries that surround its use. Big data is more than a collection of technologies, a vast amount of information and different types of it. It is more than a revolution and prediction and measurement. It has become a philosophy, an ideological regime about how decisions are made and who makes them. It has given legitimacy to a new form of social and political control that has taken the digital artifacts of our existence and found new ways to use them against us.

 

Yeshimabeit Milner (00:04:24):

Big data is not new. It's not as novel or as revolutionary as we often worship it to be. It is part of a long and pervasive history, historical legacy and a technological timeline of scientific oppression, aggressive public policy, and the most influential political and economic system that continues to shape our country and our economy, which is chattel slavery. I believe today what we face and what we must name is a data industrial complex where zip code is destiny. Debt becomes a ball and chain weaponized by complex and obscure financial systems where data is a strategy to rob people of political power by manipulating elections, a system and culture where persistent, archaic, and racist notion of risk persist. Narratives more powerful and impenetrable than any prison cell that's ever been built.

 

Yeshimabeit Milner (00:05:27):

Data for Black Lives was founded on the belief that we have an opportunity with data to abolish, re-imagine, and recreate new structures of knowledge production, new forms of decision making, and new ways of relating to each other. The possibilities for this are infinite.

 

Yeshimabeit Milner (00:05:50):

Because of the enormity of the threat, which is scary and unprecedented. The discourse has very negative and fatalistic. But this does not reflect the agencies of our communities and our movements. We don't want people to give up and get overwhelmed. We want to create alternatives. Once again to abolish big data in the most simplest terms means to reject the structures that concentrate the power of data into hands of a few and instead put that power into the hands of people who need it the most. The possibilities of that shift became very clear to me at a young age.

 

Yeshimabeit Milner (00:06:31):

I learned to collect, analyze, and use data as a high school student because early on I realized that alone we could be ignored, but that there was power in a number. When students at a nearby high school organized a peaceful protest after a school administrator put another student in a chokehold, it made national news, but not in the way that young people walking out over gun violence or other forms of civil disobedience would. Today, I'll never forget seeing footage of SWAT teams flooding into the school or police shoving the small frames of students I grew up with against police cars on CNN and on local news like shown here. The headlines read "Riot at Miami Edison Senior High School." I knew that unless we found new ways to be heard to disrupt the narratives that facilitated this level of abuse, my future and the future of many other young people would be at stake.

 

Yeshimabeit Milner (00:07:28):

When we were turned away from public hearings at the school board and totally dismissed by administration, we hit the ground running. We surveyed over 600 students about their experiences with suspensions and arrests in schools and we turned the findings into a comic book for many of the young people we surveyed. This was their first time ever being asked about their experiences. For them. They thought that, you know, not having a school ID or getting suspended for having the wrong color t-shirt under their uniform meant that they were a bad kid, but they did not know that, you know, what we were facing was a statewide problem. It was a national problem and it was called the school to prison pipeline, and most importantly that there were young people in the city who are working to change that– to shift that with something called restorative justice.

 

Yeshimabeit Milner (00:08:18):

Four years later I returned to Miami after college with a renewed sense of purpose and an arsenal of skills in data science and data collection and research that was urgently needed. I went to Brown and I see some of my Brown classmates here. We had open curriculum, so you know, you could just do whatever you wanted. And I took full advantage of that, you know, but, but I said, okay, how can I get as much skills as possible and how can I bring it back to Miami? And I had an opportunity, I was asked to come back to lead a campaign at the, at the same organization Power U, but this one was on reproductive justice. Folks in the community saw that there was a crisis happening. Black babies were three times more likely to die before their first birthday than white babies.

 

Yeshimabeit Milner (00:09:08):

You know, and even though at that point in 2012-2013, it's shifted a little bit now. There was very little research on this phenomenon on black infant mortality. The fact that the infant mortality rates had decreased for the past 50 years, but that disparity had stayed the same. Moms in the community knew, families knew that in particular when you go into the hospital, people were experiencing things like the overuse of procedures like C-section and the aggressive targeting and marketing of infant formula. Just in addition to a whole host of bedside practices that made going to give birth a very kind of violent and racist experience,

 

Yeshimabeit Milner (00:09:50):

You know, but without data and in a political climate like Miami, the community was totally ignored. So with a small team of moms, I surveyed 300 women and families about their experiences giving birth in Jackson Memorial Hospital, which is our public hospital and the largest public hospital in the country. And we weren't able to bring 300 moms into the boardroom with us once we got to meet with the hospital CEO and the staff. But they couldn't deny the data that we collected, a campaign that would have taken years to win we accomplished within months and today the lives of hundreds of thousands of mothers and babies have been impacted by the changes made in the hospitals that we fought for.

 

Yeshimabeit Milner (00:10:35):

But in all of these experiences, whether it was growing up or what we were facing with this campaign, I realized that we weren't just fighting institutions whose policies and practices sought to undermine our lives. But behind those policies and practices, what justified and endorsed these civil and human rights abuses were narratives.

 

Yeshimabeit Milner (00:10:58):

Narratives, grounded in data, that must also be abolished while we do the work of dismantling the power structures that perpetuate them. Perhaps the most harmful narratives surround the notion of risk. The very first time I heard the term at-risk was actually in the fourth grade. Another student, and coincidentally as we worked in our school's computer lab, told me that she was at-risk. Immediately just hearing the word as a nine year-old, elicited this sense of fear in me. Like at risk of what? Are you okay? What's going on? I asked her what she meant. She told me that she was enrolled in an afterschool program for at-risk students, for kids who are at-risk of joining a gang, or having an unwanted pregnancy, or dying early.

 

Yeshimabeit Milner (00:11:45):

She liked the afterschool program. But it was as if the lack of funding at our under-resourced, predominantly black school offered limited options, not just for the extracurricular activities that were possible, but also for the kind of lives and for who we could become as people. And this was a self fulfilling prophecy, right? You know, I survived the school, the prison pipeline. But many of my peers did not as they had to fight to overcome the material circumstances of their life in addition to the weight of these narratives.

 

Yeshimabeit Milner (00:12:21):

And so my question is, you know, how does this word risk, right, which is used, you know, first used in actuarial science and in business become applied to a whole generation of young people then when I was growing up, and now? But you know, this is one of my favorite forms of machine learning, used to power auto correct and word editing and synonyms and most text editing softwares. In this we see connotations of risk, right? Danger, jeopardy, probability, gamble, hazard. But again, how do these words become used, and not only use but operationalized, through policy that affected young people like myself and my friend? And you know, there's a historical basis for this, right. With the end of the civil rights movement, the war on drugs, or as what I call the war on communities, and with the introduction of massive legislation to push forward the most violent civil and human rights assaults of our lifetime—mass incarceration—also came a wave of research and data that targeted Black communities and communities of color and sought to warehouse these communities into prisons.

 

Yeshimabeit Milner (00:13:35):

And this, you know, the superpredator myth. I think it got prime time during the last election, but I think unfortunately the conversation was isolated to particular candidates, but wasn't really a broader conversation around how this myth is still persistent. But you know, this was introduced by a social social scientist and Bush administration Advisor, John Dilulio in 1995. He created a whole theory around this idea of superpredators. A new generation of street criminals is upon us. And I quote, "the youngest biggest and baddest generation anyone has ever seen. Based on all that we have witnessed, research, and heard from people who are close to the action," he wrote with two with two co-authors, "Here's what we believe. America is now home to thickening ranks of juvenile superpredators, radically impulsive, brutally remorseless youngsters, including evermore teenage boys who murder, assault, rape, rob, burglarize, deal deadly drugs, join gangs, and create serious communal disorders. At the core, the problem is that most inner city children grow up surrounded by teenagers and adults who are themselves, deviant, delinquent, or criminal." And the point of this was to spark panic, to fuel this fear and hatred into tough on crime policies that proved successful for its objective to criminalize generations of young people. But it took years for him to admit what we all know now, that there was never a real threat at all, and that his predictions were not only wrong, but they were the exact opposite of what was happening across the nation. Years after this article, you know, after this kind of campaign of lies and once he, you know, he talks about converting back to Catholicism, but he actually retracts all the statements that he made, but by then it was too late, right? The policies were already in motion and they became automated in the American imagination.

 

Yeshimabeit Milner (00:15:31):

And John Dilulio was not alone. This is another famous myth, crack baby myth, that was based on a study with only 23 participants, right. If there's researchers in here, you know, that that's not enough sample size to actually pull any like valid findings. But again, with a whole bunch of media support and a lot of you know, political endorsement, you know, this became such a persistent ideological regime, but also fueled so much policy. And later on, you know, The New York Times followed up with the original people who were part of this study, and one of the young ladies was the first in her family to graduate from college and the findings were clear, right? The biggest, you know, risk factor or whatever for you know, these children was not crack addiction, not even race, right? It was poverty, right?

 

Yeshimabeit Milner (00:16:27):

And there's also this welfare queen myth that has been used to totally privatized and dismantle our social systems. Meanwhile, we know that the real welfare queens are the corporations who benefit more from government subsidies than any individual representatives and recipients combined. You know, but so in the age of big data, unless we are aware of this history, we risk repeating it. And I think that's a much more appropriate term of the use risk. But before we can talk about algorithms and these specific big data systems, we have to tell the origin story of big data. What were the economic imperialistic and colonial context that required, the level of record keeping accounting and surveillance that have come to define the big data practices of today?

 

Yeshimabeit Milner (00:17:26):

Contrary to popular belief, slavery was not the antithesis to business innovation and much of what we know about scientific management, management science, and finance does not come from the factory floor, the railroad or the steam engine. The big data systems we're familiar with today, used to control, surveil, enact violence to maintain power structures and ensure profit on a global scale originated during slavery. In the 1600s and 1700s, the Dutch East and West India companies were the largest commercial enterprises in the world with thousands of ships, hundreds of offices and employees all across the Americas and Asia. The VOC and the WIC operations were mirrored all over the Atlantic and brought to the U.S. In proportional terms, they were wealthier and more powerful than Apple, Google, and Facebook combined. These companies pioneered colonialism, created the blueprint for globalization, and develop new data practices required to maintain this massive operation. And although this history had been largely erased and ignored, these sophisticated big data practices predate the analytical tools that we use today.

 

Yeshimabeit Milner (00:18:45):

Big data practices were not engineered out of creativity or innovation, but in the deeply violent transatlantic slave trade, which traffic 12.5 million enslaved African people to the Americas. In a great book that I think you guys should all read, Doctor Caitlin Rosenthal writes, "Accounting for slavery, masters, and management that planter's control over enslaved people made it easier for them to fit their slaves, enslaved African people, into neat empirical rows and columns. The abstraction of catastrophic loss of human life and the necessary torture required to maintain plantations was needed to serve the owners who were removed from the daily abuse of the literal rows and fields of cotton, of the cotton, sugar, and plant and tobacco plantations they owned. Data moved up and down hierarchies akin to the ways in which CEOs and boards today are responsible for, but never accountable to the violence that they inflict.

 

Yeshimabeit Milner (00:19:46):

Big data was necessary to distance oneself from the gore capitalism and violence of slavery. So these are actually standardized reports that were used on a daily basis and are really reflective of the record keeping practices of the Dutch empire and can be seen in other countries. This is actually from a plantation called hope and experiment in, you know, British Guiana, where my family is from and were enslaved, and it really hits home because of how close it is to me, but also how much, how just eerily similar it looks to some of the parts and maps and you know, the documents that we use today. So, in it there was one, one line for each day with columns for the many different categories of enslaved women, men and children, and they include in the field, watchman, house servants, carpenters, invalids, runaways. And children's labor was, was actually very much essential to the operations of the plantation and their role justified by the brutal logic of capitalism.

 

Yeshimabeit Milner (00:20:52):

And I quote one slave owner, "The hand of a Negro child is best calculated to extract weed and grass." So this daily process of dehumanization was deeply numerical and below these monthly abstracts were ideal, literally below this were ideal reports for livestock. The Negro account and the livestock account use the same methods of taking an inventory, calculating increase, decrease, purchase, sale, birth, death, slaughter, with very little difference made for man, woman, child, oxen, goat, and cattle. But the most necessary form of accounting was in the wielding of information as a weapon to create fear, distrust, and to neutralize collective action amongst enslaved people. From removing versus' of the Bible that rejected slavery and using whipping amputation and torture to ensure that no one could read, write, or communicate at all. Keeping track of weapons, that was another important data collection practice.

 

Yeshimabeit Milner (00:21:49):

Just to make sure no one was taking something and starting a, you know, a revolt. Information systems developed were created with the intention of eclipsing networks that allowed enslaved people to assert their strength in numbers, to become educated and informed, to organize and to fight back. All of these examples, and many more, trust me wait for the book, indicate the ways in which big data was born out of bondage. To emphasize my point earlier, big data is not as new or as innovative as we often see it to be, but it's really part of this long and pervasive history. We often say that no algorithm is neutral, that algorithms are opinions embedded in code and this history reveals the extent to which that is true. And for those who don't know, I'm sure most people here know, but by definition an algorithm, in a very brief way, is a set of step by step instructions to solve a problem.

 

Yeshimabeit Milner (00:22:46):

One example that I love to use is, you know, a recipe is an algorithm. It's a list of instructions to make a dish, ingredients that make up a dish, and a result, based on what we define from the beginning of the recipe as success. Whether we want to focus on making something that is healthy or something that tastes good, regardless of health benefits. These decisions are determined by a question, what are we optimizing? Computational algorithms are layered, complicated. Their ingredients are not just raw data that is fed into them and the result is not as simple as the outputs that come out of them. Scores, ratios, GPS routes, and Netflix recommendations. But as this chart demonstrates, history and values are what influence inputs and outputs, and most importantly, the very models that are trained and developed. The algorithms themselves. One example that I love to use, especially when I'm talking to different groups of people, regardless of what kind of work they do, is FICO scores, right?

 

Yeshimabeit Milner (00:23:49):

You know, contrary to popular belief, you know, Fico is not a federal agency. I hope you all know it's the Fair, Isaac and Company, a for profit entity started by William Fair and Earl Isaac, a mathematician and an engineer who 25 years ago sought to disrupt the risk and lending industry through the use of machine learning. The inputs to the FICO algorithm as we are told, are the amount of debt we have, the percentage of missed payments, et cetera. And this information, our data, is provided through the collusion of data brokers, Equifax, Experian, and TransUnion. And that is fed into this FICO algorithm and we get our score. And while we are told that certain behaviors and financial characteristics make up our credit scores, we're never really able to verify because FICO is a proprietary algorithm, a black box devoid of transparency with the purpose of displacing accountability away from these for profit companies that profit from our data at will and at our expense. And FICO scores reflect the ways in which algorithms hold tremendous power over our lives.

 

Yeshimabeit Milner (00:24:59):

Right at this very moment, I know students who will have to drop out of school because they can't qualify for subsidized student loans. People struggling to afford rising public transportation costs here and all over the country because they can't afford subprime car payments and affordable transportation. And even hardworking families right here in this city who are facing homelessness because they can't qualify to rent an apartment all because of a three digit number. And these three digit numbers will increasingly decide whether we should get hired for the job, and even some day if we do not resist, whether or not we have the right to attain citizenship or be deported. And while it isn't a violation of federal law to deny people housing, employment, and education based on race, you can't sue an algorithm. And private companies like FICO, who I must also mention, it is in their best interest to make sure that some people have low scores and some don't, argue that their algorithms don't discriminate.

 

Yeshimabeit Milner (00:25:59):

They say nowhere in their algorithm or in their inputs is race a variable. But we know that based on the history of this country, how our neighborhoods and municipalities are organized, you don't need to use race as a variable with red lining and segregation– the legacy of slavery, one of the many legacies of slavery, have made zip codes proxies for race. Only several generations after slavery, over 6 million African Americans left the South for the industrial centers of the North, Northeast, Midwest, and West Coast of this country and what was known as great black migration. As black people contributed tremendously to the growth of the manufacturing industry and to the culture and politics of metropolitan life, our federal government responded through policies that sought to institutionalize racist sentiments, and permanently entrenched black communities in a caste-like status.Policies that were foregrounded in the most essential part of economic mobility, home ownership.

 

Yeshimabeit Milner (00:26:59):

In 1933, as part of the of the New Deal, the homeowners loan corporation developed a grading system that deemed some areas desirable, while others hazardous. It did not matter that federal law had ruled that racial zoning was unconstitutional at that point. But the creation of security maps, and the red lining of black communities encouraged the practices of real estate boards, neighborhood associations, and white mob violence that made it impossible for black people to own homes. According to N.D.B. Connolly, a historian, neighborhood grading during the 1930s was hardly a science. But the program's scientific trappings help turn popular racial knowledge into real world consequences. And this is a map of Miami where I'm from, and according to the NCRC, 74% of the areas that were deemed hazardous, given a D grade in 1933, remain low income, under resourced, and neglected communities of today–and over policed, I might add.

 

Yeshimabeit Milner (00:28:03):

So zip codes created in the 20th century to organize a country for the postal service have actually also become artifacts of this history, extending the shelf life of racist public policies of the past. And I'm sure you all hope you all are familiar with this amazing study by ProPublica, where they looked at how zip codes were essential to the insurance industry's predatory practices where someone living in a commercial area downtown Chicago where there was actually more crime, and driving a sports car, which has more risk, according to the actuarial terms, were paying hundreds of dollars less in car insurance than families in working class communities in Chicago's South Side Black and Brown communities. Although, you know, they had less expensive cars and crime was less prevalent. And if you look at a comparison of the HOLC maps and also the purple where the price is higher and more concentrated, the comparison is actually startling.

 

Yeshimabeit Milner (00:29:05):

Another example, there are so many examples of how zip codes are proxies for race, but I think I like to always talk about zip codes in particular, because it's one thing that we often use in our research. And I invite folks to really think more critically about it because it really shows, again, not just present discrimination, but historically how that's played out. But again, beyond zip codes. There's other examples of how big data perpetuates racism without racists. It's no surprise that the risk assessments that courts are doling out, are giving longer sentences to black youth with no prior convictions than white career criminals. As another ProPublica study demonstrates, another one that studies on the COMPAS algorithms, when companies like COMPAS use these questions to determine risk. So some of the questions on this COMPAS questionnaire that people fill out, that's fed into the algorithm are: "how many of your friends and acquaintances have ever been arrested?," "Were you ever suspended or expelled from school?," "How often have you moved in the last 12 months," or "indicate how much you agree with this statement: Does hungry person have a right to steal?" So, I mean, my questions are who's incarcerated in this country?, Who's disproportionately suspended? Who overwhelmingly faces housing insecurity and homelessness? And I mean, according to this algorithm, I should be in jail and sentenced and spending time, and not actually giving this talk to you all today. So for this reason, and many others, I really assert the call to abolish big data, to reject and dismantle the structures that concentrate the power of data into the hands of a few, and to put the power into the hands of people who need it the most: the refugee, the activist, the most vulnerable, and the least powerful against the enormity of what we are facing.

 

Yeshimabeit Milner (00:31:04):

And given what we know now, I believe at this call to abolish big data is not simply a political choice, but it's an ethical obligation, and it's an obligation that's grounded in a critical vision. And this vision is what guides our work.

 

Yeshimabeit Milner (00:31:22):

To reclaim data's protest, data's accountability and data's collective action. And here's some examples, some brief examples of what we've been doing. I'm going to talk a little bit more with Janet [Haven] and a little bit about some specific programs. But when a coalition of teachers, students, parents and researchers and organizers in the twin cities learned that the Mayor's office, the Sheriff's office, the foster care agency and the school district were quietly planning a joint powers data sharing agreement to turn private student data and youth data into risk ratios. We helped them mobilize, right? I was called and asked to go to St. Paul in February because people said, you know Yeshi, our folks don't know what big data is. We don't understand predictive algorithms, but we know that when they say risk ratios already in the context of incredible, you know, disparities, income disparities and in the place where Philando Castille was murdered by the police with impunity, that this could only mean something that's bad for our communities and not good.

 

Yeshimabeit Milner (00:32:25):

And you know, by organizing, and holding summits, and doing strategy, and also thinking about alternative policy and ways to instead of doing joint data, sharing agreements, but to actually listen to the demands that folks had been pushing for way before all this came down. You know, they were able to successfully get the mayor and all these agencies to dissolve this plan and are currently working with decision makers to again change the material conditions for folks in that community. And when the news broke that the data of 2.1 billion users were used as a political, as a weapon of political warfare in the 2016 election, we led a bold effort to hold Facebook to a new standard. It was simply not enough to for Facebook to make for us to make sure that Facebook never did this again. And I wrote an open letter to Facebook on behalf of the Data for Black Lives movement that laid out three demands.

 

Yeshimabeit Milner (00:33:23):

The first, to develop a code of ethics that researchers and staff at Facebook must uphold in the absence of important accountability protections, such as an IRB, right? Anywhere you go, you have to follow IRB standards, any research institution and when you go out and research@facebook.com, it's really amazing how they have like actual like peer review type journal articles without the accountability and the human subject consent and all that other stuff that ensures that there isn't any ethical violation happening. And for us, in the context of like Henrietta Lacks and the Tuskegee Airmen experiments, I mean all the stuff around the personality quiz and all that stuff, it was just too familiar, and that's why that was one of our main demands. The second was to hire more black data scientists and researchers and create an environment for those folks to thrive and not to push them out.

 

Yeshimabeit Milner (00:34:12):

And most importantly, the one that we're focused on now, currently, is to commit de-identified data to a public data trust. And a data trust by definition is a structure whereby data is placed under the control of a board of trustees who have a fiduciary responsibility to look after the interests of the beneficiaries. In this case folks in the Data for Black Lives movement but also the larger Black community, 44.6 million Black people in this country, right. And using them offers us all a great, a much better chance to say how our data is collected, and how it's accessed, and how it's also used. And there's really some interesting experiments right now happening around data trust. Folks in the UK are experimenting with it. People at a group called Algorithm Watch that's based in Europe. Their approach to data trust is creating this like data donation portal where people can donate their YouTube and Facebook data for the purposes of building infrastructure for something like a data trust. All to kind of collectively bring, amass our data, and again to have more control and decision making power and autonomy over how it's being used. And you know, for us when, when we hear things like, you know, companies like these genetic testing companies are like, we need more Black people's data. So we're going to go on a full on campaign to make sure that we get folks to like sign up for these 23andMe tests to sell it to like places like GlaxoSmithKline and other pharmaceutical companies. And while we also have groups of people who are doing research on breast cancer, sickle cell and other really important health research areas that are underfunded and under-resourced, these are the kinds of places that we see right or a data trust.

 

Yeshimabeit Milner (00:36:02):

And of course, you know, when I made this demand for Facebook to hand over their data, our data, for public use, I knew that wouldn't happen, at least not immediately. Our data, the data that they've amassed of 2.1 billion users, is their bread and butter, right? That's how they make money. That is how Facebook is a free platform. But when I made this demand, I wanted to push into the public imagination this idea of what is not only possible, but what we should be expecting of these big tech companies and the fact that they are not like totally, you know, even though have a massive, extreme, immense amounts of capital. We also have power and we have to exert that. And so in May 2020, this upcoming May we're actually going to be convening people to really build out what our data trust will look like, thinking more deeply about data governance, and also creating a blueprint for how this data is going to be collected, but also how we're going to be using it amongst our network.

 

Yeshimabeit Milner (00:37:09):

And, you know, what's the vision for this blueprint? What's the whole point of this? I think that the data trust has a very important idea but to what ends, right? And I think for us, to advance the work of the 4,000 people in our network, right? And to give them the tools, the data, the relationships, and the collective efficacy to make data tool for social change right where they live. And another program that we launched, this year is our hubs program. And we had piloted this program last year in Washington, D.C. with a group of technologist, activists and folks who have been meeting regularly and decided that now was time given our new staff, and hiring, and new infrastructure, that we could expand, and we got applications from like all of these places and more. We're not launching, we're not starting with all of these places, but it just shows you how many people are really, really eager to create this kind of space that happens at our conference, but to bring it locally where they live. And you know, these hubs will serve as launchpads for distributed organizing, advocacy initiatives, and most importantly, the deep leadership development, the training of leaders with the technical skills as vision and empathy to make real change right where they live. And finally, the Data for Black Lives Conference is the heartbeat of the movement that we're building. It's not only a laboratory for new ideas around the use of data for social change, but also for experimenting with new ways to build relationships, by dissolving silos, and building points of connection to target and dismantle the proliferation of bias in research and technology.

 

Yeshimabeit Milner (00:38:48):

And this is happening in December 11, 2020, at the MIT Media Lab. Yes, we announced it last week, so registration is opening in August, September. So look out for that. But anyway, to conclude, I want to wrap this up. I leave you all with the question. What are we optimizing? A future where the injustices of the past are automated and reinforced? A past defined by slavery, dehumanization, greed, violence and control? Or a future vested in justice, fairness, solidarity, a world where all people's needs are met. Professor Dylan Rodriguez defines abolition as a dream towards futurity, vested in insurgent, counter-civilizational histories. Genealogies of collective genius that perform liberation under conditions of duress. Abolition is a process, not an end goal. It is a rejection of prisons as the answers to the world's most pressing social problems. And the process of abolition begins in our minds, in our organizations, our academic institutions. It is a new way of understanding the world. According to the Critical Resistance Handbook, abolitionists steps are about gaining ground in a constant effort to radically transform society.

 

Yeshimabeit Milner (00:40:05):

They're about chipping away at oppressive institutions rather than helping them live longer. They're about pushing critical consciousness, gaining more resources, building larger coalitions, and developing more skills. And I believe that abolition is against certainty. Abolition is against permanence, the permanence of the prison cell, the guard tower, the weapon, and it's factory. Abolition is about asserting life in a system that demands death, casualties, human bodies as its tribute. Abolition is for us right now, while simultaneously being for generations to come. People we may never meet or see, but we must will into existence, just as our ancestors willed us. And I am confident that a new world is possible. A world that we can begin building today, right here and right now. And I invite all of you to join this effort and I really hope that you can all be a part of our movement and what we're building. Thank you so much.

 

Janet Haven (00:41:06):

That was a fantastic call to action. And I would like to just go back to a point you made in the middle of your talk about history and values and, and how important it is to understand data-driven technologies as informed by history and values, by when data is collected, where it's collected, who it's collected about, and who is left out. And I just wanted to give you a moment to talk a little bit more about that point because I think it's so central and such a powerful thread through all of your work.

 

Yeshimabeit Milner (00:41:45):

Thank you. That's a great question. I think, you know, when I started Data for Black Lives, so much of my work was actually going back and reading history and, I mean we went to Amsterdam and went back into archives and learned more about just the evolution of big data from colonialism and to what we see now. But I think, you know, especially in the context of the United States of America and in the example of zip code, right? Like we really see how something that was used to organize a country around, for the postal service, has become a way to codify really deep, and really persistent, and often violent histories that surround housing, that surrounded exclusion, and are actually used in some places to even define who's what race, right?

 

Yeshimabeit Milner (00:42:36):

When I went to Miami and I did this similar talk in neighborhoods where there were foreign born people according to what the history said. You know, people who were Cuban were defined as either Black or as being White depending on what neighborhood they lived in, regardless of their, you know, complexion or their lineage or whatever, and that has persisted to this day. And I think in the current, you know, there's, there's a lot of current examples of how that also plays out. We see so many algorithms and automated decision making processes that, you know, a lot of, I think researchers and scientists before would believe were neutral, that they're devoid of history. But I always, always invite people to kind of take a step back and to say, what's missing here? What other questions do I need to ask? And like, what is this underlying history that is being reinforced through these algorithms?

 

Yeshimabeit Milner (00:43:33):

And again, just because the technology is new doesn't mean that it's innovative and it definitely doesn't mean that it's like beneficial to society. And I think that's also the case with like facial recognition. And I mean the examples go on and on and on and on. But that's why when we have, when I do these presentations, I use that chart that we made at one of our early retreats where in addition to the inputs and outputs, there's this big history and this question again of what are we optimizing? Right. That's such an important part of the data science process and I think it must be an important part to our everyday practices where there were activists, scientists, whoever.

 

Janet Haven (00:44:10):

Thank you. So I want to go back to the, the call for action that Data for Black Lives put out in 2018 following the Cambridge Analytica scandal. And that you talked about a bit here where you asked Facebook to do several things. And one of those things as you talked about was to commit to a public data trust. And I think that's, as is often the case, Data for Black Lives is out in front. I think that conversation around the creation of data trust has spread to a lot of different communities. And yet it's still a really difficult concept I think for people to get their heads around: the governance of it, how it would work in practice, exactly how people would use it. And so I was excited to hear that you're planning an event in May to talk this through in really concrete terms. So I wondered if you could preview that May event for us a little bit and talk about what you think is needed to put that idea into practice, what actually needs to happen from a practical sense? What are you looking for?

 

Yeshimabeit Milner (00:45:17):

Yeah, that's a great question. We've done so much internal study as a team to look at existing examples, knowing that the data trust that we're envisioning or the one that we needed for this context is going to be different, similar but different in a lot of ways. And there is that one piece around the technical aspect. Like logistically what does that actually look like? Where does, where would the data live? Like what would the actual contracts or trust law, so much of data trust trust right now is based on like actually like 14th Century trust law, which is, if you're interested in that, go ahead please look it up. But, you know, obviously for us we're conscious of history and what we're building on and how we're thinking about our work moving forward. But yeah, I think in addition to that technical and logistical aspect, for us it's what is the advocacy aspect, right?

 

Yeshimabeit Milner (00:46:06):

What part of the data trust needs to be about the political action, the advocacy, and the self determination piece, right? You know, we imagine, let's say that, what would it look like for example, for us to replicate what folks in Amsterdam, I mean Germany did around what was called openSCHUFA. SCHUFA is the name of their kind of FICO credit score algorithm where they literally ask people from across Germany to submit their reports, their credit reports for the purpose of amassing it into this big dataset and then pulling findings to better understand discrimination. Unfortunately the the data set that they got was quite homogenous and they weren't able to recruit enough people of color to actually really totally understand it better. But for me, you know, thinking as an organizer, what would it mean to go door to door and talk to folks in communities and say, hey, this is what we're working on. This is what a FICO score is. I'm sure you know about it because you're impacted by it. Help us amass this big data set for us to better make decisions, right? Or it'll look like this other advocacy piece, which is making demands of companies like Facebook or Google or 23andMe or you know, these big multinational corporations that have amassed large amounts of data and have been using it for not so positive purposes all the way to the point of you with Facebook totally weaponizing that data to change the entire political landscape and create the one that we are currently sitting in. So, you know, there's, there's different aspects, and that's why for us, we're bringing together people who are really on the front lines of thinking about the legal, technical, how do we actually make something de-identified? Is that possible to anonymize data? To also folks who are working in health and activism who will be really the ones using this data. So I'm stay tuned at our conference in December, we're going to have a whole panel talking about data governance and exploring for what that means to us. And I mean, even just take a step back, like, you know, we had a whole conversation about what this governance mean, right? And for me, this last year I got a chance to spend a lot of time going into Chiapas, Mexico and learning from the Zapatistas communities, which I learned about when I was growing up in high school, and learning how to be an organizer where in 1994 a group of indigenous communities came together and protested NAFTA–which was the legislation that kind of pushed forth t this whole era of neoliberalism.

 

Yeshimabeit Milner (00:48:38):

It was as if they knew that... What was coming. They could foresee what was coming and how this worldwide legislation would totally change their lives and folks all over the world. And for them, their response was to create autonomous communities, which are still in existence today. You know, so maybe, it's not possible in the U.S. context to do something like that, but how do we think differently about governance? How do we think critically about self-determination and what does that actually look like? I mean, is that possible? I think so

 

Janet Haven (00:49:08):

I'm going to ask one more question. Building off of what you're talking about, that is the idea of governance and what are the different mechanisms for governance. So going back to the call for action in 2018 Data for Black Lives also called for, again out in front, a data code of ethics. And I think since 2018 we've seen that idea picking up and gaining momentum. We've seen normalization of it, but we've also seen pushback on the idea that something like a code of ethics can deliver just outcomes for communities. And so I'm curious where you are on thinking about codes of ethics as a powerful governance mechanism, what they can do, and what their limitations are.

 

Yeshimabeit Milner (00:49:53):

Yeah, I think that's a good question. You know, there's been lots of, as you said, conversations about ethics and you know, even who these are being funded by, right? We have lots of companies who have been putting forward their ethics statements and their guidelines. And I think for us, you know, the way that we first practice this idea of building a code of ethics at our last conference, we actually created like an installation where we had people, well we first asked stuff like, how do we actually build a code of ethics? What it makes sense to you to rent out a room and like have, have people come together and create a whole like, consensus process. But I said, okay, at our next conference let's have a installation where we have people, write, and reflect on what it means for them, what ethics means for them, but also what they commit to and what they demand.

 

Yeshimabeit Milner (00:50:45):

And so we had, we built a cube and maybe for folks who came, you remember this. On the outside of the cube was "we demand." And on the inside was the phrase "we commit" and people wrote out what they demanded and what they committed to. And I think that was a really important practice for us. Again, to kind of build an understanding within ourselves of what we as a community, you know, are again holding as demands and what we're committing to. I think for us, you know, in terms of like these larger conversations around code of ethics. Yeah, I would agree. You know, that's a cog in the wheel of a larger strategy of like holding these big companies accountable. It has to be paired with, you know, movement building to then make sure that that's being actually fulfilled and enforced and materialize. Otherwise it's unfortunately just like a piece of paper or some values and principles that don't really matter.

 

Yeshimabeit Milner (00:51:43):

So we'd like to open it up to everyone to ask questions and we can bring you a mic. So if you have a question, please raise your hand and we will bring you the mic.

 

Audience member (00:51:56):

Hi. Great talk. I really enjoyed it. I had a question about, you were talking a lot about data governance and sort of who's in control of the data and how the data is sort of stored and manipulated and organized. And I'm curious if you also had sort of something to say about just when collecting certain forms of data or certain types of data can actually just be fundamentally harmful, or when data needs to be deleted. So I'm curious if you guys have like thoughts about that, especially as it connects to Data for Black Lives, and how collecting certain types of data could just fundamentally be harmful. And how that conversation changes in that context.

 

Yeshimabeit Milner (00:52:35):

Yeah, I mean there's some data that should not exist, right? I think you know, I mean when I talk, when I say that for far too long in Black communities data has been weaponized against us, also like surveillance apparatuses. Right? I had, should I talk about this? I had a whole conversation with Manhattan's DA about this open data portal they were trying to make live. And I was like, well, you know, you guys, maybe you shouldn't make all of this data available because private companies, for example, instead of just like advocacy groups, loosely defined, wIll go onto these data open databases and find a way to create apps or do whatever and you know, make money off of them which will be double jeopardy to the people who already have faced incarceration or whatever based on this data.

 

Yeshimabeit Milner (00:53:23):

So, you know, I think there's a lot of different cases, right? And I think for us, you know, we err on the side of, how do we make sure that... obviously when we're thinking about a data trust, the focus is making demands of companies who are already using data, being sure about what kind of data we want to use, and how do we make those decisions is what are the kinds of things and what are the research initiatives and ideas and questions that people in our network are really concerned about. I feel like obviously this question around genetics data is a very important question, but you know, I don't know, I'd rather have my data be used to address sickle cell anemia, Black infant mortality and breast cancer, led by a group of Black women researchers who have been doing this work for a really long time, who are committed to the community, who have roots in the community than like a pharmaceutical company like GlaxoSmithKline.

 

Yeshimabeit Milner (00:54:24):

Right? So I think these are the ethical questions that we're really facing. But I think again, for us the focus is how do we give people the tools, whether it's young people, like in my experience, to collect their own data, to be in charge of that, to know how to use that, to know how to protect that for the purposes of their communities. There are so many different ways that data can be weaponized. And I think as we push forth in this governance conversation that's a question that we'll continue to grapple with.

 

Audience member (00:54:58):

How would you change the teaching of math and statistics in public schools in order to incorporate the message of Abolish Big Data?

 

Yeshimabeit Milner (00:55:10):

Really good question. You're going to have to wait for my book. I'm kidding. No, I mean look, like I became really good at data science by having to fight for my life, right. By having to collect data to uplift, you know, the extreme civil and human rights abuses that were happening in my school. That's how I became really good at this, right? Yes. I went to Brown and I learned more about it and I, you know, had like, I was in an IB program and I like, was doing like advanced calculus. That was great, but it really didn't, I didn't understand it until I was using it for a purpose that was benefiting me. But most importantly, other people in my community. Right. We've had conversations with my co-founder, Lucas, who's getting his PhD in math about how we should change calculus education.

 

Yeshimabeit Milner (00:55:53):

Right. I think, you know, we had a whole panel at our last conference about math education and how math education in this country really, if you look at the data on math education and you also look at the history behind it, there's been a lot of, again, weaponization of curriculum, there's been a lot of exclusion, but also tons of opportunity to do really, really important social justice type work with young people. But yeah, I have a lot of ideas around that. I think, you know, but I would have to say first and foremost, like making sure that we're not suspending students for like no reason, that we're creating schools where there's like restorative practices and like non exclusionary disciplinary policies and you know, where there's opportunities for everybody to learn where we're funding schools, where we're not shutting down schools. And also finding curriculum that is beyond culturally responsive, but it's also empowering people with this knowledge, versus making it like folks feel like this is math, it's hard, only a small cross section of society is good at it.

 

Audience member (00:57:00):

My question is around what you talked about in terms of value. I think it was really interesting that you brought up the point around like the historical context and like value of data and how we're looking at that and the impacts of the oppressed and marginalized. And my question is around equity. In the interim of this path that we have towards abolition, and whether or not you think there's space for a shift of ownership when it comes to data, when it comes to the profit around data. And what do you believe is necessary for our communities to know, to be able to champion that type of equity and ownership?

 

Yeshimabeit Milner (00:57:40):

Yeah, I think that's a great question. You know, what the data trust idea, you know, people have been saying it, that's a good ownership avenue because some examples of data trust involve people actually pulling their data and being able to profit or maybe their individual big data sets and being able to profit. I don't think that's the same as what you're asking in terms of ownership. I think, you know, for us ownership would look like being able to make sure that the data that is about us is actually being used by us, and also for the purposes of, you know, the issues, context problems, solutions, that are surrounding our lives. Right. And that's when I say us, I mean Black communities. People have been doing really interesting experiments around data cooperatives. And I think that is one avenue and one solution that could also work as well.

 

Yeshimabeit Milner (00:58:33):

Data trust, data cooperatives. And I think this larger question around like education and how do we get people from one place to other. I think about this a lot because you know, when we say making data a tool for social change instead of a weapon of political oppression, like that's a cultural change. Like that's a consciousness shift, right? And I experienced it so many, so many spaces, especially, you know, folks who are like, yeah, like I, you know, folks in, in Black communities, people who have never, who don't know what an algorithm is, who don't know what predictive policing, like a predictive algorithm, is or what big data is. But they understand FICO because it's had such an important influence in their lives, right? They understand compass risk assessment scores because inherently it's impacted them. Like I had experiences going on in Capitol Hill and I'm explaining something like a FICO algorithm to senators and they understand it and you know, explaining it the same way as I would to my grandma.

 

Yeshimabeit Milner (00:59:28):

And my grandma gets it more because she's so much closer and so much more directly impacted. And I think for us, you know, we really are so focused on like how do we talk about these issues in a way that is obviously accessible, that's obviously inclusive, but most importantly empowering, and that is pushing people's imaginations, right? That's why even though we're not totally sure what a data trust looks like, we're going to push these demands. Right. You know, because that is what takes people from, Oh, like my data's being taken and used at an election all the way to, okay, wow, this is how we can actually flip the script and do something totally different and make this about empowerment and make this about autonomy and self determination. I lost my train of thought about what else I was going to say, but yeah, you know, this is, this is why we kind of do this work and this is why public education and this main principle at Data for Black Lives, which is really about using this datafication of our society to make like bold demands, maybe not totally new demands, but like bold demands for racial justice. Right? So many of the things we are asking for now, are like not new, but we have an opportunity that, with the exposure of these problems and a new way of looking at it. 

=Thank you everybody.