Self-regulation in Sensor Society

Speaker: Natasha Schüll
Date recorded: Mar 31, 2016
In the story of self-tracking technology and its increasing automation, a certain ambivalence over the terms of contemporary selfhood comes to the fore.

Natasha Schüll – From the NSA scandal to Facebook’s controversial “mood experiment,” the past decade has seen heated debate over the ways that governments and corporations collect data on citizens and consumers, the ends to which they use it, and the threat this poses to civil liberties. Yet even as this discussion over surveillant monitoring unfolds, the public has embraced practices and products of self-tracking, applying sensor-laden patches, wristbands, and pendants to their own bodies.

Drawing on ethnographic fieldwork, this talk explores how mainstream self-tracking technologies – in their design, marketing, and use – increasingly part ways with the ethos of intensive self-attention found within the Quantified Self (QS) community, serving as digital compasses to guide consumers through the confounding, tempting, and sometimes toxic landscape of everyday choice making and lifestyle management (for instance, by regulating the micro-rhythms of their bites, steps, sips, and breaths). By offering them a way to fulfill the cultural demand for self-management while delegating the often tedious, sometimes existentially taxing labor involved in meeting that demand, such devices at once exemplify and short-circuit ideals of individual agency and responsibility.

In the story of self-tracking technology and its increasing automation, a certain ambivalence over the terms of contemporary selfhood comes to the fore. Are there any connections to be drawn between this ambivalence and broader debates over governmental and corporate surveillance, data privacy, and the possibility for resistance?

Living in a Culture of Algorithms

Speaker: danah boyd
Date recorded: Jun 27, 2016
danah boyd examines the problematic implications of using algorithms designed for one problem to address societal issues without accounting for unintended consequences

danah boyd weaves together her work on youth, privacy, and data-driven technologies, to examine the complicated social and cultural dynamics underpinning social media, the messiness of “big data,” and the problematic implications of using algorithms designed for one problem to address societal issues without accounting for unintended consequences.

The Future of Municipal Open Data, Smart Cities, and Civic Technology

Speaker: Noel Hidalgo
Date recorded: Jul 28, 2016
Lessons learned from working within the City’s civic technology community and designing 21st century systems.

Noel Hidalgo will journey through two fellowships — his Data & Society Fellowship and construction of a new fellowship for 21st century civic hackers. The first half of the discussion will focus on detailed lessons learned from working within the City’s civic technology community, collaborating with CUNY’s Service Corps students, building a municipal open data curriculum, and developing partnerships with the Mayor’s Office, Manhattan Borough President, and various City agencies.

Security and Privacy in a Hyper-connected World

Speaker: Bruce Schneier
Date recorded: Dec 7, 2016
Securing systems where information technology permeates our economies, social interactions, and intimate selves.

Bruce Schneider describes how we’ have created a world where information technology permeates our economies, social interactions, and intimate selves. The combination of mobile, cloud computing, the Internet Things, persistent computing, and autonomy is resulting in something altogether different — a world-sized web. This World-Sized Web promises great benefits, but it is also vulnerable to a host of new threats from users, criminals, corporations, and governments. These threats can now result in physical damage and even death.

In this talk, Schneier will take a retrospective look back at what we’ have learned from past attempts to secure these systems. He will also push us forward to consider seriously what technologies, laws, regulations, economic incentives, and social norms we will need to secure them in the future.

Weapons of Math Destruction

Speaker: Cathy O'Neil
Date recorded: Oct 26, 2016
As algorithms increasingly mediate education, employment, consumer credit, and the criminal justice system, how do we measure their impact on our society?

Tracing her experiences as a mathematician and data scientist working in academia, finance, and advertising, Cathy O’Neil will walk us through what she has learned about the pervasive, opaque, and unaccountable mathematical models that regulate our lives, micromanage our economy, and shape our behavior. Cathy will examine how statistical models often pose as neutral mathematical tools, lending a veneer of objectivity to decisions that can severely harm people at critical life moments.

Cathy will also share her concerns around how these models are trained, optimized, and operated at scale in ways that she deems to be arbitrary and statistically unsound and can lead to pernicious feedback loops that reinforce and magnify inequality in our society, rather than rooting it out. She will also suggest solutions and possibilities for building mathematical models that could lead to greater fairness and less harm and suffering.

Understanding Patterns of Mass Violence with Data and Statistics

Speaker: Patrick Ball
Date recorded: Mar 24, 2016
An exploration how, contrary to the standard assumption, statistical patterns in raw data tend to be quite different than patterns in the world.

Patrick Ball discusses how data about mass violence can seem to offer insights into patterns: is violence getting better, or worse, over time? Is violence directed more against men or women? However, in human rights data collection, we (usually) don’t know what we don’t know — and worse, what we don’t know may be systematically different from what we do know.

This talk will explore the assumption that nearly every project using data must make: that the data are representative of reality in the world. We will explore how, contrary to the standard assumption, statistical patterns in raw data tend to be quite different than patterns in the world. Statistical patterns in data reflect how the data was collected rather than changes in the real-world phenomena data purport to represent.

Using analysis of killings in Iraq, homicides committed by police in the US, killings in the conflict in Syria, and homicides in Colombia, we will contrast patterns in raw data with data in estimated total patterns of violence. The talk will show how biases in raw data can be corrected through estimation, and explain why it matters in these countries, and more generally.

Recorded on 3/24/2016.

On Digital Passageways and Borders

Speaker: Mark Latonero & Paula Kift
Date recorded: May 12, 2016
Mark Latonero and Paula Kift discuss how a new digital infrastructure 1) facilitates and constrains the flow of data and people, 2) conceals and constructs identity and status, and 3) affects refugees’ fundamental rights to privacy.

Mark Latonero and Paula Kift on digital passageways and borders in the movement of refugees. Numerous media reports have highlighted that refugees now increasingly rely on digital devices such as smartphones in order to traverse their perilous routes, contact lost family members, or find safe places before dark. But claims that “a smartphone” may be “the most important” tool for Syrian refugees misses the bigger picture. Phones, social media, mobile apps, online maps, instant messaging, translation websites, wire money transfers, cell phone charging stations, and Wi-Fi hotspots have all created a new digital infrastructure for global movement. This infrastructure is as critical to refugees today as roads or railways. But digital infrastructures for movement can just as easily be turned into infrastructures for control by governments, corporations, and even criminals. Indeed, governments are increasingly experimenting with similar digital technologies to reinforce their border controls—to collect, process, and instrumentalize data in order to interfere with the movement of “undesirable” migrants.

Mark and Paula will explore these tensions and discuss how this new digital infrastructure 1) facilitates and constrains the flow of data and people, 2) conceals and constructs identity and status, and 3) affects refugees’ fundamental rights to privacy, data protection, and asylum.

Ebola and the Law of Disaster Experimentation

Speaker: Sean McDonald
Date recorded: Jul 7, 2016
As an increasing number of industries digitize, the economy around data analysis – particularly predictive modeling – has exploded. The problem is, we don’t have any real way to understand, analyze, or predict the accuracy of these predictive models.

Sean McDonald on Ebola and the Law of Disaster Experimentation. As an increasing number of industries digitize, the economy around data analysis – particularly predictive modeling – has exploded. The problem is, we don’t have any real way to understand, analyze, or predict the accuracy of these predictive models. There is no context where this has higher potential – for good and harm – than humanitarian emergencies.

One of the first, and worst, examples of this was the 2014 Ebola epidemic in West Africa. In its response to the escalating crisis, the humanitarian community sought out significant amounts of sensitive mobile data, epidemiological data models, and digital engagement tools, without understanding the impact it would have on the response effort. Whether that’s considered humanitarian innovation or disaster experimentation, there’s little question that it raises a significant number of legal, ethical, and practical questions.

This talk will focus on the intersection of the public interest, the law, and the digital approaches that are increasingly defining the way that we invest public resources and provide public services. We’ll talk about the Ebola case, the trends in public sector digitization, and what that means for the practical and legal protections of vulnerable groups.

Balancing Privacy Obligations and Research Aims in a Learning Health Care System

Speaker: Frank Pasquale
Date recorded: May 7, 2015
Health information technology can save lives, cut costs, and expand access to care. But its full promise will only be realized if policymakers broker a “grand bargain” between providers, patients, and administrative agencies.

Health information technology can save lives, cut costs, and expand access to care. But its full promise will only be realized if policymakers broker a “grand bargain” between providers, patients, and administrative agencies. In exchange for subsidizing systems designed to protect intellectual property and secure personally identifiable information, health regulators should have full access to key data those systems collect (once properly anonymized). Moreover, patients deserve to be able to channel certain information flows and gain some basic controls over the presentation, disclosure, and redisclosure of sensitive information. This podcast will describe and examine some legal and technical infrastructure designed to help realize these goals.

Genetic Coercion

Speaker: Ifeoma Ajunwa
Date recorded: Jun 11, 2015
Although we cannot disclaim the utility of genetic data, it is important to consider whether we are being socially and governmentally coerced to relinquish our genetic data.

Ifeoma Ajunwa on genetic coercion. Although we cannot disclaim the utility of genetic data, it is important to consider whether we are being socially and governmentally coerced to relinquish our genetic data. If so, what does this mean for privacy and discrimination? What are the obstacles and potential solutions to securing genetic data?

Recorded on 6/11/2015