featured


filtered by: testimony


On April 4, 2019, Data & Society Executive Director Janet Haven and Postdoctoral Scholar Andrew Selbst testified before the New York City Council’s Committee on Technology about the Open Algorithms Law (Local Law 49 of 2018). They called for oversight of the Automated Decision Systems Task Force to ensure access to “details of ADS systems in use by specific agencies” and a public engagement process. 

Other testimonies include Task Force members Solon Barocas and Julia Stoyanovich andBetaNYC Executive Director Noel Hidalgo. For a video of the full hearing, click here.

Please find Janet Haven and Andrew Selbst’s written testimony below. 


Our names are Janet Haven and Andrew D. Selbst. We are the executive director and a postdoctoral scholar at the Data & Society Research Institute, an independent non-profit research center dedicated to studying the social and cultural impacts of data-driven and automated technologies. Over the past five years, Data & Society has focused on the social and legal impacts of automated decision-making and artificial intelligence, publishing research and advising policymakers and industry actors on issues such as algorithmic bias, explainability, transparency, and accountability more generally.

Government services and operations play a crucial role in the lives of New York City’s citizens. Transparency and accountability in a government’s use of automated decision-making systems matters. Across the country, automated decision-making systems based on nonpublic data sources and algorithmic models currently inform decision-making on policing, criminal justice, housing, child welfare, educational opportunities, and myriad other fundamental issues.

This Task Force was set up to begin the hard work of building transparent and accountable processes to ensure that the use of such systems in New York City is geared to just outcomes, rather than only those which are most efficient.  The adoption of such systems requires a reevaluation of current approaches to due process and the adoption of appropriate safeguards. It may require entirely new approaches to accountability when the city uses automated systems, as many such systems, through their very design, can obscure or conceal policy or decision-making processes.

We at Data & Society lauded the decision to establish a Task Force focused on developing a better understanding of these issues. Indeed, we celebrated the city leadership’s prescience in being the first government in the nation to establish a much-needed evidence base regarding the inherent complexity accompanying ADS adoption across multiple departments.  We have seen little evidence that the Task Force is living up to its potential. New York has a tremendous opportunity to lead the country in defining these new public safeguards, but time is growing short to deliver on the promise of this body.

We want to make two main points in our testimony today.

First, for the Task Force to complete its mandate in any meaningful sense, it must have access to the details of ADS systems in use by specific agencies and the ability to work closely with representatives from across agencies using ADS.  We urge that task force members be given immediate access to specific, agency-level automated decision-making systems currently in use, as well as to the leadership in those departments, and others with insight into the design and use of these systems.

Social context is essential to defining fair and just outcomes.[1] The city is understood to be using ADS in such diverse contexts as housing, education, child services, and criminal justice. The very idea of a fair or just outcome is impossible to define or debate without reference to the social context. Understanding the different value tradeoffs in decisions about pretrial risk assessments tells you nothing whatsoever about school choice. What is fair, just, or accountable in public housing policy says nothing about what is fair, just, and accountable in child services. This ability to address technological systems within the social context where they are used is what makes the ADS Task Force so important, and potentially so powerful in defining real accountability measures.

The legislative mandate itself also demonstrates why the Task Force requires access to agency technologies. Under the enacting law, the purpose of the Task Force is to make recommendations particular to the City’s agencies.[2]  Specifically, the Task Force must make recommendations for procedures by which explanations of the decisions can be requested, biases can be detected, harms from biases can be redressed, the public can assess the ADS, and the systems and data can be archived.[3] Each of these recommendations apply not to automated decision systems generally, but to “agency automated decision systems,” a term defined separately in the test of the law.[4] Importantly, the law also mandates that the Task Force makes recommendations about “[c]riteria for identifying which agency automated decision systems” should be subject to these procedures.[5]  Thus, the legislative mandate makes clear that for the Task Force to do its work, it will require access to the technologies that city agencies currently use or plan to use, as well as the people in charge of their operation. Lacking this level of detail on actual agency-level use of automated decision-making systems, the recommendations can only be generic. Such generic recommendations will be ineffective because they will not be informative enough for the city to act on.

If the city wanted to find generic guidelines or recommendations for ADSs, it could have looked to existing scholarship on these issues instead of forming a Task Force. Indeed, there is an entire interdisciplinary field of scholarship that has emerged in the last several years, dedicated to the issues of Fairness, Accountability and Transparency (FAT*) in automated systems.[6] This field has made significant strides in coming up with mathematical definitions for fairness that computers can parse, and creating myriad potential methods for bias reduction in automated systems.

But the academic work has fundamental limitations. Much of the research is, by necessity or due to limited access, based on small hypothetical scenarios—toy problems—rather than real-world applications of machine learning technology.[7] This work is accomplished, as is characteristic of theoretical modeling, by stating assumptions about the world and datasets that are being used. In order to translate these solutions to the real world, researchers would have to know whether the datasets and other assumptions match the real-world scenarios.

Using information from city agencies, the task force has the ability to advance beyond the academic focus on toy problems devoid of social context and assess particular issues for systems used in practice. Without information about the systems in use, the Task Force’s recommendations will be limited to procedures at the greatest level of generality—things we already would guess, such as testing the system for bias or keeping it less complex so as to be explainable. But with information about these systems, the Task Force can examine the particular challenges and tradeoffs at issue. With community input and guidance, they can assess the appropriateness of different definitions of bias in a given context, and debate trade-offs between accuracy and explainability given specific social environments.  The recommendations of the Task Force will only be useful if they are concrete and actionable, and that can only be achieved if they are allowed to examine the way ADS operate in practice with a view into *both* the technical and the social systems informing outcomes.

Second, we urge the Task Force to prioritize public engagement. Because social context is essential to defining fair and just outcomes, meaningful engagement with community stakeholders is fundamental to this process. Once the Task Force has access to detailed information about ADS systems in use, public listening sessions must be held to understand community experiences and concerns with the goal of using that feedback to shape the Task Force’s process going forward. Iteration and reviewing of recommendations with community stakeholders as the Task Force moves this work forward will be important to arriving at truly transparent, accountable and just outcomes.

We are here today because we continue to believe the Task Force has great potential. We strongly believe that the Task Force’s work needs to be undertaken thoughtfully and contextually, centering on cooperation, transparency, and public engagement.  The Task Force’s goal needs to be offering actionable and concrete recommendations on the use of ADS in New York City government. We hope that the above testimony provides useful suggestions to move toward that goal.

Thank you.


[1] See generally Andrew D. Selbst et al., Fairness and Abstraction in Sociotechnical Systems, Proceedings of the 2019 ACM Conference on Fairness, Accountability, and Transparency (ACM FAT*), 59.

[2] See Local Law No. 49 of 2018, Council Int. No. 1696-A of 2017 [hereinafter Local Law 49] (repeatedly referring to “agency automated decision systems”).

[3] Id. §§ 3(b)–(f)

[4] Id. § 1(a).

[5] Id. § 3(a) (emphasis added).

[6] ACM Conference on Fairness, Accountability, and Transparency (ACM FAT*), https://fatconference.org/

[7] See generally Andrew D. Selbst et al., Fairness and Abstraction in Sociotechnical Systems, Proceedings of the 2019 ACM Conference on Fairness, Accountability, and Transparency (ACM FAT*), at 59.


testimony | 08.11.17

Data & Society, Fifteen Scholars File Amicus Brief in Pending SCOTUS Case

Marcia Hoffman, Kendra Albert, Andrew D Selbst

On August 11, 2017, Data & Society and fifteen individual scholars—including danah boyd, Julia Ticona, and Amanda Lenhart—filed an amicus brief in a pending U.S. Supreme Court case, Carpenter v. United States. The parties were represented by Andrew Selbst of Data & Society, and Marcia Hofmann and Kendra Albert of Zeitgeist Law.

The case implicates the Fourth Amendment’s “third party doctrine,” which states that that people who “voluntarily convey” information to third parties do not have reasonable expectation of privacy. As a result, when police obtain records from a third party, it does not currently implicate Fourth Amendment rights.

Timothy Carpenter was convicted for a string of armed robberies based on cell site location data which placed him in proximity of the armed robberies he was accused of partaking in. The case concerns the legality under the Fourth Amendment of the warrantless search and seizure of Carpenter’s historical cellphone records, which reveal his location and movements over the course of 127 days.

In the brief, we argue that the “third party doctrine” should not apply to cell site location information because cell phones are not meaningfully voluntary in modern society. Cell site location information contains abundant information about people’s lives, and unfettered police access to it poses a threat to privacy rights.

Aided by scholarship and statistics from the Data & Society research team, we provide evidence that the 95% of Americans that have cell phones cannot reasonably be expected to opt out of owning a cell phone to avoid police searches. The research shows that cell phones are:

  1. Necessary to participate in the most basic aspects of social and family life;
  2. Essential public safety infrastructure and personal safety equipment;
  3. Both necessary to find employment, and an important part of workplace infrastructure;
  4. Widely used for commerce and banking;
  5. Key for civic participation;
  6. Key for enabling better health outcomes;
  7. Critical to vulnerable populations; and
  8. Have been recognized as a necessity by the U.S. government in the past.

The case is expected to be heard in the fall of 2017.


D&S lawyer-in-residence Rebecca Wexler testifies about government oversight of forensic science laboratories in the State of New York.

I submit these comments to the Assembly Standing Committee on Codes; the Assembly Standing Committee on Judiciary and the Assembly Standing Committee on Oversight, Analysis and Investigation. Thank you for inviting my testimony on government oversight of forensic science laboratories in the State of New York. As a Resident at The Data and Society Research Institute, my work focuses on issues arising from data and technology in the criminal justice system. I want to draw your attention to trade secrets claims in forensic technologies that threaten criminal defendant’s rights to confront and cross-examine the evidence against them; to compulsory process to obtain evidence in their favor; and to due process.


D&S affiliate Ifeoma Ajunwa testified at the U.S. Equal Employment Opportunity Commission to discuss big data in the workplace.

Good afternoon, Chair Yang and members of the Commission. First, I would like to thank the Commission for inviting me to this meeting. My name is Ifeoma Ajunwa, I am a Fellow at the Berkman Klein Center at Harvard University and an Assistant Professor at the University of the District of Columbia School of Law. I have authored several papers regarding worker privacy, with an emphasis on health law and genetic discrimination, from which my testimony today is largely drawn.

Today, I will summarize a number of practices that employers have begun to deploy to collect information on employees, and my concerns that such information could ultimately be acquired and sold by data brokers or stored in databanks. There are few legal limitations on how this sensitive information could be used, sold, or otherwise disseminated. Absent careful safeguards, demographic information and sensitive health information and genetic information is at risk for being incorporated in the Big Data analytics technologies that employers are beginning to use — and which challenge the spirit of antidiscrimination laws such as the Americans with Disabilities Act (the “ADA”) and the Genetic Information Non-Discrimination Act (“GINA”).


testimony | 10.15.15

Re: Rates for Inmate Calling Services

Leadership Conference on Civil and Human Rights et al, including Data & Society

The FCC capped rates for long-distance calls in 2013 but did not address in-state call rates. Before the Federal Communications Commission’s vote on a proposal to cap prison calling rates and fees for in-state calls, 26 organizations, including Data & Society, signed onto a letter to FCC Chairman Thomas Wheeler urging him to ensure reasonable inmate calling rates.


Data & Society submitted comments with the National Telecommunications and Information Administration (NTIA) in response to their “Request for Comment on Stakeholder Engagement on Cybersecurity in the Digital Ecosystem.”

The digital ecosystem is quickly changing as more services are offered online and as the devices that make up the Internet of Things (IoT) proliferate. We recommended that NTIA’s multistakeholder effort attempt to address, among other things, cybersecurity in the Internet of Things, user notification and choice in regard to data collection, and possible civil liberties dilemmas raised by big data and monitoring by numerous devices and websites.

These concerns about the effects of the Internet of Things on cybersecurity and civil liberties need to be addressed while the ecosystem is young. Failure to consider these questions now could leave users vulnerable to a number of threats in the future. Unless devices and services are adequately secured, users will be vulnerable to breaches that could expose intimate information about their bodies and homes to people who were never given permission to access that data. Additionally, without giving users proper notification and obtaining actual consent, users will be unaware of the privacy risks involved in using these technologies and unable to protect the information they consider private. Finally, data collection by online services and by devices that monitor our bodies and environments could lead to abuses of users’ civil liberties.


testimony | 08.15.14

Re: “Big Data: A Tool for Inclusion or Exclusion?”

Seeta Peña Gangadharan, danah boyd, Solon Barocas

In this letter to the Federal Trade Commission (FTC), New American Foundation’s Open Technology Institute is joined by Data & Society and Solon Barocas, an independent researcher, in asking the FTC to address the ethical problems, legal constraints, and technical difficulties associated with building a body of evidence of big data harms, the issue of whether intentions should matter in the evaluation of big data harms, and the unique context of vulnerable populations and implications for problem solving and taking steps to protect them.

The letter was submitted in response to an FTC request for comments in advance of its workshop, Big Data: A Tool for Inclusion or Exclusion?


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.