featured


filtered by: civic tech


On April 4, 2019, Data & Society Executive Director Janet Haven and Postdoctoral Scholar Andrew Selbst testified before the New York City Council’s Committee on Technology about the Open Algorithms Law (Local Law 49 of 2018). They called for oversight of the Automated Decision Systems Task Force to ensure access to “details of ADS systems in use by specific agencies” and a public engagement process. 

Other testimonies include Task Force members Solon Barocas and Julia Stoyanovich andBetaNYC Executive Director Noel Hidalgo. For a video of the full hearing, click here.

Please find Janet Haven and Andrew Selbst’s written testimony below. 


Our names are Janet Haven and Andrew D. Selbst. We are the executive director and a postdoctoral scholar at the Data & Society Research Institute, an independent non-profit research center dedicated to studying the social and cultural impacts of data-driven and automated technologies. Over the past five years, Data & Society has focused on the social and legal impacts of automated decision-making and artificial intelligence, publishing research and advising policymakers and industry actors on issues such as algorithmic bias, explainability, transparency, and accountability more generally.

Government services and operations play a crucial role in the lives of New York City’s citizens. Transparency and accountability in a government’s use of automated decision-making systems matters. Across the country, automated decision-making systems based on nonpublic data sources and algorithmic models currently inform decision-making on policing, criminal justice, housing, child welfare, educational opportunities, and myriad other fundamental issues.

This Task Force was set up to begin the hard work of building transparent and accountable processes to ensure that the use of such systems in New York City is geared to just outcomes, rather than only those which are most efficient.  The adoption of such systems requires a reevaluation of current approaches to due process and the adoption of appropriate safeguards. It may require entirely new approaches to accountability when the city uses automated systems, as many such systems, through their very design, can obscure or conceal policy or decision-making processes.

We at Data & Society lauded the decision to establish a Task Force focused on developing a better understanding of these issues. Indeed, we celebrated the city leadership’s prescience in being the first government in the nation to establish a much-needed evidence base regarding the inherent complexity accompanying ADS adoption across multiple departments.  We have seen little evidence that the Task Force is living up to its potential. New York has a tremendous opportunity to lead the country in defining these new public safeguards, but time is growing short to deliver on the promise of this body.

We want to make two main points in our testimony today.

First, for the Task Force to complete its mandate in any meaningful sense, it must have access to the details of ADS systems in use by specific agencies and the ability to work closely with representatives from across agencies using ADS.  We urge that task force members be given immediate access to specific, agency-level automated decision-making systems currently in use, as well as to the leadership in those departments, and others with insight into the design and use of these systems.

Social context is essential to defining fair and just outcomes.[1] The city is understood to be using ADS in such diverse contexts as housing, education, child services, and criminal justice. The very idea of a fair or just outcome is impossible to define or debate without reference to the social context. Understanding the different value tradeoffs in decisions about pretrial risk assessments tells you nothing whatsoever about school choice. What is fair, just, or accountable in public housing policy says nothing about what is fair, just, and accountable in child services. This ability to address technological systems within the social context where they are used is what makes the ADS Task Force so important, and potentially so powerful in defining real accountability measures.

The legislative mandate itself also demonstrates why the Task Force requires access to agency technologies. Under the enacting law, the purpose of the Task Force is to make recommendations particular to the City’s agencies.[2]  Specifically, the Task Force must make recommendations for procedures by which explanations of the decisions can be requested, biases can be detected, harms from biases can be redressed, the public can assess the ADS, and the systems and data can be archived.[3] Each of these recommendations apply not to automated decision systems generally, but to “agency automated decision systems,” a term defined separately in the test of the law.[4] Importantly, the law also mandates that the Task Force makes recommendations about “[c]riteria for identifying which agency automated decision systems” should be subject to these procedures.[5]  Thus, the legislative mandate makes clear that for the Task Force to do its work, it will require access to the technologies that city agencies currently use or plan to use, as well as the people in charge of their operation. Lacking this level of detail on actual agency-level use of automated decision-making systems, the recommendations can only be generic. Such generic recommendations will be ineffective because they will not be informative enough for the city to act on.

If the city wanted to find generic guidelines or recommendations for ADSs, it could have looked to existing scholarship on these issues instead of forming a Task Force. Indeed, there is an entire interdisciplinary field of scholarship that has emerged in the last several years, dedicated to the issues of Fairness, Accountability and Transparency (FAT*) in automated systems.[6] This field has made significant strides in coming up with mathematical definitions for fairness that computers can parse, and creating myriad potential methods for bias reduction in automated systems.

But the academic work has fundamental limitations. Much of the research is, by necessity or due to limited access, based on small hypothetical scenarios—toy problems—rather than real-world applications of machine learning technology.[7] This work is accomplished, as is characteristic of theoretical modeling, by stating assumptions about the world and datasets that are being used. In order to translate these solutions to the real world, researchers would have to know whether the datasets and other assumptions match the real-world scenarios.

Using information from city agencies, the task force has the ability to advance beyond the academic focus on toy problems devoid of social context and assess particular issues for systems used in practice. Without information about the systems in use, the Task Force’s recommendations will be limited to procedures at the greatest level of generality—things we already would guess, such as testing the system for bias or keeping it less complex so as to be explainable. But with information about these systems, the Task Force can examine the particular challenges and tradeoffs at issue. With community input and guidance, they can assess the appropriateness of different definitions of bias in a given context, and debate trade-offs between accuracy and explainability given specific social environments.  The recommendations of the Task Force will only be useful if they are concrete and actionable, and that can only be achieved if they are allowed to examine the way ADS operate in practice with a view into *both* the technical and the social systems informing outcomes.

Second, we urge the Task Force to prioritize public engagement. Because social context is essential to defining fair and just outcomes, meaningful engagement with community stakeholders is fundamental to this process. Once the Task Force has access to detailed information about ADS systems in use, public listening sessions must be held to understand community experiences and concerns with the goal of using that feedback to shape the Task Force’s process going forward. Iteration and reviewing of recommendations with community stakeholders as the Task Force moves this work forward will be important to arriving at truly transparent, accountable and just outcomes.

We are here today because we continue to believe the Task Force has great potential. We strongly believe that the Task Force’s work needs to be undertaken thoughtfully and contextually, centering on cooperation, transparency, and public engagement.  The Task Force’s goal needs to be offering actionable and concrete recommendations on the use of ADS in New York City government. We hope that the above testimony provides useful suggestions to move toward that goal.

Thank you.


[1] See generally Andrew D. Selbst et al., Fairness and Abstraction in Sociotechnical Systems, Proceedings of the 2019 ACM Conference on Fairness, Accountability, and Transparency (ACM FAT*), 59.

[2] See Local Law No. 49 of 2018, Council Int. No. 1696-A of 2017 [hereinafter Local Law 49] (repeatedly referring to “agency automated decision systems”).

[3] Id. §§ 3(b)–(f)

[4] Id. § 1(a).

[5] Id. § 3(a) (emphasis added).

[6] ACM Conference on Fairness, Accountability, and Transparency (ACM FAT*), https://fatconference.org/

[7] See generally Andrew D. Selbst et al., Fairness and Abstraction in Sociotechnical Systems, Proceedings of the 2019 ACM Conference on Fairness, Accountability, and Transparency (ACM FAT*), at 59.


Data & Society 2015-2016 Fellow Wilneida Negron connects her past social work to her current work as a political scientist and technologist.

“We are at the cusp of a new wave of technological thinking, one defined by a new mantra that is the opposite of Zuckerberg’s: ‘Move carefully and purposely, and embrace complexity.’ As part of this wave, a new, inclusive, and intersectional generation of people are using technology for the public interest. This new wave will help us prepare for a future where technical expertise coexists with empathy, humility, and perseverance.”


D&S advisor Ethan Zuckerman provides a transcript on his recent speech about journalism and civics.

One final thing: we have this tendency in journalism right now to feel very sorry for ourselves. This is a field that we are all enormously proud to be part of. This is a field that is harder and harder to make a living in, and I see more and more news organizations essentially saying, “You’re going to miss us. We’re going away. I just want to warn you.”


D&S affiliate Anthony Townsend writes about city charter reform.

The civic tech gang for some reason — probably because it is their target and they are nibbling off what they think they can actually achieve in the short run — hasn’t really articulated the fault lines in its interactions with city governments. When I listen to those exchanges it seems a little too cozy, as if the civic tech players are just waiting to be brought into government to drive the change from within.


D&S affiliate Anthony Townsend writes about his research in data and city charters.

Now, there’s a number of organizations that are working hard on pushing cities up this maturity hill. CityMart is figuring out how to help cities overhaul their innovation process from within. Bloomberg Philanthropies is driving hard to get city governments to focus on achieving measurable innovation. But its all too much within the existing framework of governance systems that are usually fundamental dysfunctional, structurally incapable of delivering. Digital maturity seems to want to engage a larger conversation about the transformation of governance that is missing. No one seems to be willing to go out on a limb — with the exception of the radical political movements like Podemos and Syria (but they haven’t engaged the smart city meme in any real way yet) — and call the whole incremental update campaign into question. (n.b. while the Pirate Party has engaged ‘smart’ in a legitimate way, they don’t represent a coherent political movement in my opinion).


D&S affiliate Anthony Townsend writes more on city charters and big data.

The point is… what we now think of as ‘hidebound obsolete bureaucracy’ was no so long ago the cutting edge analytics and evidence-based administrative technology of its day. It’s outlived its usefulness for sure, but these zombie public organizations will shamble on for a long time without a better vision that can plot a transition path to within the reform process that’s required by law.


Code is key to civic life, but we need to start looking under the hood and thinking about the externalities of our coding practices, especially as we’re building code as fast as possible with few checks and balances.

Points: “Be Careful What You Code For” is danah boyd’s talk from Personal Democracy Forum 2016 (June 9, 2016); her remarks have been modified for Points. danah exhorts us to mind the externalities of code and proposes audits as a way to reckon with the effects of code in high stakes areas like policing. Video is available here.


D&S Advisor Andrew McGlaughlin reflects on Facebook’s approach to implementing their Free Basics program:

In opening a door to the Internet, Facebook doesn’t need to be a gatekeeper The good news, though, is that Facebook could quite easily fix its two core flaws and move forward with a program that is effective, widely supported, and consistent with Internet ideals and good public policy.

Rather than mandating an application process, vetting supplicants, and maintaining and making happy a list of approved service providers, Facebook could simply enforce all of its service restrictions through code. Entirely consistent with principles of network neutrality, Facebook could provide a stripped-down browser that only renders, for example, mobile-optimized websites built in HTML, but not Javascript, iframes, video files, flash applets, images over a certain size, etc. Facebook can publish the technical specs for its low-bandwidth browser; ideally, those specs would map directly to existing open web standards and best practices for mobile web pages and other services. When the user wants to go to a site or service, the browser makes the request and the target server delivers its response — if the browser can render what the server sends, it does; if it can’t, it tells the user as much. As the operators of websites and online services notice a surge in users with these kinds of Free Basics browsers, they will work to ensure their mobile web offering renders the way they want it to.

In this gatekeeper-less model, neither the user nor the online service has to ask Facebook’s permission to connect with each other. And that’s what makes all the difference. Rather than referring to an approved set of ~300 companies, the word “Basics” in Free Basics would denote any site or service anywhere in the world that provides a standards-compliant, low-bandwidth, mobile-optimized version.


D&S Board Member Anil Dash contrasts two recent approaches to making internet connectivity more widely available. Comparing the efforts to build consensus behind Facebook’s Free Basics initiative to LinkNYC, the recently-launched program to bring free broadband wifi to New York City, Dash views each situation as a compelling example of who gets heard, and when, any time a big institution tries to create a technology infrastructure to serve millions of people.

There’s one key lesson we can take from these two attempts to connect millions of people to the Internet: it’s about building trust. Technology infrastructure can be good or bad, extractive or supportive, a lifeline or a raw deal. Objections to new infrastructure are often dismissed by the people pushing them, but people’s concerns are seldom simply about advertising or bring skeptical of corporations. There are often very good reasons to look a gift horse in the mouth.

Whether we believe in the positive potential of getting connected simply boils down to whether we feel the people providing that infrastructure have truly listened to us. The good news is, we have clear examples of how to do exactly that.


D&S fellow Natasha Singer writes about new online apps designed to give college students additional options for reporting sexual assaults. Colleges and Universities are embracing these apps as they continue to look for more concrete data about sexual violence on their campuses and seek to provide students with more ways to report an assault.

Students at participating colleges can use its site, called Callisto, to record details of an assault anonymously. The site saves and time-stamps those records. That allows students to decide later whether they want to formally file reports with their schools — identifying themselves by their school-issued email addresses — or download their information and take it directly to the police. The site also offers a matching system in which a user can elect to file a report with the school electronically only if someone else names the same assailant.

Callisto’s hypothesis is that some college students — who already socialize, study and shop online — will be more likely initially to document a sexual assault on a third-party site than to report it to school officials on the phone or in person.

 

 


Subscribe to the Data & Society newsletter

Twitter |  Facebook  |  Medium  | RSS   |  Donate

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.  |  Privacy policy