featured


filtered by: surveillance


Data & Human Rights Research Lead Mark Latonero investigates the impact of digitally networked technologies on the safe passage of refugees and migrants.

“…in making their way to safe spaces, refugees rely not only on a physical but increasingly also digital infrastructure of movement. Social media, mobile devices, and similar digitally networked technologies comprise this infrastructure of ‘digital passages’—sociotechnical spaces of flows in which refugees, smugglers, governments, and corporations interact with each other and with new technologies.”


The rollout of Electronic Visit Verification (EVV) for Medicaid recipients has serious privacy implications, argues Data & Society Researcher Jacob Metcalf.

“So why should we be worried about rules that require caregivers to provide an electronic verification of the labor provided to clients? Because without careful controls and ethical design thinking, surveillance of caregiver labor is also functionally surveillance of care recipients, especially when family members are employed as caregivers.”


Jacobin | 02.20.18

The New Taylorism

Richard Salame

Data & Society Operations Assistant Richard Salame applies taylorism to Amazon’s new wristbands that track workers’ movements.

“Amazon’s peculiar culture notwithstanding, the wristbands in many ways don’t offer anything new, technologically or conceptually. What has changed is workers’ ability to challenge this kind of surveillance.”


On Wednesday Nov. 29th, the Supreme Court heard Carpenter vs. U.S., a 4th amendment case on cell data access. Postdoctoral scholars Julia Ticona & Andrew Selbst urged the court to understand that cell phones aren’t voluntary in this day and age.

“The justices will surely understand that without any alternatives for accessing online services, vulnerable (and over-policed) populations will be unable to make meaningful choices to protect their privacy, amplifying the disadvantages they already face.”


“Privacy, Security, and Digital Inequality” by Mary Madden is the first in-depth analysis of the privacy and security experiences of low-socioeconomic-status populations in the United States.

Supported by the Digital Trust Foundation, the report finds that most of those living in U.S. households with annual incomes of less than $20,000 per year are acutely aware of a range of digital privacy harms, but many say it would be difficult to access the tools and strategies that could help them protect their personal information online. The report provides additional insights about mobile device use and demand for digital privacy and security training.

In light of the September 18th announcement by the U.S. Department of Homeland Security[1] about federal agencies’ intent to collect social media information and search history from a variety of immigrant groups, “Privacy, Security, and Digital Inequality” is especially relevant: In particular, the report finds that foreign-born Hispanic adults stand out for both their privacy sensitivities, and for their desire to learn more about safeguarding their personal information.

“Privacy, Security, and Digital Inequality” includes detailed comparisons across different racial, ethnic, and nativity groups, finding that there are substantial gaps across these groups when looking at reliance on mobile connectivity.[2]

“This study highlights the disconnect between the one-size-fits-all conversations about privacy-related risk that happen in Washington and the concerns that are most salient to the communities who have long experienced a disproportionate level of surveillance and injustice in their daily lives,” said Madden, Researcher at Data & Society and lead author of the report. “When those who influence policy and technology design have a lower perception of privacy risk themselves, it contributes to a lack of investment in the kind of safeguards and protections that vulnerable communities both want and need.”

In light of new pressures surrounding immigration policy and status in the United States, the report is a highly relevant snapshot of the demand for privacy- and security-related training among some of the most vulnerable of these low-socioeconomic-status groups. The report also finds a disproportionate reliance on mobile devices, offering a potential starting point for those looking to provide educational resources.

“This report illustrates the many ways in which smartphones have become an indispensable source of internet access for those who may lack other technology resources in their homes and communities,” said Michele Gilman, Venable Professor of Law at the University of Baltimore and Director of the Saul Ewing Civil Advocacy Clinic. “Far from being a luxury, smartphones—with their many benefits and vulnerabilities—offer a critical source of connection to jobs, family, education and government services.”

Gilman, a poverty law expert, also served on the Research Advisory Board for the two-year research project, and co-authored a related law review article with Madden titled, “Privacy, Poverty and Big Data: A Matrix of Vulnerabilities for Poor Americans.”

“Privacy, Security, and Digital Inequality,” is based on newly-released data from a nationally-representative telephone survey of 3,000 American adults. The survey, which included interviews in both English and Spanish, was made possible by a grant from the Digital Trust Foundation and fielded in November and December of 2015.



[1] Full text here.
[2] The analysis of racial and ethnic minority groups in this report is limited by the survey sample size, and does not include detailed comparisons of Asians, Native Americans, and other subgroups. For instance, in this survey, out of 3,000 respondents, just 3% identified as Asian or Asian American.


Additional Resources

For more information about groups working on these issues and in these spaces, we invite you to take a look at resources provided by the following organizations. We welcome additional suggestions:

Center for Media Justice – Resource Library
Equality Labs
Freedom of the Press Foundation (link goes to resources)
American Civil Liberties UnionPrivacy and Technology, Free Speech
Berkman Klein Center
Color of Change
EPIC – Electronic Privacy Information Center
Future of Privacy Forum
Georgetown Center on Privacy & Technology (link goes to resources)
National Hispanic Media Coalition
Our Data Bodies (link goes to resources)
Pew Research Center
Public Knowledge
Rad.Cat (link goes to resources)
Southern Poverty Law Center


D&S Researcher Alex Rosenblat was interviewed about Uber for Klint Finley’s article in Wired

Tuesday’s agreement may not be the end of Uber’s problems with the FTC either. Hartzog says a recent paper by University of Washington law professor Ryan Calo and multidisciplinary researcher Alex Rosenblat of the research institute Data & Society points to other potential privacy concerns, such as monitoring how much battery power remains on a user’s device, because users with little juice might be willing to pay more for a ride.

‘When a company can design an environment from scratch, track consumer behavior in that environment, and change the conditions throughout that environment based on what the firm observes, the possibilities to manipulate are legion,’ Calo and Rosenblat write. ‘Companies can reach consumers at their most vulnerable, nudge them into overconsumption, and charge each consumer the most he or she may be willing to pay.’


Washington Monthly | 06.13.17

Code of Silence

Rebecca Wexler

D&S lawyer-in-residence Rebecca Wexler unpacks how private companies hide flaws in software that the government uses to convict and exonerate people in the criminal justice system.

What’s alarming about protecting trade secrets in criminal cases is that it allows private companies to withhold information not from competitors, but from individual defendants like Glenn Rodríguez. Generally, a defendant who wants to see evidence in someone else’s possession has to show that it is likely to be relevant to his case. When the evidence is considered “privileged,” the bar rises: he often has to convince the judge that the evidence could be necessary to his case—something that’s hard to do when, by definition, it’s evidence the defense hasn’t yet seen.


D&S resident Rebecca Wexler describes the flaws of an increasingly automated criminal justice system

The root of the problem is that automated criminal justice technologies are largely privately owned and sold for profit. The developers tend to view their technologies as trade secrets. As a result, they often refuse to disclose details about how their tools work, even to criminal defendants and their attorneys, even under a protective order, even in the controlled context of a criminal proceeding or parole hearing.


D&S researcher Alex Rosenblat explains how and why Uber & Lyft drivers surveil their passengers during rides.

Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.


Columbia Law Review | 03.07.17

The Taking Economy: Uber, Information, and Power

Ryan Calo, Alex Rosenblat

Ryan Calo and D&S researcher Alex Rosenblat write this analysis of the newly termed ‘taking economy’ of Uber.

Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value and raises a set of concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power. This Article, coauthored by a law professor and a technology ethnographer who studies the ride-hailing community, furnishes such a critique and indicates a path toward a meaningful response.

Commercial firms have long used what they know about consumers to shape their behavior and maximize profits. By virtue of sitting between consumers and providers of services, however, sharing economy firms have a unique capacity to monitor and nudge all participants—including people whose livelihood may depend on the platform. Much activity is hidden away from view, but preliminary evidence suggests that sharing economy firms may already be leveraging their access to information about users and their control over the user experience to mislead, coerce, or otherwise disadvantage sharing economy participants.

This Article argues that consumer protection law, with its longtime emphasis of asymmetries of information and power, is relatively well positioned to address this under-examined aspect of the sharing economy. But the regulatory response to date seems outdated and superficial. To be effective, legal interventions must (1) reflect a deeper understanding of the acts and practices of digital platforms and (2) interrupt the incentives of sharing economy firms to abuse their position.


Real Life Magazine | 01.26.17

Close Calls

Zara Rahman

D&S fellow Zara Rahman writes about how immigrant families use social media and digital technologies.

The consequence is that the home of our deeply personal information has gone from treasured letters stored in a box at our houses, to servers owned by corporate companies that we’ll never see. Those personal notes, the ways of showing our family that we’re happy and content in our new lives, despite what we’ve lost — they live online now. The more you share with that corporation, the stronger those family ties get. There is a third party in these relationships.


D&S affiliate Wilneida Negrón details the ten tech issues that will impact this year.

As we begin a new year and a new political administration takes office in the US, let’s take some time to consider some pressing issues that exist at the nexus of technology and social justice—and think about how we as social justice advocates can address them most effectively. Even amid so many unknowns, we can be certain that these issues are among those that will shape 2017 and the years and decades beyond it. And they will be central to the work of building a free, open, and transparent future.


D&S advisor Baratunde Thurston details his exploration of The Glass Room exhibit.

I want to see The Glass Room everywhere there is an Apple Store…And anyone founding or working for a tech company should have to prove they’ve gone through this space and understood its meaning.


D&S advisor Christina Xu writes about fake news and conspiracy theories in China.

Here in China, even well-educated and progressive friends have sincerely asked me about some pretty niche conspiracies. Did Hillary really assassinate someone? (No.) Didn’t Trump win 90% of the vote? (No.) Yesterday, someone even mentioned that they really liked a poem he wrote about his vision for America’s future. (What.)


points | 10.27.16

Shining a light on the darkness

Mark Van Hollebeke

D&S practitioner-in-residence Mark Van Hollebeke discusses Weapons of Math Destruction in this Points piece.

O’Neil’s analysis doesn’t just apply to mathematical models; it applies to societal models. Most of the WMDs that Cathy O’Neil describes are inextricably linked to unjust social structures.

We all, data scientists included, need to act with some humility and reflect on the nature of our social ills. As O’Neil writes, “Sometimes the job of a data scientists is to know when you don’t know enough” (216). Those familiar with Greek moral philosophy know that this type of Socratic wisdom can be very fruitful.

It’s not just the dark side of Big Data she shows us, but shady business practices and unjust social regimes. We will never disarm the WMDs without addressing the social injustice they mask and perpetuate. O’Neil deserves credit for shining a bright light on this fact.


D&S affiliate Ifeoma Ajunwa testified at the U.S. Equal Employment Opportunity Commission to discuss big data in the workplace.

Good afternoon, Chair Yang and members of the Commission. First, I would like to thank the Commission for inviting me to this meeting. My name is Ifeoma Ajunwa, I am a Fellow at the Berkman Klein Center at Harvard University and an Assistant Professor at the University of the District of Columbia School of Law. I have authored several papers regarding worker privacy, with an emphasis on health law and genetic discrimination, from which my testimony today is largely drawn.

Today, I will summarize a number of practices that employers have begun to deploy to collect information on employees, and my concerns that such information could ultimately be acquired and sold by data brokers or stored in databanks. There are few legal limitations on how this sensitive information could be used, sold, or otherwise disseminated. Absent careful safeguards, demographic information and sensitive health information and genetic information is at risk for being incorporated in the Big Data analytics technologies that employers are beginning to use — and which challenge the spirit of antidiscrimination laws such as the Americans with Disabilities Act (the “ADA”) and the Genetic Information Non-Discrimination Act (“GINA”).


D&S artist-in-residence Ingrid Burrington was interviewed for The Intercept about her book and took a walking tour with her interviewer, Cora Currier.

I asked Burrington what she hoped people would do with her guide. It is empowering to know what you’re looking at, but also overwhelming to consider the scale of the apparatus around you. Burrington described a public records battle she lost to get the locations of NYPD cameras; the city said the data could help criminals. In the process, Burrington realized that the data she was seeking wouldn’t account for unmarked cameras and privately owned cameras that could be turned over to police. To map the entire surveillance network of a city would require a huge effort and become quickly outdated.


D&S affiliate Solon Barocas and D&S fellow Karen Levy examine a concept called refractive surveillance, which is when surveillance of one group impacts another.

Debates about consumer privacy have largely missed the fact that firms’ ability to develop a better understanding of consumers also impacts workers’ day-to-day experiences, their job security, and their financial well-being.

But our research suggests that data collection frequently also impacts people other than the those being surveilled. We call this dynamic refractive surveillance. In other words, collecting information about one group can facilitate control over an entirely different group. In our ongoing study, we investigate this dynamic in the context of retail tracking, to understand how data collection about customers can impact how retail workers are managed.


Real Life Magazine | 08.29.16

Broken Windows, Broken Code

R. Joshua Scannell

D&S researcher Josh Scannell wrote an extensive analysis of predictive policing algorithms, showing that, while they were not built to be racist, they mirror a racist system.

Northpointe’s algorithms will always be racist, not because their engineers may be bad but because these systems accurately reflect the logic and mechanics of the carceral state — mechanics that have been digitized and sped up by the widespread implementation of systems like CompStat.


D&S researcher Alex Rosenblat wrote a blog post discussing how Uber can implement wage withholding policies without driver input or negotiation.

This set up essentially provides negative disincentives to drivers to retrieve the wages they’re owed. An analogy I think of, by comparison, is how cell phone companies can cram small fees into customer bills. Only some percentage of customers are actively tracking their bills, and some percentage of those are willing to spend an hour on the phone with a well-meaning but ineffective customer service agent to get back their small fee.


D&S fellow Karen Levy was interviewed for Philosophical Disquistions about intimate surveillance.


D&S artist-in-residence Ingrid Burrington speculates whether or not possible threats to internet infrastructure are in fact threats.

To take out some of the major chokepoints of the internet wouldn’t be impossible, and it would have pretty devastating consequences, but any kind of attack on such a space at scale would require a degree of coordination and resources more likely in the hands of a sovereign country than a terrorist cell. And in an age where every war is a propaganda war, taking out the internet at scale is an act of mutually assured destruction: ISIS needs a global internet to coordinate and recruit, and western governments need a global internet to surveil that coordination and recruitment. Most sovereigns are far more content to exercise limited control and outages within their own boundaries using far less labor-intensive methods than physical infrastructure attacks.


The FBI recently announced its plan to request that their massive biometrics database, called the Next Generation Identification (NGI) system, be exempted from basic requirements under the Privacy Act. These exemptions would prevent individuals from finding out if they are included within the database, whether their profile is being shared with other government entities, and whether their profile is accurate or contains false information.Forty-four organizations, including Data & Society, sent a letter to the Department of Justice asking for a 30-day extension to review the proposal.

Points: In this Points original, Robyn Caplan highlights the First Amendment implications of the FBI’s request for exemptions from the Privacy Act for its Next Generation Identification system. Public comment on the FBI’s proposal is being accepted until July 6, 2016.


D&S fellow Mark Latonero considers recent attempts by policymakers, big tech companies, and advocates to address the deepening refugee and migrant crisis and, in particular, the educational needs of displaced children through technology and app development projects. He cautions developers and policymakers to consider the risks of failing to understand the unique challenges facing refugee children living without running water, let alone a good mobile network.

The reality is that no learning app or technology will improve education by itself. It’s also questionable whether mobile apps used with minimal adult supervision can improve a refugee child’s well-being. A roundtable at the Brookings Center for Universal Education noted that “children have needs that cannot be addressed where there is little or no human interaction. A teacher is more likely to note psychosocial needs and to support children’s recovery, or to refer children to other services when they are in greater contact with children.” Carleen Maitland, a technology and policy professor who led the Penn State team, found through her experience at Zaatari that in-person interactions with instructors and staff in the camp’s many community centers could provide far greater learning opportunities for young people than sitting alone with a mobile app.

In fact, unleashing ed tech vendors or Western technologists to solve development issues without the appropriate cultural awareness could do more harm than good. Children could come to depend on technologies that are abandoned by developers once the attention and funding have waned. Plus, the business models that sustain apps through advertising, or collecting and selling consumer data, are unethical where refugees are concerned. Ensuring data privacy and security for refugee children using apps should be a top priority for any software developer.

In cases where no in-person education is available, apps can still play a role, particularly for children who feel unsafe to travel outside their shelters or are immobile owing to injuries or disabilities. But if an app is to stand a chance of making a real difference, it needs to arise not out of a tech meet-up in New York City but on a field research trip to a refugee camp, where it will be easier to see how mobile phones are actually accessed and used. Researchers need to ask basic questions about the value of education for refugees: Is the goal to inspire learning on traditional subjects? Empower students with academic credentials or job skills? Assimilate refugees into their host country? Provide a protected space where children can be fed and feel safe? Or combat violent extremism at an early age?

To decide, researchers need to put the specific needs of refugee children first—whether economic, psychosocial, emotional, or physical—and work backward to see whether technology can help, if at all.


Researchers Alexandra Mateescu and Alex Rosenblat published a paper with D&S Founder danah boyd examine police-worn body cameras and their potential to provide avenues for police accountability and foster improved policy-community relations. The authors raise concerns about potential harmful consequences of constant surveillance that has sparked concerns from civil rights groups about how body-worn cameras may violate privacy and exacerbate existing police practices that have historically victimized people of color and vulnerable populations. They consider whether one can demand greater accountability without increased surveillance at the same time and suggest that “the trajectory laid out by body-worn cameras towards greater surveillance is clear, if not fully realized, while the path towards accountability has not yet been adequately defined, let alone forged.”

The intimacy of body-worn cameras’ presence—which potentially enables the recording of even mundane interpersonal interactions with citizens—can be exploited with the application of technologies like facial recognition; this can exacerbate existing practices that have historically victimized people of color and vulnerable populations. Not only do such technologies increase surveillance, but they also conflate the act of surveilling citizens with the mechanisms by which police conduct is evaluated. Although police accountability is the goal, the camera’s view is pointed outward and away from its wearer, and audio recording captures any sounds within range. As a result, it becomes increasingly difficult to ask whether one can demand greater accountability without increased surveillance at the same time.

Crafting better policies on body-worn camera use has been one of the primary avenues for balancing the right of public access with the need to protect against this technology’s invasive aspects. However, no universal policies or norms have been established, even on simple issues such as whether officers should notify citizens that they are being recorded. What is known is that body-worn cameras present definite and identifiable risks to privacy. By contrast, visions of accountability have remained ill-defined, and the role to be played by body-worn cameras cannot be easily separated from the wider institutional and cultural shifts necessary for enacting lasting reforms in policing. Both the privacy risks and the potential for effecting accountability are contingent upon an ongoing process of negotiation, shaped by beliefs and assumptions rather than empirical evidence.


D&S Advisor Andrew McGlaughlin reflects on Facebook’s approach to implementing their Free Basics program:

In opening a door to the Internet, Facebook doesn’t need to be a gatekeeper The good news, though, is that Facebook could quite easily fix its two core flaws and move forward with a program that is effective, widely supported, and consistent with Internet ideals and good public policy.

Rather than mandating an application process, vetting supplicants, and maintaining and making happy a list of approved service providers, Facebook could simply enforce all of its service restrictions through code. Entirely consistent with principles of network neutrality, Facebook could provide a stripped-down browser that only renders, for example, mobile-optimized websites built in HTML, but not Javascript, iframes, video files, flash applets, images over a certain size, etc. Facebook can publish the technical specs for its low-bandwidth browser; ideally, those specs would map directly to existing open web standards and best practices for mobile web pages and other services. When the user wants to go to a site or service, the browser makes the request and the target server delivers its response — if the browser can render what the server sends, it does; if it can’t, it tells the user as much. As the operators of websites and online services notice a surge in users with these kinds of Free Basics browsers, they will work to ensure their mobile web offering renders the way they want it to.

In this gatekeeper-less model, neither the user nor the online service has to ask Facebook’s permission to connect with each other. And that’s what makes all the difference. Rather than referring to an approved set of ~300 companies, the word “Basics” in Free Basics would denote any site or service anywhere in the world that provides a standards-compliant, low-bandwidth, mobile-optimized version.


D&S Advisor Susan Crawford argues that the President is on shaky legal ground in the FBI vs. Apple showdown:

The problem for the president is that when it comes to the specific battle going on right now between Apple and the FBI, the law is clear: twenty years ago, Congress passed a statute, the Communications Assistance for Law Enforcement Act (CALEA) that does not allow the government to tell manufacturers how to design or configure a phone or software used by that phone — including security software used by that phone.

CALEA was the subject of intense negotiation — a deal, in other words. The government won an extensive, specific list of wiretapping assistance requirements in connection with digital communications. But in exchange, in Section 1002 of that act, the Feds gave up authority to “require any specific design of equipment, facilities, services, features or system configurations” from any phone manufacturer. The government can’t require companies that build phones to come to it for clearance in advance of launching a new device. Nor can the authorities ask a manufacturer to design something new — like a back door — once that device is out.


paper | 03.10.16

Limitless Worker Surveillance

Ifeoma Ajunwa, Kate Crawford, Jason Schultz

D&S Affiliates Ifeoma Ajunwa, Kate Crawford, and Jason Schultz examine the effectiveness of the law as a check on worker surveillance, given recent technological innovations. This law review article focuses on popular trends in worker tracking – productivity apps and worker wellness programs – to argue that current legal constraints are insufficient and may leave American workers at the mercy of 24/7 employer monitoring. They also propose a new comprehensive framework for worker privacy protections that should withstand current and future trends.

Abstract:

From the Pinkerton private detectives of the 1850s, to the closed-circuit cameras and email monitoring of the 1990s, to contemporary apps that quantify the productivity of workers, American employers have increasingly sought to track the activities of their employees. Along with economic and technological limits, the law has always been presumed as a constraint on these surveillance activities. Recently, technological advancements in several fields – data analytics, communications capture, mobile device design, DNA testing, and biometrics – have dramatically expanded capacities for worker surveillance both on and off the job. At the same time, the cost of many forms of surveillance has dropped significantly, while new technologies make the surveillance of workers even more convenient and accessible. This leaves the law as the last meaningful avenue to delineate boundaries for worker surveillance.

 


D&S Advisor Joel Reidenberg considers the scope of the court order compelling Apple to provide “reasonable technical assistance” to help the government hack into one of the San Bernadino attacker’s locked iPhone.

In short, for government to legitimately circumvent device encryption through a court order, legal authorization to access the contents of the device (typically through a judicial warrant) is necessary. Then, if the equipment manufacturer has control over the encryption, the decryption should be performed by the manufacturer with the results provided to the government.

If, instead, the equipment manufacturer only has control over information necessary to decrypt the device, the information should be provided to the government under strict court seal and supervision for a one-time limited use.

If neither circumstance applies, then unless Congress says otherwise, the equipment manufacturer should not be compelled to assist.

The bottom line is that the government should have an ability to compel companies to unlock encrypted devices for access to evidence of crimes, but should not be able to force companies to build electronic skeleton keys, new access tools and security vulnerabilities.


D&S Advisor Ethan Zuckerman contemplates the sustainability of advertising on the internet in 2016 and why advertisers continue to go through all the trouble to track a user across their devices:

The simple truth is that web ads don’t work very well. People hate them – which is why we block them – and almost no one voluntarily clicks on them. So internet advertising companies need to promise advertisers that putting us under more surveillance will magically make web ads work. When ads follow you from your TV screen to your computer to your phone, you’ll surely click on them, right?


primer | 02.24.15

Police Body-Worn Cameras – Updated

Alexandra Mateescu, Alex Rosenblat, danah boyd (with support from Jenna Leventoff and David Robinson)

In the wake of the police shooting of Michael Brown in August 2014, as well as the subsequent protests in Ferguson, Missouri and around the country, there has been a call to mandate the use of body-worn cameras to promote accountability and transparency in police-civilian interactions. Both law enforcement and civil rights advocates are excited by the potential of body-worn cameras to improve community policing and safety, but there is no empirical research to conclusively suggest that these will reduce the deaths of black male civilians in encounters with police. There are some documented milder benefits evident from small pilot studies, such as more polite interactions between police and civilians when both parties are aware they are being recorded, and decreased fraudulent complaints made against officers. Many uncertainties about best practices of body-worn camera adoption and use remain, including when the cameras should record, what should be stored and retained, who should have access to the footage, and what policies should determine the release of footage to the public. As pilot and permanent body-worn camera programs are implemented, it is important to ask questions about how they can be best used to achieve their touted goals. How will the implementation of these programs be assessed for their efficacy in achieving accountability goals? What are the best policies to have in place to support those goals?

The primer on police body-worn cameras was written in February 2015. We provided an update on what has happened in the past year with regard to the use of body-worn cameras across the US (the update can be read here) for the 2015 Data & Civil Rights Conference, A New Era of Policing and Justice.


Excerpt: “As one of the speakers put it (we’re under Chatham House rules…), listening machines trigger all three aspects of the surveillance holy trinity: they’re pervasive, starting to appear in all aspects of our lives; they’re persistent, capable of keeping records of what we’ve said indefinitely, and they process the data they collect, seeking to understand what people are saying and acting on what they’re able to understand. To reduce the creepy nature of their surveillant behavior, listening systems are often embedded in devices designed to be charming, cute and delightful: toys, robots and smooth-voiced personal assistants.”


other | 05.19.15

The Minutes of Marshall Jones

Gideon Lichfield

D&S fellow Gideon Lichfield’s short story for the Police Technology and Civil Rights Roundtable:

“The year is 2019, and body cams have become standard for patrol officers in most police departments in the US. The cams and the management software for their footage are provided by a patchwork of vendors, and each department uses its own variant of them, with its own rules and procedures.”


The Atlantic | 05.15.15

It’s Not Too Late to Get Body Cameras Right

danah boyd, Alex Rosenblat

Excerpt: “Police-worn body cameras are coming. Support for them comes from stakeholders who often take opposing views. Law enforcement wants them, many politicians are pushing for them, and communities that already have a strong police presence in their neighborhoods are demanding that the police get cameras now. Civil-rights groups are advocating for them. The White House is funding them. The public is in favor of them. The collective — albeit, not universal — sentiment is that body cameras are a necessary and important solution to the rising concerns about fatal encounters between police and black men.

“As researchers who have spent the last few months analyzing what is known about body cams, we understand the reasons for this consensus, but we’re nervous that there will be unexpected and undesirable outcomes. On one hand, we’re worried that these expensive technologies will do little to curb systemic abuse. But what really scares us is the possibility that they may magnify injustice rather than help eradicate it. We support safeguards being put in place. But the cameras are not a proven technology, and we’re worried that too much is hinging on them being a silver bullet to a very serious problem. Our concerns stem from three major issues:

  1. Technology doesn’t produce accountability.
  2. Removing discretion often backfires.
  3. Surveillance carries significant, hidden economic and social costs.”

“This article examines the implications of electronic monitoring systems for organizational information flows and worker control, in the context of the U.S. trucking industry. Truckers, a spatially dispersed group of workers with a traditionally independent culture and a high degree of autonomy, are increasingly subjected to performance monitoring via fleet management systems that record and transmit fine-grained data about their location and behaviors. These systems redistribute operational information within firms by accruing real-time aggregated data in a remote company dispatcher. This redistribution results in a seemingly incongruous set of effects. First, abstracted and aggregated data streams allow dispatchers to quantitatively evaluate truckers’ job performance across new metrics, and to challenge truckers’ accounts of local and biophysical conditions. Second, even as these data are abstracted, information about truckers’ activities is simultaneously resocialized via its strategic deployment into truckers’ social relationships with their coworkers and families. These disparate dynamics operate together to facilitate firms’ control over truckers’ daily work practices in a manner that was not previously possible. The trucking case reveals multifaceted pathways to the entrenchment of organizational control via electronic monitoring.”


The Atlantic | 03.05.15

The Failed Attempt to Destroy GPS

Ingrid Burrington

“The accelerated age buries technological origin stories beneath endless piles of timestamped data. When people lose sight of these origin stories, they do a disservice to our technologies and to ourselves.” In this essay Data & Society fellow Ingrid Burrington works through the history of and resistance to GPS, and its connection to networked time, in order to argue that, “[i]n the rush of a persistent accelerated now, interruptions and challenges to life in real-time are sometimes necessary in order to ask what kind of future we’re building.”


Data & Society affiliate Kate Crawford comments on a court case in which a law firm is using data from a plaintiff’s Fitbit in support of her personal injury claim and explores the implications of elective self-tracking technologies for “truth” in legal proceedings.


primer | 10.08.14

Future of Labor: Workplace Surveillance

Alex Rosenblat, Tamara Kneese, danah boyd

Employers have long devised techniques and used new technologies to surveil employees in order to increase efficiency, decrease theft, and otherwise assert power and control over subordinates. New and cheaper networked technologies make surveillance easier to implement, but what are the ramifications of widespread workplace surveillance?

This document was produced as a part of the Future of Work Project at Data & Society Research Institute. This effort is supported by the Open Society Foundations’ U.S. Programs Future of Work inquiry, which is bringing together a cross-disciplinary and diverse group of thinkers to address some of the biggest questions about how work is transforming and what working will look like 20-30 years from now. The inquiry is exploring how the transformation of work, jobs and income will affect the most vulnerable communities, and what can be done to alter the course of events for the better.


In this op-ed, Data & Society fellow Seeta Peña Gangadharan argues that privacy-as-default features “help restore public trust in technology as a tool to improve our lives and collectively self-govern.” “When technologies come to market with security and privacy baked in,” she writes, “they help users navigate an increasingly opaque digital landscape.”


Social Politics | 08.26.14

The Trafficking-Technology Nexus

Jennifer Lynne Musto, danah boyd

Within some public policy and scholarly accounts, human trafficking is increasingly understood as a technological problem that invites collaborative anti-trafficking solutions. A growing cohort of state, non-governmental, and corporate actors in the United States have come together around the shared contention that technology functions as both a facilitator and disrupting force of trafficking, specifically sex trafficking. Despite increased attention to the trafficking-technology nexus, scant research to date has critically unpacked these shifts nor mapped how technology reconfigures anti-trafficking collaborations. In this article, we propose that widespread anxieties and overzealous optimism about technology’s role in facilitating and disrupting trafficking have simultaneously promoted a tri-part anti-trafficking response, one animated by a law and order agenda, operationalized through augmented internet, mobile, and networked surveillance, and maintained through the integration of technology experts and advocates into organized anti-trafficking efforts. We suggest that an examination of technology has purchase for students of gender, sexuality, and neoliberal governmentality in its creation of new methods of surveillance, exclusion, and expertise.


In this op-ed, Data & Society fellow Karen Levy discusses mandating electronic monitoring of truck drivers as a way to address unsafe practices in trucking. She argues that “electronic monitoring is an incomplete solution to a serious public safety problem. If we want safer highways and fewer accidents, we must also attend to the economic realities that drive truckers to push their limits.”


D&S fellow Karen Levy writes about the nation’s trucking system and the need for reform. Based on her three years of research around trucker’s compliance with federal regulations, she argues the changes must address root economic causes underlying a range of unsafe practices and that electronic monitoring is an incomplete solution to a serious public safety problem.

Truckers don’t work without sleep for dangerously long stretches (as many acknowledge having done) because it’s fun. They do it because they have to earn a living. The market demands a pace of work that many drivers say is impossible to meet if they’re “driving legal.”

If we want safer highways and fewer accidents, we must also attend to the economic realities that drive truckers to push their limits.


The New Inquiry | 05.30.14

The Anxieties of Big Data

Kate Crawford

In this essay Data & Society affiliate Kate Crawford asks, “What does the lived reality of Big Data feel like?” She offers “surveillant anxiety — the fear that all the data we are shedding every day is too revealing of our intimate selves but may also misrepresent us.” And she pairs the anxiety of the surveilled with the anxiety of the surveillers: “that no matter how much data they have, it is always incomplete, and the sheer volume can overwhelm the critical signals in a fog of possible correlations.”


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.