featured


report | 08.18.22

The Promises, Challenges, and Futures of Media Literacy

Monica Bulger and Patrick Davison

This report responds to the “fake news” problem by evaluating the successes and failures of recent media literacy efforts while pointing towards next steps for educators, legislators, technologists, and philanthropists.


report | 08.18.22

Dead Reckoning

Robyn Caplan, Lauren Hanson, and Joan Donovan

New Data & Society report clarifies current uses of “fake news” and analyzes four specific strategies for intervention.


video | 01.24.18

Are Internet Trolls Born or Made?

Amanda Lenhart, Kathryn Zickuhr

What are internet trolls? Above the Noise explains where internet trolls come from in this video and encourages watchers to read Data & Society’s report “Online Harassment, Digital Abuse, and Cyberstalking in America.”


points | 01.17.18

Don’t Call AI “Magic”

Madeleine Clare Elish

Artificial intelligence is being increasingly used across multiple sectors and people often refer to its function as “magic.” In this blogpost, D&S researcher Madeleine Clare Elish points out how there’s nothing magical about AI and reminds us that the human labor involved in making AI systems work is offered rendered invisible.

“From one perspective, this makes sense: Working like magic implies impressive and seamless functionality and the means by which the effect was achieved is hidden from view or even irrelevant. Yet, from another perspective, implying something works like magic focuses attention on the end result, denying an accounting of the means by which that end result was reached.”


D&S founder and president sings praise for Virginia Eubank’s new book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.

“This book should be mandatory for anyone who works in social services, government, or the technology sector because it forces you to really think about what algorithmic decision-making tools are doing to our public sector, and the costs that this has on the people that are supposedly being served. It’s also essential reading for taxpayers and voters who need to understand why technology is not the panacea that it’s often purported to be. Or rather, how capitalizing on the benefits of technology will require serious investment and a deep commitment to improving the quality of social services, rather than a tax cut.”


In the gig-economy, management by algorithms means employment relationships grow more remote and distributed across the network. Alex Rosenblat explains how workers navigate this by creating their own forums.

“Online forums aren’t just helping drivers like Cole navigate the challenges of their work, and helping those of us who use and study these platforms grasp those challenges too. They show how as employment relationships grow more remote and distributed across the network, workers can adapt, using technology to forge their own workplace culture.”


At this year’s Coalition for Networked Information (CNI) Fall 2017 Membership Meeting, Bonnie Tijerina spoke about the implications of using data science in libraries.

“Data science approaches may shed light on otherwise hard-to-see problems in the library.”  — D&S researcher Bonnie Tijerina


 

The informational, economic, and political influence of the dominant tech platforms — Google, Facebook, and Amazon in particular — has become a central topic of debate. In this talk, K. Sabeel Rahman argues that these firms are best understood as the core infrastructure of our 21st century economy and public sphere. The infrastructural power of these firms raises a range of policy questions. What exactly about these firms (e.g., their accumulation of data, their gatekeeping functions, their control over vital public and economic functions like retail delivery or online speech) is “infrastructural?” How should these infrastructural functions be governed and regulated, in light of both their economic and political influence?

Professor Rahman sketches some tentative answers to these questions, drawing on the intellectual history of early 20th century “public utility regulation,” where reformers developed a compelling approach to diagnosing and remedying the problem of private power over the essential infrastructure of the industrial economy, from railroads to finance.

This history suggests some design principles and opens up some novel implications for addressing the problem of platform power in the digital economy. The talk explores more contemporary analogies and applications in the context of our current debates over informational platforms, big data, AI, and algorithms, in order to sketch out some principles for what a public utility-style regulatory approach to Internet platforms would look like.

K. Sabeel Rahman is a Visiting Professor of Law at Harvard Law School, an Assistant Professor of Law at Brooklyn Law School, and a Fellow at the Roosevelt Institute. Rahman earned his AB at Harvard College summa cum laude in Social Studies and returned to Harvard for his JD at Harvard Law School and his PhD in the Harvard Government Department. He also has degrees in Economics and Sociolegal Studies from Oxford, where he was a Rhodes Scholar.


Miranda Katz of WIRED interviews D&S founder and president danah boyd on the evolving public discourse around disinformation and how the tech industry can help rebuild American society.

“It’s actually really clear: How do you reknit society? Society is produced by the social connections that are knit together. The stronger those networks, the stronger the society. We have to make a concerted effort to create social ties, social relationships, social networks in the classic sense that allow for strategic bridges across the polis so that people can see themselves as one.”


On Wednesday Nov. 29th, the Supreme Court heard Carpenter vs. U.S., a 4th amendment case on cell data access. Postdoctoral scholars Julia Ticona & Andrew Selbst urged the court to understand that cell phones aren’t voluntary in this day and age.

“The justices will surely understand that without any alternatives for accessing online services, vulnerable (and over-policed) populations will be unable to make meaningful choices to protect their privacy, amplifying the disadvantages they already face.”


The Daily 360 interviews D&S INFRA Lead Ingrid Burrington about how we can see the internet in everyday life.

“First place to look if you’re looking for the internet on a city street is down because a lot of it runs through fiberoptic cables that are buried under the street.”


Is Facebook a platform or a media company? NBC News THINK asks D&S researcher Robyn Caplan to comment on the recent tech hearings.

Facebook thinks of itself as a neutral platform where everyone can come and share ideas…They’re basically saying that they’re the neutral public sphere. That they are the marketplace of ideas, instead of being the marketers of ideas.”


This is a transcript of Data & Society founder and president danah boyd’s recent lightening talk at The People’s Disruption: Platform Co-Ops for Global Challenges.

“But as many of you know, power corrupts. And the same geek masculinities that were once rejuvenating have spiraled out of control. Today, we’re watching as diversity becomes a wedge issue that can be used to radicalize disaffected young men in tech. The gendered nature of tech is getting ugly.”


D&S research Claire Fontaine explores issues around accessibility at NYC schools.

“So for the past six months, I’ve been asking local parents about the data they used to choose among the system’s 1700 or so schools…Beyond the usual considerations like test scores and art programs, they also consider the logistics of commuting from the Bronx to the East Village with two children in tow, whether the school can accommodate parents and children who are still learning English, and how much money the parent-teacher association raises to supplement the school’s budget.

But for some families, the choice process begins and ends with the question: Is the building fully accessible?”


D&S researcher Monica Bulger co-authored an article on the way children are engaging with technology nationally.

“Beyond revealing pressing and sizeable gaps in knowledge, this cross-national review also reveals the importance of understanding local values and practices regarding the use of technologies. This leads us to stress that future researchers must take into account local contexts and existing inequalities and must share best practices internationally so that children can navigate the balance between risks and opportunities.”

 


D&S founder and President danah boyd & affiliate Solon Barocas investigate the practice of ethics in data science.

“Critical commentary on data science has converged on a worrisome idea: that data scientists do not recognize their power and, thus, wield it carelessly. These criticisms channel legitimate concerns about data science into doubts about the ethical awareness of its practitioners. For these critics, carelessness and indifference explains much of the problem—to which only they can offer a solution.”


Columbia Law Review | 11.01.17

The Taking Economy: Uber, Information, And Power

Alex Rosenblat, Ryan Calo

D&S researcher Alex Rosenblat co-authored an article on power dynamics in the sharing economy.

“Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value but raises concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power.”


D&S INFRA Lead Ingrid Burrington investigates community network infrastructures in times of disaster.

“By design, resilient network infrastructure prioritizes interdependence and cooperation over self-sufficiency — without strong underlying social ties, there is no localized network infrastructure. The technical complexities of building a network are a lot easier to overcome than the political complexities of building community, political agency, and governance.”


D&S researcher Mary Madden shared her findings from her recent report “Privacy, Security and Digital Inequality.”

“Not only do Americans with lower levels of income and education have fewer technology resources, but they also express heightened sensitivities about a range of data privacy concerns.”


D&S researcher Alex Ronseblat explores when incentives in the gig economy become deceptive.

“While charging for work opportunities is reminiscent of multi-level marketing, like Mary Kay or Amway, this is different because Uber controls so much of the labor process, like dispatch, and competing promotional pay, in addition to setting the base rates at which drivers earn their income. In other words, drivers can use their labor as collateral on their down payment now in exchange for earning a premium on their labor later, but Uber ultimately controls whether or not the promotion is worthwhile.”


paper | 10.01.17

Calculated Risks

Ingrid Burrington

D&S INFRA Lead Ingrid Burrington reflects on the concept of the future in anticipation of the upcoming show Futureproof that she curated at Haverford College.

“Trying to build alternative futures is often a process of facing that haunting spectre: finding life or potential by invoking and living with the ghosts and weird spirits of a world that could have been. Often, the interface for visiting these particular ghosts isn’t the medium or Ouija board but the archive, which is partly why so many of the works in Futureproof take on an archivist, museological tone. The alternative archive is historical evidence of a shift in the timeline, its own kind of proof that another timeline is not only possible, but has already happened, is already happening and emergent before us.”


On September 27th, D&S Fellow Taeyoon Choi released the first two chapters of his online-book “Poetic Computation: Reader,” which looks at code as a form of poetry as well as the ethics behind it. As an online-book, readers have the unique experience of customizing the design elements of the text to their preferred standards as they read.

Choi is co-founder of The School for Poetic Computation based in New York City, and the book is based off of two of his lectures from the curriculum. The following chapters will be published later this year.

 


In late August, D&S Researcher Anne Washington talked real-world implications of AI with the Centre for Public Impact’s Joel Tito.

Who wants to talk about the end of the world? CPI’s Joel Tito and Data & Society’s Anne Washington certainly do – and it’s this discussion point which kicks off our latest podcast.

Washington, a digital government scholar whose work addresses emerging policy needs for data science, tells Joel about what she is most afraid of when it comes to artificial intelligence (AI) and its application to government. She also explains that while AI can make processes more efficient and better streamlined, it shouldn’t be used for “really complicated human decisions”. Find out why here, as well as if she thinks we should seek inspiration from the Romans or ancient Greeks when it comes to AI and government…


book | 03.01.18

Trump and the Media

Edited by Pablo J. Boczkowski and Zizi Papacharissi

D&S Founder danah boyd and Researcher Robyn Caplan contributed to the book “Trump and the Media,” which examines the role the media played in the election of Donald Trump.

Other contributors include: Mike Ananny, Chris W. Anderson, Rodney Benson, Pablo J. Boczkowski, Michael X. Delli Carpini, Josh Cowls, Susan J. Douglas, Keith N. Hampton, Dave Karpf, Daniel Kreiss, Seth C. Lewis, Zoey Lichtenheld, Andrew L. Mendelson, Gina Neff, Zizi Papacharissi, Katy E. Pearce, Victor Pickard, Sue Robinson, Adrienne Russell, Ralph Schroeder, Michael Schudson, Julia Sonnevend, Keren Tenenboim-Weinblatt, Tina Tucker, Fred Turner, Nikki Usher, Karin Wahl-Jorgensen, Silvio Waisbord, Barbie Zelizer.


In this blogpost, Zachary Gold dives into the implications of the Children’s Internet Protection Act (CIPA), launched in 2000.

“Many parents certainly worry about their children getting access to inappropriate material online, and CIPA may have been a reasonable way to address that concern when it was passed. The devices we use, and the way we use the internet, have changed drastically since then. Updating CIPA, or replacing it to govern these new devices and connections being used by students could do more harm than good. Keeping pornography out of student’s schoolrooms is important, but filtering and monitoring student’s internet activity around town and at home blurs the role of school administrators.”


Today, Data & Society releases “Privacy, Security, and Digital Inequality” by Mary Madden; the first in-depth analysis of the privacy and security experiences of low-socioeconomic-status populations in the United States.

Supported by the Digital Trust Foundation, the report finds that most of those living in U.S. households with annual incomes of less than $20,000 per year are acutely aware of a range of digital privacy harms, but many say it would be difficult to access the tools and strategies that could help them protect their personal information online. The report provides additional insights about mobile device use and demand for digital privacy and security training.

In light of the September 18th announcement by the U.S. Department of Homeland Security[1] about federal agencies’ intent to collect social media information and search history from a variety of immigrant groups, “Privacy, Security, and Digital Inequality” is especially relevant: In particular, the report finds that foreign-born Hispanic adults stand out for both their privacy sensitivities, and for their desire to learn more about safeguarding their personal information.

“Privacy, Security, and Digital Inequality” includes detailed comparisons across different racial, ethnic, and nativity groups, finding that there are substantial gaps across these groups when looking at reliance on mobile connectivity.[2]

“This study highlights the disconnect between the one-size-fits-all conversations about privacy-related risk that happen in Washington and the concerns that are most salient to the communities who have long experienced a disproportionate level of surveillance and injustice in their daily lives,” said Madden, Researcher at Data & Society and lead author of the report. “When those who influence policy and technology design have a lower perception of privacy risk themselves, it contributes to a lack of investment in the kind of safeguards and protections that vulnerable communities both want and need.”

In light of new pressures surrounding immigration policy and status in the United States, the report is a highly relevant snapshot of the demand for privacy- and security-related training among some of the most vulnerable of these low-socioeconomic-status groups. The report also finds a disproportionate reliance on mobile devices, offering a potential starting point for those looking to provide educational resources.

“This report illustrates the many ways in which smartphones have become an indispensable source of internet access for those who may lack other technology resources in their homes and communities,” said Michele Gilman, Venable Professor of Law at the University of Baltimore and Director of the Saul Ewing Civil Advocacy Clinic. “Far from being a luxury, smartphones—with their many benefits and vulnerabilities—offer a critical source of connection to jobs, family, education and government services.”

Gilman, a poverty law expert, also served on the Research Advisory Board for the two-year research project, and co-authored a related law review article with Madden titled, “Privacy, Poverty and Big Data: A Matrix of Vulnerabilities for Poor Americans.”

“Privacy, Security, and Digital Inequality,” is based on newly-released data from a nationally-representative telephone survey of 3,000 American adults. The survey, which included interviews in both English and Spanish, was made possible by a grant from the Digital Trust Foundation and fielded in November and December of 2015.



[1] Full text here.
[2] The analysis of racial and ethnic minority groups in this report is limited by the survey sample size, and does not include detailed comparisons of Asians, Native Americans, and other subgroups. For instance, in this survey, out of 3,000 respondents, just 3% identified as Asian or Asian American.


Additional Resources

For more information about groups working on these issues and in these spaces, we invite you to take a look at resources provided by the following organizations. We welcome additional suggestions:

Center for Media Justice – Resource Library
Equality Labs
Freedom of the Press Foundation (link goes to resources)
American Civil Liberties UnionPrivacy and Technology, Free Speech
Berkman Klein Center
Color of Change
EPIC – Electronic Privacy Information Center
Future of Privacy Forum
Georgetown Center on Privacy & Technology (link goes to resources)
National Hispanic Media Coalition
Our Data Bodies (link goes to resources)
Pew Research Center
Public Knowledge
Rad.Cat (link goes to resources)
Southern Poverty Law Center


D&S Media Manipulation Research Lead Joan Donovan talks about the role of large tech companies in curbing extremist activity online.

Joan Donovan, a media manipulation research lead at the research institute Data & Society, said it’s well within these companies’ reach to implement changes that will curb white supremacist activity. And it’s something she said major platforms like Facebook and Twitter will have to confront as they acknowledge their role in magnifying hate speech and those who spout it.

‘Richard Spencer might have a megaphone and his own website to communicate his messages of hate,’ Donovan said in a phone interview Wednesday. ‘Now these platforms are realizing they are the megaphone. They are the conduit between him and larger audiences.’

Movements like the so-called ‘alt-right’ aren’t just built on charisma, Donovan added — they’re built on infrastructure. The internet and all of its possibilities has now become a major part of that infrastructure.


D&S Researcher Alex Rosenblat was interviewed about Uber for Klint Finley’s article in Wired

Tuesday’s agreement may not be the end of Uber’s problems with the FTC either. Hartzog says a recent paper by University of Washington law professor Ryan Calo and multidisciplinary researcher Alex Rosenblat of the research institute Data & Society points to other potential privacy concerns, such as monitoring how much battery power remains on a user’s device, because users with little juice might be willing to pay more for a ride.

‘When a company can design an environment from scratch, track consumer behavior in that environment, and change the conditions throughout that environment based on what the firm observes, the possibilities to manipulate are legion,’ Calo and Rosenblat write. ‘Companies can reach consumers at their most vulnerable, nudge them into overconsumption, and charge each consumer the most he or she may be willing to pay.’


Quartz cites D&S Postdoctoral Scholar Caroline Jack in their guide to Lexicon of Lies:

Problematic information comes in various forms, each uniquely irksome. Yet people are quick to blast all inaccuracies as “fake news,” reinforcing the sense that facts are a thing of the past.

That’s dangerous and it needn’t be the case, according to the Lexicon of Lies, a recent report from the New York-based Data and Society research institute. “The words we choose to describe media manipulation can lead to assumptions about how information spreads, who spreads it, and who receives it,” writes Caroline Jack, a media historian and postdoctoral fellow at Data and Society. On a cultural level, “these assumptions can shape what kinds of interventions or solutions seem desirable, appropriate, or even possible,” she writes.


Teaching Tolerance | 08.17.14

What is the Alt-Right?

Becca Lewis

D&S Researcher Becca Lewis discusses the recruiting methodologies of the Alt-Right in Teaching Tolerance

‘Social media can be very powerful in shaping outlooks, but it doesn’t operate in a vacuum,’ explains Data & Society researcher Becca Lewis. ‘The shaping is coming from the other people using the platforms.’

The alt-right has a massive presence on social media and other channels where young people congregate. A Washington Post analysis identified 27,000 influential Twitter accounts associated with the alt-right, 13 percent of which are considered radical. Later, a George Washington University study found that white nationalist accounts in the United States have seen their follower counts grow by 600 percent since 2012.


testimony | 08.11.17

Data & Society, Fifteen Scholars File Amicus Brief in Pending SCOTUS Case

Marcia Hoffman, Kendra Albert, Andrew D Selbst

On August 11, 2017, Data & Society and fifteen individual scholars—including danah boyd, Julia Ticona, and Amanda Lenhart—filed an amicus brief in a pending U.S. Supreme Court case, Carpenter v. United States. The parties were represented by Andrew Selbst of Data & Society, and Marcia Hofmann and Kendra Albert of Zeitgeist Law.

The case implicates the Fourth Amendment’s “third party doctrine,” which states that that people who “voluntarily convey” information to third parties do not have reasonable expectation of privacy. As a result, when police obtain records from a third party, it does not currently implicate Fourth Amendment rights.

Timothy Carpenter was convicted for a string of armed robberies based on cell site location data which placed him in proximity of the armed robberies he was accused of partaking in. The case concerns the legality under the Fourth Amendment of the warrantless search and seizure of Carpenter’s historical cellphone records, which reveal his location and movements over the course of 127 days.

In the brief, we argue that the “third party doctrine” should not apply to cell site location information because cell phones are not meaningfully voluntary in modern society. Cell site location information contains abundant information about people’s lives, and unfettered police access to it poses a threat to privacy rights.

Aided by scholarship and statistics from the Data & Society research team, we provide evidence that the 95% of Americans that have cell phones cannot reasonably be expected to opt out of owning a cell phone to avoid police searches. The research shows that cell phones are:

  1. Necessary to participate in the most basic aspects of social and family life;
  2. Essential public safety infrastructure and personal safety equipment;
  3. Both necessary to find employment, and an important part of workplace infrastructure;
  4. Widely used for commerce and banking;
  5. Key for civic participation;
  6. Key for enabling better health outcomes;
  7. Critical to vulnerable populations; and
  8. Have been recognized as a necessity by the U.S. government in the past.

The case is expected to be heard in the fall of 2017.



points | 07.12.17

Who Cares in the Gig Economy?

Alexandra Mateescu

Data & Society Researcher Alexandra Mateescu maps out the inequalities and power dynamics  within the gig economy.

“As on-demand companies like Handy and online marketplaces like Care.com enter the space of domestic work, a range of questions emerge: what are the risks and challenges of signing up for platform-based work as an immigrant? As a non-native English speaker? How are experiences of work different for individuals with strong professional identities as caregivers or housekeepers, versus more casual workers who may also be finding other kinds of work via Postmates or Uber?”


D&S Researcher Madeleine Clare Elish discusses the implications of biased AI in different contexts.

She said when AI is applied to areas like targeted marketing or customer service, this kind of bias is essentially an inconvenience. Models won’t deliver good results, but at the end of the day, no one gets hurt.

The second type of bias, though, can be more impactful to people. Elish talked about how AI is increasingly seeping into areas like insurance, credit scoring and criminal justice. Here, biases, whether they result from unrepresentative data samples or from unconscious partialities of developers, can have much more severe effects.


D&S founder danah boyd discusses machine learning algorithms and prejudice, digital white flight on social media, trust in the media, and more on The Ezra Klein Show.

“Technology is made by people in a society, and it has a tendency to mirror and magnify the issues that affect everyday life.”


D&S lawyer-in-residence Rebecca Wexler describes the intersection of automated technologies, trade secrets, and the criminal justice system.

For-profit companies dominate the criminal justice technologies industry and produce computer programs that are widespread throughout the justice system. These automated programs deploy cops, analyze forensic evidence, and assess the risk levels of inmates. But these technological advances may be making the system less fair, and without access to the source code, it’s impossible to hold computers to account.


Washington Monthly | 06.13.17

Code of Silence

Rebecca Wexler

D&S lawyer-in-residence Rebecca Wexler unpacks how private companies hide flaws in software that the government uses to convict and exonerate people in the criminal justice system.

What’s alarming about protecting trade secrets in criminal cases is that it allows private companies to withhold information not from competitors, but from individual defendants like Glenn Rodríguez. Generally, a defendant who wants to see evidence in someone else’s possession has to show that it is likely to be relevant to his case. When the evidence is considered “privileged,” the bar rises: he often has to convince the judge that the evidence could be necessary to his case—something that’s hard to do when, by definition, it’s evidence the defense hasn’t yet seen.


D&S resident Rebecca Wexler describes the flaws of an increasingly automated criminal justice system

The root of the problem is that automated criminal justice technologies are largely privately owned and sold for profit. The developers tend to view their technologies as trade secrets. As a result, they often refuse to disclose details about how their tools work, even to criminal defendants and their attorneys, even under a protective order, even in the controlled context of a criminal proceeding or parole hearing.


Ford Foundation blog | 05.30.17

Why you Should Care about Bots if you Care about Social Justice

Wildneida Negrón, Morgan Hargrave

D&S affiliate Wilneida Negrón details the role of bots and automation in activism today.

As everyone from advertisers to political adversaries jockey for attention, they are increasingly using automated technologies and processes to raise their own voices or drown out others. In fact, 62 percent of all Internet traffic is made up of programs acting on their own to analyze information, find vulnerabilities, or spread messages. Up to 48 million of Twitter’s 320 million users are bots, or applications that perform automated tasks. Some bots post beautiful art from museum collections, while some spread abuse and misinformation instead. Automation itself isn’t cutting edge, but the prevalence and sophistication of how automated tools interact with users is.


How do young people of low socio-economic status (SES) view online privacy? D&S fellow Alice Marwick, researcher Claire Fontaine, and president and founder danah boyd examine this question in their study.

” Framing online privacy violations as inevitable and widespread may not only help foster activist anger and strategic resistance but also avoid the victim-blaming narratives of some media literacy efforts. By examining the experiences of these young people, who are often left out of mainstream discussions about privacy, we hope to show how approaches to managing the interplay of on- and offline information flows are related to marginalized social and economic positions.”


D&S researcher Alex Rosenblat explains how and why Uber & Lyft drivers surveil their passengers during rides.

Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.


D&S researcher Mark Latonero provides an overview of the role of large tech companies in refugee crises.

While the 40-page brief is filled with arguments in support of immigration, it hardly speaks about refugees, except to note that those seeking protection should be welcomed. Any multinational company with a diverse workforce would be concerned about limits to international hiring and employee travel. But tech companies should also be concerned about the refugee populations that depend on their digital services for safety and survival.


report | 05.15.17

Media Manipulation and Disinformation Online

Alice Marwick and Rebecca Lewis

New Report Reveals Why Media Was Vulnerable to Radicalized Groups Online


D&S affiliate Seeta Peña Gangadharan writes about defending digital rights of library patrons.

If this sounds complicated and scary, that’s because it is. But confronted with this matrix of vulnerabilities, the library—with its longstanding commitment to patron privacy—also offers an impressive plan of action.


New Report Reveals Why Media Was Vulnerable to Radicalized Groups Online

View report here

For press inquiries, please contact [email protected] and write to the attention of Sam Hinds Garcia, Director of Communications.


D&S affiliate Mimi Onuoha states that discarded and sold hardware often has data still on it.

It’s not just individuals who are lax about removing data, companies around the world are at fault as well. In a 2007 study researchers in Canada obtained 60 secondhand drives that had previously belonged to health care facilities. They were able to recover personal information from 65% of the drives. The data included, in the words of the researchers, “very sensitive mental health information on a large number of people.”


D&S affiliate Desmond Patton breaks down how social media can lead to gun violence in this piece in The Trace.

Social media doesn’t allow for the opportunity to physically de-escalate an argument. Instead, it offers myriad ways to exacerbate a brewing conflict as opposing gangs or crews and friends and family take turns weighing in.


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.