filtered by: blog posts

In this blogpost for Ethical Resolve, researcher Jacob Metcalf discusses A/B testing and research ethics and argues that:

…data scientists need to earn the social trust that is the foundation of ethical research in any field. Ultimately, the foundations of ethical research are about trusting social relationships, not our assumptions about how experiments are constituted. This is a critical moment for data-driven enterprises to get creative and thoughtful about building such trust.

Benjamen Walker makes some of the best radio around….His finest work tends to come out in series of podcasts, exploring a complex issue through interviews and stories that unfold over two or more sequential weekly episodes.

The most recently concluded series is called “Instaserfs” and it focuses on the “sharing economy”, aka “the 1099” economy, the “gig economy” or as Ben offers, “the demand economy” or “the exploitation economy”. Struck by the ability to outsource virtually any task, Benjamen hires San Francisco native Andrew Callaway to make three episodes of his podcast as an “Instapodder”. The working method? Andrew’s task is to take on as many sharing economy jobs as he can and to report Benjamen about the experience, and whether he can pay his San Francisco rent with the money he earns.

D&S fellow Mimi Onuoha contemplates our ability to access and analyze data about ourselves and whether or not we actually need (or want) to see all of it.

To be clear, I’m all for data analysis, empowerment, journalism and the things that you can through all three. But surely we can acknowledge that not everything is suited to routine and saccharine representation through shapes, lines, and maps. Do you want to know how few of your friends will be alive for your 95th birthday? Do you want to know how many times you cried after your last breakup? And those are just the trivial examples!

Perhaps there are things in this world—-messy, difficult, things—-whose very nature demands that we consider them apart from the sense of order, categorization, and understanding that data visualizations tread in. Maybe some things mean less, not more, once categorized and put into metrics.


In this guest blogpost for the Responsible Data Forum, D&S fellow Mark Latonero provides an overview of the issues and tensions addressed by the Data, Human Rights & Human Security primer he coauthored with research analyst Zack Gold.

(The post was subsequently published on MasterCard Center for Inclusive Growth Insights.)

Excerpt: “As one of the speakers put it (we’re under Chatham House rules…), listening machines trigger all three aspects of the surveillance holy trinity: they’re pervasive, starting to appear in all aspects of our lives; they’re persistent, capable of keeping records of what we’ve said indefinitely, and they process the data they collect, seeking to understand what people are saying and acting on what they’re able to understand. To reduce the creepy nature of their surveillant behavior, listening systems are often embedded in devices designed to be charming, cute and delightful: toys, robots and smooth-voiced personal assistants.”

D&S founder danah boyd considers recent efforts at reforming laws around student privacy and what it would mean to actually consider the privacy rights of the most marginalized students.

The threats that poor youth face? That youth of color face? And the trade-offs they make in a hypersurveilled world? What would it take to get people to care about how we keep building out infrastructure and backdoors to track low-status youth in new ways? It saddens me that the conversation is constructed as being about student privacy, but it’s really about who has the right to monitor which youth. And, as always, we allow certain actors to continue asserting power over youth.

In this piece, D&S founder danah boyd considers the interwoven political and social goals of education and, in light of these goals, the different ways one can interpret the personalized learning agenda in education. She also asks us to consider who benefits, and who loses, from a technologically mediated world.

Just as recommendation systems result in differentiated experiences online, creating dynamics where one person’s view of the internet radically differs from another’s, so too will personalized learning platforms.

More than anything, what personalized learning brings to the table for me is the stark reality that our society must start grappling with the ways we are both interconnected and differentiated. We are individuals and we are part of networks.

In the realm of education, we cannot and should not separate these two. By recognizing our interconnected nature, we might begin to fulfill the promises that technology can offer our students.

D&S researcher Monica Bulger considers how existing child pornography laws address (or fail to address) the phenomenon of sexting, in which minors produce sexually explicit images of themselves to share with other minors:

Existing US law prohibits the production, possession, sale, and distribution of child pornography, defined by Section 2256 of Title 18, United States Code as “any visual depiction of sexually explicit conduct involving someone under 18 years of age” (2). Penalties under child pornography laws can include fines, imprisonment, and registry as a sex offender. What is unclear in practice, however, is distinguishing between child sexual abuse images — in which the image is taken and distributed without consent or through violence or coercion, and which seem to be the intended target of existing law — and self-generated images taken by minors and willingly shared with other minors. Given the legislative grey area surrounding sexting, teens are potentially at risk of criminal charges for what has become a widespread practice.

The relevance and applicability of laws intended to protect minors from the production of child sexual abuse images need further investigation in light of new practices of image sharing among teens.

In this blog post, D&S Ethan Zuckerman discusses some of the recent work being done by Helen Nissenbaum. While also being a leading thinker on the ethical issues faced in digital spaces, Nissenbaum also lends her support and helps build software to counteract these issues. Among these projects is Cryptagram, TrackMeNot, and now Ad Nauseum- all of these programs work in different ways to give users a sense of agency in their online dealings. To find out more about Nissenbaum’s work and AdNauseum read it on Ethan’s blog.

Pointing to an essay on high-frequency trading by Donald Mackenzie, D&S fellow Martha Poon comments on the tension between technologies accommodating “a variety of social visions” and available choices being constrained by material infrastructures.

Subscribe to the Data & Society newsletter

Support us

Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.