TechTarget | 07.07.18
D&S Researcher Madeleine Clare Elish discusses the implications of biased AI in different contexts.
She said when AI is applied to areas like targeted marketing or customer service, this kind of bias is essentially an inconvenience. Models won’t deliver good results, but at the end of the day, no one gets hurt.
The second type of bias, though, can be more impactful to people. Elish talked about how AI is increasingly seeping into areas like insurance, credit scoring and criminal justice. Here, biases, whether they result from unrepresentative data samples or from unconscious partialities of developers, can have much more severe effects.
D&S founder danah boyd discusses machine learning algorithms and prejudice, digital white flight on social media, trust in the media, and more on The Ezra Klein Show.
“Technology is made by people in a society, and it has a tendency to mirror and magnify the issues that affect everyday life.”
WNYC The Takeaway | 08.17.16
D&S lawyer-in-residence Rebecca Wexler describes the intersection of automated technologies, trade secrets, and the criminal justice system.
For-profit companies dominate the criminal justice technologies industry and produce computer programs that are widespread throughout the justice system. These automated programs deploy cops, analyze forensic evidence, and assess the risk levels of inmates. But these technological advances may be making the system less fair, and without access to the source code, it’s impossible to hold computers to account.
D&S lawyer-in-residence Rebecca Wexler unpacks how private companies hide flaws in software that the government uses to convict and exonerate people in the criminal justice system.
What’s alarming about protecting trade secrets in criminal cases is that it allows private companies to withhold information not from competitors, but from individual defendants like Glenn Rodríguez. Generally, a defendant who wants to see evidence in someone else’s possession has to show that it is likely to be relevant to his case. When the evidence is considered “privileged,” the bar rises: he often has to convince the judge that the evidence could be necessary to his case—something that’s hard to do when, by definition, it’s evidence the defense hasn’t yet seen.
D&S resident Rebecca Wexler describes the flaws of an increasingly automated criminal justice system
The root of the problem is that automated criminal justice technologies are largely privately owned and sold for profit. The developers tend to view their technologies as trade secrets. As a result, they often refuse to disclose details about how their tools work, even to criminal defendants and their attorneys, even under a protective order, even in the controlled context of a criminal proceeding or parole hearing.
Ford Foundation blog | 05.30.17
D&S affiliate Wilneida Negrón details the role of bots and automation in activism today.
As everyone from advertisers to political adversaries jockey for attention, they are increasingly using automated technologies and processes to raise their own voices or drown out others. In fact, 62 percent of all Internet traffic is made up of programs acting on their own to analyze information, find vulnerabilities, or spread messages. Up to 48 million of Twitter’s 320 million users are bots, or applications that perform automated tasks. Some bots post beautiful art from museum collections, while some spread abuse and misinformation instead. Automation itself isn’t cutting edge, but the prevalence and sophistication of how automated tools interact with users is.
Sage Journals | 05.30.17
D&S researcher Alex Rosenblat explains how and why Uber & Lyft drivers surveil their passengers during rides.
Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.
Protecting Patron Privacy, edited by Bobbi Newman and Data & Society Researcher Bonnie Tijerina, suggests strategies for data privacy in libraries.
Although privacy is one of the core tenets of librarianship, technology changes have made it increasingly difficult for libraries to ensure the privacy of their patrons in the 21st century library.
This authoritative LITA Guide offers readers guidance on a wide range of topics, including:
• Foundations of privacy in libraries
• Data collection, retention, use, and protection
• Laws and regulations
• Privacy instruction for patrons and staff
• Contracts with third parties
• Use of in-house and internet tools including social network sites, surveillance video, and RFID
Harvard Business Review | 05.16.17
D&S researcher Mark Latonero provides an overview of the role of large tech companies in refugee crises.
While the 40-page brief is filled with arguments in support of immigration, it hardly speaks about refugees, except to note that those seeking protection should be welcomed. Any multinational company with a diverse workforce would be concerned about limits to international hiring and employee travel. But tech companies should also be concerned about the refugee populations that depend on their digital services for safety and survival.
report | 05.15.17
New Report Reveals Why Media Was Vulnerable to Radicalized Groups Online
New America | 05.11.17
D&S affiliate Seeta Peña Gangadharan writes about defending digital rights of library patrons.
If this sounds complicated and scary, that’s because it is. But confronted with this matrix of vulnerabilities, the library—with its longstanding commitment to patron privacy—also offers an impressive plan of action.
Quartz | 05.10.17
D&S affiliate Mimi Onuoha states that discarded and sold hardware often has data still on it.
It’s not just individuals who are lax about removing data, companies around the world are at fault as well. In a 2007 study researchers in Canada obtained 60 secondhand drives that had previously belonged to health care facilities. They were able to recover personal information from 65% of the drives. The data included, in the words of the researchers, “very sensitive mental health information on a large number of people.”
Jack Flynn and Data & Society Research Analyst Kinjal Dave examine the role of the Koch brothers at Villanova.
“In order to preserve academic freedom and integrity on campus, the university must introduce substantive measures improving transparency and visibility surrounding the influence of outside donors on campus.”
The Trace | 05.03.17
First Monday | 05.01.17
Philip Napoli and D&S researcher Robyn Caplan write on why companies like Google and Facebook insist that they are merely tech companies with no media impact, and why they are wrong for First Monday. Abstract is below:
A common position amongst social media platforms and online content aggregators is their resistance to being characterized as media companies. Rather, companies such as Google, Facebook, and Twitter have regularly insisted that they should be thought of purely as technology companies. This paper critiques the position that these platforms are technology companies rather than media companies, explores the underlying rationales, and considers the political, legal, and policy implications associated with accepting or rejecting this position. As this paper illustrates, this is no mere semantic distinction, given the history of the precise classification of communications technologies and services having profound ramifications for how these technologies and services are considered by policy-makers and the courts.
In this Points piece, D&S researcher Claire Fontaine argues that educational marketplaces disproportionately advantage those with surplus time, energy, social capital, and institutional knowledge.
“Schooling is not a service, and it is not a commodity. It is — or should be — the means through which a society as diverse as our own coheres and develops a functional social fabric.”
Harvard Business Review | 04.19.17
Jongbin Jung, Connor Concannon, D&S fellow Ravi Shroff, Sharad Goel, and Daniel G. Goldstein explore new methods for machine learning in criminal justice.
Simple rules certainly have their advantages, but one might reasonably wonder whether favoring simplicity means sacrificing performance. In many cases the answer, surprisingly, is no. We compared our simple rules to complex machine learning algorithms. In the case of judicial decisions, the risk chart above performed nearly identically to the best statistical risk assessment techniques. Replicating our analysis in 22 varied domains, we found that this phenomenon holds: Simple, transparent decision rules often perform on par with complex, opaque machine learning methods.
American Press Institute | 04.12.17
D&S researcher Mary Madden was interviewed by the American Press Institute about Madden’s recent Knight Foundation-supported report, “How Youth Navigate the News Landscape.”
However, one of my favorite quotes was from a participant who described a future where news would be delivered by hologram: “I think like it’s going to be little holograms. You’re going to open this thing and a little guy’s going to come out and tell you about stuff.”
Given that some participants said they already found notifications annoying, I’m not sure how successful the little hologram guy would be, but it was clear that the participants fully expected that the news industry would continue to evolve and innovate in creative ways moving forward.
Centre for Public Impact | 04.12.17
D&S fellow Anne L. Washington discusses her previous research as a digital government scholar and her upcoming work examining US open data policy, funded through a five-year National Science Foundation Early Faculty Research Career grant.
“We use a secret language in academia sometimes,” [Washington] says, laughing. “‘Technology management’ is about how organisations leverage digital assets for strategic business goals. My doctorate is in management information systems. On the other side of that is ‘informatics’, which comes from the library science tradition. Over centuries, librarians have refined how to store and retrieve knowledge so people can find what they need and walk away smarter. Informatics takes this basic idea and scales it up for massive digital collections.”
ProPublica | 04.05.17
Rhizome | 04.05.17
D&S artist-in-residence Ingrid Burrington, with Josh Begley and Seth Freed Wessler, created an installation on presentation at the Ace Hotel.
Building on Wessler’s journalistic investigation into privately run immigrant-only federal prisons, Burrington and Begley present seventy-five individual lenticular prints of satellite imagery capturing these sites and government documents pertaining to them. Together the images, arranged into three distinct grids, explore and abstract “the terrain of U.S. immigration and carceral policy and the human stories usually conspicuously absent in the aerial perspective.”
paper | 04.02.17
Sharad Goel, Maya Perelman, D&S fellow Ravi Shroff, and David Alan Sklansky examine a method that can “reduce the racially disparate impact of pedestrian searches and to increase their effectiveness”. Abstract is below:
The exponential growth of available information about routine police activities offers new opportunities to improve the fairness and effectiveness of police practices. We illustrate the point by showing how a particular kind of calculation made possible by modern, large-scale datasets — determining the likelihood that stopping and frisking a particular pedestrian will result in the discovery of contraband or other evidence of criminal activity — could be used to reduce the racially disparate impact of pedestrian searches and to increase their effectiveness. For tools of this kind to achieve their full potential in improving policing, though, the legal system will need to adapt. One important change would be to understand police tactics such as investigatory stops of pedestrians or motorists as programs, not as isolated occurrences. Beyond that, the judiciary will need to grow more comfortable with statistical proof of discriminatory policing, and the police will need to be more receptive to the assistance that algorithms can provide in reducing bias.
PLOS Computational Biology | 03.30.17
Matthew Zook, D&S affiliate Solon Barocas, D&S founder danah boyd, D&S affiliate Kate Crawford, Emily Keller, D&S affiliate Seeta Peña Gangadharan, Alyssa Goldman, Rachelle Hollander, Barbara A. Koenig, D&S researcher Jacob Metcalf, Arvind Narayanan, D&S advisor Alondra Nelson, and Frank Pasquale wrote a paper detailing ten rules for responsible big data research. Introduction is below:
The use of big data research methods has grown tremendously over the past five years in both academia and industry. As the size and complexity of available datasets has grown, so too have the ethical questions raised by big data research. These questions become increasingly urgent as data and research agendas move well beyond those typical of the computational and natural sciences, to more directly address sensitive aspects of human behavior, interaction, and health. The tools of big data research are increasingly woven into our daily lives, including mining digital medical records for scientific and economic insights, mapping relationships via social media, capturing individuals’ speech and action via sensors, tracking movement across space, shaping police and security policy via “predictive policing,” and much more.
The beneficial possibilities for big data in science and industry are tempered by new challenges facing researchers that often lie outside their training and comfort zone. Social scientists now grapple with data structures and cloud computing, while computer scientists must contend with human subject protocols and institutional review boards (IRBs). While the connection between individual datum and actual human beings can appear quite abstract, the scope, scale, and complexity of many forms of big data creates a rich ecosystem in which human participants and their communities are deeply embedded and susceptible to harm. This complexity challenges any normative set of rules and makes devising universal guidelines difficult.
Nevertheless, the need for direction in responsible big data research is evident, and this article provides a set of “ten simple rules” for addressing the complex ethical issues that will inevitably arise. Modeled on PLOS Computational Biology’s ongoing collection of rules, the recommendations we outline involve more nuance than the words “simple” and “rules” suggest. This nuance is inevitably tied to our paper’s starting premise: all big data research on social, medical, psychological, and economic phenomena engages with human subjects, and researchers have the ethical responsibility to minimize potential harm.
The variety in data sources, research topics, and methodological approaches in big data belies a one-size-fits-all checklist; as a result, these rules are less specific than some might hope. Rather, we exhort researchers to recognize the human participants and complex systems contained within their data and make grappling with ethical questions part of their standard workflow. Towards this end, we structure the first five rules around how to reduce the chance of harm resulting from big data research practices; the second five rules focus on ways researchers can contribute to building best practices that fit their disciplinary and methodological approaches. At the core of these rules, we challenge big data researchers who consider their data disentangled from the ability to harm to reexamine their assumptions. The examples in this paper show how often even seemingly innocuous and anonymized data have produced unanticipated ethical questions and detrimental impacts.
This paper is a result of a two-year National Science Foundation (NSF)-funded project that established the Council for Big Data, Ethics, and Society, a group of 20 scholars from a wide range of social, natural, and computational sciences (http://bdes.datasociety.net/). The Council was charged with providing guidance to the NSF on how to best encourage ethical practices in scientific and engineering research, utilizing big data research methods and infrastructures.
Backchannel | 03.27.17
D&S researcher danah boyd discusses the problem with asking companies like Facebook and Google to ‘solve’ fake news – boyd insists the context of complex social problems are missing in this problematic solutionism of solving fake news.
Although a lot of the emphasis in the “fake news” discussion focuses on content that is widely spread and downright insane, much of the most insidious content out there isn’t in your face. It’s not spread widely, and certainly not by people who are forwarding it to object. It’s subtle content that is factually accurate, biased in presentation and framing, and encouraging folks to make dangerous conclusions that are not explicitly spelled out in the content itself.
Washington University Law Review | 03.11.17
D&S researcher Mary Madden, Michele Gilman, D&S affiliate Karen Levy, and D&S fellow Alice Marwick examine how poor Americans are impacted by privacy violations and discuss how to protect digital privacy for the vulnerable. Abstract is as follows:
This Article examines the matrix of vulnerabilities that low-income people face as a result of the collection and aggregation of big data and the application of predictive analytics. On the one hand, big data systems could reverse growing economic inequality by expanding access to opportunities for low-income people. On the other hand, big data could widen economic gaps by making it possible to prey on low-income people or to exclude them from opportunities due to biases that get entrenched in algorithmic decision-making tools. New kinds of “networked privacy” harms, in which users are simultaneously held liable for their own behavior and the actions of those in their networks, may have particularly negative impacts on the poor. This Article reports on original empirical findings from a large, nationally-representative telephone survey with an oversample of low-income American adults and highlights how these patterns make particular groups of low-status internet users uniquely vulnerable to various forms of surveillance and networked privacy-related problems. In particular, a greater reliance on mobile connectivity, combined with lower usage of privacy-enhancing strategies may contribute to various privacy and security-related harms. The article then discusses three scenarios in which big data – including data gathered from social media inputs – is being aggregated to make predictions about individual behavior: employment screening, access to higher education, and predictive policing. Analysis of the legal frameworks surrounding these case studies reveals a lack of legal protections to counter digital discrimination against low-income people. In light of these legal gaps, the Article assesses leading proposals for enhancing digital privacy through the lens of class vulnerability, including comprehensive consumer privacy legislation, digital literacy, notice and choice regimes, and due process approaches. As policymakers consider reforms, the article urges greater attention to impacts on low-income persons and communities.
D&S advisor Nick Grossman discusses trust in the sharing economy.
So, fast forward to our refund situation: now I no longer feel like I have any moral high ground to demand a formal close out — in my mind, I was complicit in the shadiness when I was cool with fooling the apartment building. How is that any different than agreeing to sidestep the Airbnb platform rules?
D&S advisor Anil Dash discusses how interviews can exclude people from the tech industry.
When we mimic patterns from tech culture without knowing why we do them, we often take good ideas and turn them into terrible barriers.
Columbia Law Review | 03.07.17
Ryan Calo and D&S researcher Alex Rosenblat write this analysis of the newly termed ‘taking economy’ of Uber.
Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value and raises a set of concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power. This Article, coauthored by a law professor and a technology ethnographer who studies the ride-hailing community, furnishes such a critique and indicates a path toward a meaningful response.
Commercial firms have long used what they know about consumers to shape their behavior and maximize profits. By virtue of sitting between consumers and providers of services, however, sharing economy firms have a unique capacity to monitor and nudge all participants—including people whose livelihood may depend on the platform. Much activity is hidden away from view, but preliminary evidence suggests that sharing economy firms may already be leveraging their access to information about users and their control over the user experience to mislead, coerce, or otherwise disadvantage sharing economy participants.
This Article argues that consumer protection law, with its longtime emphasis of asymmetries of information and power, is relatively well positioned to address this under-examined aspect of the sharing economy. But the regulatory response to date seems outdated and superficial. To be effective, legal interventions must (1) reflect a deeper understanding of the acts and practices of digital platforms and (2) interrupt the incentives of sharing economy firms to abuse their position.
D&S advisor Nick Grossman discusses constructive approaches for tech companies to support important issues.
As the white house continues to issues executive orders on issues like immigration that hit tech companies directly, and as issues like transgender rights — that are outside the pocketbook interests but may intersect with a company or community’s values — come up, it feels as though companies are going to continue to be under pressure to take public stands.
D&S advisor Anil Dash discusses Fake Markets that are dominated by few tech companies.
Worse, we’ve lost the ability to discern that a short-term benefit for some users that’s subsidized by an unsustainable investment model will lead to terrible long-term consequences for society. We’re hooked on the temporary infusion of venture capital dollars into vulnerable markets that we know are about to be remade by technological transformation and automation. The only social force empowered to anticipate or prevent these disruptions are policymakers who are often too illiterate to understand how these technologies work, and who too desperately want the halo of appearing to be associated with “high tech”, the secular religion of America.
report | 03.01.17
Slate | 02.28.17
D&S lawyer-in-residence Rebecca Wexler analyzes the unreliability of video authenticating in Slate.
When forensic scientists refuse to reveal details about how their experimental methods work, they erode trust in the ideal of scientific objectivity, and in the legitimacy of their results. There is already a dearth of trust surrounding forensic sciences. Just last fall, President Obama’s Council of Advisors on Science and Technology reported that even some long-practiced forensic disciplines, like bite-mark analysis and some methods for analyzing complex mixtures of DNA, are not foundationally valid.
paper | 02.21.17
D&S lawyer-in-residence Rebecca Wexler provides an analysis on trade secrecy in the criminal justice system. Abstract is below:
From policing to evidence to parole, data-driven algorithmic systems and other automated software programs are being adopted throughout the criminal justice system. The developers of these technologies often claim that the details about how the programs work are trade secrets and, as a result, cannot be disclosed in criminal cases. This Article turns to evidence law to examine the conflict between transparency and trade secrecy in the criminal justice system. It is the first comprehensive account of trade secret evidence in criminal cases. I argue that recognizing a trade secrets evidentiary privilege in criminal proceedings is harmful, ahistorical, and unnecessary. Withholding information from the accused because it is a trade secret mischaracterizes due process as a business competition.
D&S researcher Alex Rosenblat was interviewed by Radio NZ about Uber and the promises it makes its drivers, i.e. flexible hours and freedom.
D&S artist-in-residence Ingrid Burrington explores the importance of domain names at NamesCon, an annual conference for the domain-names industry.
In addition to being crucial to making the web work, domain names are also a highly political pocket of the web, particularly shaped by the legacy of colonialism. Most of the underlying protocols that make the internet work—including DNS—are encoded in ASCII, which translates bits into letterforms, numbers, and punctuation marks. But ASCII’s letterforms only represent the Latin alphabet, limiting expression in domain names to Western languages (while arguing that a character encoding is an instrument of imperialism sounds bold, so does assuming that “text” is synonymous only with “English”).
D&S lawyer-in-residence Rebecca Wexler testifies about government oversight of forensic science laboratories in the State of New York.
I submit these comments to the Assembly Standing Committee on Codes; the Assembly Standing Committee on Judiciary and the Assembly Standing Committee on Oversight, Analysis and Investigation. Thank you for inviting my testimony on government oversight of forensic science laboratories in the State of New York. As a Resident at The Data and Society Research Institute, my work focuses on issues arising from data and technology in the criminal justice system. I want to draw your attention to trade secrets claims in forensic technologies that threaten criminal defendant’s rights to confront and cross-examine the evidence against them; to compulsory process to obtain evidence in their favor; and to due process.
The Guardian | 02.02.17
D&S affiliate Keith Hiatt, Michael Kleinman, and D&S researcher Mark Latonero think critically about the usage of technology as a all-encompassing solution in human rights spaces.
It’s important to acknowledge that, most of the time, the underlying problem human rights organisations are trying to solve isn’t technical. It’s often a bureaucratic, institutional, process or workflow problem, and technology won’t solve it (and might exacerbate it).
D&S researcher Monica Bulger, with Patrick McCormick and D&S research analyst Mikaela Pitcan, writes this working paper detailing the “Legacy of inBloom”.
Although inBloom closed in 2014, it ignited a public discussion of student data privacy that resulted in the introduction of over 400 pieces of state-level legislation. The fervor over inBloom showed that policies and procedures were not yet where they needed to be for schools to engage in data-informed instruction. Industry members responded with a student data privacy pledge that detailed responsible practice. A strengthened awareness of the need for transparent data practices among nearly all of the involved actors is one of inBloom’s most obvious legacies.
Instead of a large-scale, open source platform that was a multi-state collaboration, the trend in data-driven educational technologies since inBloom’s closure has been toward closed, proprietary systems, adopted piecemeal. To date, no large-scale educational technology initiative has succeeded in American K-12 schools. This study explores several factors that contributed to the demise of inBloom and a number of important questions: What were the values and plans that drove inBloom to be designed the way it was? What were the concerns and movements that caused inBloom to run into resistance How has the entire inBloom development impacted the future of edtech and student data?
D&S advisor John Palfrey speaks to his students about the President’s executive orders on immigration.
There has been much talk of universities and schools committing to be “sanctuaries” for students. There is merit in this idea but there is also a lot of debate as to what it means, in a legal sense. I would simplify how I see it: I aspire for our school to be a home for our students–a home away from home to be sure–one where our youth from every quarter and from every religion know that they will have every protection we can manage, just as we would offer our own children at home.
D&S affiliate Mimi Onuoha details the process of completely deleting data.
This overwriting process is a bit like painting a wall: If you start with a white wall and paint it red, there’s no way to erase the red. If you want the red gone or the wall returned to how it was, you either destroy the wall or you paint it over, several times, so that it’s white again.
D&S founder danah boyd responds to remarks by the Trump administration stating that their opposition is the media.
And now many of the actors most set on undermining institutionalized information intermediaries are in the most powerful office in the land. They are waging war on the media and the media doesn’t know what to do other than to report on it.
Nick Grossman ponders the importance of and vision for the public data layer.
If we do this right, we can get smarter at policymaking, and design regulatory systems that have both greater effectiveness and lower costs of implementation and compliance.
D&S fellow Zara Rahman writes about how immigrant families use social media and digital technologies.
The consequence is that the home of our deeply personal information has gone from treasured letters stored in a box at our houses, to servers owned by corporate companies that we’ll never see. Those personal notes, the ways of showing our family that we’re happy and content in our new lives, despite what we’ve lost — they live online now. The more you share with that corporation, the stronger those family ties get. There is a third party in these relationships.
points | 01.25.17
D&S researcher Claire Fontaine’s debut on Points, “Doing Screen Time,” resourcefully unwinds the apparent contradiction between anxiety around screen time at home and support for screen time at school: “Each produces and enables the other.” Looking into this dynamic is an occasion for asking, collectively, how we want to live.
D&S advisor Ethan Zuckerman writes about fake news and the bigger problem behind fake news.
The truly disturbing truth is that fake news isn’t the cause of our contemporary political dysfunction. More troublingly, we live in a world where people disagree deeply and fundamentally about how to understand it, even when we share the same set of facts. Solving the problems of fake news make that world slightly easier to navigate, but they don’t scratch the surface of the deeper problems of finding common ground with people with whom we disagree.
D&S affiliate Kate Crawford writes a letter to Silicon Valley on how to resist Trump.
You, the software engineers and leaders of technology companies, face an enormous responsibility. You know better than anyone how best to protect the millions who have entrusted you with their data, and your knowledge gives you real power as civic actors. If you want to transform the world for the better, here is your moment.
D&S affiliate Kate Crawford and Hito Steyerl converse about “NSA bros, dataveillance, apex predators, AI, and empathy machines.”
D&S artist-in-residence Ingrid Burrington discusses associating weaponry, like drones, with art.
The impulse to pair a technology associated with automated extralegal killing of American citizens alongside “culture and the arts” is weird, but not entirely surprising—the vantage point of drones affords a particular aesthetic in addition to plausible deniability. The aerial perspective has appealed to artists for as long as it has appealed to generals and kings. That distant, presumed-objective view from nowhere, whether achieved via hot air balloon or low-orbit satellite, suggests a totality, a kind of coherence in defiance of the often-incoherent groundtruth of everyday life. For generals, coherence offers the possibility of tactical advantage. For artists (or at least good artists), it’s something to interrogate and take apart.
D&S advisor Ethan Zuckerman provides a transcript on his recent speech about journalism and civics.
One final thing: we have this tendency in journalism right now to feel very sorry for ourselves. This is a field that we are all enormously proud to be part of. This is a field that is harder and harder to make a living in, and I see more and more news organizations essentially saying, “You’re going to miss us. We’re going away. I just want to warn you.”
report | 01.18.17
12% of U.S. internet users who have been in romantic relationships have experienced intimate partner digital abuse…
Digital tools are often an integral part of healthy romantic relationships. Romantic partners frequently use digital tools to connect with each other through text messages, photo-sharing, social media posts, and other online activities. These same digital tools can be used in unhealthy ways, facilitating negative behaviors such as monitoring, unwanted picture sharing, and abusive messages — both within the romantic relationship and after the relationship is over. Better understanding how often intimate partner digital abuse is happening, to whom, and in what ways are critical pieces to understanding the scope of the problem.
This report, part of a series of research reports on digital harassment and abuse, examines the prevalence and impact of intimate partner digital abuse. Findings are based upon the results of a nationally representative survey of 3,002 Americans 15 years of age and older conducted from May 17th through July 31st, 2016. Respondents were surveyed on either their landline or cell phone. Interviews were conducted in either English or Spanish. Findings in this report refer to the 2,810 respondents who have ever been in a romantic relationship.
D&S artist-in-residence Heather Dewey-Hagborg partnered with Chelsea Manning and Shoili Kanungo to create Suppressed Images, an illustrated series about Dewey-Hagborg and Manning’s collaboration.
blog post | 01.17.17
D&S advisor Susan Crawford writes about communication infrastructure in rural parts of America.
This year, I’ll be traveling the US talking to people in scrappy communities who are building fiber on their own. They’re fed up with waiting for enormous incumbent communications companies to decide it’s in their corporate interests to invest in 21st-century communications capacity for Americans. These communities have run the numbers and looked at their economic development needs — as well as the possibilities for advanced healthcare, world-class educations, effective governance, energy management, and public safety that publicly controlled wholesale “street grids” of fiber make real — and they’ve come to the conclusion that if they hang back, they’ll become irrelevant.
New Media & Society | 01.16.17
D&S researcher Monica Bulger, with Patrick Burton, Brian O’Neill, and Elisabeth Staksrud, writes “Where policy and practice collide: Comparing United States, South African and European Union approaches to protecting children online”.
That children have a right to protection when they go online is an internationally well-established principle, upheld in laws that seek to safeguard children from online abuse and exploitation. However, children’s own transgressive behaviour can test the boundaries of this protection regime, creating new dilemmas for lawmakers the world over. This article examines the policy response from both the Global North and South to young people’s online behaviour that may challenge adult conceptions of what is acceptable, within existing legal and policy frameworks. It asks whether the ‘childhood innocence’ implied in much protection discourse is a helpful basis for promoting children’s rights in the digital age. Based on a comparative analysis of the emerging policy trends in Europe, South Africa and the United States, the article assesses the implications for policymakers and child welfare specialists as they attempt to redraw the balance between children’s online safety while supporting their agency as digital citizens.
D&S advisor Anil Dash presents the newest episode of On Being.
Back in November, I got to sit down with the amazing Krista Tippett for a lengthy interview in front of an incredibly warm crowd in Easton, MD. Now, that interview has been edited down and is available as the latest episode of Krista’s hugely popular show, On Being.
Quartz | 01.15.17
D&S affiliate Gideon Litchfield writes a short story called ‘Democracy 3.0’
The Guardian | 01.13.17
D&S founder danah boyd’s Points piece was re-published for The Guardian. boyd looks back at the unraveling of two historical institutions through which social, racial, and class-based diversification of social networks was achieved — the US military and higher education — and asks how trends towards content personalization on social media continue to fragment Americans along ideological lines.
Ford Foundation blog | 01.12.17
D&S advisor Claudia Perlich discusses modeling, transparency, and machine learning in a new episode of the Partially Derivative podcast.
“One pitfall I see is that it’s easy from a social science perspective to condemn all data science as evil…but that ultimately doesn’t help advance the situation.”
D&S fellow Zara Rahman details her thoughts on owning one’s success and hard work.
It’s unrealistic and unfair to ignore all that work – to myself, and others. Citing luck and serendipity gives the impression that people in positions of influence will somehow magically find out about you and your interests and reach out to you – they (probably) won’t. It implies that if you’re doing this right, opportunities to work on things you want to be working on will just pop up out of the blue.
D&S advisor Ethan Zuckerman reflects on today’s political atmosphere and FDR’s speech on the four freedoms.
This is a scary moment, a time where it looks like the progress we’ve made around the world might reverse, where we go from a world that’s gotten much bigger to one that shrinks. The good news is that we get to decide how big a world we want to live in. We get to decide how to speak, how to listen and how to stand together against fear.
D&S founder danah boyd begins to sketch out how hacking culture evolved from playful efforts to game the media ecosystem to more complex and politicized projects of social engineering, propaganda, and activism in “Hacking the attention economy”.
In “How do you deal with a problem like ‘fake news?’” D&S researcher Robyn Caplan weighs in on the potential — and pitfalls — of efforts to curb Facebook’s fake news problem.
In “Why America is Self-Segregating,” D&S founder danah boyd looks back at the unraveling of two historical institutions through which social, racial, and class-based diversification of social networks was achieved — the US military and higher education — and asks how trends towards content personalization on social media continue to fragment Americans along ideological lines.
In “Are There Limits to Online Free Speech?” D&S fellow Alice Marwick argues against simplistic binaries pitting free speech against censorship, looking at how the tech industry’s historic commitment to freedom of speech falls short in the face of organized harassment.
In “Did Media Literacy Backfire?” D&S founder danah boyd argues that the thorny problems of fake news and the spread of conspiracy theories have, in part, origins in efforts to educate people against misinformation. At the heart of the problem are deeper cultural divides that we must learn how to confront.
In “What’s Propaganda Got To Do With It?” Caroline Jack notes a resurgence in the popularity of “propaganda” as a judgment-laden label for a vast array of media ranging from fringe conspiracy theories to establishment news institutions. What work is this concept doing in efforts to conceptually navigate the contemporary media environment?
D&S advisor Baratunde Thurston discusses EFF’s work to encrypt the web and switch every site to https://.
D&S fellow Zara Rahman details the books she has read and enjoyed in 2016.
Dosavannah.com | 01.03.17
D&S artist-in-residence Heather Dewey-Hagborg’s work was profiled in this piece by Do Savannah.
In a related piece, artist Heather Dewey-Hagborg has created facial portraits based on the DNA profile of leftover genetic material found in discarded trash, like chewing gum and cigarette butts. The results are both fascinating and deeply creepy.
D&S advisor Charlton D. McIlwain writes an op-ed about the new white political movement.
We must see the new white narrative for what it really is, an attempt to refocus public attention and political capital away from people of color. Trump and many of those he’s chosen to lead his administration are the new white’s principal ambassadors. They take stock of the last few years as blacks fought against police brutality, Muslims battled religious persecution, and Hispanics defended themselves and their families from mass deportations. These representatives of the new white respond: Your concerns don’t matter as much as working-class folk (white people) for whom America’s promise was designed but has been denied.
Globalvoices.org | 12.19.16
D&S fellow Zara Rahman donates part of her Shuttleworth Flash Grant to the Human Rights Data Analysis Group and Global Voices.
So here’s the thing: I don’t think we need only innovative ideas or world-changing projects. We also need trust, communities, and skills. We need to strengthen and support existing infrastructure and communities. I worry that we’ve become far too fixated upon quickly implemented innovation and disruption, and that we’re taking a lot of important things for granted—things we rely upon that, unlike “innovative ideas”, take a lot of time and effort to build.
Sage Journals | 12.13.16
D&S affiliate Kate Crawford co-wrote, with Mike Ananny, this piece discussing transparency and algorithmic accountability.
Models for understanding and holding systems accountable have long rested upon ideals and logics of transparency. Being able to see a system is sometimes equated with being able to know how it works and govern it—a pattern that recurs in recent work about transparency and computational systems. But can “black boxes’ ever be opened, and if so, would that ever be sufficient? In this article, we critically interrogate the ideal of transparency, trace some of its roots in scientific and sociotechnical epistemological cultures, and present 10 limitations to its application. We specifically focus on the inadequacy of transparency for understanding and governing algorithmic systems and sketch an alternative typology of algorithmic accountability grounded in constructive engagements with the limitations of transparency ideals.
Recode | 12.13.16
D&S advisor Anil Dash asserts that business leaders have to stand up against proposed abuses and violations from the Trump administration.
report | 12.13.16
Media coverage of revenge porn largely focuses on instances when celebrities have had private nude or explicit photos or videos made public without their consent, but this experience is not limited to the famous and newsworthy. Roughly 3% of all online Americans have had someone threaten to post nude or nearly nude photos or videos of them online to hurt or embarrass them, and 2% of online Americans have had someone actually post a photo of them online without their permission. Taken together, 4% of internet users—one in 25 online Americans—have either had sensitive images posted without their permission or had someone threaten to post photos of them.
This report, “Nonconsensual Image Sharing” (press release), complements an earlier report covering the prevalence of online harassment and abuse more broadly, as well as a subsequent report on intimate partner digital abuse.
D&S fellow Zara Rahman details the year’s ‘data-driven confusion’ and contends for a responsible data approach, both to practice and comprehension.
We must take a responsible data approach to advocacy – address gaps in literacy proactively, be rigorous in our methods, and maintain credibility, especially on important issues. Nowadays, thanks to the speed and amplification of sources afforded to us via technology, analyses and “facts” will spread faster than before. Understanding the critical limitations of data and information is going to become ever more important in years to come.
D&S advisor Ethan Zuckerman discusses his long-time friendship with a Trump supporter.
D&S advisor Hilary Mason presents her process for discovering changes in markets.
points | 12.06.16
D&S advisor Baratunde Thurston details his exploration of The Glass Room exhibit.
I want to see The Glass Room everywhere there is an Apple Store…And anyone founding or working for a tech company should have to prove they’ve gone through this space and understood its meaning.
Quartz | 12.04.16
D&S affiliate Mimi Onuoha profiles the Asian American Performers Action Coalition (AAPAC) of Broadway and off-Broadway and their efforts “to track racial demographic data in the industry.”
paper | 11.30.16
D&S researchers Mark Latonero and Monica Bulger, with Bronwyn Wex, Emma Day, Kapil Aryal, Mariya Ali, and Keith Hiatt, completed a thorough study on online child sexual exploitation in South aSIA.
This study identified an assumption that a technical fix must exist for problems identified as ‘online’. In the case of online child sexual exploitation, these assumptions are true, but limited. INTERPOL and the International Centre for Missing & Exploited Children (ICMEC) lead efforts to identify and take down CSAM images globally, a technological fix. Yet it is a finding of this study that combined with international response there is also a need for a local response to attend to the victims and perpetrators. Local response to online child sexual exploitation relies on the strength of the existing child protection system, locating treatment of abuse incidents regardless of where they occur, within an existing framework. It additionally addresses that a single child may be victim of multiple forms of abuse and seeking treatment from the same facilities.
D&S founder danah boyd’s prepared remarks for a public roundtable in the European Parliament on algorithmic accountability and transparency in the digital economy were adapted in this Points piece.
I believe that algorithmic transparency creates false hope. Not only is it technically untenable, but it obfuscates the real politics that are at stake.
D&S advisor Ethan Zuckerman wrote an op-ed detailing the issue of normalizing the abnormal.
My deep fear is that there’s no single set of Hallin’s spheres anymore. What’s consensus to a Trump supporter may be deviant to a Clinton supporter and vice versa. We now face an online media landscape so diverse and fragmented that each of us finds big enough spheres of legitimate controversy that we think we’re seeing a real debate at work.
Wired | 11.27.16
D&S affiliate Angèle Christin was quoted in this Wired piece discussing the pervasiveness of election news online.
In that process, such conversations start to invade “social areas that are usually sheltered from heated political discussions,” says Angèle Christin, professor of communication at Stanford. She says social media, including seemingly anodyne environments like a parenting forum, actually accentuate the problem because they blend the private and public.
Quartz | 11.24.16
D&S fellow Alice E. Marwick wrote this op-ed discussing how online harassment disproportionately impacts women and minorities.
In a divisive time for American society, it’s crucial that everyone is heard. Social media companies need to take a stand and ensure that destructive online behavior doesn’t turn people away from sharing their voices.