In Content or Context Moderation? by Robyn Caplan illustrates the organizational contexts of three types of content moderation strategies by drawing from interviews with 10 major digital platforms.
After the Cambridge Analytica scandal, can internet data be used ethically for research? Data & Society Postdoctoral Scholar Kadija Ferryman and Elaine O. Nsoesie, PhD from the Institute for Health Metrics and Evaluation recommend “proceeding with caution” when it comes to internet data and precision medicine.
“Despite the public attention and backlash stemming from the Cambridge Analytica scandal — which began with an academic inquiry and resulted in at least 87 million Facebook profiles being disclosed — researchers argue that Facebook and other social media data can be used to advance knowledge, as long as these data are accessed and used in a responsible way. We argue that data from internet-based applications can be a relevant resource for precision medicine studies, provided that these data are accessed and used with care and caution.”
For Data & Society Points, Visiting Scholar Anne Washington breaks down the numbers behind Facebook and Cambridge Analytica.
“How did the choices made by only 270,000 Facebook users affect millions of people? How is it possible that the estimate of those affected changed from 50 million to 87 million so quickly? As a professor of data policy, I am interested in how information flows within organizations. In the case of Facebook and Cambridge Analytica, I was curious why this number was so inexact.”
MIT Technology Review | 04.09.18
In the wake of Cambridge Analytica, Data & Society Researcher Jacob Metcalf argues that the real risk is the behavioral models that have been developed from Facebook user’s data.
“But focusing solely on the purloined data is a mistake. Much more important are the behavioral models Cambridge Analytica built from the data. Even though the company claims to have deleted the data sets in 2015 in response to Facebook’s demands, those models live on, and can still be used to target highly specific groups of voters with messages designed to leverage their psychological traits. Although the stolen data sets represent a massive collection of individual privacy harms, the models are a collective harm, and far more pernicious.”
Journal of Computer-Mediated Communication | 04.06.18
Big Data & Society | 02.14.18
How do algorithms & data-driven tech induce similarity across an industry? Data & Society Researcher Robyn Caplan and Founder & President danah boyd trace Facebook’s impact on news media organizations and journalists.
“This type of analysis sheds light on how organizational contexts are embedded into algorithms, which can then become embedded within other organizational and individual practices. By investigating technical practices as organizational and bureaucratic, discussions about accountability and decision-making can be reframed.”
Social Media + Society | 02.01.18
Data & Society Media Manipulation Lead Joan Donovan investigates the development of InterOccupy, a virtual organization operated by participants in the Occupy Movement.
“InterOccupy took infrastructure building as a political strategy to ensure the movement endured beyond the police raids on the encampments. I conclude that NSMs create virtual organizations when there are routine and insurmountable failures in the communication milieu, where the future of the movement is at stake. My research follows the Occupy Movement ethnographically to understand what happens after the keyword.”
Is Facebook a platform or a media company? NBC News THINK asks D&S researcher Robyn Caplan to comment on the recent tech hearings.
Facebook thinks of itself as a neutral platform where everyone can come and share ideas…They’re basically saying that they’re the neutral public sphere. That they are the marketplace of ideas, instead of being the marketers of ideas.”
Mic | 08.16.17
D&S Media Manipulation Research Lead Joan Donovan talks about the role of large tech companies in curbing extremist activity online.
Joan Donovan, a media manipulation research lead at the research institute Data & Society, said it’s well within these companies’ reach to implement changes that will curb white supremacist activity. And it’s something she said major platforms like Facebook and Twitter will have to confront as they acknowledge their role in magnifying hate speech and those who spout it.
‘Richard Spencer might have a megaphone and his own website to communicate his messages of hate,’ Donovan said in a phone interview Wednesday. ‘Now these platforms are realizing they are the megaphone. They are the conduit between him and larger audiences.’
Movements like the so-called ‘alt-right’ aren’t just built on charisma, Donovan added — they’re built on infrastructure. The internet and all of its possibilities has now become a major part of that infrastructure.
Quartz | 08.14.17
Quartz cites D&S Postdoctoral Scholar Caroline Jack in their guide to Lexicon of Lies:
Problematic information comes in various forms, each uniquely irksome. Yet people are quick to blast all inaccuracies as “fake news,” reinforcing the sense that facts are a thing of the past.
That’s dangerous and it needn’t be the case, according to the Lexicon of Lies, a recent report from the New York-based Data and Society research institute. “The words we choose to describe media manipulation can lead to assumptions about how information spreads, who spreads it, and who receives it,” writes Caroline Jack, a media historian and postdoctoral fellow at Data and Society. On a cultural level, “these assumptions can shape what kinds of interventions or solutions seem desirable, appropriate, or even possible,” she writes.
D&S Researcher Becca Lewis discusses the recruiting methodologies of the Alt-Right in Teaching Tolerance
‘Social media can be very powerful in shaping outlooks, but it doesn’t operate in a vacuum,’ explains Data & Society researcher Becca Lewis. ‘The shaping is coming from the other people using the platforms.’
The alt-right has a massive presence on social media and other channels where young people congregate. A Washington Post analysis identified 27,000 influential Twitter accounts associated with the alt-right, 13 percent of which are considered radical. Later, a George Washington University study found that white nationalist accounts in the United States have seen their follower counts grow by 600 percent since 2012.
D&S founder danah boyd discusses machine learning algorithms and prejudice, digital white flight on social media, trust in the media, and more on The Ezra Klein Show.
“Technology is made by people in a society, and it has a tendency to mirror and magnify the issues that affect everyday life.”
D&S researcher Alex Rosenblat explains how and why Uber & Lyft drivers surveil their passengers during rides.
Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.
The Trace | 05.03.17
First Monday | 05.01.17
Philip Napoli and D&S researcher Robyn Caplan write on why companies like Google and Facebook insist that they are merely tech companies with no media impact, and why they are wrong for First Monday. Abstract is below:
A common position amongst social media platforms and online content aggregators is their resistance to being characterized as media companies. Rather, companies such as Google, Facebook, and Twitter have regularly insisted that they should be thought of purely as technology companies. This paper critiques the position that these platforms are technology companies rather than media companies, explores the underlying rationales, and considers the political, legal, and policy implications associated with accepting or rejecting this position. As this paper illustrates, this is no mere semantic distinction, given the history of the precise classification of communications technologies and services having profound ramifications for how these technologies and services are considered by policy-makers and the courts.
American Press Institute | 04.12.17
D&S researcher Mary Madden was interviewed by the American Press Institute about Madden’s recent Knight Foundation-supported report, “How Youth Navigate the News Landscape.”
However, one of my favorite quotes was from a participant who described a future where news would be delivered by hologram: “I think like it’s going to be little holograms. You’re going to open this thing and a little guy’s going to come out and tell you about stuff.”
Given that some participants said they already found notifications annoying, I’m not sure how successful the little hologram guy would be, but it was clear that the participants fully expected that the news industry would continue to evolve and innovate in creative ways moving forward.
Backchannel | 03.27.17
D&S researcher danah boyd discusses the problem with asking companies like Facebook and Google to ‘solve’ fake news – boyd insists the context of complex social problems are missing in this problematic solutionism of solving fake news.
Although a lot of the emphasis in the “fake news” discussion focuses on content that is widely spread and downright insane, much of the most insidious content out there isn’t in your face. It’s not spread widely, and certainly not by people who are forwarding it to object. It’s subtle content that is factually accurate, biased in presentation and framing, and encouraging folks to make dangerous conclusions that are not explicitly spelled out in the content itself.
Washington University Law Review | 03.11.17
D&S researcher Mary Madden, Michele Gilman, D&S affiliate Karen Levy, and D&S fellow Alice Marwick examine how poor Americans are impacted by privacy violations and discuss how to protect digital privacy for the vulnerable. Abstract is as follows:
This Article examines the matrix of vulnerabilities that low-income people face as a result of the collection and aggregation of big data and the application of predictive analytics. On the one hand, big data systems could reverse growing economic inequality by expanding access to opportunities for low-income people. On the other hand, big data could widen economic gaps by making it possible to prey on low-income people or to exclude them from opportunities due to biases that get entrenched in algorithmic decision-making tools. New kinds of “networked privacy” harms, in which users are simultaneously held liable for their own behavior and the actions of those in their networks, may have particularly negative impacts on the poor. This Article reports on original empirical findings from a large, nationally-representative telephone survey with an oversample of low-income American adults and highlights how these patterns make particular groups of low-status internet users uniquely vulnerable to various forms of surveillance and networked privacy-related problems. In particular, a greater reliance on mobile connectivity, combined with lower usage of privacy-enhancing strategies may contribute to various privacy and security-related harms. The article then discusses three scenarios in which big data – including data gathered from social media inputs – is being aggregated to make predictions about individual behavior: employment screening, access to higher education, and predictive policing. Analysis of the legal frameworks surrounding these case studies reveals a lack of legal protections to counter digital discrimination against low-income people. In light of these legal gaps, the Article assesses leading proposals for enhancing digital privacy through the lens of class vulnerability, including comprehensive consumer privacy legislation, digital literacy, notice and choice regimes, and due process approaches. As policymakers consider reforms, the article urges greater attention to impacts on low-income persons and communities.
report | 03.01.17
D&S advisor Ethan Zuckerman writes about fake news and the bigger problem behind fake news.
The truly disturbing truth is that fake news isn’t the cause of our contemporary political dysfunction. More troublingly, we live in a world where people disagree deeply and fundamentally about how to understand it, even when we share the same set of facts. Solving the problems of fake news make that world slightly easier to navigate, but they don’t scratch the surface of the deeper problems of finding common ground with people with whom we disagree.
points | 12.06.16
D&S advisor Baratunde Thurston details his exploration of The Glass Room exhibit.
I want to see The Glass Room everywhere there is an Apple Store…And anyone founding or working for a tech company should have to prove they’ve gone through this space and understood its meaning.
Wired | 11.27.16
D&S affiliate Angèle Christin was quoted in this Wired piece discussing the pervasiveness of election news online.
In that process, such conversations start to invade “social areas that are usually sheltered from heated political discussions,” says Angèle Christin, professor of communication at Stanford. She says social media, including seemingly anodyne environments like a parenting forum, actually accentuate the problem because they blend the private and public.
Quartz | 11.24.16
D&S fellow Alice E. Marwick wrote this op-ed discussing how online harassment disproportionately impacts women and minorities.
In a divisive time for American society, it’s crucial that everyone is heard. Social media companies need to take a stand and ensure that destructive online behavior doesn’t turn people away from sharing their voices.
D&S founder danah boyd writes her response to the recent “Online Harassment, Digital Abuse, and Cyberstalking in America” report.
27% of all American internet users self-censor what they post online out of fear of online harassment. Young people are especially prone to self-censorship. This is deeply disturbing. We often worry about free speech online, but we don’t consider the social factors that prompt people to censor themselves or the ways in which this impacts some groups more than others.
D&S advisor Christina Xu writes about fake news and conspiracy theories in China.
Here in China, even well-educated and progressive friends have sincerely asked me about some pretty niche conspiracies. Did Hillary really assassinate someone? (No.) Didn’t Trump win 90% of the vote? (No.) Yesterday, someone even mentioned that they really liked a poem he wrote about his vision for America’s future. (What.)
The New York Times | 11.05.16
D&S founder danah boyd was recently interviewed for 52 Insights.
As she fires out her progressive opinions at rapid speed, what becomes immediately apparent is just how immensely passionate she is about her work. Her research revolves around the world of new technologies, social media and today’s youth and how they all fit together in our society. She is also a Principal Researcher at Microsoft and founder of the Data & Society Research Institute. As we discover, she is a fervent defender of young people and admonishing of her own generation’s eagerness to place blame on them. At the end of the day, her work is very much about the notion of equality and how we can create it with these many new tools we have.
We believe danah boyd has some very important things to say, and with over 100,000 followers on Twitter, her voice is already being heard.
video | 10.20.16
D&S researcher Amanda Lenhart participates in the Relationships and Privacy in a World of Tinder and Twitter panel at Kids Online.
paper | 10.11.16
D&S affiliate Desmond Patton, with Terra Blevins, Robert Kwiatkowski, Jamie C. Macbeth, Kathleen Mckeown, and Owen Rambow, wrote this paper exploring a body of texts from a female gang member and examine patterns of speech that indicate an aggression trigger.
Violence is a serious problems for cities like Chicago and has been exacerbated by the use of social media by gang-involved youths for taunting rival gangs. We present a corpus of tweets from a young and powerful female gang member and her communicators, which we have annotated with discourse intention, using a deep read to understand how and what triggered conversations to escalate into aggression. We use this corpus to develop a part-of-speech tagger and phrase table for the variant of English that is used, as well as a classiﬁer for identifying tweets that express grieving and aggression.
ProPublica | 09.28.16
Julia Angwin, Terry Parris Jr., and D&S affiliate Surya Mattu explore what Facebook knows about its users.
We built a tool that works with the Chrome Web browser that lets you see what Facebook says it knows about you — you can rate the data for accuracy and you can send it to us, if you like. We will, of course, protect your privacy. We won’t collect any identifying details about you. And we won’t share your personal data with anyone.
D&S founder danah boyd critiques traditional media’s response to the September 17th bombings in NYC.
Traditional news media has a lot of say in what it publishes. This is one of the major things that distinguishes it from social media, which propagates the fears and anxieties of the public. And yet, time and time again, news media shows itself to be irresponsible, motivated more by the attention and money that it can obtain by stoking people’s fears than by a moral responsibility to help ground an anxious public.
paper | 03.18.16
D&S researcher Robyn Caplan co-wrote a paper analyzing how many large well-known companies, such as Buzzfeed and Facebook, argue against being categorized as media companies. However, Caplan and co-writer Philip M. Napoli assert that this argument has led to a misclassification of these companies and such misclassification has profound policy implications.
A common position amongst online content providers/aggregators is their resistance to being characterized as media companies. Companies such as Google, Facebook, BuzzFeed, and Twitter have argued that it is inaccurate to think of them as media companies. Rather, they argue that they should be thought of as technology companies. The logic of this position, and its implications for communications policy, have yet to be thoroughly analyzed. However, such an analysis is increasingly necessary as the dynamics of news and information production, dissemination, and consumption continue to evolve. This paper will explore and critique the logic and motivations behind the position that these content providers/aggregators are technology companies rather than media companies, as well as the communications policy implications associated with accepting or rejecting this position.
The New York Times | 08.09.16
D&S fellow Natasha Singer co-wrote a piece discussing Facebook’s new partnership with Summit Public Schools to ‘introduce a free student-directed learning system’.
The Facebook-Summit partnership, by contrast, is more of a ground-up effort to create a national demand for student-driven learning in schools. Facebook announced its support for the system last September; the company declined to comment on how much it is spending on it. Early this month, Summit and Facebook opened the platform up to individual teachers who have not participated in Summit’s extensive on-site training program.
D&S Researcher Robyn Caplan considers whether Facebook is saving journalism or ruining it:
The question of whether Facebook is saving or ruining journalism is not relevant here because, like it or not, Facebook is a media company. That became more apparent recently as human editors became a visible part of Facebook’s news curation process. In truth, this team is only a tiny fraction of a network of actors whose decisions affect the inner workings of Facebook’s platform and the content we see.
primer | 05.13.16
In this background primer, D&S Research Analyst Laura Reed and D&S Founder danah boyd situate the current debate around the role of technology in the public sphere within a historical context. They identify and tease out some of the underlying values, biases, and assumptions present in the current debate surrounding the relationship between media and democracy, and connect them to existing scholarship within media history that is working to understand the organizational, institutional, social, political, and economic factors affecting the flow of news and information. They also identify a set of key questions to keep in mind as the conversation around technology and the public sphere evolves.
Algorithms play an increasingly significant role in shaping the digital news and information landscape, and there is growing concern about the potential negative impact that algorithms might have on public discourse. Examples of algorithmic biases and increasingly curated news feeds call into question the degree to which individuals have equal access to the means of producing, disseminating, and accessing information online. At the same time, these debates about the relationship between media, democracy, and publics are not new, and linking those debates to these emerging conversations about algorithms can help clarify the underlying assumptions and expectations. What do we want algorithms to do in an era of personalization? What does a successful algorithm look like? What form does an ideal public sphere take in the digital age? In asking these and other questions, we seek to highlight what’s at stake in the conversation about algorithms and publics moving forward.
primer | 05.13.16
D&S Research Analyst Laura Reed and D&S Researcher Robyn Caplan put together a set of case studies to complement the contemporary issues primer, Mediation, Automation, and Power, for the Algorithms and Publics project. These case studies explore situations in which algorithmic media is shaping the public sphere across a variety of dimensions, including the changing role of the journalism industry, the use of algorithms for censorship or international compliance, how algorithms are functioning within foreign policy aims, digital gerrymandering, the spread of misinformation, and more.
CultureDigitally.org | 05.09.16
D&S Advisor Tarleton Gillespie responds to Gizmodo’s recent piece alleging bias in Facebook’s Trending Topics list. He argues that information algorithms like the ones used to identify “trends” on Facebook do not work alone and cannot work alone and argues that “in so many ways that we must simply discard the fantasy that they do, or ever will.”
People are in the algorithm because how could they not be? People produce the Facebook activity being measured, people design the algorithms and set their evaluative criteria, people decide what counts as a trend, people name and summarize them, and people look to game the algorithm with their next posts.
Trending algorithms are undeniably becoming part of the cultural landscape, and revelations like Gizmodo’s are helpful steps in helping us shed the easy notions of what they are and how they work, notions the platforms have fostered. Social media platforms must come to fully realize that they are newsmakers and gatekeepers, whether they intend to be or not, whether they want to be or not. And while algorithms can chew on a lot of data, it is still a substantial, significant, and human process to turn that data into claims about importance that get fed back to millions of users. This is not a realization that they will ever reach on their own — which suggests to me that they need the two countervailing forces that journalism has: a structural commitment to the public, imposed if not inherent, and competition to force them to take such obligations seriously.
points | 02.26.16
The processes editors have used to filter information were never transparent, hence the enthusiasm of the early 2000s for unfiltered media. What may be new is the pervasiveness of the gatekeeping that algorithms make possible, the invisibility of that filtering and the difficulty of choosing which filters you want shaping your conversation.
Points/public spheres: “Ben Franklin, the Post Office and the Digital Public Sphere” was originally published at … My Heart’s in Accra. It’s the essay version of Ethan Zuckerman’s opening remarks at the Who Controls the Public Sphere in the Era of Algorithms? workshop, hosted by Data & Society as part of our developing Algorithms and Publics project. Video is available here.
D&S Board Member Anil Dash contrasts two recent approaches to making internet connectivity more widely available. Comparing the efforts to build consensus behind Facebook’s Free Basics initiative to LinkNYC, the recently-launched program to bring free broadband wifi to New York City, Dash views each situation as a compelling example of who gets heard, and when, any time a big institution tries to create a technology infrastructure to serve millions of people.
There’s one key lesson we can take from these two attempts to connect millions of people to the Internet: it’s about building trust. Technology infrastructure can be good or bad, extractive or supportive, a lifeline or a raw deal. Objections to new infrastructure are often dismissed by the people pushing them, but people’s concerns are seldom simply about advertising or bring skeptical of corporations. There are often very good reasons to look a gift horse in the mouth.
Whether we believe in the positive potential of getting connected simply boils down to whether we feel the people providing that infrastructure have truly listened to us. The good news is, we have clear examples of how to do exactly that.
“Social media platforms don’t just guide, distort, and facilitate social activity, they also delete some of it. They don’t just link users together, they also suspend them. They don’t just circulate our images and posts, they also algorithmically promote some over others. Platforms pick and choose.”
Theory, Culture & Society | 05.04.15
magazine article | 04.13.15
After his Facebook account was hacked, D&S advisor Baratunde Thurston saw the other side of allowing social media giants to access other websites, apps, and services. After sharing his take-aways from the experience you might reconsider connecting your Twitter account the next time you find yourself wondering, as Thurston puts it: “Exactly why do you need to know my email address and have me upload a profile photo, random app I don’t really care about?”
New Media & Society | 07.21.14
other | 07.09.14
magazine article | 01.13.14
“As consumers we’ve been told that we’re in charge, so we enjoy the ritual, even though it’s exhausting. We even decide when we’ll opt out. But what happens when companies walk away from us first?”
In this article, D&S advisor Baratunde Thurston proposes a way for users to take more agency in their relationships with apps and the companies that created them.