filtered by: social media

In Content or Context Moderation? by Robyn Caplan illustrates the organizational contexts of three types of content moderation strategies by drawing from interviews with 10 major digital platforms.

Proceed With Caution

points | 04.25.18

Kadija Ferryman, Elaine O. Nsoesie

After the Cambridge Analytica scandal, can internet data be used ethically for research? Data & Society Postdoctoral Scholar Kadija Ferryman and Elaine O. Nsoesie, PhD from the Institute for Health Metrics and Evaluation recommend “proceeding with caution” when it comes to internet data and precision medicine.

“Despite the public attention and backlash stemming from the Cambridge Analytica scandal — which began with an academic inquiry and resulted in at least 87 million Facebook profiles being disclosed — researchers argue that Facebook and other social media data can be used to advance knowledge, as long as these data are accessed and used in a responsible way. We argue that data from internet-based applications can be a relevant resource for precision medicine studies, provided that these data are accessed and used with care and caution.”

For Data & Society Points, Visiting Scholar Anne Washington breaks down the numbers behind Facebook and Cambridge Analytica.

“How did the choices made by only 270,000 Facebook users affect millions of people? How is it possible that the estimate of those affected changed from 50 million to 87 million so quickly? As a professor of data policy, I am interested in how information flows within organizations. In the case of Facebook and Cambridge Analytica, I was curious why this number was so inexact.”

In the wake of Cambridge Analytica, Data & Society Researcher Jacob Metcalf argues that the real risk is the behavioral models that have been developed from Facebook user’s data.

“But focusing solely on the purloined data is a mistake. Much more important are the behavioral models Cambridge Analytica built from the data. Even though the company claims to have deleted the data sets in 2015 in response to Facebook’s demands, those models live on, and can still be used to target highly specific groups of voters with messages designed to leverage their psychological traits. Although the stolen data sets represent a massive collection of individual privacy harms, the models are a collective harm, and far more pernicious.”

In this study, Data & Society Founder and President danah boyd, Affiliate Alice Marwick, and Researcher Mikaela Pitcan interviewed ask, how do young people of low socio-economic status in NYC manage their impressions online using tactics of respectability politics?

“This paper analyzes how upwardly mobile young people of low socio-economic status in New York City manage impressions online by adhering to normative notions of respectability. Our participants described how they present themselves on social media by self-censoring, curating a neutral image, segmenting content by platform, and avoiding content and contacts coded as lower class. Peers who post sexual images, primarily women, were considered unrespectable and subject to sexual shaming. These strategies reinforce racist and sexist notions of appropriate behavior, simultaneously enabling and limiting participants’ ability to succeed. We extend the impression management literature to examine how digital media mediates the intersection of class, gender, and race.”

How do algorithms & data-driven tech induce similarity across an industry? Data & Society Researcher Robyn Caplan and Founder & President danah boyd trace Facebook’s impact on news media organizations and journalists.

“This type of analysis sheds light on how organizational contexts are embedded into algorithms, which can then become embedded within other organizational and individual practices. By investigating technical practices as organizational and bureaucratic, discussions about accountability and decision-making can be reframed.”

Data & Society Media Manipulation Lead Joan Donovan investigates the development of InterOccupy, a virtual organization operated by participants in the Occupy Movement.

“InterOccupy took infrastructure building as a political strategy to ensure the movement endured beyond the police raids on the encampments. I conclude that NSMs create virtual organizations when there are routine and insurmountable failures in the communication milieu, where the future of the movement is at stake. My research follows the Occupy Movement ethnographically to understand what happens after the keyword.”

Is Facebook a platform or a media company? NBC News THINK asks D&S researcher Robyn Caplan to comment on the recent tech hearings.

Facebook thinks of itself as a neutral platform where everyone can come and share ideas…They’re basically saying that they’re the neutral public sphere. That they are the marketplace of ideas, instead of being the marketers of ideas.”

D&S Media Manipulation Research Lead Joan Donovan talks about the role of large tech companies in curbing extremist activity online.

Joan Donovan, a media manipulation research lead at the research institute Data & Society, said it’s well within these companies’ reach to implement changes that will curb white supremacist activity. And it’s something she said major platforms like Facebook and Twitter will have to confront as they acknowledge their role in magnifying hate speech and those who spout it.

‘Richard Spencer might have a megaphone and his own website to communicate his messages of hate,’ Donovan said in a phone interview Wednesday. ‘Now these platforms are realizing they are the megaphone. They are the conduit between him and larger audiences.’

Movements like the so-called ‘alt-right’ aren’t just built on charisma, Donovan added — they’re built on infrastructure. The internet and all of its possibilities has now become a major part of that infrastructure.

Quartz cites D&S Postdoctoral Scholar Caroline Jack in their guide to Lexicon of Lies:

Problematic information comes in various forms, each uniquely irksome. Yet people are quick to blast all inaccuracies as “fake news,” reinforcing the sense that facts are a thing of the past.

That’s dangerous and it needn’t be the case, according to the Lexicon of Lies, a recent report from the New York-based Data and Society research institute. “The words we choose to describe media manipulation can lead to assumptions about how information spreads, who spreads it, and who receives it,” writes Caroline Jack, a media historian and postdoctoral fellow at Data and Society. On a cultural level, “these assumptions can shape what kinds of interventions or solutions seem desirable, appropriate, or even possible,” she writes.

What is the Alt-Right?

Teaching Tolerance | 08.17.14

Becca Lewis

D&S Researcher Becca Lewis discusses the recruiting methodologies of the Alt-Right in Teaching Tolerance

‘Social media can be very powerful in shaping outlooks, but it doesn’t operate in a vacuum,’ explains Data & Society researcher Becca Lewis. ‘The shaping is coming from the other people using the platforms.’

The alt-right has a massive presence on social media and other channels where young people congregate. A Washington Post analysis identified 27,000 influential Twitter accounts associated with the alt-right, 13 percent of which are considered radical. Later, a George Washington University study found that white nationalist accounts in the United States have seen their follower counts grow by 600 percent since 2012.

D&S founder danah boyd discusses machine learning algorithms and prejudice, digital white flight on social media, trust in the media, and more on The Ezra Klein Show.

“Technology is made by people in a society, and it has a tendency to mirror and magnify the issues that affect everyday life.”

D&S researcher Alex Rosenblat explains how and why Uber & Lyft drivers surveil their passengers during rides.

Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.

D&S affiliate Desmond Patton breaks down how social media can lead to gun violence in this piece in The Trace.

Social media doesn’t allow for the opportunity to physically de-escalate an argument. Instead, it offers myriad ways to exacerbate a brewing conflict as opposing gangs or crews and friends and family take turns weighing in.

Philip Napoli and D&S researcher Robyn Caplan write on why companies like Google and Facebook insist that they are merely tech companies with no media impact, and why they are wrong for First Monday. Abstract is below:

A common position amongst social media platforms and online content aggregators is their resistance to being characterized as media companies. Rather, companies such as Google, Facebook, and Twitter have regularly insisted that they should be thought of purely as technology companies. This paper critiques the position that these platforms are technology companies rather than media companies, explores the underlying rationales, and considers the political, legal, and policy implications associated with accepting or rejecting this position. As this paper illustrates, this is no mere semantic distinction, given the history of the precise classification of communications technologies and services having profound ramifications for how these technologies and services are considered by policy-makers and the courts.

D&S researcher Mary Madden was interviewed by the American Press Institute about Madden’s recent Knight Foundation-supported report, “How Youth Navigate the News Landscape.”

However, one of my favorite quotes was from a participant who described a future where news would be delivered by hologram: “I think like it’s going to be little holograms. You’re going to open this thing and a little guy’s going to come out and tell you about stuff.”

Given that some participants said they already found notifications annoying, I’m not sure how successful the little hologram guy would be, but it was clear that the participants fully expected that the news industry would continue to evolve and innovate in creative ways moving forward.

Narcissism, Social Media and Power

Other | 04.03.17

Alice Marwick, Miranda Giacomin

D&S fellow Alice Marwick discusses narcissism and the attention economy in this interview with Kinfolk.

D&S researcher danah boyd discusses the problem with asking companies like Facebook and Google to ‘solve’ fake news – boyd insists the context of complex social problems are missing in this problematic solutionism of solving fake news.

Although a lot of the emphasis in the “fake news” discussion focuses on content that is widely spread and downright insane, much of the most insidious content out there isn’t in your face. It’s not spread widely, and certainly not by people who are forwarding it to object. It’s subtle content that is factually accurate, biased in presentation and framing, and encouraging folks to make dangerous conclusions that are not explicitly spelled out in the content itself.

Privacy, Poverty and Big Data: A Matrix of Vulnerabilities for Poor Americans

Washington University Law Review | 03.11.17

Mary Madden, Michele E. Gilman, Karen Levy, Alice E. Marwick

D&S researcher Mary Madden, Michele Gilman, D&S affiliate Karen Levy, and D&S fellow Alice Marwick examine how poor Americans are impacted by privacy violations and discuss how to protect digital privacy for the vulnerable. Abstract is as follows:

This Article examines the matrix of vulnerabilities that low-income people face as a result of the collection and aggregation of big data and the application of predictive analytics. On the one hand, big data systems could reverse growing economic inequality by expanding access to opportunities for low-income people. On the other hand, big data could widen economic gaps by making it possible to prey on low-income people or to exclude them from opportunities due to biases that get entrenched in algorithmic decision-making tools. New kinds of “networked privacy” harms, in which users are simultaneously held liable for their own behavior and the actions of those in their networks, may have particularly negative impacts on the poor. This Article reports on original empirical findings from a large, nationally-representative telephone survey with an oversample of low-income American adults and highlights how these patterns make particular groups of low-status internet users uniquely vulnerable to various forms of surveillance and networked privacy-related problems. In particular, a greater reliance on mobile connectivity, combined with lower usage of privacy-enhancing strategies may contribute to various privacy and security-related harms. The article then discusses three scenarios in which big data – including data gathered from social media inputs – is being aggregated to make predictions about individual behavior: employment screening, access to higher education, and predictive policing. Analysis of the legal frameworks surrounding these case studies reveals a lack of legal protections to counter digital discrimination against low-income people. In light of these legal gaps, the Article assesses leading proposals for enhancing digital privacy through the lens of class vulnerability, including comprehensive consumer privacy legislation, digital literacy, notice and choice regimes, and due process approaches. As policymakers consider reforms, the article urges greater attention to impacts on low-income persons and communities.


How Youth Navigate the News Landscape

report | 03.01.17

Mary Madden, Amanda Lenhart, Claire Fontaine

D&S researchers Mary Madden, Amanda Lenhart, and Claire Fontaine explore youth news consumption behavior on mobile and social media. It reveals how young people are adapting to a changing media environment to access news they trust. Executive summary is below

In 2017, what it means to “know what’s going on in the world” has become a hotly contested issue. Years of change and innovations in the journalism industry have radically transformed the way Americans consume, share and even produce their own forms of news. At a deeper level, the public’s eroding trust in journalistic institutions and the rise of a highly politicized networked digital media environment have underscored the urgent need to understand how these disruptions might evolve in the future.

As is often the case with technological revolutions, young people are on the front lines of change. They are deeply immersed in social media and mobile technologies in their daily lives, and are tasked with navigating an increasingly malleable media environment. And as researchers seek to understand the shifting behaviors and attitudes of today’s young news consumers, it has become increasingly important to reexamine the shifting boundaries of what counts as “news.” If we want to understand the place that news holds in young people’s lives, it is imperative that we understand their language, their conceptual models, and their frames of reference. These are the kinds of insights that interpretive qualitative research has the potential to surface.

In June and July of 2016, Knight Foundation commissioned a series of focus groups with 52 teenagers and young adults from across the United States to learn more about how young people conceptualize and consume news in digital spaces—with a focus on understanding the growing influence of mobile devices, social media and messaging apps. The research team conducted six exploratory focus groups of about 90 minutes each in three cities in the United States: Philadelphia, Chicago and Charlotte, North Carolina. Participants were between the ages of 14 and 24 and included an even mix of young men and women.

Fake news is a red herring

Deutsche Welle | 01.25.17

Ethan Zuckerman

D&S advisor Ethan Zuckerman writes about fake news and the bigger problem behind fake news.

The truly disturbing truth is that fake news isn’t the cause of our contemporary political dysfunction. More troublingly, we live in a world where people disagree deeply and fundamentally about how to understand it, even when we share the same set of facts. Solving the problems of fake news make that world slightly easier to navigate, but they don’t scratch the surface of the deeper problems of finding common ground with people with whom we disagree.

D&S advisor Baratunde Thurston details his exploration of The Glass Room exhibit.

I want to see The Glass Room everywhere there is an Apple Store…And anyone founding or working for a tech company should have to prove they’ve gone through this space and understood its meaning.

D&S affiliate Angèle Christin was quoted in this Wired piece discussing the pervasiveness of election news online.

In that process, such conversations start to invade “social areas that are usually sheltered from heated political discussions,” says Angèle Christin, professor of communication at Stanford. She says social media, including seemingly anodyne environments like a parenting forum, actually accentuate the problem because they blend the private and public.

D&S fellow Alice E. Marwick wrote this op-ed discussing how online harassment disproportionately impacts women and minorities.

In a divisive time for American society, it’s crucial that everyone is heard. Social media companies need to take a stand and ensure that destructive online behavior doesn’t turn people away from sharing their voices.

D&S founder danah boyd writes her response to the recent “Online Harassment, Digital Abuse, and Cyberstalking in America” report.

27% of all American internet users self-censor what they post online out of fear of online harassment. Young people are especially prone to self-censorship. This is deeply disturbing. We often worry about free speech online, but we don’t consider the social factors that prompt people to censor themselves or the ways in which this impacts some groups more than others.

D&S advisor Christina Xu writes about fake news and conspiracy theories in China.

Here in China, even well-educated and progressive friends have sincerely asked me about some pretty niche conspiracies. Did Hillary really assassinate someone? (No.) Didn’t Trump win 90% of the vote? (No.) Yesterday, someone even mentioned that they really liked a poem he wrote about his vision for America’s future. (What.)

D&S researcher Tim Hwang participates in episode 11 of Big Thinkers.

D&S affiliate Natasha Singer profiles how high school students are encouraged to develop their social media personas for college admissions – particularly Linkedin.

Now some social media experts are advising high school seniors to go even further. They are coaching students to take control of their online personas — by creating elaborate profiles on LinkedIn, the professional network, and bringing them to the attention of college admissions officers.

D&S founder danah boyd was recently interviewed for 52 Insights.

As she fires out her progressive opinions at rapid speed, what becomes immediately apparent is just how immensely passionate she is about her work. Her research revolves around the world of new technologies, social media and today’s youth and how they all fit together in our society. She is also a Principal Researcher at Microsoft and founder of the Data & Society Research Institute. As we discover, she is a fervent defender of young people and admonishing of her own generation’s eagerness to place blame on them. At the end of the day, her work is very much about the notion of equality and how we can create it with these many new tools we have.

We believe danah boyd has some very important things to say, and with over 100,000 followers on Twitter, her voice is already being heard.

D&S researcher Amanda Lenhart participates in the Relationships and Privacy in a World of Tinder and Twitter panel at Kids Online.

Automatically Processing Tweets from Gang-Involved Youth: Towards Detecting Loss and Aggression

paper | 10.11.16

Terra Blevins, Robert Kwiatkowski, Jamie C. Macbeth, Kathleen Mckeown, Desmond Patton, Owen Rambow

D&S affiliate Desmond Patton, with Terra Blevins, Robert Kwiatkowski, Jamie C. Macbeth, Kathleen Mckeown, and Owen Rambow, wrote this paper exploring a body of texts from a female gang member and examine patterns of speech that indicate an aggression trigger.

Violence is a serious problems for cities like Chicago and has been exacerbated by the use of social media by gang-involved youths for taunting rival gangs. We present a corpus of tweets from a young and powerful female gang member and her communicators, which we have annotated with discourse intention, using a deep read to understand how and what triggered conversations to escalate into aggression. We use this corpus to develop a part-of-speech tagger and phrase table for the variant of English that is used, as well as a classifier for identifying tweets that express grieving and aggression.

Breaking the Black Box: What Facebook Knows About You

ProPublica | 09.28.16

Julia Angwin, Terry Parris Jr., Surya Mattu

Julia Angwin, Terry Parris Jr., and D&S affiliate Surya Mattu explore what Facebook knows about its users.

We built a tool that works with the Chrome Web browser that lets you see what Facebook says it knows about you — you can rate the data for accuracy and you can send it to us, if you like. We will, of course, protect your privacy. We won’t collect any identifying details about you. And we won’t share your personal data with anyone.

D&S founder danah boyd critiques traditional media’s response to the September 17th bombings in NYC.

Traditional news media has a lot of say in what it publishes. This is one of the major things that distinguishes it from social media, which propagates the fears and anxieties of the public. And yet, time and time again, news media shows itself to be irresponsible, motivated more by the attention and money that it can obtain by stoking people’s fears than by a moral responsibility to help ground an anxious public.

D&S researcher Robyn Caplan co-wrote a paper analyzing how many large well-known companies, such as Buzzfeed and Facebook, argue against being categorized as media companies. However, Caplan and co-writer Philip M. Napoli assert that this argument has led to a misclassification of these companies and such misclassification has profound policy implications.

A common position amongst online content providers/aggregators is their resistance to being characterized as media companies. Companies such as Google, Facebook, BuzzFeed, and Twitter have argued that it is inaccurate to think of them as media companies. Rather, they argue that they should be thought of as technology companies. The logic of this position, and its implications for communications policy, have yet to be thoroughly analyzed. However, such an analysis is increasingly necessary as the dynamics of news and information production, dissemination, and consumption continue to evolve. This paper will explore and critique the logic and motivations behind the position that these content providers/aggregators are technology companies rather than media companies, as well as the communications policy implications associated with accepting or rejecting this position.

Anil Dash wrote a piece describing the evolution of blogs. He compares past and current capabilities of features, such as searches, comments, and following. He concludes with:

Ultimately, though, I think most of these ideas were good ideas the first time around and will remain good ideas in whatever modern incarnation revives them for a new generation. I have no doubt there’s a billion-dollar company waiting to be founded based on revisiting one of the concepts outlined here.

D&S fellow Natasha Singer co-wrote a piece discussing Facebook’s new partnership with Summit Public Schools to ‘introduce a free student-directed learning system’.

The Facebook-Summit partnership, by contrast, is more of a ground-up effort to create a national demand for student-driven learning in schools. Facebook announced its support for the system last September; the company declined to comment on how much it is spending on it. Early this month, Summit and Facebook opened the platform up to individual teachers who have not participated in Summit’s extensive on-site training program.

D&S Researcher Robyn Caplan considers whether Facebook is saving journalism or ruining it:

The question of whether Facebook is saving or ruining journalism is not relevant here because, like it or not, Facebook is a media company. That became more apparent recently as human editors became a visible part of Facebook’s news curation process. In truth, this team is only a tiny fraction of a network of actors whose decisions affect the inner workings of Facebook’s platform and the content we see.

In this background primer, D&S Research Analyst Laura Reed and D&S Founder danah boyd situate the current debate around the role of technology in the public sphere within a historical context. They identify and tease out some of the underlying values, biases, and assumptions present in the current debate surrounding the relationship between media and democracy, and connect them to existing scholarship within media history that is working to understand the organizational, institutional, social, political, and economic factors affecting the flow of news and information. They also identify a set of key questions to keep in mind as the conversation around technology and the public sphere evolves.

Algorithms play an increasingly significant role in shaping the digital news and information landscape, and there is growing concern about the potential negative impact that algorithms might have on public discourse. Examples of algorithmic biases and increasingly curated news feeds call into question the degree to which individuals have equal access to the means of producing, disseminating, and accessing information online. At the same time, these debates about the relationship between media, democracy, and publics are not new, and linking those debates to these emerging conversations about algorithms can help clarify the underlying assumptions and expectations. What do we want algorithms to do in an era of personalization? What does a successful algorithm look like? What form does an ideal public sphere take in the digital age? In asking these and other questions, we seek to highlight what’s at stake in the conversation about algorithms and publics moving forward.

D&S Research Analyst Laura Reed and D&S Researcher Robyn Caplan put together a set of case studies to complement the contemporary issues primer, Mediation, Automation, and Power, for the Algorithms and Publics project. These case studies explore situations in which algorithmic media is shaping the public sphere across a variety of dimensions, including the changing role of the journalism industry, the use of algorithms for censorship or international compliance, how algorithms are functioning within foreign policy aims, digital gerrymandering, the spread of misinformation, and more.

The processes editors have used to filter information were never transparent, hence the enthusiasm of the early 2000s for unfiltered media. What may be new is the pervasiveness of the gatekeeping that algorithms make possible, the invisibility of that filtering and the difficulty of choosing which filters you want shaping your conversation.

Points/public spheres: “Ben Franklin, the Post Office and the Digital Public Sphere” was originally published at … My Heart’s in Accra. It’s the essay version of Ethan Zuckerman’s opening remarks at the Who Controls the Public Sphere in the Era of Algorithms? workshop, hosted by Data & Society as part of our developing Algorithms and Publics project. Video is available here.

D&S Board Member Anil Dash contrasts two recent approaches to making internet connectivity more widely available. Comparing the efforts to build consensus behind Facebook’s Free Basics initiative to LinkNYC, the recently-launched program to bring free broadband wifi to New York City, Dash views each situation as a compelling example of who gets heard, and when, any time a big institution tries to create a technology infrastructure to serve millions of people.

There’s one key lesson we can take from these two attempts to connect millions of people to the Internet: it’s about building trust. Technology infrastructure can be good or bad, extractive or supportive, a lifeline or a raw deal. Objections to new infrastructure are often dismissed by the people pushing them, but people’s concerns are seldom simply about advertising or bring skeptical of corporations. There are often very good reasons to look a gift horse in the mouth.

Whether we believe in the positive potential of getting connected simply boils down to whether we feel the people providing that infrastructure have truly listened to us. The good news is, we have clear examples of how to do exactly that.

Platforms Intervene

Social Media + Society | 05.11.15

Tarleton Gillespie

“Social media platforms don’t just guide, distort, and facilitate social activity, they also delete some of it. They don’t just link users together, they also suspend them. They don’t just circulate our images and posts, they also algorithmically promote some over others. Platforms pick and choose.”

“The phenomenon of ‘social media’ has more to do with its cultural positioning than its technological affordances. Rooted in the broader “Web 2.0” landscape, social media helped engineers, entrepreneurs, and everyday people reimagine the role that technology could play in information dissemination, community development, and communication. While the technologies invoked by the phrase social media have a long history, what unfolded in the 2000s reconfigured socio-technical practices in significant ways. Reflecting on the brief history of social media, this essay argues for the need to better understand this phenomenon.”

Social Media, Financial Algorithms and the Hack Crash

Theory, Culture & Society | 05.04.15

Tero Karppi, Kate Crawford

‘@AP: Breaking: Two Explosions in the White House and Barack Obama is injured’. So read a tweet sent from a hacked Associated Press Twitter account @AP, which affected financial markets, wiping out $136.5 billion of the Standard & Poor’s 500 Index’s value. While the speed of the Associated Press hack crash event and the proprietary nature of the algorithms involved make it difficult to make causal claims about the relationship between social media and trading algorithms, we argue that it helps us to critically examine the volatile connections between social media, financial markets, and third parties offering human and algorithmic analysis. By analyzing the commentaries of this event, we highlight two particular currents: one formed by computational processes that mine and analyze Twitter data, and the other being financial algorithms that make automated trades and steer the stock market. We build on sociology of finance together with media theory and focus on the work of Christian Marazzi, Gabriel Tarde and Tony Sampson to analyze the relationship between social media and financial markets. We argue that Twitter and social media are becoming more powerful forces, not just because they connect people or generate new modes of participation, but because they are connecting human communicative spaces to automated computational spaces in ways that are affectively contagious and highly volatile.

After his Facebook account was hacked, D&S advisor Baratunde Thurston saw the other side of allowing social media giants to access other websites, apps, and services. After sharing his take-aways from the experience you might reconsider connecting your Twitter account the next time you find yourself wondering, as Thurston puts it: “Exactly why do you need to know my email address and have me upload a profile photo, random app I don’t really care about?”

Social Media Ethics

WNET Religion & Ethics Newsweekly | 01.09.15

danah boyd

“Some social media companies—including Facebook—have run experiments to learn what influences user behavior. Many of these experiments have troubled both social media users and privacy advocates, who worry that this research and use of personal information is unethical.[…]
“[D&S founder danah] boyd: ‘I’m more concerned about how you get engineers to be thinking about ethical decisions and what it means to be training engineers from the get-go to really think about ethics.'”

Social Media Ethics, WNET Religion & Ethics Newsweekly, January 9, 2015

Abstract: While much attention is given to young people’s online privacy practices on sites like Facebook, current theories of privacy fail to account for the ways in which social media alter practices of information-sharing and visibility. Traditional models of privacy are individualistic, but the realities of privacy reflect the location of individuals in contexts and networks. The affordances of social technologies, which enable people to share information about others, further preclude individual control over privacy. Despite this, social media technologies primarily follow technical models of privacy that presume individual information control. We argue that the dynamics of sites like Facebook have forced teens to alter their conceptions of privacy to account for the networked nature of social media. Drawing on their practices and experiences, we offer a model of networked privacy to explain how privacy is achieved in networked publics.

Request a copy of this paper.

Re: “Experimental evidence of massive-scale emotional contagion through social networks”

other | 07.09.14

danah boyd, Kate Crawford, Ed Felten, Tarleton Gillespie, Micah Sifry, Anthony Townsend, Janet Vertesi

Data & Society community members wrote individually about “Experimental evidence of massive-scale emotional contagion through social networks” (PNAS 2014 111 (24) 8788-8790) and the public controversy surrounding the study. Their comments are collected here:

danah boyd, What does the Facebook experiment teach us?

Kate Crawford, The Test We Can—and Should—Run on Facebook

Ed Felten, Facebook’s Emotional Manipulation Study: When Ethical Worlds Collide; Privacy Implications of Social Media Manipulation; and On the Ethics of A/B Testing

Tarleton Gillespie, Facebook’s algorithm — why our assumptions are wrong, and our concerns are right

Micah Sifry, Why Facebook’s ‘Voter Megaphone’ Is the Real Manipulation to Worry About

Anthony Townsend, The Ethics of Experimentation in the IoT

Janet Vertesi, The Real Reason You Should Be Worried About That Facebook Experiment

(Updated October 23, 2014.)

“As consumers we’ve been told that we’re in charge, so we enjoy the ritual, even though it’s exhausting. We even decide when we’ll opt out. But what happens when companies walk away from us first?”

In this article, D&S advisor Baratunde Thurston proposes a way for users to take more agency in their relationships with apps and the companies that created them.

Subscribe to the Data & Society newsletter

Twitter |  Facebook  |  Medium  | RSS

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.  |  Privacy policy