Building off the research for her book Uberland: How Algorithms are Rewriting the Rules of Work, Data & Society Researcher Alex Rosenblat explains algorithmic management in the gig-economy.
“Data and algorithms are presented as objective, neutral, even benevolent: Algorithms gave us super-convenient food delivery services and personalized movie recommendations. But Uber and other ride-hailing apps have taken the way Silicon Valley uses algorithms and applied it to work, and that’s not always a good thing.”
Slate | 08.13.18
Drawing on conclusions from the Data & Society report Beyond Disruption, Researcher Alexandra Mateescu discusses surveillance of domestic care workers online.
“Online marketplaces may not be the root cause of individual employers’ biases, but their design is not neutral. They are built with a particular archetype of what an “entrepreneurial” domestic worker looks like—one who feels at home in the world of apps, social media, and online self-branding—and ultimately replicates and can even exacerbate many of the divisions that came with our predigital workplaces. As platform companies gain growing power over the hiring processes of a whole industry, they will need to actively work against the embedded inequalities in the markets they now mediate.”
In an op-ed for The New York Times, Data & Society Researcher Alex Rosenblat shatters the narrative that Uber encapsulates the entire gig-economy.
“But this industry has, until recently, operated largely informally, with jobs secured by word-of-mouth. That’s changing, as employers are increasingly turning to Uber-like services to find nannies, housecleaners and other care workers. These new gig economy companies, while making it easier for some people to find short-term work, have created hardships for others, and may leave many experienced care workers behind.”
In this reading list, Data & Society Researcher Alexandra Mateescu and Postdoctoral Scholar Julia Ticona provide a pathway for deeper investigations into themes such as gender inequality and algorithmic visibility in the gig economy.
“This list is meant for readers of Beyond Disruption who want to dig more deeply into some of the key areas explored in its pages. It isn’t meant to be exhaustive, but rather give readers a jumping off point for their own investigations.”
Drawn from the experiences of U.S. ridehail, care, and cleaning platform workers, “Beyond Disruption” demonstrates how technology reshapes the future of labor.
New Media & Society | 05.15.18
Data & Society Postdoctoral Scholar Julia Ticona and Research Analyst Alexandra Mateescu investigate the consequences of “visibility” in carework apps.
“Based on a discourse analysis of carework platforms and interviews with workers using them, we illustrate that these platforms seek to formalize employment relationships through technologies that increase visibility. We argue that carework platforms are “cultural entrepreneurs” that create and maintain cultural distinctions between populations of workers, and institutionalize those distinctions into platform features. Ultimately, the visibility created by platforms does not realize the formalization of employment relationships, but does serve the interests of platform companies and clients and exacerbate existing inequalities for workers.”
Fast Company | 03.29.18
Data & Society Postdoctoral Scholar Julia Ticona and Data & Society Research Analyst Alexandra Mateescu co-authored an op-ed for Fast Company about the safety of workers who rely on digital platforms to stay employed.
“For the past year, we’ve been interviewing nannies, babysitters, elder care workers, and housecleaners across the U.S. who use platforms like Handy, TaskRabbit, and the in-home care provider platform Care.com to do care and cleaning work, in an effort to better understand how platforms are shaping domestic work. Along the way, we have found that, in many cases, the aggregation of individual data leads not to more accountability and justice, but rather forces workers to make trade-offs between visibility and vulnerability.”
Slate | 03.02.18
Data & Society Researcher Alex Rosenblat unveils the impact of Uber’s new driving limit policy.
“These moves from Uber and Lyft seem to align with their gig-economy model of employment, which structures work as an individual pursuit and individual liability. But even this sell is misleading. While, for many drivers, the idea of being independent at work is very appealing, their ability to make entrepreneurial decisions is consistently constrained by the ride-hail apps’ nudges and other algorithmic management, rules, external costs, and wage cuts.”
Data & Society Operations Assistant Richard Salame applies taylorism to Amazon’s new wristbands that track workers’ movements.
“Amazon’s peculiar culture notwithstanding, the wristbands in many ways don’t offer anything new, technologically or conceptually. What has changed is workers’ ability to challenge this kind of surveillance.”
In the gig-economy, management by algorithms means employment relationships grow more remote and distributed across the network. Alex Rosenblat explains how workers navigate this by creating their own forums.
“Online forums aren’t just helping drivers like Cole navigate the challenges of their work, and helping those of us who use and study these platforms grasp those challenges too. They show how as employment relationships grow more remote and distributed across the network, workers can adapt, using technology to forge their own workplace culture.”
Columbia Law Review | 11.01.17
D&S researcher Alex Rosenblat co-authored an article on power dynamics in the sharing economy.
“Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value but raises concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power.”
D&S researcher Alex Ronseblat explores when incentives in the gig economy become deceptive.
“While charging for work opportunities is reminiscent of multi-level marketing, like Mary Kay or Amway, this is different because Uber controls so much of the labor process, like dispatch, and competing promotional pay, in addition to setting the base rates at which drivers earn their income. In other words, drivers can use their labor as collateral on their down payment now in exchange for earning a premium on their labor later, but Uber ultimately controls whether or not the promotion is worthwhile.”
Data & Society Researcher Alexandra Mateescu maps out the inequalities and power dynamics within the gig economy.
“As on-demand companies like Handy and online marketplaces like Care.com enter the space of domestic work, a range of questions emerge: what are the risks and challenges of signing up for platform-based work as an immigrant? As a non-native English speaker? How are experiences of work different for individuals with strong professional identities as caregivers or housekeepers, versus more casual workers who may also be finding other kinds of work via Postmates or Uber?”
D&S researcher Alex Rosenblat explains how and why Uber & Lyft drivers surveil their passengers during rides.
Passenger shaming is partly a consequence of the Uber/Lyft business model. Drivers can’t get reliable accountability from their employers or passengers, so they turn to tools like dash-cams. These are part of the externalized costs of the lean gig economy employment model.
Washington University Law Review | 03.11.17
D&S researcher Mary Madden, Michele Gilman, D&S affiliate Karen Levy, and D&S fellow Alice Marwick examine how poor Americans are impacted by privacy violations and discuss how to protect digital privacy for the vulnerable. Abstract is as follows:
This Article examines the matrix of vulnerabilities that low-income people face as a result of the collection and aggregation of big data and the application of predictive analytics. On the one hand, big data systems could reverse growing economic inequality by expanding access to opportunities for low-income people. On the other hand, big data could widen economic gaps by making it possible to prey on low-income people or to exclude them from opportunities due to biases that get entrenched in algorithmic decision-making tools. New kinds of “networked privacy” harms, in which users are simultaneously held liable for their own behavior and the actions of those in their networks, may have particularly negative impacts on the poor. This Article reports on original empirical findings from a large, nationally-representative telephone survey with an oversample of low-income American adults and highlights how these patterns make particular groups of low-status internet users uniquely vulnerable to various forms of surveillance and networked privacy-related problems. In particular, a greater reliance on mobile connectivity, combined with lower usage of privacy-enhancing strategies may contribute to various privacy and security-related harms. The article then discusses three scenarios in which big data – including data gathered from social media inputs – is being aggregated to make predictions about individual behavior: employment screening, access to higher education, and predictive policing. Analysis of the legal frameworks surrounding these case studies reveals a lack of legal protections to counter digital discrimination against low-income people. In light of these legal gaps, the Article assesses leading proposals for enhancing digital privacy through the lens of class vulnerability, including comprehensive consumer privacy legislation, digital literacy, notice and choice regimes, and due process approaches. As policymakers consider reforms, the article urges greater attention to impacts on low-income persons and communities.
Columbia Law Review | 03.07.17
Ryan Calo and D&S researcher Alex Rosenblat write this analysis of the newly termed ‘taking economy’ of Uber.
Sharing economy firms such as Uber and Airbnb facilitate trusted transactions between strangers on digital platforms. This creates economic and other value and raises a set of concerns around racial bias, safety, and fairness to competitors and workers that legal scholarship has begun to address. Missing from the literature, however, is a fundamental critique of the sharing economy grounded in asymmetries of information and power. This Article, coauthored by a law professor and a technology ethnographer who studies the ride-hailing community, furnishes such a critique and indicates a path toward a meaningful response.
Commercial firms have long used what they know about consumers to shape their behavior and maximize profits. By virtue of sitting between consumers and providers of services, however, sharing economy firms have a unique capacity to monitor and nudge all participants—including people whose livelihood may depend on the platform. Much activity is hidden away from view, but preliminary evidence suggests that sharing economy firms may already be leveraging their access to information about users and their control over the user experience to mislead, coerce, or otherwise disadvantage sharing economy participants.
This Article argues that consumer protection law, with its longtime emphasis of asymmetries of information and power, is relatively well positioned to address this under-examined aspect of the sharing economy. But the regulatory response to date seems outdated and superficial. To be effective, legal interventions must (1) reflect a deeper understanding of the acts and practices of digital platforms and (2) interrupt the incentives of sharing economy firms to abuse their position.
D&S advisor Anil Dash discusses Fake Markets that are dominated by few tech companies.
Worse, we’ve lost the ability to discern that a short-term benefit for some users that’s subsidized by an unsustainable investment model will lead to terrible long-term consequences for society. We’re hooked on the temporary infusion of venture capital dollars into vulnerable markets that we know are about to be remade by technological transformation and automation. The only social force empowered to anticipate or prevent these disruptions are policymakers who are often too illiterate to understand how these technologies work, and who too desperately want the halo of appearing to be associated with “high tech”, the secular religion of America.
D&S researcher Alex Rosenblat was interviewed by Radio NZ about Uber and the promises it makes its drivers, i.e. flexible hours and freedom.
D&S post-doctoral scholar Caroline Jack responds to “Gig Work, Online Selling and Home Sharing” from Pew Research Center.
The more that is known about the workers and the work of the on-demand economy, the stronger the call for platform builders to make systems for sustainable work: systems that acknowledge the lived conditions and external factors that affect workers.
D&S post-doctoral scholar Julia Ticona responds to “Gig Work, Online Selling and Home Sharing” from Pew Research Center.
Contingent work has always been prevalent in communities where workers have been historically excluded from secure jobs, from union membership, and even from wider public forms of social welfare through systemic forms of discrimination. For these workers, there was no “golden era” of plentiful stable work and a strong social safety net. Despite these long-standing trends, emerging forms of on-demand labor, and the data-driven technologies that workers interact with, can deepen the vulnerabilities of certain populations of workers.
D&S researcher Alex Rosenblat writes about the motivations of gig economy workers.
In sum, the effects of the gig economy on the workforce are mixed. These platforms seem to benefit people earning supplementary income or those lacking other job opportunities the most, while they impose the most risk on full-time earners. And Uber and Lyft are still facing legal challenges in the U.S. for classifying drivers as independent contractors, as opposed to employees who can receive benefits. (In the U.K., an employment tribunal recently ruled that two Uber drivers must receive employee benefits, like the national living wage. Uber plans to appeal that ruling.)
The New York Times | 10.24.16
D&S affiliate Natasha Singer with Michael D. Shear co-wrote this piece discussing President Obama’s legacy in technology and how he could continue to contribute in the future.
Knight Foundation blog | 10.19.16
paper | 10.19.16
D&S researchers Alex Rosenblat and Tim Hwang and D&S affiliates Solon Barocas and Karen Levy examine how bias may creep into evaluations of Uber drivers through consumer-sourced rating systems:
Through the rating system, consumers can directly assert their preferences and their biases in ways that companies are prohibited from doing on their behalf. The fact that customers may be racist, for example, does not license a company to consciously or even implicitly consider race in its hiring decisions. The problem here is that Uber can cater to racists, for example, without ever having to consider race, and so never engage in behavior that amounts to disparate treatment. In effect, companies may be able to perpetuate bias without being liable for it.”
testimony | 10.13.16
D&S affiliate Ifeoma Ajunwa testified at the U.S. Equal Employment Opportunity Commission to discuss big data in the workplace.
Good afternoon, Chair Yang and members of the Commission. First, I would like to thank the Commission for inviting me to this meeting. My name is Ifeoma Ajunwa, I am a Fellow at the Berkman Klein Center at Harvard University and an Assistant Professor at the University of the District of Columbia School of Law. I have authored several papers regarding worker privacy, with an emphasis on health law and genetic discrimination, from which my testimony today is largely drawn.
Today, I will summarize a number of practices that employers have begun to deploy to collect information on employees, and my concerns that such information could ultimately be acquired and sold by data brokers or stored in databanks. There are few legal limitations on how this sensitive information could be used, sold, or otherwise disseminated. Absent careful safeguards, demographic information and sensitive health information and genetic information is at risk for being incorporated in the Big Data analytics technologies that employers are beginning to use — and which challenge the spirit of antidiscrimination laws such as the Americans with Disabilities Act (the “ADA”) and the Genetic Information Non-Discrimination Act (“GINA”).
paper | 10.13.16
D&S researchers Alex Rosenblat and Tim Hwang explore “the significant role of worker motivations and regional political environments on the social and economic outcomes of automation” in this new paper.
Preliminary observations of rideshare drivers and their changing working conditions reveals the significant role of worker motivations and regional political environments on the social and economic outcomes of automation. Technology’s capacity for social change is always combined with non-technological structures of power—legislation, economics, and cultural norms.
Harvard Business Review | 08.31.16
D&S affiliate Solon Barocas and D&S fellow Karen Levy examine a concept called refractive surveillance, which is when surveillance of one group impacts another.
Debates about consumer privacy have largely missed the fact that firms’ ability to develop a better understanding of consumers also impacts workers’ day-to-day experiences, their job security, and their financial well-being.
But our research suggests that data collection frequently also impacts people other than the those being surveilled. We call this dynamic refractive surveillance. In other words, collecting information about one group can facilitate control over an entirely different group. In our ongoing study, we investigate this dynamic in the context of retail tracking, to understand how data collection about customers can impact how retail workers are managed.
TheRideShareGuy.com | 08.10.16
D&S researcher Alex Rosenblat wrote a blog post discussing how Uber can implement wage withholding policies without driver input or negotiation.
This set up essentially provides negative disincentives to drivers to retrieve the wages they’re owed. An analogy I think of, by comparison, is how cell phone companies can cram small fees into customer bills. Only some percentage of customers are actively tracking their bills, and some percentage of those are willing to spend an hour on the phone with a well-meaning but ineffective customer service agent to get back their small fee.
International Journal of Communication | 07.31.16
D&S researcher Alex Rosenblat and Luke Stark published a case study of Uber drivers and highlight the information and power asymmetries produced by the Uber application. Abstract is below.
Uber manages a large, disaggregated workforce through its ridehail platform, one that delivers a relatively standardized experience to passengers while simultaneously promoting its drivers as entrepreneurs whose work is characterized by freedom, flexibility, and independence. Through a nine-month empirical study of Uber driver experiences, we found that Uber does leverage significant indirect control over how drivers do their jobs. Our conclusions are twofold: First, the information and power asymmetries produced by the Uber application are fundamental to its ability to structure control over its workers; second, the rhetorical invocations of digital technology and algorithms are used to structure asymmetric corporate relationships to labor, which favor the former. Our study of the Uber driver experience points to the need for greater attention to the role of platform disintermediation in shaping power relations and communications between employers and workers.
D&S researcher Alex Rosenblat discusses safety and surveillance of Uber and Lyft drivers in Medium. From neighborhood discrimination to threats of violence, drivers describe safety issues while disclosing how they feel Uber and Lyft are surveilling them, such as through suspected camera spying.
When drivers discuss the dangers of their job, they usually reference a passenger who made them uncomfortable, or, more commonly, specific neighborhoods they avoid, such as by logging out when they’re nearby so they don’t get a ride request. Most drivers know it’s taboo to explicitly discriminate based on destination, and they generally express a willingness to accommodate passenger requests, but sometimes perceptions about dangerous neighborhoods become a factor in their risk assessment. (One of the big selling points for ridehail services is that they go where cabs refuse to venture, particularly to low-income, minority neighborhoods).
D&S researcher Alex Rosenblat writes more of her field notes.
There are lots of reasons drivers might opt to disguise or promote their work as ridehail drivers. To help passengers locate them on a busy street, trade dress can be helpful, but not all drivers want to be identified explicitly as Uber or Lyft drivers.
D&S researcher Alex Rosenblat wrote this piece narrating her many interviews with Uber drivers around the country. In this article, Rosenblat highlights many aspects of Uber drivers’ work and lives, including working in different regional contexts, anxieties around information privacy, and learning English on the job.
Just because software is universally deployable, though, doesn’t mean that work is experienced the same way everywhere, for everyone. The app works pretty much the same way in different places, and produces a workforce that behaves relatively homogeneously to give passengers a reliable experience — it’s easy to come away with the impression that the work experience is standardized, too.
Medium | 05.18.16
D&S Researcher Alex Rosenblat on the fallout of the Austin Transportation’s showdown with Uber and Lyft:
Uber allied with Lyft in Austin to lobby against an ordinance passed by the city council which requires ridehail drivers to undergo fingerprint-based background checks. The two companies spent $8.1 million combined to encourage (i.e. bombard with robo-texts) Austin voters to oppose the ordinance in a referendum vote called Proposition 1. If local cities take a stand against Uber or Lyft’s demand about background checks, and they prevail, that could produce a ripple effect in other cities that have regulatory demands. The local impact on Austin is a secondary concern to the global and national ambitions of imperial Uber and parochial Lyft. When they lost the vote on Prop. 1, they followed through on their threats to withdraw their services.
Medium | 04.29.16
D&S Researcher Alex Rosenblat examines and problematizes Uber’s stance against tipping and the resulting effects on Uber drivers.
Harvard Business Review | 04.06.16
D&S Researcher Alex Rosenblat examines how Uber’s app design and deployment redistributes management functions to semiautomated and algorithmic systems, as well as to consumer ratings systems, creating ambiguity around who is in charge and what is expected of workers. Alex also raises questions about Uber’s neutral branding as an intermediary between supply (drivers) and demand (passengers) and considers the employment structures and hierarchies that emerge through its software platform:
Most conversations about the future of work and automation focus on issues of worker displacement. We’re only starting to think about the labor implications in the design of platforms that automate management and coordination of workers. Tools like the rating system, performance targets and policies, algorithmic surge pricing, and insistent messaging and behavioral nudges are part of the “choice architecture” of Uber’s system: it can steer drivers to work at particular places and at particular times while maintaining that its system merely reflects demand to drivers. These automated and algorithmic management tools complicate claims that drivers are independent workers whose employment opportunities are made possible through a neutral, intermediary software platform.
In many ways, automation can obscure the role of management, but as our research illustrates, algorithmic management cannot be conflated with worker autonomy. Uber’s model clearly raises new challenges for companies that aim to produce scalable, standardized services for consumers through the automation of worker-employer relationships.
D&S Researcher Alex Rosenblat unpacks the implications of Uber’s power to unilaterally set and change the rates passengers pay, the rates that drivers are paid, and the commission Uber takes. She also asks whether the conditions of driving for Uber are necessarily a form of “collusion”:
How drivers earn money is directly impacted by the policies and behavioral expectations Uber devises for how they interact with the Uber platform, and with Uber passengers. Drivers have to meet Uber’s performance targets in their local markets, such as a 90% ride acceptance rate, a low cancellation rate, like 5%, and maintain a high average rating that hovers at a minimum of 4.6/5 stars, often by performing according to Uber’s “recommended” etiquette. If they fall below the local performance targets, they risk deactivation(an Uber word for “temporarily suspended” or “fired”).
Aside from these implicit controls over how drivers interact with the system, Uber has a policy of blind passenger acceptance through its automated dispatcher. The system is designed to encourage drivers to accept all rides by hiding the destination of the passenger, generating goodwill for the company and support from its passenger base. In effect, not only does Uber set the price — Uber also requires drivers to accept those fares when drivers might otherwise reject them for being unprofitable, such as short, minimum fare rides. Drivers also receive deactivation warnings for displaying a preference for surge fares over non-surge fares. As such, their eligibility to work on the Uber platform could plausibly be construed as contingent on the very conditions that would violate anti-trust laws: if they are not in compliance with Uber’s system for setting prices, they risk deactivation. Those restrictions on drivers’ independence really calls into question their ability to act freely as entrepreneurs.
paper | 03.10.16
D&S Fellow Sorelle Friedler and D&S Affiliate Ifeoma Ajunwa argue in this essay that well settled legal doctrines that prohibit discrimination against job applicants on the basis of sex or race dictate an examination of how algorithms are employed in the hiring process with the specific goals of: 1) predicting whether such algorithmic decision-making could generate decisions having a disparate impact on protected classes; and 2) repairing input data in such a way as to prevent disparate impact from algorithmic decision-making.
Major advances in machine learning have encouraged corporations to rely on Big Data and algorithmic decision making with the presumption that such decisions are efficient and impartial. In this Essay, we show that protected information that is encoded in seemingly facially neutral data could be predicted with high accuracy by algorithms and employed in the decision-making process, thus resulting in a disparate impact on protected classes. We then demonstrate how it is possible to repair the data so that any algorithm trained on that data would make non-discriminatory decisions. Since this data modification is done before decisions are applied to any individuals, this process can be applied without requiring the reversal of decisions. We make the legal argument that such data modifications should be mandated as an anti-discriminatory measure. And akin to Professor Ayres’ and Professor Gerarda’s Fair Employment Mark, such data repair that is preventative of disparate impact would be certifiable by teams of lawyers working in tandem with software engineers and data scientists. Finally, we anticipate the business necessity defense that such data modifications could degrade the accuracy of algorithmic decision-making. While we find evidence for this trade-off, we also found that on one data set it was possible to modify the data so that despite previous decisions having had a disparate impact under the four-fifths standard, any subsequent decision-making algorithm was necessarily non-discriminatory while retaining essentially the same accuracy. Such an algorithmic “repair” could be used to refute a business necessity defense by showing that algorithms trained on modified data can still make decisions consistent with their previous outcomes.
Feminist Media Studies | 02.25.16
In this commentary, D&S fellow Karen Levy’s considers the gendered dimensions of shifting cultures of work in response to the growing demands of the technologized/mediated workplace. She also explores the impact of new digital surveillance technologies on constructions of masculinity in the male-dominated US long-haul trucking industry.
New workplace technologies are often met with resistance from workers, particularly to the degree that they challenge traditional workplace norms and practices. These conflicts may be all the more acute when a work culture is deeply and historically gendered. In this Commentary, I draw from one such context—long-haul trucking to consider the role a hypermasculine work culture plays in the reception of new digital monitoring technologies.
I base my analysis on ethnographic study of the United States long-haul trucking industry between 2011 and 2014. My research focused on the use of digital fleet management systems to achieve legal and organizational compliance. The research was multi-sited, taking me to eleven states in total, and to many sites of trucking-related work, including large and small firms, trucking conventions, regulatory meetings, inspection stations, and truck stops. Throughout the work, I spoke with and observed a wide variety of industry participants— truckers themselves, of course, but also fleet managers, technology vendors, trucking historians, insurance agents, lawyers, police officers, and many others.
D&S Researcher Madeleine Clare Elish considers the possibility of a full-on replacement of humans by robots. She argues that this scenario is nowhere near as close as we have been led to believe. Though algorithms can do an astounding range of things that were once viewed as exclusively human work, they don’t work all by themselves.
This is a crucial but often overlooked point in the debate around algorithms and the future of work: Most human jobs will not be replaced but rather reconfigured in the near future. We absolutely need to worry about the long-term implications on the demand for human labor and how this will affect the economy. But if we only focus on the question of whether and when humans will be replaced, we miss the impact algorithms are already having on work and the opportunities to make choices, as designers and consumers, about how algorithms can disrupt or enforce existing power dynamics in the future.
TheRideShareGuy.com | 11.25.15
“On today’s podcast, I get to interview Alex Rosenblat, a researcher from the Data & Society Research Institute. Now that name may seem familiar because in addition to spending the last 9 months studying how Uber drivers interact with the driver app, Alex has also published several very popular articles on things like Uber’s phantom cabs and a technical paper on the subject of driver control.”
Harry Campbell, Alex Rosenblat On How Much Control Uber Really Has Over Its Drivers, The Rideshare Guy Podcast, November 25, 2015
D&S fellow Karen Levy published an essay on measurement in Pacific Standard’s The Future of Work and Workers series:
As data analytics and monitoring technologies come to be used in more and more workplaces, we must be attuned to how they affect these most vulnerable workers. Counting some kinds of work to the exclusion of others can mean that the real burdens of work are less visible, and that the workers who bear them may be less fairly paid for all that they do.
“Uber’s access to real-time information about where passengers and drivers are has helped make it one of the most efficient and useful apps produced by Silicon Valley in recent years. But if you open the app assuming you’ll get the same insight, think again: drivers and passengers are only getting part of the picture.”
Using research conducted with Luke Stark, D&S researcher Alex Rosenblat discusses the differently mediated experiences of Uber drivers and passengers and their various strategies for gaming that mediation.
“Increasingly, what underlies the debate over the so-called sharing economy is a nascent, bigger battle about how society wants machines coordinating and governing human activity. These apps don’t match and route people by hand. Instead, software and underlying algorithms make these technologies work. Companies throughout the ‘sharing economy’ — like Postmates, Handy, and TaskRabbit—all depend on the use of machines to match, sort, and assign tasks effectively at massive scale.”
In “The Mirage of the Marketplace: The disingenuous ways Uber hides behind its algorithm,” Tim Hwang and Madeleine Claire Elish delaminate Uber’s engineering of supply and demand in order to raise questions around the role and responsibilities of automation.
blog post | 07.20.15
Benjamen Walker makes some of the best radio around….His finest work tends to come out in series of podcasts, exploring a complex issue through interviews and stories that unfold over two or more sequential weekly episodes.
The most recently concluded series is called “Instaserfs” and it focuses on the “sharing economy”, aka “the 1099” economy, the “gig economy” or as Ben offers, “the demand economy” or “the exploitation economy”. Struck by the ability to outsource virtually any task, Benjamen hires San Francisco native Andrew Callaway to make three episodes of his podcast as an “Instapodder”. The working method? Andrew’s task is to take on as many sharing economy jobs as he can and to report Benjamen about the experience, and whether he can pay his San Francisco rent with the money he earns.
The Information Society | 03.19.15
“This article examines the implications of electronic monitoring systems for organizational information flows and worker control, in the context of the U.S. trucking industry. Truckers, a spatially dispersed group of workers with a traditionally independent culture and a high degree of autonomy, are increasingly subjected to performance monitoring via fleet management systems that record and transmit fine-grained data about their location and behaviors. These systems redistribute operational information within firms by accruing real-time aggregated data in a remote company dispatcher. This redistribution results in a seemingly incongruous set of effects. First, abstracted and aggregated data streams allow dispatchers to quantitatively evaluate truckers’ job performance across new metrics, and to challenge truckers’ accounts of local and biophysical conditions. Second, even as these data are abstracted, information about truckers’ activities is simultaneously resocialized via its strategic deployment into truckers’ social relationships with their coworkers and families. These disparate dynamics operate together to facilitate firms’ control over truckers’ daily work practices in a manner that was not previously possible. The trucking case reveals multifaceted pathways to the entrenchment of organizational control via electronic monitoring.”