featured


filtered by: design


The New Inquiry | 02.09.18

Artificial Advancements

Taeyoon Choi

In this essay, D&S Fellow Taeyoon Choi interrogates technology designed for those with disabilities.

“Even with the most advanced technology, disability can not and—sometimes should not—disappear from people. There are disabled people whose relationship with their own bodily functions and psychological capabilities cannot be considered in a linear movement from causation to result, where narratives of technology as cure override the real varieties in people’s needs and conditions and falsely construct binary states—one or the other, abled or disabled—shadowing everything between or outside of those options.”


On September 27th, D&S Fellow Taeyoon Choi released the first two chapters of his online-book “Poetic Computation: Reader,” which looks at code as a form of poetry as well as the ethics behind it. As an online-book, readers have the unique experience of customizing the design elements of the text to their preferred standards as they read.

Choi is co-founder of The School for Poetic Computation based in New York City, and the book is based off of two of his lectures from the curriculum. The following chapters will be published later this year.

 


D&S fellow Mark Latonero considers recent attempts by policymakers, big tech companies, and advocates to address the deepening refugee and migrant crisis and, in particular, the educational needs of displaced children through technology and app development projects. He cautions developers and policymakers to consider the risks of failing to understand the unique challenges facing refugee children living without running water, let alone a good mobile network.

The reality is that no learning app or technology will improve education by itself. It’s also questionable whether mobile apps used with minimal adult supervision can improve a refugee child’s well-being. A roundtable at the Brookings Center for Universal Education noted that “children have needs that cannot be addressed where there is little or no human interaction. A teacher is more likely to note psychosocial needs and to support children’s recovery, or to refer children to other services when they are in greater contact with children.” Carleen Maitland, a technology and policy professor who led the Penn State team, found through her experience at Zaatari that in-person interactions with instructors and staff in the camp’s many community centers could provide far greater learning opportunities for young people than sitting alone with a mobile app.

In fact, unleashing ed tech vendors or Western technologists to solve development issues without the appropriate cultural awareness could do more harm than good. Children could come to depend on technologies that are abandoned by developers once the attention and funding have waned. Plus, the business models that sustain apps through advertising, or collecting and selling consumer data, are unethical where refugees are concerned. Ensuring data privacy and security for refugee children using apps should be a top priority for any software developer.

In cases where no in-person education is available, apps can still play a role, particularly for children who feel unsafe to travel outside their shelters or are immobile owing to injuries or disabilities. But if an app is to stand a chance of making a real difference, it needs to arise not out of a tech meet-up in New York City but on a field research trip to a refugee camp, where it will be easier to see how mobile phones are actually accessed and used. Researchers need to ask basic questions about the value of education for refugees: Is the goal to inspire learning on traditional subjects? Empower students with academic credentials or job skills? Assimilate refugees into their host country? Provide a protected space where children can be fed and feel safe? Or combat violent extremism at an early age?

To decide, researchers need to put the specific needs of refugee children first—whether economic, psychosocial, emotional, or physical—and work backward to see whether technology can help, if at all.


D&S Fellow Natasha Singer looks into exploitative interactive website design techniques known as “dark patterns”.

Persuasive design is a longstanding practice, not just in marketing but in health care and philanthropy. Countries that nudge their citizens to become organ donors — by requiring them to opt out if they don’t want to donate their body parts — have a higher rate of participation than the United States, where people can choose to sign up for organ donation when they obtain driver’s licenses or ID cards.

But the same techniques that encourage citizens to do good may also be used to exploit consumers’ cognitive biases. User-experience designers and marketers are well aware that many people are so eager to start using a new service or complete a task, or are so loath to lose a perceived deal, that they will often click one “Next” button after another as if on autopilot — without necessarily understanding the terms they have agreed to along the way.

“That’s when things start to drift into manipulation,” said Katie Swindler, director of user experience at FCB Chicago, an ad agency. She and Mr. Brignull are part of an informal effort among industry experts trying to make a business case for increased transparency.


In this background primer, D&S Research Analyst Laura Reed and D&S Founder danah boyd situate the current debate around the role of technology in the public sphere within a historical context. They identify and tease out some of the underlying values, biases, and assumptions present in the current debate surrounding the relationship between media and democracy, and connect them to existing scholarship within media history that is working to understand the organizational, institutional, social, political, and economic factors affecting the flow of news and information. They also identify a set of key questions to keep in mind as the conversation around technology and the public sphere evolves.

Algorithms play an increasingly significant role in shaping the digital news and information landscape, and there is growing concern about the potential negative impact that algorithms might have on public discourse. Examples of algorithmic biases and increasingly curated news feeds call into question the degree to which individuals have equal access to the means of producing, disseminating, and accessing information online. At the same time, these debates about the relationship between media, democracy, and publics are not new, and linking those debates to these emerging conversations about algorithms can help clarify the underlying assumptions and expectations. What do we want algorithms to do in an era of personalization? What does a successful algorithm look like? What form does an ideal public sphere take in the digital age? In asking these and other questions, we seek to highlight what’s at stake in the conversation about algorithms and publics moving forward.


D&S Board Member Anil Dash contrasts two recent approaches to making internet connectivity more widely available. Comparing the efforts to build consensus behind Facebook’s Free Basics initiative to LinkNYC, the recently-launched program to bring free broadband wifi to New York City, Dash views each situation as a compelling example of who gets heard, and when, any time a big institution tries to create a technology infrastructure to serve millions of people.

There’s one key lesson we can take from these two attempts to connect millions of people to the Internet: it’s about building trust. Technology infrastructure can be good or bad, extractive or supportive, a lifeline or a raw deal. Objections to new infrastructure are often dismissed by the people pushing them, but people’s concerns are seldom simply about advertising or bring skeptical of corporations. There are often very good reasons to look a gift horse in the mouth.

Whether we believe in the positive potential of getting connected simply boils down to whether we feel the people providing that infrastructure have truly listened to us. The good news is, we have clear examples of how to do exactly that.


D&S Advisor Joichi Ito writes about his concern surrounding the future sustainable development around blockchain technology, particularly as it relates to the community that has cultivated the talent and knowledge necessary to make it work:

Partially driven by the overinvestment in the space, and partially by the fact that Bitcoin is much more about money than the Internet ever was, it is experiencing a crisis that didn’t really have any parallels in the early days of the Internet. Nonetheless, the formation of the Internet offers some important lessons — most importantly, on the question of the talent and knowledge pool. In those early days, and at some layers maybe even still today, there were only a very small number of people who had the background, brain type and personality to understand some of the core elements that made the Internet work. I remember when there were only a handful of people in the world who really understood Border Gateway Protocol (BGP) and we had to hunt them down and share them with our “competitors” when we were setting up PSINet in Japan.

It’s very similar today with Bitcoin and the Blockchain. There are a small number of people who understand cryptography, systems, networks and code and are capable of understanding the Bitcoin software code. Most of them are working on Bitcoin, while some are working on Ethereum and other “related” systems and a few more are scattered around the world in other places. It’s a community including some who have been around since the 90s, before the Web, going to crazy conferences like the Financial Cryptography conference. Like any free and open-source software community on the Internet, it’s a bunch of people who know each other and mostly, though not always, respect each other, but which fundamentally holds a near monopoly on talent.

Unfortunately, the wild growth of Bitcoin and now “the Blockchain” has caught this community off guard from a governance perspective, leaving the core developers of Bitcoin unable to interface effectively with the commercial interests whose businesses depend on scaling the technology. When asked “can you scale this?” They said, “we’ll do the best we can.” That wasn’t good enough for many, especially those who don’t understand the architecture or the nature of what is going on inside of Bitcoin.


D&S artist in residence Ingrid Burrington shares impressions from a tour of Facebook’s massive Altoona data center, and wonders about the extent to which Facebook might be creating an infrastructure to rival the internet itself.

The entrance to the server room where all of this hardware lives is behind both an ID-card reader and a fingerprint scanner. The doors open dramatically, and they close dramatically. It is only one of several server rooms at Altoona, but just this one room also seems endless. It is exactly the glimmering-LED cyberpunk server-porn dreamscape that it is supposed to be.


“On today’s podcast, I get to interview Alex Rosenblat, a researcher from the Data & Society Research Institute. Now that name may seem familiar because in addition to spending the last 9 months studying how Uber drivers interact with the driver app, Alex has also published several very popular articles on things like Uber’s phantom cabs and a technical paper on the subject of driver control.”

Harry Campbell, Alex Rosenblat On How Much Control Uber Really Has Over Its Drivers, The Rideshare Guy Podcast, November 25, 2015


D&S fellow Natasha Singer writes about new online apps designed to give college students additional options for reporting sexual assaults. Colleges and Universities are embracing these apps as they continue to look for more concrete data about sexual violence on their campuses and seek to provide students with more ways to report an assault.

Students at participating colleges can use its site, called Callisto, to record details of an assault anonymously. The site saves and time-stamps those records. That allows students to decide later whether they want to formally file reports with their schools — identifying themselves by their school-issued email addresses — or download their information and take it directly to the police. The site also offers a matching system in which a user can elect to file a report with the school electronically only if someone else names the same assailant.

Callisto’s hypothesis is that some college students — who already socialize, study and shop online — will be more likely initially to document a sexual assault on a third-party site than to report it to school officials on the phone or in person.

 

 


Motherboard | 07.27.15

Uber’s Phantom Cabs

Alex Rosenblat

“Uber’s access to real-time information about where passengers and drivers are has helped make it one of the most efficient and useful apps produced by Silicon Valley in recent years. But if you open the app assuming you’ll get the same insight, think again: drivers and passengers are only getting part of the picture.”

Using research conducted with Luke Stark, D&S researcher Alex Rosenblat discusses the differently mediated experiences of Uber drivers and passengers and their various strategies for gaming that mediation.


“In a self-driving car, the control of the vehicle is shared between the driver and the car’s software. How the software behaves is in turn controlled — designed — by the software engineers. It’s no longer true to say that the driver is in full control… Nor does it feel right to say that the software designers are entirely control.
“Yet as control becomes distributed across multiple actors, our social and legal conceptions of responsibility are still generally about an individual. If there’s a crash, we intuitively — and our laws, in practice — want someone to take the blame.
“The result of this ambiguity is that humans may emerge as ‘liability sponges’ or ‘moral crumple zones.'”

At Data & Society’s Intelligence and Autonomy forum in March 2015, “moral crumple zone” emerged as a useful shared term for the way the “human in the loop” is saddled with liability in the failure of an automated system.

In this essay in Quartz, Madeleine Clare Elish and Tim Hwang explore the problematic named by “moral crumple zone,” with reference to cruise control, self-driving cars, and autopilot.


Civicist | 05.14.15

Bring on the Bots

Samuel Woolley, Tim Hwang

In this piece for Civic Hall’s Civicist, Samuel Woolley and D&S fellow Tim Hwang argue that “[t]he failure of the ‘good bot’ is a failure of design, not a failure of automation” and urge us not to dismiss the potential benefits of bots.


Medium | 04.10.15

Back Stage at the Machine Theater

Karen Levy, Tim Hwang

D&S fellows Karen Levy and Tim Hwang ask after the ethics of design theater.

Excerpt: “A machine’s front stage performance gets enacted through design. Just as a human provides front stage cues through her appearance and behavior (for instance, by talking with a certain degree of formality, or wearing a uniform), design provides signals for how the people around a machine should understand and interact with it. Sometimes these cues are relatively forthright: press this button to start, plug me in here. But just as humans can provide social cues that mislead others about their ‘true’ nature, the design of a system or artifact can invoke deception: a machine, like a person, can lie, omit, or mislead.”


“As consumers we’ve been told that we’re in charge, so we enjoy the ritual, even though it’s exhausting. We even decide when we’ll opt out. But what happens when companies walk away from us first?”

In this article, D&S advisor Baratunde Thurston proposes a way for users to take more agency in their relationships with apps and the companies that created them.


Subscribe to the Data & Society newsletter

Support us

Donate
Data & Society Research Institute 36 West 20th Street, 11th Floor
New York, NY 10011, Tel: 646.832.2038

Reporters and media:
[email protected]

General inquiries:
[email protected]

Unless otherwise noted this site and its contents are licensed under a Creative Commons Attribution 3.0 Unported license.