Truthiness in Digital Media:
a symposium that seeks to understand and address propaganda and misinformation in the new media ecosystem
March 6-7, 2012
Organized by the Berkman Center for Internet and Society and the MIT Center for Civic Media
As the networked media environment increasingly permeates private and public life, driven in part by the rapid and extensive travels of news, information and commentary, our systems for identifying and responding to misinformation and propaganda are failing us, creating serious risk to everything from personal and financial health, to fundamental democratic processes and governance. In this age when many would argue that news and information have become key ingredients of broad social progress, our supply is tainted. Concerns about misinformation and disinformation are nothing new. Indeed, many tried and true techniques remain just as vibrant in this new communications and information era. But digital media presents new challenges to existing institutions, structures and processes, jeopardizing its potential contributions to the health of political, economic, and social systems.
While opinions differ over how digital media ameliorates and exacerbates the spread and influence of misinformation, this multifaceted issue persists in the face of thoughtful, sustained, and creative responses—and demonstrates a great diversity of manifestations, roots, and harms. The motives for spreading misinformation are many. They may be partisan or commercial, may derive from or evoke moral and religious sensibilities, may offer political or social commentary or may be merely whimsical. But what to do? Building upon recent convenings and a number of related projects, we are taking a critical step towards a deeper understanding of the problem with a keen eye towards collectively identifying novel solutions and concrete actions to combat the deleterious impacts of misinformation in the near term and over time.
Tricks, Old and New
The rise of social media presents many exciting opportunities to nourish the networked public sphere through shared news and information and engaged discussion and debate. The constantly changing digital media tools we enjoy also present a stream of new affordances for creating and sharing misinformation, whether by adding new twists to age-old propaganda techniques or through native approaches that draw on the power and vulnerabilities of social media. The ability to misrepresent the provenance of an organization or information, to simulate deep or widespread support, to obfuscate through information overload, and to conjure new voices from whole cloth present substantial new challenges to media organizations trying to cover related issues, to people seeking to understand those issues, and to policymakers tasked with both regulating the issues and responding to resulting distortions. With the terrifically competitive media environment and the accelerated pace, interactivity, and reach of media and political processes globally, questions about the mechanisms and flow of misinformation online are increasingly pressing, and made more urgent still by the US 2012 election cycle, perhaps the first truly digital one.
Changing Roles and Players
While the media has always been subject or party to inaccuracies, omissions, biases, and distortions, the dynamics of today’s news environment have not only further limited media organizations’ ability to address these issues effectively, but also potentially increased their reach. Barriers include the expectations of an always-on newsroom despite staffs eviscerated by budget cuts, new news platforms and associated decentralization and distribution of authority, and the evolving, uneasy relationship between new and old media. The responsibility of media in separating fact from fiction continues to be contentious, and the roles of institutions and intermediaries, as part of the problem and solution, deserve more attention.
For their part, individuals have infinitely more sources of information than ever before, as well as much greater responsibility for discerning its quality—an increasingly difficult task due to the sheer volume of information, the pace at which it moves, the declining relevance of signposts such as the medium, source or history, and so on.
A new set of media actors including but not limited to social media platforms, search, recommendation engines have emerged as intermediaries, marking a shift in the responsibility for media accuracy and salience. The digital media guideposts and filters people use to navigate this world are important not only in shaping their media sources, but also in affecting the sources of others with whom they interact through social media.
Of Smoldering Pants, Pinocchios, and Frames
Identifying and publicizing inaccuracies receives sustained attention from diverse institutional quarters including academics, mainstream media, satirists, politicians and non-profits, and new attempts at innovative approaches are ongoing, yet there is a larger gray area of information and communication practices used to persuade and often subtly distort public understanding of social, political, cultural, and economic issues. For many years, media researchers have documented the impact and influence of media framing on public perception. The manner in which a story is crafted and presented, including the tone, context, reference points, and choice of words, can point a shared set of facts in any number of directions. This can be a legitimate part of advocacy and political communication, as people seek to exercise their capacity for expression, frame issues, and gain support, but framing may also foster fundamental misconceptions, promote hate speech, undermine credibility, exacerbate social divisions, and arguably subvert core democratic practices.
Rational Actors We Aren’t
At the core of the problem are the ways in which we seek out and process information. A growing body of empirical research suggests that our ability and willingness to sort through information to discern facts, lies, misrepresentation, and biases, is deeply rooted in cultural values and belief systems, and that our perceptions of facts are colored by deep-seated social, cultural, and behavioral factors.
We may be less skilled at and less committed to rational decision-making based on new information than we are at using this information to rationalize our preexisting views. Our understanding of issues of vital social importance appear to be loosely tethered to the underlying facts and evidence, making us susceptible to misinformation.
Designing for a Networked Environment
The menu of possible corrective actions corresponds to the volume and complexity of these issues; developing and employing strategies and tactics effectively, however, demands a better understanding of the most promising points and dynamics of intervention. Experience and research to date suggests an integrated and multifaceted approach, with likely elements of a solution set spanning learning, technology, process, policy while covering information sources, intermediaries, and recipients.
These interventions may focus on empowering and supporting ways to better evaluate the accuracy, value and salience of media and on strengthening institutions and intermediaries that act as filters and watchdogs.
Learning, information, and new media literacies for individuals and organizations, especially as developed through constructivist approaches (eg, learning by doing, where people are both consumers and producers) are a fundamental part of any solution. Developing better technologies and tools for filtering and evaluating digital information may help users, media, and media critics to evaluate the sources and veracity of online content.
These tools should be accompanied by supportive norms and strategies for assessing information quality. Expanding the capacity, reach, and impact of myth-busting, fact-checking and transparency organizations, for instance, could provide a counterweight to the many well-resourced efforts for media distortion.
Strengthening and adopting new strategies in public interest media could also be a key part of any solution. Identifying and understanding the mechanisms by which memes are propagated through new media would support initiatives that seek to uncover and counter intentional distortions, and there is likewise a role for research on topics such as the determinants of media consumption and the impact of media exposure on the formation of values and beliefs, among many other topics. Downstream evaluations of these interventions are also essential to understand how they may complement one another, how they might inform advocacy efforts, and how they can help to identify and promote policy and regulatory changes.
Onward!
This conference will focus on exploring the many facets of this complex issue with an eye to crafting tools and strategies to ameliorate the negative impacts of deception, bias, and inaccuracy in the digital media ecosystem. We hope to come to a better position to take advantage of the benefits promised by digital media while appreciating the positive aspects of creative media-making and probing the blurred boundaries between nefarious and beneficial media shaping practices.