Garbage In, Garbage Out

At one time or another, those of us who teach despair of the whole educational enterprise. We entertain the dark suspicion that some people simply can’t make use of information–that their ability to reason is faulty, that they are unable to consider and evaluate evidence to reach sound conclusions.

Happily, I’m wrong.  At least, that’s the conclusion reached by researchers at Princeton,  When a wrong choice is made, the researchers found that it might be the information rather than the brain’s decision-making process that is to blame.

The results of the study were reported in Science Daily, and the experiment involved very simple types of information; nevertheless, if the conclusions are replicated, the importance of good education and accurate journalism increases.

If human decision-making depends upon the quality of the information available, those of us in the information-providing business have an ethical obligation to provide information that is sound and verified. In public school classrooms, that means teaching science in science class,not religion. It means teaching American and constitutional history in much more depth. It means introducing students to the world beyond America’s borders–the world they will increasingly interact with, and about which they will need solid information.

As important as education is, the information we are fed daily is even more consequential. In a country that celebrates free expression, we can’t mandate truth in journalism–and even a cursory trip around the internet will demonstrate how much  unreliable and delusional “information” is out there. In the age of the internet, it’s increasingly difficult to separate fact from opinion and both from outright propaganda. When we relied upon daily newspapers and the evening news–the “legacy” media–we missed a lot, but those journalists generally followed an ethical code that required independent verification of information before it was reported. In today’s news environment, with the 24-hour “news hole,” speed often trumps accuracy even for the more responsible media–and there are more and more irresponsible media outlets competing for our attention.

We can’t make good decisions if we don’t have trustworthy information.

The Princeton study validates a couple of old sayings: “it ain’t what you don’t know that hurts you; it’s what you know that just ain’t so.”   And the even pithier, “garbage in, garbage out.”

Comments

What Do We Know and Who Can We Trust?

I thought I’d share a presentation I made at “Weekend U,” a series of workshops hosted by IUPUI Alumni, focused on our current media environment. It’s a bit long, so my apologies….

It is always tempting to assert that we live in times that are radically unlike past eras—that somehow, the challenges we face are not only fundamentally different than the problems that confronted our forebears, but worse; to worry that children growing up today are subject to more pernicious influences than children of prior generations. (In Stephanie Coontz’ felicitous phrase, there is a great deal of nostalgia for “the way we never were.”) I grew up in the 1950s, and can personally attest to the fact that all of the misty-eyed recollections of that time are revisionist nonsense. The widespread belief that 50s-era Americans all lived like the characters from shows like “Father Knows Best” or “Leave it to Beaver” is highly inaccurate, to put it mildly. If you don’t believe me, ask the African-Americans who were still relegated to separate restrooms and drinking fountains in much of the American South, or the women who couldn’t get equal pay for equal work or a credit rating separate from their husbands.

Nevertheless—even conceding our human tendency to overstate the effects of social change for good or ill—it is impossible to understand the crazy in contemporary life without recognizing the profound social changes that have been both generated by and reflected in our modern communication technologies, most prominently the Internet.

We live today in an incessant babble of information, some of which is credible and much of which is not. Some of that information comes to us through hundreds of cable and broadcast television stations, increasing numbers of which are devoted to “news,” broadly defined twenty-four hours a day, seven days a week.   In our cars, we tune in to news and commentary on AM or FM stations, or more recently to satellite broadcasts that have extended the reach of that broadcast medium. Then there’s the Web—streaming incessantly not just through our computers, but also through our smart phones and tablets. The one place we increasingly don’t get our information is from a daily newspaper, or  other sources that might be called the “journalism of verification.”

That’s the reality of our lives today. We read news and commentary from all over the world on line, we shop for goods and services on line, we communicate with our friends and families through email and Facebook, and we consult web-based sources for everything from medical advice to housekeeping hints to comedy routines. When we don’t know something, we Google it.  The web is rapidly becoming a repository of all human knowledge—not to mention human rumors, hatreds, gossip, trivia and paranoid fantasies. Picking our way through this landscape requires new skills, new ways of accessing, sorting and evaluating the credibility and value of what we see and hear. It is not an exaggeration to say that the enhanced communications environment has changed the way we process information and our very perceptions of reality. And we’re just beginning the task of figuring out how to cope with this brave new world.

Let me share a very minor example of how this communications environment affects our perceptions of reality: toward the end of her life, my mother was in a nursing home. Given the limited mobility of most of the residents, the television was a central focus of their day, and it was on continually. Although she had never been a particularly fearful person, nor one who focused on crime, my mother became convinced that crime rates were soaring. They weren’t. In fact, there had been a substantial decline over the preceding few years. But when mother was growing up, with the exception of particularly heinous incidents or crimes involving celebrities, the local media reported only local criminal behavior. Seventy years later, the television relayed daily reports of subway murders in London, train bombings in Spain, and other assorted misbehavior by people from all over the globe. To my mother and her elderly peers, it seemed that predators were suddenly everywhere.  And that was just television. The Internet has had an infinitely greater impact. I’m old, and probably use technology less than my grandchildren’s generation, but when I am driving to a location I’ve not previously visited, I get directions via the web (assuming I don’t have a GPS in my car or—more recently—in my cell phone). I can Instant Message or Skype my granddaughter in Wales, and for free—no long distance telephone charges incurred. I’m kept up-to-date on what my friends are doing via Facebook, and on political news via Twitter. Increasingly, I shop on line for books, office supplies, even clothing and home furnishings. I no longer visit the license branch and wait in line to renew my license plates; I go online and save the time and trouble. If I want to know how a congressman voted, the information is at my googling fingertips. I’m hardly alone. In an unbelievably short period of time, the Internet has not only made the world a smaller place, it has forever altered the rhythm of our daily lives.

Part of that alteration has meant that we are no longer passive consumers of information; the interactive nature of the web allows us to talk back, to post our opinions, to offer rebuttals. It brings us into contact with people of different countries, religions, cultures and backgrounds. It allows each of us, if we are so inclined, to become a publisher. When I was young, if you wanted to publish a newspaper, the costs of the printing press and distribution system were prohibitive, and most broadcast media was owned by the wealthy. Today, anyone with access to the Internet can hire a few reporters or “content providers” and create her own media outlet. One result is that the previously hierarchical nature of public knowledge is rapidly diminishing. The time-honored “gatekeeper” function of the press—when journalists decided what constituted news—will soon be a thing of the past, if it isn’t already.

The communication revolution isn’t limited to the delivery of news and information. Social networking sites have allowed like-minded people to connect with each other and form communities that span traditional geographical and political boundaries. (The growing global hegemony of the English language has further enabled cross-national communications.) One result is that it has become much harder to define just what a “community” is, and to recognize how those communities differ.

For example, on my most recent trip to Europe, I was struck by how homogenized citizens from western industrialized countries have become—how much we all look and dress alike. Thirty years ago, on our first trip to Europe, cultural differences expressed in clothing and mannerisms made it fairly easy to spot Americans. Over the intervening years, that has changed. Today, we dress alike, drive the same cars, watch the same television programs and listen to the same (mostly American) music. IPhones, IPods and IPads (and their various clones) are ubiquitous, as are Facebook and Google. Evidence of the globalization of culture—at least pop culture—is everywhere.

The participatory nature of the Internet has also encouraged—and enabled—a wide array of political and civic activism. How lasting that shift will be is still an open question, but the ways in which American political life has changed are unmistakable. Early in the development of the web, naysayers worried that the Internet would encourage people to become more solitary. They warned that people were being seduced by this new medium to withdraw from human and social interaction. In some cases, that may be true. For others, however, the Internet has been an “enabler,” facilitating a great wave of political and community organizing; it has become a mechanism for forming new kinds of communities, for finding like-minded people we didn’t previously know, even though they might have been living just down the street. “Meetings” on line have led to internet-facilitated “Meet Ups” and other face-to-face interactions in service of particular social and political goals.

A telling example of the profound change You Tube has wrought in the political landscape was the widely reported “macaca” moment of Senator George Allen during the 2006 campaign season. Allen, who was running for re-election to the Senate from Virginia, was considered a shoo-in for re-election, and a strong contender for the 2008 Republican Presidential nomination. While delivering a speech to a small gathering in rural Virginia, he pointed out a volunteer from his opponent’s campaign who was videotaping his talk.

“This fellow here, over here with the yellow shirt, macaca, or whatever his name is. He’s with my opponent. He’s following us around everywhere…Let’s give a welcome to macaca, here. Welcome to America and the real world of Virginia.”

Depending on how it is spelled, the word macaca can mean either a monkey that inhabits the Eastern Hemisphere or a town in South Africa. In some European cultures, macaca is also considered a racial slur against African immigrants. The Webb volunteer promptly uploaded the videotape of Allen’s remarks to You Tube; a mere three days later, it had been downloaded and viewed 334,254 times. It was picked up and endlessly replayed on the evening news. Print media across the country reported on the controversy, and radio talk show hosts argued about the meaning of the word macaca, and whether Allen had intended a slur. Investigative reporters whose curiosity had been piqued by the controversy dug up evidence of prior racially charged incidents involving Allen. By November, James Webb—initially dismissed as a long-shot candidate with little chance of defeating a popular sitting Senator—was the new Senator from Virginia, and “macaca moment” had entered the political lexicon as shorthand for a gaffe captured on video. Politics has never been the same since.

The 2008 Presidential campaign introduced the phenomenon of “viral videos.” Remember when will.i.am of the group Black-Eyed Peas created the music video, “Yes We Can?” For a time, that video was everywhere—forwarded and re-forwarded until literally millions of Americans had seen it. Humorous and not-so-humorous videos promoting and panning the candidates have since become ubiquitous. Campaign rumors (and worse) are endlessly forwarded, circulated and recirculated. In 2008, John McCain, who admitted never using a computer, and who displayed discomfort with the new media environment, was caught off-guard by videos showing him delivering inconsistent statements. I argued at the time that failure to understand the impact of the internet is failure to understand the world we live in, and that was undoubtedly the point of a pro-Obama blogger’s characterization of McCain as “an analog candidate for a digital age.” By last November, the impact of You Tube was far greater. Mitt Romney was videotaped at a private fundraiser dismissing 47% of Americans as non-taxpaying moochers who would never vote for him. To say that the video damaged his campaign would be an understatement.

The 2008 campaign was probably the first where all political candidates embraced the new technologies and made extensive use of email to raise funds, organize volunteers, counter charges, announce endorsements and rally their respective bases, at a tiny fraction of the cost of direct mail. The impact of all this would be difficult to exaggerate—and older politicians are still having difficulty negotiating the new media landscape.

It isn’t only our political life that has been profoundly changed by new technologies. The ability to communicate cheaply and almost instantaneously with millions of people, the ability to link up like-minded people, and the ability to both spread and counter misinformation are all having a profound impact on our political, civic and personal relationships in ways we cannot yet fully anticipate or appreciate.

This information revolution is particularly pertinent to the issue of trust in our civic and governing institutions. At no time in human history have citizens been as aware of every failure of competence, every allegation of corruption or malfeasance—real or imagined. Politicians like to talk about “low-information” voters, but even the most detached American citizen cannot escape hearing about institutional failures on a daily basis, whether it is reports of high levels of lead in children’s toys (said to be due to government failure to monitor imports properly), the collapse of a bridge in Minnesota (said to be due to government failure to inspect and repair deteriorating infrastructure), or the devastation caused by a drone strike (authorized first by the Bush and now by the Obama administrations). Evidence of the intransigence and partisanship of Congress and the appalling behavior of too many of its members is communicated on a daily basis. It may be true that things weren’t much different in past eras, but it is certainly the case that information about public wrongdoing or incompetence is infinitely more widespread in today’s wired and connected world.

So, of course, is misinformation.

As wonderful as some of the new technologies can be, there’s a downside—and the growth of propaganda is part of that downside. In my opinion, the most damaging side effect of this paradigm shift in the way we access information is the perilous state of journalism at a time when we desperately need verified, factual, objective information about our world, our government and our environment. Don’t let the bravado mask reality: newspapers as we’ve known them are dying.  Since 1990, a quarter of all newspaper jobs have disappeared. Once robust publications have closed their doors. Others have drastically curtailed delivery. Last year, the textbook I used in my Media and Policy class was “Will the Last Reporter Please Turn Out the Lights.”

Like it or not, we are increasingly dependent upon the Internet for our news. And that presents us with both an opportunity and a problem.

Eli Pariser was one of the founders of MoveOn.com, and an early believer in the power of the Internet to increase and improve democracy. But as he documents in an important book, “The Filter Bubble,” the technology that promises (and delivers) so much is moving us into what he calls a “mediated future”—a future in which each of us exists in a personalized universe of our own construction.

In an effort to give each of us what we want, sites like Google, Facebook, and Amazon are constantly refining their algorithms in order to deliver results that are “relevant” to each particular searcher, and they have more data about our individual likes and dislikes than most of us can imagine. As a result, two people googling “BP,” for example, will not necessarily get the same results, and certainly not in the same order. Someone whose search history suggests interest in investment information may get the company’s annual report, while someone with a history of environmental interests will get stories about the Gulf spill. Similarly, Facebook delivers the posts of friends and family that its algorithm suggests are most consistent with the member’s interests and beliefs, not everything those friends post.

Pariser calls this the “filter bubble,” and points out that—unlike choosing to listen to Fox rather than PBS or MSNBC, for example—the resulting bias is invisible to us.

Little by little, search by search, individuals are constructing different–and often conflicting–realities. At the same time, traditional news sources aimed at a general audience—the newspapers and broadcasts that required reporters to fact-check assertions, label opinion and aim for objectivity—are losing market share. How many will survive and in what form is anyone’s guess.

We can live without newsprint, but we desperately need real journalism—where reporters monitor what governments and businesses do, where they fact-check and provide context and background. Instead, we have mountains of unsubstantiated opinion, PR and spin. Good citizens have to be able to separate fact from fantasy. They have to live in the world as it is, not in a bubble where they listen only to things that confirm what they already believe—and the Internet makes it so easy and tempting to construct that bubble.

A favorite catchphrase of traditional media is “news you can use.” This catchphrase has come into increased use as newspaper readership has continued to decline–not just in Indianapolis, but nationally. The problem is that no one completes the sentence. Those who toss off the phrase don’t proceed to the important issue, which is: use how, and for what?

In my somewhat jaundiced opinion, the news citizens can use is information about our common institutions–including but not limited to government, and especially local government. Judging from what the newspapers are actually covering, however, they consider “news you can use” to be reviews of local restaurants, diet and home decorating tips, and sports. Not–as they used to say on Seinfeld–that there is anything wrong with that. At least, there wouldn’t be anything wrong with that if these stories were being served as “dessert” rather than the main course.

What we can use is a return to journalism’s time-honored watchdog role. But genuine watchdog coverage requires resources–enough reporters with enough time to investigate and monitor a wide variety of important government agencies and functions. Newspapers around the country that have survived have done so by engaging in wave after wave of layoffs. Those layoffs have left them with skeletal reporting operations, drastically compromising their capacity to provide genuine journalism.

Let me just conclude by describing what I mean by “genuine journalism.” Real journalism is accurate, objective, fact-checked reporting on what Alex Jones calls the “iron core,” fact-based accountability news.   Such news isn’t “fair and balanced,” because often, balance is neither fair nor accurate.

A couple of years ago, National Public Radio—one of our most reliable purveyors of “real journalism”– adopted new ethics guidelines. The new code stresses the importance of accuracy over false balance; it appears–finally–to abandon the “he said, she said” approach (what I have elsewhere called “stenography masquerading as reporting”) that all too often distorts truth in favor of a phony “fairness.”

The policy reads: “At all times, we report for our readers and listeners, not our sources. So our primary consideration when presenting the news is that we are fair to the truth. If our sources try to mislead us or put a false spin on the information they give us, we tell our audience. If the balance of evidence in a matter of controversy weighs heavily on one side, we acknowledge it in our reports. We strive to give our audience confidence that all sides have been considered and represented fairly.”

One of my biggest gripes over the past several years has been the wholesale abandonment of precisely this tenet of good journalism. A good example has been environmental reporting–how many times have media sources reported on climate change, for example, by giving equal time and weight to the settled science and the deniers, without ever noting that the deniers constitute less than 1% of all climate scientists, and are generally regarded as a kooky fringe? That’s “balance,” but it certainly isn’t “fair to the truth.”

Several years ago, this sort of false equivalency was illustrated by one of my all-time favorite Daily Show skits. The “senior journalism reporter” was explaining the Swift Boat Veterans for Truth attacks on John Kerry to Jon Stewart. “The Swift Boat Veterans say such-and-such happened; the Kerry Campaign says it didn’t. Back to you, Jon!” When Stewart then asked “But aren’t you going to tell us who is telling the truth?”  the response was dead-on. “Absolutely not, Jon. This is journalism.”

Far too often, our remaining reporters pursue artificial balance at the expense of truth. If a Democratic campaign plays a dirty trick, reporters rush to remind their audience of a similar transgression by Republicans, and vice-versa. This search for equivalency may be well intentioned, but it misrepresents reality and misleads those of us who depend upon the media for accurate information. NPR’s recognition of this pernicious practice, and its new Code of Ethics, are a welcome sign that at least some journalists might be returning to Job One: telling us the unembellished truth.

At the end of the day, we need to recognize that the journalism of accountability and verification, the journalism that acts as a watchdog over our common institutions, is irreplaceable. My own favorite journalist, Jon Stewart, put it best in an interview with Terry Gross of NPR. Gross noted his constant criticisms of both politicians and the media, and asked Stewart who he felt was most culpable. Stewart said “Politicians are politicians. If you go to the zoo and monkeys are throwing feces, well—that’s what monkeys do. But you’d like to have the zoo-keeper there saying ‘Bad Monkey.’”

That pretty much sums it up.

Comments

Quote of the Day

From Dick Lugar’s first address after leaving elective office, an observation worth pondering:

Perhaps the most potent force driving partisanship is the rise of a massive industry that makes money off political discord. This industry encompasses cable news networks, talk radio shows, partisan think tanks, direct mail fundraisers, innumerable websites and blogs, social media and gadfly candidates and commentators. Many of these entities have a deep economic stake in perpetuating political conflict. They are successully marketing and monetizing partisan outrage.

Comments

Revisiting…Everything

Random thoughts for a Sunday morning….

The Sunday morning interview shows are focused on the GOP’s “identity crisis.” The New York Times has an article by the Public Editor about a not-dissimilar debate occurring within journalism over the meaning and possibility of “objectivity.” An academic listserv I participate in has a recurring discussion about the advisability of holding a new Constitutional Convention, or at least seriously considering significant constitutional changes. Various religious denominations are grappling with challenges to settled theological positions, including their beliefs about the role of women, homosexuality and same-sex marriage. Educators are struggling to redefine both ends and means. Technology is changing everything from how we live to how we define friendship.

I could go on, but you get the picture. We live in an era when–as the poet put it– “the center will not hold.”

The existential question, of course, is: what will emerge from all this confusion and change? Will we take this opportunity to think about the “big” questions–what kind of society do we want to inhabit? What would a more just system look like? Aristotle was among the first to suggest that an ideal society would facilitate human flourishing; what would such a society look like?

Unfortunately, there’s not much evidence that these “big” questions are being asked. Instead, we seem to be surrounded by quarrelsome adolescents, desperately trying to game the system and retain–or obtain–relative advantage.

I wonder what it would take to change the conversation?

Comments

The Trust Conundrum

I was recently asked to participate in a panel exploring current levels of trust and distrust in government. Among other things, we were asked to consider what citizens might do to mitigate the growing cynicism about politics, and whether we thought the current media environment was contributing to widespread distrust of government at all levels.

These are questions worth pondering.

I think a great deal of distrust in government is a result of the deficit in civic literacy that I have written about previously. When citizens don’t understand constitutional constraints on the public sector, when they are unfamiliar with the most basic historical and philosophic roots of our particular approach to self-government, they are unable to evaluate the lawfulness of government activity. One result is that government action that should be entirely predictable looks arbitrary, while corruptions of the process are seen as “business as usual.” Normal checks and balances are decried as unnecessary red tape, and egregious abuses of legislative mechanisms like the filibuster are seen not as a misuse of power, but part of the ordinary, mysterious processes of the political system.

When citizens aren’t able to distinguish between use and misuse of the power of the state, it’s no wonder they believe all public policy is for sale.

The current chaos that is the media is even more consequential, because a healthy Fourth Estate is critical to democratic self-government.

Citizens can’t act on the basis of information they don’t have. The paradox of life in the age of the Internet is that there are more voices than ever before—theoretically, a good thing—but we’ve lost news that is collectively recognized as authoritative, which is proving to be a very bad thing. A babble of opinion, spin and outright fabrication has replaced what used to be called the “iron core”—reliable information that has been fact-checked and authenticated.

It is one thing to draw different conclusions from a reported set of facts; it is quite another to deny the existence of the facts themselves.

On the one hand, the Internet has empowered many more government watchdogs; on the other, it has facilitated the rise of innumerable conspiracy theorists, fringe groups, special interests and outright liars. The result is that someone who prefers to believe, say, that global climate change is a hoax or that President Obama is a secret Muslim born in Kenya can readily find sources that confirm those suspicions.

The days when everyone listened to—and trusted the veracity of—reporting by Walter Cronkite and his counterparts in the mainstream media are long gone. (Indeed, there is a persuasive argument to be made that there is no longer such a thing as “mainstream” media.) Daniel Patrick Moynihan famously said that we are all entitled to our own opinions, but not to our own facts.  Today, thanks to incredibly shrinking newsrooms and proliferating propagandists, people are choosing their own facts, and increasingly living in alternate realities that conform to their pre-existing beliefs and prejudices. When thoughtful Americans aren’t sure what news they can trust, and ideologically rigid Americans—left and right—are living in information bubbles of their own choosing, the lack of constructive dialogue and institutional trust shouldn’t surprise us.

In a world that is changing as rapidly and dramatically as ours, the importance of real journalism—not “infotainment,” not talking heads, not bloggers, not columnists, not “he-said, she-said” stenographers, but actual fact-checked, verified news in context—becomes immeasurably more important.

Without a shared reality, we can’t build trust. Without accurate civics education and an authoritative journalism of verification, we can’t share a reality.

Comments