Balancing Act

Leave it to the British to accurately diagnose what is terribly wrong with the American media.

It’s the mindless elevation of “balance” over accuracy. Somewhere along the line, members of the American news media (I’m hesitant to call them journalists) decided that “he said, she said” was reporting. It isn’t. It’s stenography.

This emphasis on “balance” at the expense of accuracy and the old-style journalism of verification is abetted by the media’s genuine bias, which is neither conservative nor liberal  but rather a bias for conflict. If it bleeds, it leads.

So we get “balanced” coverage of things like climate change.  More than 99% of climate scientists agree that the earth is warming, but our intrepid media will find that one crank who insists otherwise, and give us a “balanced” story by quoting “both sides.” Left unreported is the fact that the science is overwhelmingly on one “side” and the “debate” is virtually non-existent.

Or we get political coverage that has been dubbed “false equivalence.” There’s a reason for that. Over the past couple of decades, the right wing has employed a brilliant strategy: labeling the media “liberal.” (Has a factual report cast you in an unfavorable light? Scream immediately about the liberal, “lame stream” media.)  In response, most traditional media outlets have been cowed into reporting a phony equivalence whenever possible, a “plague on both your houses” approach that often distorts the reality of a situation and even more often encourages lazy reporting. How much easier it is to quote a Republican and a Democrat and then go home–without ever bothering to tell the audience who is telling the truth.

No wonder so many people don’t trust the media. Very few are still trustworthy.

Comments

Just the Facts

As regular readers of this blog know, I tend to harp a lot on the inadequacies of the media and the importance of accurate and complete information. My (frequently unarticulated) assumption is that if people agree about the facts of a matter, they are more likely to agree upon what those facts mean. So facts matter. A lot.

Case in point: yesterday, I shared my frustration about Fox News and its incessant drumbeat about a ‘Benghazi scandal’ the details of which the network neglects to specify. One of the commenters purported to fill in the blanks by asserting that the administration had refused to deploy troops that were within range and might have saved lives.

That would indeed be scandalous, if true. But as most other media outlets have reported, every military official in a position to know has emphatically denied the allegation. (Former Secretary of Defense Gates characterized the belief that the nearest troops could have gotten to Benghazi in time to defend the embassy as based upon “a cartoonish understanding” of military operations.) Unless every military expert from Gates on down is part of a conspiracy to protect the administration, the facts do not support the single concrete accusation being made.

I’ve been mulling over the role fact-finding plays in our political debates, because I’ve been reading a book that has been getting a lot of attention lately, Jonathan Haidt’s The Righteous Mind: Why Good People are Divided by Politics and Religion. Haidt’s scholarship is focused upon moral psychology, and the book is an excellent and very accessible exploration of evolutionary morality and the operation of culture on innate human tendencies.

One of the innate tendencies Haidt identifies is a belief in proportionality; that is, a belief that reward should be based upon contribution. Most of us have an innate “fairness” monitor that tells us that the member of the tribe who works hard should be entitled to a greater share of communal goods produced than the slacker.

I think both conservatives and liberals agree with this moral premise. Their dispute is with application—that is, with the facts. For example, if you believe that people are poor because they are lazy and conniving—that is, slackers, you will resent their dependence on public assistance. If you discover that the great majority of poor people work 40 or more hours a week at jobs that simply do not pay enough to allow them to get by, and that those who are “gaming the system” are a very small percentage, you are less likely to feel that you’ve been taken advantage of and more likely to support policies aimed at making the working poor self-sufficient.

There are lots of other examples, but the basic point is: facts matter. Conservatives and liberals (terms that have lost much clarity in any event) share many more moral premises than the pundits and pontificators assume.

What we increasingly do not share is accurate and complete information–and a uniformly credible media.

Comments

Media Malpractice

Who can Americans trust to report news accurately? Yesterday, I blogged about a recent survey that showed increasing skepticism about Fox News. Barely a half-hour after I posted, my husband mentioned that he’d been listening to a newscast on the radio in which the reporter interviewed lawmakers who are calling for the use of military tribunals for the Boston bombing suspects. According to my husband, the newscaster then reported–as fact–that such tribunals have proved to be more effective than the regular criminal courts. “I didn’t know that,” he said.

He didn’t know it, because that superior effectiveness is not even remotely a “fact.”

The facts are these: after 9-11, the Bush administration initiated prosecutions of 828 people on terrorism charges in civilian courts. Last year, according to a report from the Center on Law and Security, NYU School of Law, trials were still pending against 235 of them. That leaves 593 resolved cases. Of that number, 523 were convicted, for a conviction rate of 88%.

In addition, the Bush administration pursued 20 cases in military tribunals. So far, there have been exactly three convictions. The highest-profile was the case involving Salim Hamdan, Osama bin Laden’s driver. Hamdan was convicted, but he was sentenced  by a military jury to a mere five and half years–and the tribunal judge, a US Navy captain, gave him credit for time served, which was five years. So Hamdan served only six months after conviction.

Furthermore, in Hamdan v. Rumsfeld–the case that grew out of this particular trial–the Supreme Court held that the Military Tribunals as constituted at the time violated both the Geneva Conventions and the Uniform Code of Military Justice.

The propriety of using a Military Tribunal in any given case is, of course, open to debate. What is not debatable is the history of their past performance. It is perfectly legitimate to argue about the pros and cons of using such tribunals; I have my opinion, and others are entitled to theirs. But that debate needs to be grounded in fact, not propaganda.

If we cannot depend upon the media to provide accurate information and to separate opinion from fact– if we have lost what used to be called “the journalism of verification”– we are reduced to exchanging opinions anchored to nothing but our individual biases.

We live in a complicated world. We desperately need a competent and trustworthy media.

Comments

Garbage In, Garbage Out

At one time or another, those of us who teach despair of the whole educational enterprise. We entertain the dark suspicion that some people simply can’t make use of information–that their ability to reason is faulty, that they are unable to consider and evaluate evidence to reach sound conclusions.

Happily, I’m wrong.  At least, that’s the conclusion reached by researchers at Princeton,  When a wrong choice is made, the researchers found that it might be the information rather than the brain’s decision-making process that is to blame.

The results of the study were reported in Science Daily, and the experiment involved very simple types of information; nevertheless, if the conclusions are replicated, the importance of good education and accurate journalism increases.

If human decision-making depends upon the quality of the information available, those of us in the information-providing business have an ethical obligation to provide information that is sound and verified. In public school classrooms, that means teaching science in science class,not religion. It means teaching American and constitutional history in much more depth. It means introducing students to the world beyond America’s borders–the world they will increasingly interact with, and about which they will need solid information.

As important as education is, the information we are fed daily is even more consequential. In a country that celebrates free expression, we can’t mandate truth in journalism–and even a cursory trip around the internet will demonstrate how much  unreliable and delusional “information” is out there. In the age of the internet, it’s increasingly difficult to separate fact from opinion and both from outright propaganda. When we relied upon daily newspapers and the evening news–the “legacy” media–we missed a lot, but those journalists generally followed an ethical code that required independent verification of information before it was reported. In today’s news environment, with the 24-hour “news hole,” speed often trumps accuracy even for the more responsible media–and there are more and more irresponsible media outlets competing for our attention.

We can’t make good decisions if we don’t have trustworthy information.

The Princeton study validates a couple of old sayings: “it ain’t what you don’t know that hurts you; it’s what you know that just ain’t so.”   And the even pithier, “garbage in, garbage out.”

Comments

What Do We Know and Who Can We Trust?

I thought I’d share a presentation I made at “Weekend U,” a series of workshops hosted by IUPUI Alumni, focused on our current media environment. It’s a bit long, so my apologies….

It is always tempting to assert that we live in times that are radically unlike past eras—that somehow, the challenges we face are not only fundamentally different than the problems that confronted our forebears, but worse; to worry that children growing up today are subject to more pernicious influences than children of prior generations. (In Stephanie Coontz’ felicitous phrase, there is a great deal of nostalgia for “the way we never were.”) I grew up in the 1950s, and can personally attest to the fact that all of the misty-eyed recollections of that time are revisionist nonsense. The widespread belief that 50s-era Americans all lived like the characters from shows like “Father Knows Best” or “Leave it to Beaver” is highly inaccurate, to put it mildly. If you don’t believe me, ask the African-Americans who were still relegated to separate restrooms and drinking fountains in much of the American South, or the women who couldn’t get equal pay for equal work or a credit rating separate from their husbands.

Nevertheless—even conceding our human tendency to overstate the effects of social change for good or ill—it is impossible to understand the crazy in contemporary life without recognizing the profound social changes that have been both generated by and reflected in our modern communication technologies, most prominently the Internet.

We live today in an incessant babble of information, some of which is credible and much of which is not. Some of that information comes to us through hundreds of cable and broadcast television stations, increasing numbers of which are devoted to “news,” broadly defined twenty-four hours a day, seven days a week.   In our cars, we tune in to news and commentary on AM or FM stations, or more recently to satellite broadcasts that have extended the reach of that broadcast medium. Then there’s the Web—streaming incessantly not just through our computers, but also through our smart phones and tablets. The one place we increasingly don’t get our information is from a daily newspaper, or  other sources that might be called the “journalism of verification.”

That’s the reality of our lives today. We read news and commentary from all over the world on line, we shop for goods and services on line, we communicate with our friends and families through email and Facebook, and we consult web-based sources for everything from medical advice to housekeeping hints to comedy routines. When we don’t know something, we Google it.  The web is rapidly becoming a repository of all human knowledge—not to mention human rumors, hatreds, gossip, trivia and paranoid fantasies. Picking our way through this landscape requires new skills, new ways of accessing, sorting and evaluating the credibility and value of what we see and hear. It is not an exaggeration to say that the enhanced communications environment has changed the way we process information and our very perceptions of reality. And we’re just beginning the task of figuring out how to cope with this brave new world.

Let me share a very minor example of how this communications environment affects our perceptions of reality: toward the end of her life, my mother was in a nursing home. Given the limited mobility of most of the residents, the television was a central focus of their day, and it was on continually. Although she had never been a particularly fearful person, nor one who focused on crime, my mother became convinced that crime rates were soaring. They weren’t. In fact, there had been a substantial decline over the preceding few years. But when mother was growing up, with the exception of particularly heinous incidents or crimes involving celebrities, the local media reported only local criminal behavior. Seventy years later, the television relayed daily reports of subway murders in London, train bombings in Spain, and other assorted misbehavior by people from all over the globe. To my mother and her elderly peers, it seemed that predators were suddenly everywhere.  And that was just television. The Internet has had an infinitely greater impact. I’m old, and probably use technology less than my grandchildren’s generation, but when I am driving to a location I’ve not previously visited, I get directions via the web (assuming I don’t have a GPS in my car or—more recently—in my cell phone). I can Instant Message or Skype my granddaughter in Wales, and for free—no long distance telephone charges incurred. I’m kept up-to-date on what my friends are doing via Facebook, and on political news via Twitter. Increasingly, I shop on line for books, office supplies, even clothing and home furnishings. I no longer visit the license branch and wait in line to renew my license plates; I go online and save the time and trouble. If I want to know how a congressman voted, the information is at my googling fingertips. I’m hardly alone. In an unbelievably short period of time, the Internet has not only made the world a smaller place, it has forever altered the rhythm of our daily lives.

Part of that alteration has meant that we are no longer passive consumers of information; the interactive nature of the web allows us to talk back, to post our opinions, to offer rebuttals. It brings us into contact with people of different countries, religions, cultures and backgrounds. It allows each of us, if we are so inclined, to become a publisher. When I was young, if you wanted to publish a newspaper, the costs of the printing press and distribution system were prohibitive, and most broadcast media was owned by the wealthy. Today, anyone with access to the Internet can hire a few reporters or “content providers” and create her own media outlet. One result is that the previously hierarchical nature of public knowledge is rapidly diminishing. The time-honored “gatekeeper” function of the press—when journalists decided what constituted news—will soon be a thing of the past, if it isn’t already.

The communication revolution isn’t limited to the delivery of news and information. Social networking sites have allowed like-minded people to connect with each other and form communities that span traditional geographical and political boundaries. (The growing global hegemony of the English language has further enabled cross-national communications.) One result is that it has become much harder to define just what a “community” is, and to recognize how those communities differ.

For example, on my most recent trip to Europe, I was struck by how homogenized citizens from western industrialized countries have become—how much we all look and dress alike. Thirty years ago, on our first trip to Europe, cultural differences expressed in clothing and mannerisms made it fairly easy to spot Americans. Over the intervening years, that has changed. Today, we dress alike, drive the same cars, watch the same television programs and listen to the same (mostly American) music. IPhones, IPods and IPads (and their various clones) are ubiquitous, as are Facebook and Google. Evidence of the globalization of culture—at least pop culture—is everywhere.

The participatory nature of the Internet has also encouraged—and enabled—a wide array of political and civic activism. How lasting that shift will be is still an open question, but the ways in which American political life has changed are unmistakable. Early in the development of the web, naysayers worried that the Internet would encourage people to become more solitary. They warned that people were being seduced by this new medium to withdraw from human and social interaction. In some cases, that may be true. For others, however, the Internet has been an “enabler,” facilitating a great wave of political and community organizing; it has become a mechanism for forming new kinds of communities, for finding like-minded people we didn’t previously know, even though they might have been living just down the street. “Meetings” on line have led to internet-facilitated “Meet Ups” and other face-to-face interactions in service of particular social and political goals.

A telling example of the profound change You Tube has wrought in the political landscape was the widely reported “macaca” moment of Senator George Allen during the 2006 campaign season. Allen, who was running for re-election to the Senate from Virginia, was considered a shoo-in for re-election, and a strong contender for the 2008 Republican Presidential nomination. While delivering a speech to a small gathering in rural Virginia, he pointed out a volunteer from his opponent’s campaign who was videotaping his talk.

“This fellow here, over here with the yellow shirt, macaca, or whatever his name is. He’s with my opponent. He’s following us around everywhere…Let’s give a welcome to macaca, here. Welcome to America and the real world of Virginia.”

Depending on how it is spelled, the word macaca can mean either a monkey that inhabits the Eastern Hemisphere or a town in South Africa. In some European cultures, macaca is also considered a racial slur against African immigrants. The Webb volunteer promptly uploaded the videotape of Allen’s remarks to You Tube; a mere three days later, it had been downloaded and viewed 334,254 times. It was picked up and endlessly replayed on the evening news. Print media across the country reported on the controversy, and radio talk show hosts argued about the meaning of the word macaca, and whether Allen had intended a slur. Investigative reporters whose curiosity had been piqued by the controversy dug up evidence of prior racially charged incidents involving Allen. By November, James Webb—initially dismissed as a long-shot candidate with little chance of defeating a popular sitting Senator—was the new Senator from Virginia, and “macaca moment” had entered the political lexicon as shorthand for a gaffe captured on video. Politics has never been the same since.

The 2008 Presidential campaign introduced the phenomenon of “viral videos.” Remember when will.i.am of the group Black-Eyed Peas created the music video, “Yes We Can?” For a time, that video was everywhere—forwarded and re-forwarded until literally millions of Americans had seen it. Humorous and not-so-humorous videos promoting and panning the candidates have since become ubiquitous. Campaign rumors (and worse) are endlessly forwarded, circulated and recirculated. In 2008, John McCain, who admitted never using a computer, and who displayed discomfort with the new media environment, was caught off-guard by videos showing him delivering inconsistent statements. I argued at the time that failure to understand the impact of the internet is failure to understand the world we live in, and that was undoubtedly the point of a pro-Obama blogger’s characterization of McCain as “an analog candidate for a digital age.” By last November, the impact of You Tube was far greater. Mitt Romney was videotaped at a private fundraiser dismissing 47% of Americans as non-taxpaying moochers who would never vote for him. To say that the video damaged his campaign would be an understatement.

The 2008 campaign was probably the first where all political candidates embraced the new technologies and made extensive use of email to raise funds, organize volunteers, counter charges, announce endorsements and rally their respective bases, at a tiny fraction of the cost of direct mail. The impact of all this would be difficult to exaggerate—and older politicians are still having difficulty negotiating the new media landscape.

It isn’t only our political life that has been profoundly changed by new technologies. The ability to communicate cheaply and almost instantaneously with millions of people, the ability to link up like-minded people, and the ability to both spread and counter misinformation are all having a profound impact on our political, civic and personal relationships in ways we cannot yet fully anticipate or appreciate.

This information revolution is particularly pertinent to the issue of trust in our civic and governing institutions. At no time in human history have citizens been as aware of every failure of competence, every allegation of corruption or malfeasance—real or imagined. Politicians like to talk about “low-information” voters, but even the most detached American citizen cannot escape hearing about institutional failures on a daily basis, whether it is reports of high levels of lead in children’s toys (said to be due to government failure to monitor imports properly), the collapse of a bridge in Minnesota (said to be due to government failure to inspect and repair deteriorating infrastructure), or the devastation caused by a drone strike (authorized first by the Bush and now by the Obama administrations). Evidence of the intransigence and partisanship of Congress and the appalling behavior of too many of its members is communicated on a daily basis. It may be true that things weren’t much different in past eras, but it is certainly the case that information about public wrongdoing or incompetence is infinitely more widespread in today’s wired and connected world.

So, of course, is misinformation.

As wonderful as some of the new technologies can be, there’s a downside—and the growth of propaganda is part of that downside. In my opinion, the most damaging side effect of this paradigm shift in the way we access information is the perilous state of journalism at a time when we desperately need verified, factual, objective information about our world, our government and our environment. Don’t let the bravado mask reality: newspapers as we’ve known them are dying.  Since 1990, a quarter of all newspaper jobs have disappeared. Once robust publications have closed their doors. Others have drastically curtailed delivery. Last year, the textbook I used in my Media and Policy class was “Will the Last Reporter Please Turn Out the Lights.”

Like it or not, we are increasingly dependent upon the Internet for our news. And that presents us with both an opportunity and a problem.

Eli Pariser was one of the founders of MoveOn.com, and an early believer in the power of the Internet to increase and improve democracy. But as he documents in an important book, “The Filter Bubble,” the technology that promises (and delivers) so much is moving us into what he calls a “mediated future”—a future in which each of us exists in a personalized universe of our own construction.

In an effort to give each of us what we want, sites like Google, Facebook, and Amazon are constantly refining their algorithms in order to deliver results that are “relevant” to each particular searcher, and they have more data about our individual likes and dislikes than most of us can imagine. As a result, two people googling “BP,” for example, will not necessarily get the same results, and certainly not in the same order. Someone whose search history suggests interest in investment information may get the company’s annual report, while someone with a history of environmental interests will get stories about the Gulf spill. Similarly, Facebook delivers the posts of friends and family that its algorithm suggests are most consistent with the member’s interests and beliefs, not everything those friends post.

Pariser calls this the “filter bubble,” and points out that—unlike choosing to listen to Fox rather than PBS or MSNBC, for example—the resulting bias is invisible to us.

Little by little, search by search, individuals are constructing different–and often conflicting–realities. At the same time, traditional news sources aimed at a general audience—the newspapers and broadcasts that required reporters to fact-check assertions, label opinion and aim for objectivity—are losing market share. How many will survive and in what form is anyone’s guess.

We can live without newsprint, but we desperately need real journalism—where reporters monitor what governments and businesses do, where they fact-check and provide context and background. Instead, we have mountains of unsubstantiated opinion, PR and spin. Good citizens have to be able to separate fact from fantasy. They have to live in the world as it is, not in a bubble where they listen only to things that confirm what they already believe—and the Internet makes it so easy and tempting to construct that bubble.

A favorite catchphrase of traditional media is “news you can use.” This catchphrase has come into increased use as newspaper readership has continued to decline–not just in Indianapolis, but nationally. The problem is that no one completes the sentence. Those who toss off the phrase don’t proceed to the important issue, which is: use how, and for what?

In my somewhat jaundiced opinion, the news citizens can use is information about our common institutions–including but not limited to government, and especially local government. Judging from what the newspapers are actually covering, however, they consider “news you can use” to be reviews of local restaurants, diet and home decorating tips, and sports. Not–as they used to say on Seinfeld–that there is anything wrong with that. At least, there wouldn’t be anything wrong with that if these stories were being served as “dessert” rather than the main course.

What we can use is a return to journalism’s time-honored watchdog role. But genuine watchdog coverage requires resources–enough reporters with enough time to investigate and monitor a wide variety of important government agencies and functions. Newspapers around the country that have survived have done so by engaging in wave after wave of layoffs. Those layoffs have left them with skeletal reporting operations, drastically compromising their capacity to provide genuine journalism.

Let me just conclude by describing what I mean by “genuine journalism.” Real journalism is accurate, objective, fact-checked reporting on what Alex Jones calls the “iron core,” fact-based accountability news.   Such news isn’t “fair and balanced,” because often, balance is neither fair nor accurate.

A couple of years ago, National Public Radio—one of our most reliable purveyors of “real journalism”– adopted new ethics guidelines. The new code stresses the importance of accuracy over false balance; it appears–finally–to abandon the “he said, she said” approach (what I have elsewhere called “stenography masquerading as reporting”) that all too often distorts truth in favor of a phony “fairness.”

The policy reads: “At all times, we report for our readers and listeners, not our sources. So our primary consideration when presenting the news is that we are fair to the truth. If our sources try to mislead us or put a false spin on the information they give us, we tell our audience. If the balance of evidence in a matter of controversy weighs heavily on one side, we acknowledge it in our reports. We strive to give our audience confidence that all sides have been considered and represented fairly.”

One of my biggest gripes over the past several years has been the wholesale abandonment of precisely this tenet of good journalism. A good example has been environmental reporting–how many times have media sources reported on climate change, for example, by giving equal time and weight to the settled science and the deniers, without ever noting that the deniers constitute less than 1% of all climate scientists, and are generally regarded as a kooky fringe? That’s “balance,” but it certainly isn’t “fair to the truth.”

Several years ago, this sort of false equivalency was illustrated by one of my all-time favorite Daily Show skits. The “senior journalism reporter” was explaining the Swift Boat Veterans for Truth attacks on John Kerry to Jon Stewart. “The Swift Boat Veterans say such-and-such happened; the Kerry Campaign says it didn’t. Back to you, Jon!” When Stewart then asked “But aren’t you going to tell us who is telling the truth?”  the response was dead-on. “Absolutely not, Jon. This is journalism.”

Far too often, our remaining reporters pursue artificial balance at the expense of truth. If a Democratic campaign plays a dirty trick, reporters rush to remind their audience of a similar transgression by Republicans, and vice-versa. This search for equivalency may be well intentioned, but it misrepresents reality and misleads those of us who depend upon the media for accurate information. NPR’s recognition of this pernicious practice, and its new Code of Ethics, are a welcome sign that at least some journalists might be returning to Job One: telling us the unembellished truth.

At the end of the day, we need to recognize that the journalism of accountability and verification, the journalism that acts as a watchdog over our common institutions, is irreplaceable. My own favorite journalist, Jon Stewart, put it best in an interview with Terry Gross of NPR. Gross noted his constant criticisms of both politicians and the media, and asked Stewart who he felt was most culpable. Stewart said “Politicians are politicians. If you go to the zoo and monkeys are throwing feces, well—that’s what monkeys do. But you’d like to have the zoo-keeper there saying ‘Bad Monkey.’”

That pretty much sums it up.

Comments