What Do We Know and Who Can We Trust?

I thought I’d share a presentation I made at “Weekend U,” a series of workshops hosted by IUPUI Alumni, focused on our current media environment. It’s a bit long, so my apologies….

It is always tempting to assert that we live in times that are radically unlike past eras—that somehow, the challenges we face are not only fundamentally different than the problems that confronted our forebears, but worse; to worry that children growing up today are subject to more pernicious influences than children of prior generations. (In Stephanie Coontz’ felicitous phrase, there is a great deal of nostalgia for “the way we never were.”) I grew up in the 1950s, and can personally attest to the fact that all of the misty-eyed recollections of that time are revisionist nonsense. The widespread belief that 50s-era Americans all lived like the characters from shows like “Father Knows Best” or “Leave it to Beaver” is highly inaccurate, to put it mildly. If you don’t believe me, ask the African-Americans who were still relegated to separate restrooms and drinking fountains in much of the American South, or the women who couldn’t get equal pay for equal work or a credit rating separate from their husbands.

Nevertheless—even conceding our human tendency to overstate the effects of social change for good or ill—it is impossible to understand the crazy in contemporary life without recognizing the profound social changes that have been both generated by and reflected in our modern communication technologies, most prominently the Internet.

We live today in an incessant babble of information, some of which is credible and much of which is not. Some of that information comes to us through hundreds of cable and broadcast television stations, increasing numbers of which are devoted to “news,” broadly defined twenty-four hours a day, seven days a week.   In our cars, we tune in to news and commentary on AM or FM stations, or more recently to satellite broadcasts that have extended the reach of that broadcast medium. Then there’s the Web—streaming incessantly not just through our computers, but also through our smart phones and tablets. The one place we increasingly don’t get our information is from a daily newspaper, or  other sources that might be called the “journalism of verification.”

That’s the reality of our lives today. We read news and commentary from all over the world on line, we shop for goods and services on line, we communicate with our friends and families through email and Facebook, and we consult web-based sources for everything from medical advice to housekeeping hints to comedy routines. When we don’t know something, we Google it.  The web is rapidly becoming a repository of all human knowledge—not to mention human rumors, hatreds, gossip, trivia and paranoid fantasies. Picking our way through this landscape requires new skills, new ways of accessing, sorting and evaluating the credibility and value of what we see and hear. It is not an exaggeration to say that the enhanced communications environment has changed the way we process information and our very perceptions of reality. And we’re just beginning the task of figuring out how to cope with this brave new world.

Let me share a very minor example of how this communications environment affects our perceptions of reality: toward the end of her life, my mother was in a nursing home. Given the limited mobility of most of the residents, the television was a central focus of their day, and it was on continually. Although she had never been a particularly fearful person, nor one who focused on crime, my mother became convinced that crime rates were soaring. They weren’t. In fact, there had been a substantial decline over the preceding few years. But when mother was growing up, with the exception of particularly heinous incidents or crimes involving celebrities, the local media reported only local criminal behavior. Seventy years later, the television relayed daily reports of subway murders in London, train bombings in Spain, and other assorted misbehavior by people from all over the globe. To my mother and her elderly peers, it seemed that predators were suddenly everywhere.  And that was just television. The Internet has had an infinitely greater impact. I’m old, and probably use technology less than my grandchildren’s generation, but when I am driving to a location I’ve not previously visited, I get directions via the web (assuming I don’t have a GPS in my car or—more recently—in my cell phone). I can Instant Message or Skype my granddaughter in Wales, and for free—no long distance telephone charges incurred. I’m kept up-to-date on what my friends are doing via Facebook, and on political news via Twitter. Increasingly, I shop on line for books, office supplies, even clothing and home furnishings. I no longer visit the license branch and wait in line to renew my license plates; I go online and save the time and trouble. If I want to know how a congressman voted, the information is at my googling fingertips. I’m hardly alone. In an unbelievably short period of time, the Internet has not only made the world a smaller place, it has forever altered the rhythm of our daily lives.

Part of that alteration has meant that we are no longer passive consumers of information; the interactive nature of the web allows us to talk back, to post our opinions, to offer rebuttals. It brings us into contact with people of different countries, religions, cultures and backgrounds. It allows each of us, if we are so inclined, to become a publisher. When I was young, if you wanted to publish a newspaper, the costs of the printing press and distribution system were prohibitive, and most broadcast media was owned by the wealthy. Today, anyone with access to the Internet can hire a few reporters or “content providers” and create her own media outlet. One result is that the previously hierarchical nature of public knowledge is rapidly diminishing. The time-honored “gatekeeper” function of the press—when journalists decided what constituted news—will soon be a thing of the past, if it isn’t already.

The communication revolution isn’t limited to the delivery of news and information. Social networking sites have allowed like-minded people to connect with each other and form communities that span traditional geographical and political boundaries. (The growing global hegemony of the English language has further enabled cross-national communications.) One result is that it has become much harder to define just what a “community” is, and to recognize how those communities differ.

For example, on my most recent trip to Europe, I was struck by how homogenized citizens from western industrialized countries have become—how much we all look and dress alike. Thirty years ago, on our first trip to Europe, cultural differences expressed in clothing and mannerisms made it fairly easy to spot Americans. Over the intervening years, that has changed. Today, we dress alike, drive the same cars, watch the same television programs and listen to the same (mostly American) music. IPhones, IPods and IPads (and their various clones) are ubiquitous, as are Facebook and Google. Evidence of the globalization of culture—at least pop culture—is everywhere.

The participatory nature of the Internet has also encouraged—and enabled—a wide array of political and civic activism. How lasting that shift will be is still an open question, but the ways in which American political life has changed are unmistakable. Early in the development of the web, naysayers worried that the Internet would encourage people to become more solitary. They warned that people were being seduced by this new medium to withdraw from human and social interaction. In some cases, that may be true. For others, however, the Internet has been an “enabler,” facilitating a great wave of political and community organizing; it has become a mechanism for forming new kinds of communities, for finding like-minded people we didn’t previously know, even though they might have been living just down the street. “Meetings” on line have led to internet-facilitated “Meet Ups” and other face-to-face interactions in service of particular social and political goals.

A telling example of the profound change You Tube has wrought in the political landscape was the widely reported “macaca” moment of Senator George Allen during the 2006 campaign season. Allen, who was running for re-election to the Senate from Virginia, was considered a shoo-in for re-election, and a strong contender for the 2008 Republican Presidential nomination. While delivering a speech to a small gathering in rural Virginia, he pointed out a volunteer from his opponent’s campaign who was videotaping his talk.

“This fellow here, over here with the yellow shirt, macaca, or whatever his name is. He’s with my opponent. He’s following us around everywhere…Let’s give a welcome to macaca, here. Welcome to America and the real world of Virginia.”

Depending on how it is spelled, the word macaca can mean either a monkey that inhabits the Eastern Hemisphere or a town in South Africa. In some European cultures, macaca is also considered a racial slur against African immigrants. The Webb volunteer promptly uploaded the videotape of Allen’s remarks to You Tube; a mere three days later, it had been downloaded and viewed 334,254 times. It was picked up and endlessly replayed on the evening news. Print media across the country reported on the controversy, and radio talk show hosts argued about the meaning of the word macaca, and whether Allen had intended a slur. Investigative reporters whose curiosity had been piqued by the controversy dug up evidence of prior racially charged incidents involving Allen. By November, James Webb—initially dismissed as a long-shot candidate with little chance of defeating a popular sitting Senator—was the new Senator from Virginia, and “macaca moment” had entered the political lexicon as shorthand for a gaffe captured on video. Politics has never been the same since.

The 2008 Presidential campaign introduced the phenomenon of “viral videos.” Remember when will.i.am of the group Black-Eyed Peas created the music video, “Yes We Can?” For a time, that video was everywhere—forwarded and re-forwarded until literally millions of Americans had seen it. Humorous and not-so-humorous videos promoting and panning the candidates have since become ubiquitous. Campaign rumors (and worse) are endlessly forwarded, circulated and recirculated. In 2008, John McCain, who admitted never using a computer, and who displayed discomfort with the new media environment, was caught off-guard by videos showing him delivering inconsistent statements. I argued at the time that failure to understand the impact of the internet is failure to understand the world we live in, and that was undoubtedly the point of a pro-Obama blogger’s characterization of McCain as “an analog candidate for a digital age.” By last November, the impact of You Tube was far greater. Mitt Romney was videotaped at a private fundraiser dismissing 47% of Americans as non-taxpaying moochers who would never vote for him. To say that the video damaged his campaign would be an understatement.

The 2008 campaign was probably the first where all political candidates embraced the new technologies and made extensive use of email to raise funds, organize volunteers, counter charges, announce endorsements and rally their respective bases, at a tiny fraction of the cost of direct mail. The impact of all this would be difficult to exaggerate—and older politicians are still having difficulty negotiating the new media landscape.

It isn’t only our political life that has been profoundly changed by new technologies. The ability to communicate cheaply and almost instantaneously with millions of people, the ability to link up like-minded people, and the ability to both spread and counter misinformation are all having a profound impact on our political, civic and personal relationships in ways we cannot yet fully anticipate or appreciate.

This information revolution is particularly pertinent to the issue of trust in our civic and governing institutions. At no time in human history have citizens been as aware of every failure of competence, every allegation of corruption or malfeasance—real or imagined. Politicians like to talk about “low-information” voters, but even the most detached American citizen cannot escape hearing about institutional failures on a daily basis, whether it is reports of high levels of lead in children’s toys (said to be due to government failure to monitor imports properly), the collapse of a bridge in Minnesota (said to be due to government failure to inspect and repair deteriorating infrastructure), or the devastation caused by a drone strike (authorized first by the Bush and now by the Obama administrations). Evidence of the intransigence and partisanship of Congress and the appalling behavior of too many of its members is communicated on a daily basis. It may be true that things weren’t much different in past eras, but it is certainly the case that information about public wrongdoing or incompetence is infinitely more widespread in today’s wired and connected world.

So, of course, is misinformation.

As wonderful as some of the new technologies can be, there’s a downside—and the growth of propaganda is part of that downside. In my opinion, the most damaging side effect of this paradigm shift in the way we access information is the perilous state of journalism at a time when we desperately need verified, factual, objective information about our world, our government and our environment. Don’t let the bravado mask reality: newspapers as we’ve known them are dying.  Since 1990, a quarter of all newspaper jobs have disappeared. Once robust publications have closed their doors. Others have drastically curtailed delivery. Last year, the textbook I used in my Media and Policy class was “Will the Last Reporter Please Turn Out the Lights.”

Like it or not, we are increasingly dependent upon the Internet for our news. And that presents us with both an opportunity and a problem.

Eli Pariser was one of the founders of MoveOn.com, and an early believer in the power of the Internet to increase and improve democracy. But as he documents in an important book, “The Filter Bubble,” the technology that promises (and delivers) so much is moving us into what he calls a “mediated future”—a future in which each of us exists in a personalized universe of our own construction.

In an effort to give each of us what we want, sites like Google, Facebook, and Amazon are constantly refining their algorithms in order to deliver results that are “relevant” to each particular searcher, and they have more data about our individual likes and dislikes than most of us can imagine. As a result, two people googling “BP,” for example, will not necessarily get the same results, and certainly not in the same order. Someone whose search history suggests interest in investment information may get the company’s annual report, while someone with a history of environmental interests will get stories about the Gulf spill. Similarly, Facebook delivers the posts of friends and family that its algorithm suggests are most consistent with the member’s interests and beliefs, not everything those friends post.

Pariser calls this the “filter bubble,” and points out that—unlike choosing to listen to Fox rather than PBS or MSNBC, for example—the resulting bias is invisible to us.

Little by little, search by search, individuals are constructing different–and often conflicting–realities. At the same time, traditional news sources aimed at a general audience—the newspapers and broadcasts that required reporters to fact-check assertions, label opinion and aim for objectivity—are losing market share. How many will survive and in what form is anyone’s guess.

We can live without newsprint, but we desperately need real journalism—where reporters monitor what governments and businesses do, where they fact-check and provide context and background. Instead, we have mountains of unsubstantiated opinion, PR and spin. Good citizens have to be able to separate fact from fantasy. They have to live in the world as it is, not in a bubble where they listen only to things that confirm what they already believe—and the Internet makes it so easy and tempting to construct that bubble.

A favorite catchphrase of traditional media is “news you can use.” This catchphrase has come into increased use as newspaper readership has continued to decline–not just in Indianapolis, but nationally. The problem is that no one completes the sentence. Those who toss off the phrase don’t proceed to the important issue, which is: use how, and for what?

In my somewhat jaundiced opinion, the news citizens can use is information about our common institutions–including but not limited to government, and especially local government. Judging from what the newspapers are actually covering, however, they consider “news you can use” to be reviews of local restaurants, diet and home decorating tips, and sports. Not–as they used to say on Seinfeld–that there is anything wrong with that. At least, there wouldn’t be anything wrong with that if these stories were being served as “dessert” rather than the main course.

What we can use is a return to journalism’s time-honored watchdog role. But genuine watchdog coverage requires resources–enough reporters with enough time to investigate and monitor a wide variety of important government agencies and functions. Newspapers around the country that have survived have done so by engaging in wave after wave of layoffs. Those layoffs have left them with skeletal reporting operations, drastically compromising their capacity to provide genuine journalism.

Let me just conclude by describing what I mean by “genuine journalism.” Real journalism is accurate, objective, fact-checked reporting on what Alex Jones calls the “iron core,” fact-based accountability news.   Such news isn’t “fair and balanced,” because often, balance is neither fair nor accurate.

A couple of years ago, National Public Radio—one of our most reliable purveyors of “real journalism”– adopted new ethics guidelines. The new code stresses the importance of accuracy over false balance; it appears–finally–to abandon the “he said, she said” approach (what I have elsewhere called “stenography masquerading as reporting”) that all too often distorts truth in favor of a phony “fairness.”

The policy reads: “At all times, we report for our readers and listeners, not our sources. So our primary consideration when presenting the news is that we are fair to the truth. If our sources try to mislead us or put a false spin on the information they give us, we tell our audience. If the balance of evidence in a matter of controversy weighs heavily on one side, we acknowledge it in our reports. We strive to give our audience confidence that all sides have been considered and represented fairly.”

One of my biggest gripes over the past several years has been the wholesale abandonment of precisely this tenet of good journalism. A good example has been environmental reporting–how many times have media sources reported on climate change, for example, by giving equal time and weight to the settled science and the deniers, without ever noting that the deniers constitute less than 1% of all climate scientists, and are generally regarded as a kooky fringe? That’s “balance,” but it certainly isn’t “fair to the truth.”

Several years ago, this sort of false equivalency was illustrated by one of my all-time favorite Daily Show skits. The “senior journalism reporter” was explaining the Swift Boat Veterans for Truth attacks on John Kerry to Jon Stewart. “The Swift Boat Veterans say such-and-such happened; the Kerry Campaign says it didn’t. Back to you, Jon!” When Stewart then asked “But aren’t you going to tell us who is telling the truth?”  the response was dead-on. “Absolutely not, Jon. This is journalism.”

Far too often, our remaining reporters pursue artificial balance at the expense of truth. If a Democratic campaign plays a dirty trick, reporters rush to remind their audience of a similar transgression by Republicans, and vice-versa. This search for equivalency may be well intentioned, but it misrepresents reality and misleads those of us who depend upon the media for accurate information. NPR’s recognition of this pernicious practice, and its new Code of Ethics, are a welcome sign that at least some journalists might be returning to Job One: telling us the unembellished truth.

At the end of the day, we need to recognize that the journalism of accountability and verification, the journalism that acts as a watchdog over our common institutions, is irreplaceable. My own favorite journalist, Jon Stewart, put it best in an interview with Terry Gross of NPR. Gross noted his constant criticisms of both politicians and the media, and asked Stewart who he felt was most culpable. Stewart said “Politicians are politicians. If you go to the zoo and monkeys are throwing feces, well—that’s what monkeys do. But you’d like to have the zoo-keeper there saying ‘Bad Monkey.’”

That pretty much sums it up.

Comments

Quote of the Day

From Dick Lugar’s first address after leaving elective office, an observation worth pondering:

Perhaps the most potent force driving partisanship is the rise of a massive industry that makes money off political discord. This industry encompasses cable news networks, talk radio shows, partisan think tanks, direct mail fundraisers, innumerable websites and blogs, social media and gadfly candidates and commentators. Many of these entities have a deep economic stake in perpetuating political conflict. They are successully marketing and monetizing partisan outrage.

Comments

Betraying the American Dream

When I was growing up, the accepted description of America was “land of opportunity.” It was commonly believed that the American Dream could be attained by anyone willing to work hard; social mobility was the name of the game.

Knowing that poverty isn’t necessarily permanent is hugely important in a capitalist system. Inequalities are inevitable, but they need not be paralyzing, they need not engender the sorts of simmering resentments that lead to social unrest, because they are seen as temporary and (fairly or unfairly) a reflection of the effort and entrepreneurship of the individual.

We are beginning to see what happens when it becomes apparent that Americans can no longer work themselves into the middle class. Thanks to short-sighted and mean-spirited public policies, such social mobility as previously characterized our economic system (it was probably never as obtainable as national mythology had it) is largely a thing of the past.

In a column addressing the need for high quality early childhood education, Gail Collins put it bluntly: “We have no bigger crisis as a nation than the class barrier. We’re near the bottom of the industrialized world when it comes to upward mobility. A child born to poor parents has a pathetic chance of growing up to be anything but poor. This isn’t the way things were supposed to be in the United States. But here we are.”

In his recent book on inequality, Nobel-prize winning economist Joseph Stiglitz underlined the current lack of social mobility in America–and its unpleasant consequences.

We have a problem, and it isn’t temporary, isn’t a result of the recent economic downturn. Social scientists have documented the characteristics of stable democracies–the attitudes and institutions that keep societies from erupting, that strengthen the social fabric rather than tearing it. A perception that the government “plays fair” and a belief in opportunity for advancement–a belief that effort and diligence will be rewarded–are among them.

In his State of the Union speech, President Obama proposed two measures–universal access to preschool and raising the minimum wage–that would begin, however modestly, to address the problem. There is ample research connecting early childhood education to later economic well-being. There is equally persuasive research rebutting the proposition that a higher minimum wage means fewer jobs. (The latter proposition seems so logical, I used to believe it was self-evident; a copious amount of research, however, shows otherwise.)

The “usual suspects” met the President’s proposals with their usual screams of “socialism.” Those usual suspects, however, should rethink their support of the status quo. When poor people lose hope–when the belief in the possibility of bettering their condition disappears, and they face the fact that social mobility is rapidly becoming a myth and the American Dream is out of reach–they become people with nothing to lose. Eventually, they take to the streets and threaten the comfortable.

What’s that old line? Pigs get fed, but hogs get slaughtered.

Comments

This Has Gone Too Far….

The news that Senate Republicans plan to filibuster the President’s nomination of (Republican) Chuck Hagel for Defense Secretary ought to be the final straw.

Harry Reid clearly allowed himself to be punked, settling for a toothless agreement with Mitch McConnell rather than the genuine reform of the much-abused filibuster that he promised. And we are all paying a high price for his fecklessness.

I understand the legitimate use of that legislative weapon to prevent a majority from running roughshod over the minority. But in its current form, the filibuster is being used by a minority–by partisans whose positions were emphatically rejected by the electorate–to defeat virtually every effort undertaken by a popularly-elected majority. As a result, government has been brought to a standstill. Nothing can be done unless a super-majority vote can be rounded up–and finding sixty votes in a Senate occupied by too many small-minded, mean-spirited partisan hacks is no easy task.

At the very least, those who want to bring government to a halt should have to stand on the Senate floor and actually talk. It should not be enough for the minority members to raise their little pinkies and announce that they are “virtually” filibustering, so please go f#*#k yourself.

The intransigence of these GOP Senators has cost this country dearly during the recent economic meltdown. For every bad idea they’ve blocked, we’ve lost many more opportunities to improve the lives of middle-class Americans, to strengthen our crumbling infrastructure, to create jobs and take measures to protect the environment.

The federal legislative system is designed to work on the principle of majority rule.  A majority of those who have been elected to represent the voters is supposed to determine what laws will be enacted. That doesn’t mean that Senators who oppose legislation cannot express that opposition forcefully in their floor speeches and their votes. It does mean that when the minority party consistently refuses to allow an up-and-down majority vote, that party isn’t just blocking particular measures: it is undermining American government–and it is becoming increasingly clear that destroying the capacity to govern is not an incidental or unintended consequence of these tactics; it is the real reason for them. It’s a feature, not a bug.

Presidential nominees have never been filibustered.  Even John McCain–who has made his contempt for the man who defeated him quite plain– has argued against such an unprecedented move. If the Republicans want to vote against Hagel’s confirmation, fine. That is clearly their prerogative–although, as Dick Lugar has maintained, there should be a rebuttable presumption that the executive is entitled to his choice of those he wants populating his administration. Preventing an up-and-down vote simply because they can–motivated by spite, anti-government fervor and a level of partisanship that dwarfs anything previously seen–is beyond reprehensible. It is beyond irresponsible.

It’s despicable and profoundly unAmerican. And it needs to stop.

Comments

The Kids Are All Right

I routinely apologize to my graduate students for my generation, and the mess we’ve made of the world we’re leaving them. I tell them that it will be up to their generation to clean that mess up, and generally speaking, I find most of them up to the task. Unlike people who wring their hands and bemoan the state of “today’s youth”–a practice that began with Socrates’ Athens, if I’m not mistaken–I find the students who populate my classes to be, on balance, thoughtful, fair-minded, evidence-based and public-spirited. They give me hope that they really will improve our common institutions.

Of course, these are graduate students I’m talking about, and self-selected ones at that. So it was interesting to get an email from my sister, who created and runs the art program at Sycamore School here in Indianapolis, about one of her eighth graders.

In my eighth grade class, my students are to keep a notebook.  Each week, I hand out a quote or comment or question about art, and they must respond.  One week, the question was, “Is there any time when art, no matter how well done, should not be displayed?”

Today as I was grading the notebooks, I came across this answer, which I thought might interest you.  (I could show you notebooks that would blow your mind!)
“No, I think blasphemy and profanity are only ever taken down by less enlightened people.  Enlightenment comes from not having a perfect society.  By not allowing both the good and the bad of living, true intellect is unobtainable..”
John Stuart Mill would be proud of this kid. He has figured out what the nation’s founders knew, but so many of our would-be contemporary censors still can’t seem to grasp–the proper response to bad speech is more and better speech–not suppression. Only when all ideas are available for examination can we ever hope to distinguish between truth and falsity.