Tag Archives: Tom Wheeler

Regulating Facebook et al

Over the past few years, as my concerns about the media environment we inhabit have grown, I have found Tom Wheeler’s columns and interviews invaluable. Wheeler–for those of you unfamiliar with him– chaired the Federal Communications Commission from 2013 to 2017, and is currently both a senior fellow at Harvard’s Kennedy School Shorenstein Center and a visiting fellow at the Brookings Institution.

He’s also a clear writer and thinker.

In a recent article for Time Magazine, Wheeler proposes the establishment of a new federal agency that would be empowered to regulate Internet giants like Facebook. He began the article by noting Mark Zuckerberg’s apparent willingness to be regulated–a willingness expressed in advertisements and testimony to Congress. As he notes, however,

A tried-and-true lobbying strategy is to loudly proclaim support for lofty principles while quietly working to hollow out the implementation of such principles. The key is to move beyond embracing generic concepts to deal with regulatory specifics. The headline on Politico’s report of the March 25 House of Representatives hearing, “D.C.’s Silicon Valley crackdown enters the haggling phase,” suggests that such an effort has begun. Being an optimist, I want to take Facebook at its word that it supports updated internet regulations. Being a pragmatist and former regulator, though, I believe we need to know exactly what such regulations would provide.

Wheeler proceeds to explain why he favors the creation of a separate agency that would be charged with regulating “big Tech.” As he notes, most proposals in Congress would give that job to the Federal Trade Commission (FTC). Wheeler has nothing negative to say about the FTC but points out that the agency is already “overburdened with an immense jurisdiction.” (Companies have even been known to seek transfer of their oversight to the agency, believing that such a transfer would allow its issues to get lost among the extensive and pressing other matters for which the agency is responsible.) Furthermore,  oversight of digital platforms “should not be a bolt-on to an existing agency but requires full-time specialized focus.”

So how should a new agency approach its mission?

Digital companies complain (not without some merit) that current regulation with its rigid rules is incompatible with rapid technology developments. To build agile policies capable of evolving with technology, the new agency should take a page from the process used in developing the technology standards that created the digital revolution. In that effort, the companies came together to agree on exactly how things would work. This time, instead of technical standards, there would be behavioral standards.

The subject matter of these new standards should be identified by the agency, which would convene industry and public stakeholders to propose a code, much like electric codes and fire codes. Ultimately, the agency would approve or modify the code and enforce it. While there is no doubt that such a new approach is ambitious, the new challenges of the digital giants require new tools.

Wheeler proceeds to outline how the proposed agency would approach issues such as misinformation and privacy, and to describe how it might promote and protect competition in the digital marketplace.

It’s a truism among policy wonks that government’s efforts to engage with rapidly changing social realities lag the development of those realities. The Internet has changed dramatically from the first days of the World Wide Web; the social media sites that are so ubiquitous now didn’t exist before 1997, and blogs like the one you are reading first emerged in 1999–a blink of the eye in historical terms. In the next twenty years, there will undoubtedly be digital innovations we can’t yet imagine or foresee. A specialized agency to oversee our digital new world makes a lot of sense.

I’m usually leery of creating new agencies of government, given the fact that once they appear on the scene, they tend to outlive their usefulness. But Wheeler makes a persuasive case.

And the need for thoughtful, informed regulation gets more apparent every day.

Falsely Shouting “Fire” In The Digital Theater

Tom Wheeler is one of the savviest observers of the digital world.

Now at the Brookings Institution, Wheeler headed up the FCC during the Obama administration, and recently authored an essay titled “The Consequences of Social Media’s Giant Experiment.” That essay–like many of his other publications–considered the impact of legally-private enterprises that have had a huge public impact.

The “experiment” Wheeler considers is the shutdown of Trump’s disinformation megaphones: most consequential, of course, were the Facebook and Twitter bans of Donald Trump’s accounts, but it was also important that  Parler–a site for rightwing radicalization and conspiracy theories–was effectively shut down for a time by Amazon’s decision to cease hosting it, and decisions by both Android and Apple to remove it from their app stores. (I note that, since Wheeler’s essay, Parler has found a new hosting service–and it is Russian owned.)

These actions are better late than never. But the proverbial horse has left the barn. These editorial and business judgements do, however, demonstrate how companies have ample ability to act conscientiously to protect the responsible use of their platforms.

Wheeler addresses the conundrum that has been created by a subsection of the law that  insulates social media companies from responsibility for making the sorts of  editorial judgements that publishers of traditional media make every day. As he says, these 26 words are the heart of the issue: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

As he points out,

If you are insulated from the consequences of your actions and make a great deal of money by exploiting that insulation, then what is the incentive to act responsibly?…

The social media companies have put us in the middle of a huge and explosive lab experiment where we see the toxic combination of digital technology, unmoderated content, lies and hate. We now have the answer to what happens when these features and large profits are blended together in a connected world. The result not only has been unproductive for civil discourse, it also represents a danger to democratic systems and effective problem-solving.

Wheeler repeats what most observers of our digital world have recognized: these platforms have the technological capacity to exercise the same sort of responsible moderation that  we expect of traditional media. What they lack is the will–because more responsible moderating algorithms would eat into their currently large–okay, obscene– profits.

The companies’ business model is built around holding a user’s attention so that they may display more paying messages. Delivering what the user wants to see, the more outrageous the better, holds that attention and rings the cash register.

Wheeler points out that we have mischaracterized these platforms–they are not, as they insist, tech enterprises. They are media, and should be required to conform to the rules and expectations that govern media sources. He has other suggestions for tweaking the rules that govern these platforms, and they are worth consideration.

That said, the rise of these digital giants creates a bigger question and implicates what is essentially a philosophical dilemma.

The U.S. Constitution was intended to limit the exercise of power; it was crafted at a time in human history when governments held a clear monopoly on that power. That is arguably no longer the case–and it isn’t simply social media giants. Today, multiple social and economic institutions have the power to pose credible threats both to individual liberty and to social cohesion. How we navigate the minefield created by that reality–how we restrain the power of theoretically “private” enterprises– will determine the life prospects of our children and grandchildren.

At the very least, we need rules that will limit the ability of miscreants to falsely shout fire in our digital environments.

 

 

 

The Crux Of The Problem

Governing Magazine recently ran a report on the emergence of several politically-tied websites in Michigan. Designed to look like “real” news organizations, the sites– linked to a variety of partisan political groups– are expanding across the state in preparation for the 2020 election.

At about the same time, The Intellectualist reported on yet another study of Fox News; to the surprise of no one other than the network’s devoted audience (who will dismiss it as “fake news”), the study found that nearly 60% of statements made on Fox were either partially or entirely false–and that as a result, Fox News viewers are more likely to believe repeatedly debunked conspiracy theories.

I could add dozens of other examples of our current media environment–an environment characterized by the loss of what we once called “mass media” and its replacement by a digital universe of “news” sites spanning the spectrum from objective reporting to partisan spin to  propaganda.

Regular readers of this blog–not to mention my students–are well aware of my near-obsession with the effects of this current media environment on governance. I’ve become increasingly convinced that America’s tribalism and dysfunction are directly linked to the fragmentation of our information landscape, but I have struggled to come up with a clear explanation of that link.

Tom Wheeler could explain it.  And in an article for the Brookings Institution, he did.

Wheeler was the head of the FCC in the Obama Administration, and is a knowledgable and thoughtful observer of today’s media environment. I really, really encourage you to click through and read the article in its entirety.

The most incisive observation Wheeler makes is that the American media has gone from broadcasting to targetcasting.

Since the time of the early advertising-supported newspapers, economic incentive has worked to bring people together around a common set of shared information. Maximizing ad revenue meant offending as few readers as possible by at least attempting a balanced presentation of the facts. The search for balance began to retreat with the arrival of cable television, but the economic model of maximizing revenue by maximizing reach still governed. The targeting capability of social media algorithms, however, has extinguished the traditional economic model. Now profit comes not through the broad delivery of common information, but the targeted delivery of selected information. The result is an attack on the model of shared information that is necessary for a democracy to function.

Radio and television are “broadcasting”: from a single source they deliver to the widest possible audience. Broadcasting changed the nature of communications from after-the-fact newspapers to the wide distribution of real time information. The image of a family huddling around the radio to hear one of FDR’s fireside chats comes to mind; a common set of inputs available to all upon which to base collective decisions.

Cable television is “narrowcasting.” Cable is like a video newsstand with many titles from which to choose. To differentiate themselves on this newsstand cable news channels developed “an attitude” espousing different political viewpoints. While narrowcasting was driven by conflict and disagreement, the revenue-maximizing goal was still the same as broadcasters’: reach the largest audience possible.

Social media is “targetcasting.” Software algorithms owned by the social media platforms watch how users behave online and use that data to categorize them into specific groups. They then sell advertisers the ability to reach those groups. Targetcasting companies make money the opposite way from broadcasters and narrowcasters. Instead of selling reach to a wide audience, they charge a premium to target a small but specifically defined group.

An even greater differentiator between traditional media and social media is how targetcasting is available only to a specific audience. Such secret targeting tears at the fabric of democracy. The Founding Fathers made E Pluribus Unum (out of many one) the national motto. They began the Constitution with the collective “We the people.” Such a coming together, the Founders realized, was essential for their experiment in democracy to function.

To become “We” requires a suspension of human nature’s tribal instincts in favor of a shared future. Such a belief is predicated in part on shared information.

I have taken the liberty of quoting Wheeler at length, because I think this description is at the very heart of what ails our politics. It is the crux of the problem. We really don’t occupy the same reality, because we don’t have a “common set of inputs upon which to base collective action.”

As Wheeler writes,

Coming together in an environment of shared information—an information commons—is a key component of moving from tribes to the larger Unum. When the algorithms of social media follow the money, they discourage the search for Unum and undermine the communal “We.” By delivering different information to each tribe—in secret—the algorithms keep users online for as long as possible, maximizing ad sales. In doing so, they gnaw away at the heart of “We the people.”

And as always, we are left with the question: what can we do about this? How do we re-establish an information commons? Because if we don’t, the future looks very, very grim.