Tom Wheeler is one of the savviest observers of the digital world.
Now at the Brookings Institution, Wheeler headed up the FCC during the Obama administration, and recently authored an essay titled “The Consequences of Social Media’s Giant Experiment.” That essay–like many of his other publications–considered the impact of legally-private enterprises that have had a huge public impact.
The “experiment” Wheeler considers is the shutdown of Trump’s disinformation megaphones: most consequential, of course, were the Facebook and Twitter bans of Donald Trump’s accounts, but it was also important that Parler–a site for rightwing radicalization and conspiracy theories–was effectively shut down for a time by Amazon’s decision to cease hosting it, and decisions by both Android and Apple to remove it from their app stores. (I note that, since Wheeler’s essay, Parler has found a new hosting service–and it is Russian owned.)
These actions are better late than never. But the proverbial horse has left the barn. These editorial and business judgements do, however, demonstrate how companies have ample ability to act conscientiously to protect the responsible use of their platforms.
Wheeler addresses the conundrum that has been created by a subsection of the law that insulates social media companies from responsibility for making the sorts of editorial judgements that publishers of traditional media make every day. As he says, these 26 words are the heart of the issue: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
As he points out,
If you are insulated from the consequences of your actions and make a great deal of money by exploiting that insulation, then what is the incentive to act responsibly?…
The social media companies have put us in the middle of a huge and explosive lab experiment where we see the toxic combination of digital technology, unmoderated content, lies and hate. We now have the answer to what happens when these features and large profits are blended together in a connected world. The result not only has been unproductive for civil discourse, it also represents a danger to democratic systems and effective problem-solving.
Wheeler repeats what most observers of our digital world have recognized: these platforms have the technological capacity to exercise the same sort of responsible moderation that we expect of traditional media. What they lack is the will–because more responsible moderating algorithms would eat into their currently large–okay, obscene– profits.
The companies’ business model is built around holding a user’s attention so that they may display more paying messages. Delivering what the user wants to see, the more outrageous the better, holds that attention and rings the cash register.
Wheeler points out that we have mischaracterized these platforms–they are not, as they insist, tech enterprises. They are media, and should be required to conform to the rules and expectations that govern media sources. He has other suggestions for tweaking the rules that govern these platforms, and they are worth consideration.
That said, the rise of these digital giants creates a bigger question and implicates what is essentially a philosophical dilemma.
The U.S. Constitution was intended to limit the exercise of power; it was crafted at a time in human history when governments held a clear monopoly on that power. That is arguably no longer the case–and it isn’t simply social media giants. Today, multiple social and economic institutions have the power to pose credible threats both to individual liberty and to social cohesion. How we navigate the minefield created by that reality–how we restrain the power of theoretically “private” enterprises– will determine the life prospects of our children and grandchildren.
At the very least, we need rules that will limit the ability of miscreants to falsely shout fire in our digital environments.
Comments