Section 230

These are hard times for free speech advocates. The Internet–with its capacity for mass distribution of lies, misinformation, bigotry and incitement to violence–cries out for reform, but it is not apparent (certainly not to me) what sort of reforms might curb the dangers without also stifling free expression.

One approach is focused on a law that is older than Google: Section 230 of the Communications Decency Act. 

What is Section 230? Is it really broken? Can it be fixed without inadvertently doing more damage? 

The law is just 26 words that allow online platforms to make rules about what people can or can’t post without being held legally responsible for the content. (There are some exceptions, but not many. )As a recent newsletter on technology put it (sorry, for some reason link doesn’t work),

If I accuse you of murder on Facebook, you might be able to sue me, but you can’t sue Facebook. If you buy a defective toy from a merchant on Amazon, you might be able to take the seller to court, but not Amazon. (There is some legal debate about this, but you get the gist.)

The law created the conditions for Facebook, Yelp and Airbnb to give people a voice without being sued out of existence. But now Republicans and Democrats are asking whether the law gives tech companies either too much power or too little responsibility for what happens under their watch.


Republicans mostly worry that Section 230 gives internet companies too much power to suppress online debate and discussion, while Democrats mostly worry that it lets those companies ignore or even enable dangerous incitements and/or illegal transactions. 

The fight over Section 230 is really a fight over the lack of control exercised by Internet giants like Facebook and Twitter. In far too many situations, the law allows people to lie online without consequence–lets face it, that high school kid who is spreading lewd rumors about a girl who turned him down for a date isn’t likely to be sued, no matter how damaging, reprehensible and untrue his posts may be. The recent defamation suits brought by the voting machine manufacturers were salutary and satisfying, but most people harmed by the bigotry and disinformation online are not in a position to pursue such remedies.

The question being debated among techies and lawyers is whether Section 230 is too protective; whether it reduces incentives for platforms like Facebook and Twitter to make and enforce stronger measures that would be more effective in curtailing obviously harmful rhetoric and activities. 

Several proposed “fixes” are currently being considered. The Times article described them.


Fix-it Plan 1: Raise the bar. Some lawmakers want online companies to meet certain conditions before they get the legal protections of Section 230.

One example: A congressional proposal would require internet companies to report to law enforcement when they believe people might be plotting violent crimes or drug offenses. If the companies don’t do so, they might lose the legal protections of Section 230 and the floodgates could open to lawsuits.

Facebook this week backed a similar idea, which proposed that it and other big online companies would have to have systems in place for identifying and removing potentially illegal material.

Another proposed bill would require Facebook, Google and others to prove that they hadn’t exhibited political bias in removing a post. Some Republicans say that Section 230 requires websites to be politically neutral. That’s not true.

Fix-it Plan 2: Create more exceptions. One proposal would restrict internet companies from using Section 230 as a defense in legal cases involving activity like civil rights violations, harassment and wrongful death. Another proposes letting people sue internet companies if child sexual abuse imagery is spread on their sites.

Also in this category are legal questions about whether Section 230 applies to the involvement of an internet company’s own computer systems. When Facebook’s algorithms helped circulate propaganda from Hamas, as David detailed in an article, some legal experts and lawmakers said that Section 230 legal protections should not have applied and that the company should have been held complicit in terrorist acts.


Slate has an article describing all of the proposed changes to Section 230.

I don’t have a firm enough grasp of the issues involved–let alone the technology needed to accomplish some of the proposed changes–to have a favored “fix” to Section 230.

I do think that this debate foreshadows others that will arise in a world where massive international companies–online and not– in many cases wield more power than governments. Constraining these powerful entities will require new and very creative approaches.

Comments

Testing The Current Court

The worst “hangover” from four years of Trump is undoubtedly the composition of the country’s federal courts–including but not limited to the Supreme Court. Granted, Trump–who wouldn’t know a legal principle if he fell over one–wouldn’t have known how to stuff the courts with rightwing ideologues; Mitch McConnell is the villain. But Trump enabled him.

In a recent column for the New York Times, Linda Greenhouse explained the troubling implications–and predictive value– of an upcoming Supreme Court case.

The case that the Supreme Court heard this week about a California law granting union organizers access to private farms has been described as a labor case, which it marginally is. It has also been described as a case about property rights, which it definitely is. But what makes Cedar Point Nursery v. Hassid one of the most important cases of the current term is the question it presents for the newly configured court: whether, after years of disappointment, the political right may finally be able to take the Supreme Court for granted.

The case is being brought by the Pacific Legal Foundation, and as Greenhouse reports, Pacific group is using Cedar Point–a company that grows strawberries– and another employer that packs and ships citrus fruit and grapes, as “stalking horses for its long-running project to elevate property rights.”

The case involved union access to agricultural workers. The California law being challenged had been passed during Cesar Chavez’s drive to organize the state’s farmworkers. It limited the ability of the union to approach workers in the field to periods before and after the working day and to three hours on 120 days of a year.

Greenhouse focused in on an illuminating–and to lawyers, startling–exchange between the lawyer and Justice Kavanaugh. Kavanaugh referred to a 1956 case that balanced employers’ property rights agains union organizing rights, and noted that–under that test–Pacific would “prevail”–it would win its case. The lawyer for Pacific “rejected out of hand” that potential path to victory.

Pacific isn’t interested in just winning its case. It wants to change the law.

The Pacific Legal Foundation doesn’t want a balancing test. It wants a categorical rule — referred to throughout the argument as a “per se rule” — that any entry by a union onto private land, if authorized by the state, is a “taking” of private property in violation of the Fifth Amendment’s Takings Clause (“nor shall private property by taken for public use, without just compensation”). Any entry at all.

So let me ask you this,” Justice Amy Coney Barrett said to Mr. Thompson. “What if California had a regulation that permitted union organizers to go onto the property of your clients one hour a day, one day a year. Is that a taking subject to the per se rule?”

Yes, the lawyer replied.

Barrett clerked for former Justice Scalia, who championed an expansion of the categories of government action that count as a “taking.”  The Fifth Amendment requires government to compensate property owners for takings, and there has long been an effort to turn regulations–especially environmental regulations–into compensable takings subject to that Amendment.

If you have a wetland on your property and regulations impede your ability to develop it, for example, the government would have to “compensate” you.

Until a 1992 case, Lucas v. South Carolina,  courts had defined takings as the physical occupation of private property, usually via eminent domain.

Government actions that didn’t “take” private property in the literal sense, but simply limited its use in certain ways, were regarded as “regulatory takings,” with the private and governmental interests being weighed against one another to determine whether compensation was required…

When a regulation “declares ‘off-limits’ all economically productive or beneficial uses of land,” Justice Scalia wrote for the court, “compensation must be paid to sustain it.”

Ever since, the Pacific Legal Foundation has argued for the adoption of what Scalia called a “categorical” taking.

That was the war that resumed at the Supreme Court this week, and that history explains why, from the Pacific Legal Foundation’s point of view, anything short of total victory is beside the point.

Greenhouse notes that whether the court buys Pacific’s theory will tell us a great deal about the success of McConnell’s effort to refashion the courts.

Comments

The Winners Write The History

I get Heather Cox Richardson’s daily letter. Richardson is a history professor, and one of the voices trying to restore accuracy to the largely incomplete lessons we’ve been taught about how and why we find ourselves where we are.

A couple of days ago, her letter made me think of the old adage about history being written by the victors–something that is evidently as true of policy arguments as it is of warfare.

Richardson was reacting to the mass shootings in Boulder and  Atlanta, and she proceeded to lay out a history of gun control in the United States, much of which I had not known.

The Second Amendment to the Constitution is one simple sentence: “A well regulated militia, being necessary for the security of a free state, the right of the people to keep and bear arms, shall not be infringed.” There’s not a lot to go on about what the Framers meant, although in their day, to “bear arms” meant to be part of an organized militia.

As the Tennessee Supreme Court wrote in 1840, “A man in the pursuit of deer, elk, and buffaloes might carry his rifle every day for forty years, and yet it would never be said of him that he had borne arms; much less could it be said that a private citizen bears arms because he has a dirk or pistol concealed under his clothes, or a spear in a cane.”

So how did the “original intent” of the Amendment get twisted into a personal right to own weapons? Evidently, thanks to a similar twisting of the NRA.

The NRA was established in the late 1800s “to improve the marksmanship skills of American citizens who might be called on to fight in another war, and in part to promote in America the British sport of elite shooting.”

By the 1920s, rifle shooting was a popular American sport. “Riflemen” competed in the Olympics, in colleges and in local, state and national tournaments organized by the NRA… In 1925, when the secretary of the NRA apparently took money from ammunitions and arms manufacturers, the organization tossed him out and sued him.

Times have certainly changed.

The early NRA distinguished between law-abiding citizens who should have access to guns, and criminals and mentally ill people who should not. In 1931, it backed federal legislation to limit concealed weapons, prevent possession by criminals, the mentally ill and children, to require all dealers to be licensed, and to require background checks before delivery. It endorsed the 1934 National Firearms Act, and other gun control legislation.

But in the mid-1970s, a faction in the NRA forced the organization away from sports and toward opposing “gun control.” It formed a political action committee (PAC) in 1975, and two years later elected an organization president who abandoned sporting culture and focused instead on “gun rights.”

Richardson tells us that the NRA “embraced the politics of Movement Conservatism,” a movement opposing business regulations and social welfare programs. Movement Conservatives also embraced the myth of the heroic American cowboy, a White man standing up to the “socialism” of the federal government while dominating Black and Native American people.

In 1972, the Republican platform had called for gun control to restrict the sale of “cheap handguns,” but in 1975, as he geared up to challenge President Gerald R. Ford for the 1976 presidential nomination, Movement Conservative hero Ronald Reagan took a stand against gun control. In 1980, the Republican platform opposed the federal registration of firearms, and the NRA endorsed a presidential candidate—Reagan– for the first time.

After Reagan was shot, the  NRA spent millions of dollars fighting the Brady Bill; after it passed, the organization financed lawsuits in nine states to strike it down.

Richardson also points out that until 1959, every single legal article on the Second Amendment concluded it wasn’t intended to guarantee individuals the right to own a gun. In the 1970s, legal scholars funded by the NRA began arguing that the Second Amendment did exactly that.

The organization got its money’s worth. In 2008, the Supreme Court declared that the Second Amendment protects an individual’s right to keep and bear arms.

The unfettered right to own and carry weapons has come to symbolize the Republican Party’s ideology of individual liberty. Lawmakers and activists have not been able to overcome Republican insistence on gun rights despite the mass shootings that have risen since their new emphasis on guns. Even though 90% of Americans—including nearly 74% of NRA members— recently supported background checks, Republicans have killed such legislation by filibustering it.

The good news is that the NRA is currently imploding. Perhaps the loss of its ability to spend mountains of money will allow Congress to pass responsible gun control legislation–and if it’s no longer the policy “winner,” we may get a more accurate history.

Comments

Repeating My Mantra…

People who have read this blog for any length of time are familiar with some of my preoccupations–civic literacy and civics education, climate change, competent governance, and job creation. (Admittedly, I have a lot of “hot buttons”…)

I have been fairly consistent in my approach to most of these issues over the years, but I’ve changed my tune when it comes to growing the economy and creating jobs. I used to be persuaded by the argument that significant raises in the minimum wage would lead to job losses–it seemed logical that forcing a business to pay more to worker A would leave that business with fewer dollars with which to hire worker B. What I didn’t understand was the unspoken caveat: all things being equal. In the real world, it turns out that all things aren’t equal.

What the real world evidence shows is that paying workers a living wage–and thus providing them with a modicum of disposable income–is what creates jobs. As I now understand, demand is what creates jobs, not the beneficence of the factory owner. The guy who owns the widget factory isn’t going to hire more workers to make widgets if no one has the money to buy them.

A recent article in The Week emphasized the point

For many years, rich oligarchs have posed as the engines of the economy — the entrepreneurs whose beneficence and wise decisions create economic prosperity. In a 2019 article for Fox News, Sally Pipes, president of the right-wing Pacific Research Institute, called for Americans to “celebrate America’s job creators” during Labor Day. “Let’s honor the people responsible for that grandeur — namely, the profit-seeking entrepreneurs and business people who make our economy hum,” she wrote.

This is bunk. The real engine of the economy is the dollars in the pocket of the humble average citizen.

The article goes further, however. Most economists now recognize that putting additional money in the hands of workers stimulates demand, but they tend to think of that demand in the context of a fixed economic capacity–as a mechanism for getting to full employment in existing factories and other enterprises.

In reality, as Skanda Amarnath and Alex Williams argue at Employ America, spending also affects overall capacity. A factory, for instance, is not some immortal thing — at a minimum, it must be continually maintained because of entropy and ordinary wear and tear on equipment. To remain competitive, it must be regularly upgraded with the latest production technologies. But businesses will logically invest in new capacity only if they see a market for the goods and services that capacity would produce. This is especially true with respect to high-tech manufacturing investment, which is very complex and expensive — taking over half a decade to pay off.

Amarnath and Williams argue that slack demand afflicted America’s economy well before the 2008 recession, and that it is only surging again now because of the huge boom in sales of computer products–a boom generated by two things; the pandemic surge in working from home, and government transfers to individuals, also due to the pandemic.

All of the available evidence confirms that giving poorer people more money generates economic growth. When you give rich people more money–through Republican policies like deregulation, union busting and especially the numerous, generous tax cuts so dear to GOP hearts–they disproportionately save it, rather than spending it and boosting the economy.

As the article says, cash in the pockets of the working poor isn’t just good in in a humanitarian sense (giving people money they need to live.) It’s good because spending those dollars is what will keep businesses humming, investment high, and the economy healthy.

Comments

A Female Perspective

I was asked to make a (Zoom)presentation to a group of O’Neill women students, focused on “women and politics.” This is what I said.
____________________________________

I think I have always been a “political” person, in the sense that the question that has always fascinated me is a question that most women wrestle with in one way or another: how should people live together? What sort of social and political arrangements are most likely to nourish our humanity and promote—in Aristotle’s term—human flourishing? If the old African proverb is right, if it “takes a village to raise a child,” what should that village look like, and how should its inhabitants behave? How do we build that kind of village? Politics is the process of turning our answers to those questions into policy—and since women’s answers have been shaped by our life experiences, it is important that women’s voices be part of the policy process.

You have asked me to share my experiences as a professional and political woman, so let me get the biography out of the way. I was born in 1941, and I am very much a product of the 1950s, way before any of you were born. It was a time when women went to college to find a husband, a time when we were expected to be decorative and submissive—or at the very least, quiet. (You can see why I had a problem.)

I grew up in Anderson, Indiana, where being Jewish was at best exotic and at worst, Satanic, and where I was usually the only Jew my classmates had ever encountered. Those experiences undoubtedly deepened my interest in social divisions and the effects of marginalization. They also kindled an ongoing fascination with the ways in which religions shape our worldviews.

I left Anderson for college when I was 16. I wanted to major in liberal arts, but my father insisted that I get a teaching degree, because if my eventual husband died, I would need something to fall back on. At the time, educated women were secretaries, teachers or nurses; I couldn’t type and the sight of blood made me queasy. That left teaching. Because I was so young, my parents sent me to Stephens College for Women, a two-year school that took very seriously its obligation to act in loco parentis. After Stephens, I briefly attended the University of North Carolina, where the most indelible lesson I learned was that when you pay Full Professors 3000/year, you get what you pay for. (Even in the 1950s, 3000 wasn’t much.) I transferred to IU Bloomington to finish my undergraduate degree, got married and divorced, and later did a semester at Butler, pursuing an MA in literature that I never finished.

I married a second time and took my first job (well, first if you don’t count the summer I worked for my father’s friend at his—no kidding—Cadillac-Rambler agency, where I was billed as Anderson’s first female used car salesman.) I began my adult work life as a high school English teacher. When I became pregnant with my first child, however, I could no longer teach—Even though I was married, those days, once women teachers “showed,” we could no longer be in the classroom. The theory evidently was that the kids would know what we’d been up to…

I went to law school when I was 30 and had three small children (four if you count the husband I had at the time). There were very few women in law school then, and my most important epiphany revolved around the need for potty parity… the few women’s restrooms were for the secretarial staff and inconvenient for students. After graduating law school, I was the first female lawyer hired at what was then Baker and Daniels.

To give you a flavor of the time—serial interviews with prospective associates were conducted by several of the partners, and I was in conversation with two who were being very careful not to ask improper questions—this was barely ten years after creation of the EEOC. Since I had three children, I thought it reasonable to volunteer my childcare arrangements. One of the partners was so obviously relieved that I wasn’t acting like some sort of radical bra-burning feminist, he blurted out: “It isn’t that there’s anything wrong with being a woman. We hired a man with a glass eye once!”

I practiced corporate law for three years, until Bill Hudnut asked me to take charge of the City’s legal department. I was the first woman to serve as Corporation Counsel in Indianapolis–or, to the best of my knowledge, in any major metropolitan area. At the time, Indianapolis had two newspapers. The afternoon paper, the Indianapolis News, had a front-page “gossip” blurb, and I still recall its juicy little item after my appointment was announced: “What high-ranking city official appointed his most recent honey to a prominent position…” I guess it was inconceivable that I’d been appointed because I was a decent lawyer, or even because I represented a constituency Bill was reaching out to. Gotta sell papers…

I left City Hall to be the Republican candidate for Congress in 1980, running against Andy Jacobs, Jr., in what was then Indiana’s 11th Congressional district. That was back when Republicans were still rational, and political campaigns less toxic. I was pro-choice and pro-gay rights, and I won a Republican primary. The worst name I called Andy was Democrat. My youngest son later served as his Congressional page, and after Andy retired, he and I would occasionally have lunch. As I say, things were different then….
I also remarried during that campaign and I’m happy to report that the third time was the charm—it’s been 41 years and counting.

After losing the election, I practiced law, started a Real Estate Development Company that went broke during the recession of the late 1980s, and served six years as the Executive Director of Indiana’s ACLU. I joined IUPUI’s faculty in 1998.

I’ve lived through the women’s movement, the Civil Rights movement, the 60s, the sexual revolution (I missed it by 6 years!), the gay rights movement, the decades of religious zealotry that a friend calls “America’s most recent Great Awakening,” and a dizzying explosion of new technologies. As George Burns once said, I’m so old I remember when the air was clean and sex was dirty.

I became politically active at nineteen, as a Republican. I was persuaded—and remain persuaded—by what has been called the “libertarian principle,” the belief that the best society is one in which individuals are free to set and pursue our own life goals, determine our own telos, so long as we don’t harm the person or property of a non-consenting other, and so long as we are willing to grant an equal right to others. Back then, with some notable exceptions, the GOP understood the importance of “so long as” in those last two caveats. Times, obviously, have changed. The political party to which I belonged no longer exists, except in name.

For those who begin with the libertarian principle as I just shared it, good faith political arguments tend to revolve around the nature and severity of the “harms” that government can legitimately prohibit or regulate, and the extent of government’s obligation to provide a physical and social infrastructure to be paid for through citizens’ “dues,” called taxes. Needless to say, we are not having those good faith arguments today—instead, we are in a culture war– what may well be an existential struggle between science and reason on the one hand, and a variety of fundamentalisms on the other.

Women do not do well in culture wars.

Of the nine books I’ve written, the two that taught me the most—the ones that required the “deepest dives” into our philosophy of government and suggested some answers to Aristotle’s question—were God and Country: America in Red and Blue and my small textbook Talking Politics? What You Need to Know Before You Open Your Mouth.

The research I did for God and Country provided me with a lens through which I’ve come to understand so much of our current political environment. Policymaking has become a power struggle between Puritans who believe government should make the rest of us live “godly” lives, based upon their particular version of what’s godly, and those of us who demand that government act on what John Rawls called “public reasons,” based upon logical persuasion and scientific and empirical understandings. Contemporary Puritans remain deeply antagonistic to the Enlightenment and to secular ways of knowing—especially science—and they utterly reject the notion that each of us gets to define our own morality. Scroll down a Facebook page, or read the comments section of an online newspaper, and you’ll come across posts from fundamentalists of various stripes who wrap themselves in victimhood whenever government fails to impose their preferred worldviews on everyone else. And as most women understand, those preferred worldviews almost always include a “biblically-mandated” submission of women.

Another example is the effort—in Indiana and elsewhere—to exempt so-called “bible-believing Christians” from compliance with otherwise applicable civil rights laws. In our system, religious citizens have absolute liberty to believe whatever they want—that’s the individual rights pole of the continuum. But religious or political beliefs, no matter how sincere, don’t entitle people to sacrifice newborns or bomb abortion clinics, and they don’t entitle them to engage in behavior that is contrary to America’s cultural and legal commitment to civic equality. That’s the public good end of the continuum. There’s no religious privilege to behave in ways that we collectively deem destructive to America’s social health.

Let me just share a final observation: Social justice is a term we don’t hear very often these days. Social justice is aspirational, and its elements are subject to debate, but at its heart, the concept is concerned with mutual obligation and the common good. In its broadest outlines, a just society is one that meets the basic human needs of its members, without regard to their identities, genders or social status—a society that doesn’t draw invidious distinctions between male and female, black and white, gay and straight, religious and atheist, Republican and Democrat, or any of the other categories into which we like to sort our fellow humans. It is a society that recognizes and respects the inherent dignity and value of each person.

We should want to make our society more just for many reasons, practical as well as moral: for one thing, a more equitable society is in the long-term best interests of even those people who don’t feel any obligation to feed hungry children or find jobs for ex-offenders or make health care accessible to poor people. That’s because in order to remain competitive in the global economy, America needs to make use of all its talent. Social systems that prevent people from contributing their talents cost all of us in lost opportunities and unrealized promise.

I’m painfully aware that cultural institutions, folkways and intellectual paradigms influence people far more than logic and reason, and I also know that culture is incredibly difficult to change. Systemic barriers and ingrained privilege don’t disappear without significant upheavals or outright revolutions.

Even more daunting, when I look at today’s politics, I’m reminded of a 1999 movie called “The Sixth Sense.” The young boy in that movie saw dead people. I see crazy people.

If I had to guess why so many of our fellow-citizens appear to have gone off the deep end—why they are trying to stockpile guns, roll back women’s rights, put gays back in the closet, stigmatize African-Americans and stereotype Muslims—I think the answer is fear. Change is creating a very different world from the one most of us grew up in, and the pace of that change continues to accelerate. As a result, we have a lot of bewildered and disoriented people who find themselves in an increasingly ambiguous world; they are frantic for bright lines, clear rules, simple answers to complicated issues, and especially, for someone to blame. People who are unhappy or dissatisfied with their lives evidently need to attribute their problems and disappointments to some nefarious “other.” Black and brown people and “uppity women” are obvious targets.

I have hopes that your generation will be able to reverse this retreat into anti-intellectualism, bigotry and various kinds of fundamentalism. We humans flourish through constant learning, by opening ourselves to new perspectives, by reaching out and learning from those who are different.

And women only flourish in a society that understands that.

Comments