A Perfect Storm

I woke up yesterday to the news that Trump’s Supreme Court–through its “Shadow Docket” and by a five to four margin–had effectively overturned what lawyers call “incorporation”–an odd term for the proposition that the Bill of Rights constrains state and local governments

In a scathing dissent, Justice Sonia Sotomayor wrote: “The court’s order is stunning. Presented with an application to enjoin a flagrantly unconstitutional law engineered to prohibit women from exercising their constitutional rights and evade judicial scrutiny, a majority of Justices have opted to bury their heads in the sand.”

Actually, it’s worse than that. Much worse.

Not only does the Court’s increasing use of the Shadow Docket raise serious questions about the erosion of the judicial transparency fundamental to the rule of law, the decision to allow Texas’ empowerment of culture war vigilantes achieves a goal long held by “states rights” fundamentalists: a return to the days when state and local lawmakers could impose their preferred “morality” on their citizens–and not-so-incidentally decide which citizens were entitled to equal rights– without the pesky interference of the federal government.

As I noted yesterday, approval of Texas’ ploy opens a door to civil strife far removed from the abortion wars. State legislatures can now turn private citizens into “enforcers” of pretty much any goal–and not just conservative ones. The decision effectively approves a federalism on steroids, and the unraveling of the “United” States.

I used to explain to my students that one of the salutary effects of the incorporation of the Bill of Rights was that it ensured a “floor”–so that when someone moves from New York to Alabama or Texas, they don’t suddenly lose their right to religious liberty or free speech or their protection against unreasonable search and seizure..

This case strikes a terrifying blow against that principle.

I titled this post “a perfect storm” because the Supreme Court’s abandonment of fifty years of precedent is only one of the truly existential challenges we currently face.

It is no longer possible to pretend that climate change is some sort of elitist, liberal theory that can safely be ignored. Fires in California (now threatening Nevada), increasingly powerful hurricanes battering not just Louisiana but causing flooding and chaos all the way to New England, the continuation of “extinctions” threatening to disrupt the global ecology…the list goes on. There are some valiant efforts underway to combat climate change, but the likelihood is that even if those efforts manage to moderate its effects, there will be enormous disruptions of global life–including  famines and massive population movements.

Then, of course, there’s the pandemic. Two pandemics, actually–COVID and insanity. The insanity makes it highly likely that COVID won’t be the last disease to decimate populations around the world.

Speaking of insanity, Leonard Pitts reminds us of the rising tide of rightwing violence.

While it’s unlikely we’ll see regional armies clashing as they once did at Antietam and Shiloh, is it so hard to imagine the country descending into a maelstrom of conservative terrorism, the kind of hit-and-run asymmetric warfare — random bombings and shootings — that rocked Iraq and Afghanistan in the early 2000s? Certainly, the weapons and the sense of grievance are there.

On top of all of this, outdated elements of  America’s legal architecture are impeding our ability to confront these challenges. In a recent, very important paper by Will Wilkinson of the Niskanen Center (I will have much more to say about his paper in future posts), Wilkinson concluded his analysis of what he calls “The Density Divide” with a recitation of the mismatch between America’s population realities and that framework.

As Wilkinson notes, our Constitutional system has a strong small-state bias, “which effectively gives extra votes to topsoil in low-population states.” In a country where 50 percent of voters identify or lean Democratic and 42 percent identify or lean Republican–a Democratic advantage of some 18 million voters– the GOP has erected “an imposing fortification” through gerrymandering, voter ID laws, voter-roll purges…the list goes on.

Wilkinson underscores what many others have said: we desperately need structural reforms and especially strong new legislation protecting voting rights. What he doesn’t say–since his paper was written before the Court’s recent assault on the supremacy of the Constitution–is that such protection must be nationally enforceable.

This “perfect storm” has created a genuinely existential moment. It is no longer possible to ignore the fact that American governance by We the People is teetering on a dangerous edge. The question is: can a nation burdened with a substantial minority of QAnon-believing, MAGA-hat wearing, Ivermectin-ingesting, Confederacy-loving citizens–many if not most of whom are White racially-resentful rural residents empowered by outdated electoral structures– rise to the challenge?

Comments

A New Social Contract

Time Magazine recently ran an interview with a top global economist, who has authored a book about what humans owe each other–in other words, about a new or perhaps renewed social contract. Several of her concerns mirror my own; as readers of this blog are aware, my last book, Living Together, was focused on the same question.

The notion of a social contract was first introduced by John Locke and his formulation became a foundation of American law and culture. The U.S. Constitution was heavily influenced by Enlightenment philosophers like Locke, who rejected the divine right of Kings in favor of a belief in a theorized “contract” in which citizens grant government an exclusive right to the exercise of coercive power in return for an obligation to provide for their safety and welfare–the “law and order” required for civilization. Citizens could revoke government’s authority if government failed in its mission or breached the bounds of the contract.

Most European nations have subsequently adopted social contract theories that are considerably more expansive than the version embraced by most Americans. Those versions interpret government’s obligation to provide “social goods” broadly,  including access to healthcare.

Several years ago, I collaborated with colleagues in  on an article intended to probe America’s limited view of the proper role for government in social welfare, and to demonstrate that the Affordable Care Act–and for that matter, single-payer health insurance–really was consistent with Locke’s view of a social contract. (We noted that a deficit of civic knowledge poses a significant barrier to efforts to revisit social contract theory–revisiting a theory is impossible for those who have never visited that theory in the first place.)

Take the contemporary debate over healthcare reform. This fight cannot be understood without recognizing the continued potency of the country’s foundational assumptions, and especially the continued relevance of social contract theory most directly attributed to John Locke. In this paper, we echo arguments made by historians and legal theorists like Daniel Boorstin and Louis Hartz who noted that Americans who may never have heard of Locke or the Enlightenment, have nevertheless internalized Locke’s philosophy in ways that make social inclusion and extensions of the social safety net particularly difficult. In a very real sense, John Locke doomed more comprehensive healthcare reform, at least in the short term, and made it far more difficult to extend unemployment benefits, increase payments under Temporary Assistance for Needy Families (TANF), or raise the minimum wage. If we are to have any success in changing the long term prospects for these and similar reforms, we will need to go beyond the academic, moral, and fiscal arguments, no matter how persuasive some of us find them, and directly engage the need to update and expand our basic understanding of the social contract.

We were writing during the initial debates over the ACA, which we noted was yet another iteration of America’s deeply embedded conflict between Social Darwinism and the Social Gospel.

No matter how logical or effective, programs requiring extensive government involvement, or that include “mandates” of any sort, trigger an almost visceral reaction in those who tend more to Social Darwinism, a belief that “productive” people’s rights are thereby violated, and that such approaches are contrary to freedom, to “real” Americanism. In other words, at a basic—perhaps unconscious—level, many people believe that government involvement in healthcare, or government intervention via provision of a social safety net, is somehow un-American and therefore must be rejected. It does no good to point out how deeply government is already involved in providing a social safety net through Social Security, or in providing health care in particular (e.g., the Veterans Administration which is the largest integrated health care system in the country serving more than 8.75 million Veterans each year) — the issue is emotional, not factual. The passage of Medicare generated cries of socialism, and the New Deal—even in the midst of the Great Depression—was aggressively opposed. It is the rare social program that hasn’t had to contend with accusations of incipient communism.

Our article explored the reasons for that “emotional” response, and those of you with time and temperament to wade through its scholarship can agree or disagree with our analysis, but I think it is fair to say that the underlying issue has become considerably more salient.

Humans around the globe are faced not just with a pandemic, but with the existential threat posed by climate change.  Individuals are powerless to address those threats. Collective action is required, and government is our mechanism for collective action.

A workable social contract requires government to protect individual autonomy, provide a supportive social infrastructure and take decisive action to protect the common good.

I’m convinced John Locke would agree.

Comments

Return On Investment

I tend to get testy when I hear people intone that government should “be run like a business.” (Granted–I’m testy a lot…) Government is very different from business–its purposes (which do not include a profit motive) are distinct. Not recognizing the substantial differences  between governance and enterprise marks those making that facile comment as–at best– uninformed.

That said, there is one concept fundamental to both business plans and investment decisions that should also guide governmental decisions: return on investment. Interestingly, however, many of the same folks who want more businesslike governance routinely ignore that calculation.

If I’m purchasing stock in a company, I want evidence that the shares I purchase will appreciate in value–or at least pay dividends. If I am a savvy/mature investor, rather than a gambler playing the market, I understand that such appreciation will likely not be immediate; I will invest “for the long haul.” 

That same calculation ought to determine America’s investments in social spending.  Although appropriate returns on government investment will not and should not be monetary, a number of studies confirm that a surprising number of programs actually do turn a fiscal profit for taxpayers.

Children who have been fed thanks to food stamps grow up into healthier, more productive adults than those who didn’t get enough to eat. That greater productivity means that government eventually recoups much of what it spent on those food stamps–and also saves money due to reduced spending on things like disability payments.

A recent study by Harvard economists found that many programs — especially those focused on children and young adults — made money for taxpayers, when all costs and benefits were factored in.

That’s because they improved the health and education of enrollees and their families, who eventually earned more income, paid more taxes and needed less government assistance over all.

The study, published in The Quarterly Journal of Economics, analyzed 101 government programs. In one way, it was a standard cost/benefit analysis–it looked at what  government’s costs were, and the resulting benefits to the recipients. However, the researchers took an extra step–they calculated the “fiscal externalities: the indirect ways that a program affected the government’s budget.”

In other words, in addition to the upfront costs, they calculated the monetary return on taxpayers’ investment.

Consider one program: health insurance for pregnant women. In the mid-1980s, the federal government allowed states to expand Medicaid eligibility to more low-income pregnant women. Some, but not all, states took up the offer. Increased Medicaid coverage enabled women to receive prenatal care and better obstetric care, and to save on personal medical spending.

For the federal government, the most straightforward fiscal impact of this expanded coverage was increased spending on health insurance. The indirect fiscal effects were more complex, and could easily be overlooked, but they have been enormous.

First, newly eligible women had fewer uninsured medical costs. The federal government picks up part of the tab for the uninsured because it reimburses hospitals for “uncompensated care,” or unpaid bills. Thus, this saved the government some money. On the other hand, some of the women stopped working, probably because they no longer needed employer-provided private health insurance, and this cost the government money.

But the biggest indirect effects were not apparent until children born to the Medicaid-covered women became adults. As shown in a study by Sarah Miller at the University of Michigan and Laura Wherry at the University of California, Los Angeles, those second-generation beneficiaries were healthier in adulthood, with fewer hospitalizations. The government saved money because it would have paid for part of those hospital bills. The now-adult beneficiaries had more education and earned more money than people in similar situations whose mothers did not get Medicaid benefits. That meant higher tax revenue.

Data on other social programs yields similar results. Researchers have found that Medicaid expansion, for example, more than paid for itself, even after accounting for the fact that future benefits are “discounted”–i.e., worth less today. 

Businesspeople understand that it usually takes time to realize profit. With government social programs, too, the fiscal “payoff” generally is delayed. That doesn’t mean it is less substantial or less real. In the cited study, 16 of the social policies that the researchers examined either broke even or actually made a profit. 

I’m certainly not suggesting that government programs be limited to those with a positive financial return–government is most definitely not a business. I am suggesting, however, that we consider government social programs investments–and that the returns on those investments aren’t limited to improving the safety and security of the communities in which we all live, sufficient as that return would be. In many cases, taxpayers also get a positive monetary return on investment.

Just like well-run businesses.

Comments

This Isn’t Dunkirk

Longtime readers of this blog know that I rarely, if ever, post about foreign policy. There’s a reason for that–I am uninformed about most aspects of such policies, and I am deeply conflicted about America’s obligations vis a vis purely humanitarian concerns. 

When it comes to warfare, I mostly agree with those who insist we should keep our cotton-pickin’ hands off unless there is a very clear American interest to be protected, or a humanitarian crisis of significant proportions that we are actually in a position to ameliorate. I will readily admit that the definition of American interests and the nature and extent of humanitarian crises are matters of considerable debate.

If I had been the person determining the parameters of America’s intervention in Afghanistan, I would have approved an initial intervention to root out Al Qaida and “get” Osama Bin Laden–but not the slog of the subsequent 18 years, during which we wasted trillions of dollars–not to mention the lives of thousands of soldiers and civilians.

But here we are.

President Biden has made what I consider the absolutely correct call–and the media and self-styled pundits, abetted by deeply dishonest Republicans sensing political advantage, are having a field day attacking him for, among other things, recognizing and admitting the obvious.

I think that Michael Moore, of all people, has it right in the following paragraphs. (I say “of all people” because I tend to find Moore tiresome–you usually know precisely what he’ll say because, like far too many people, he approaches all issues through an unshakable, pre-defined lens. Sometimes, of course, like that “stopped clock” he’s right; sometimes, not so much.)

In this case,I think he is “on point.” In his recent letter, Moore wrote about our departure from Afghanistan: 

This is nothing here to celebrate. This should only be a monumental gut-check moment of serious reflection and a desire to seek redemption for ourselves. We don’t need to spend a single minute right now analyzing how Biden has or has not messed up while bravely handling the end of this mess he was handed — including his incredible private negotiations all this week with the Taliban leaders to ensure that not a single enemy combatant from the occupying force (that would be us; e.g., U.S. soldiers and spies and embassy staff), will be harmed. And Biden so far has gotten every American and foreign journalist out alive, plus a promise from the Taliban that those who stay to cover it will not be harmed. And not a single one has! Usually a force like the Taliban rushes in killing every enemy in sight. That has not happened! And we will learn that it was because of the negotiating skills and smarts of the Biden team that there was no mass slaughter. This is not Dunkirk.

Dozens of planes have safely taken off all week — and not one of them has been shot down. None of our troops in this chaotic situation have been killed. Despite the breathless shrieks of panic from maleducated journalists who think they’re covering the Taliban of the 1990s (Jake Tapper on CNN keeps making references to “beheadings“ and how girls might be “kidnapped” and “raped” and forced to become “child brides”), none of this seems to be happening. I do not want to hear how we “need to study” what went wrong with this Taliban victory and our evacuation because (switching to all caps because I can’t scream this loud enough): WE ARE NEVER GOING TO FIND OURSELVES IN A SITUATION LIKE THIS AGAIN BECAUSE OUR DAYS OF INVADING AND TAKING OVER COUNTRIES MUST END. RIGHT? RIGHT!!

Unfortunately, we probably will find ourselves in similar situations, because a substantial portion of our citizenry believes we have the right–indeed, the duty–to impose our will around the globe, irrespective of any threat to genuine American interests.

Is our exit from Afghanistan being accomplished smoothly? No. To the extent both the war and the exit were bungled, we’ll need sober analyses of those failures in order to inform future foreign policy decisions. But sober analyses are not what we’re getting–for that matter, even presumably straightforward eyewitness reports of what is occurring “on the ground” are wildly inconsistent. 

If people of good will are truly concerned about the fate of non-Taliban Afghanis–especially Afghani women–under a fundamentalist religious regime, what they can and must do is extend a welcome to those who want to emigrate, and work to facilitate their speedy immigration and resettlement.

It is telling–but not surprising– that the monkeys throwing poo in hopes it sticks to the administration are unwilling to do that.

Comments

Maybe The Horse Isn’t Dead Yet…

My friend Morton Marcus–an Indiana columnist who was for many years the Director of   the Indiana Business Research Center–used a recent column to weigh in on the plight of local journalism. As he noted, one of the major causes of the decline of local news outlets has been the displacement of private financing “from independent, local entrepreneurs to large corporate chains that “trimmed” costs.”

“Trimmed ” is a very nice word for the ferocious and destructive cost-cutting that has virtually killed local news– the very product those outlets were selling.

As Morton noted (I got this in an email, so no link–sorry)

Corporations behave like individuals; they seek to avoid the risks of change and the challenges of diversity. Therefore, editors who accept the risk of divergent views are best removed. Reporters who impede corporate strategy are best discharged. Radio and TV stations are bought and stripped of their distinctive local content.
Given lower costs of production, newspaper and magazine offices, TV and radio stations, housing older equipment, with their associated personnel, become unnecessary drags on profits. A conglomerate can morph an enterprise from news and reasoned commentary into a conveyor of entertainment and sensationalism. “Efficiency” of the corporation often out-weighs the quality and nature of the product.

Lest you think Morton’s column was merely another flogging of that “dead horse” along the lines of my post yesterday, you will be happy to learn that he ended with some very good news: the introduction of companion measures in both the House and Senate titled “The Local Journalism Sustainability Act.”

The bill is intended to provide a “pathway to financial viability” for local news produced by newspapers–including all-digital ones–plus television and radio. The mechanism through which this is to be achieved is a combination of three tax credits: a credit aimed at incentivizing subscribers; a credit to provide news outlets an increased ability to hire and retain journalists; and a credit intended to encourage small businesses to advertise in these local news outlets.

The individual credit for subscribers is described as a five-year credit of up to $250 annually, available to individuals who either subscribe to a local newspaper or donate to a nonprofit news organization. It would cover 80% of those costs the first year, and 50% in four subsequent years.

The effort is billed as bipartisan, which–if accurate–should increase its chances of passage.

Will these tax credits work to stem the bleeding? Who knows? I have my concerns about the use of tax incentives, which tend to add to the complexity of America’s tax system, and where “goodies” intended to reward donors can be shielded from the light of day. On the other hand, there are–as I have recently noted–examples of the successful use of such incentives to prompt socially beneficial behaviors.

Perhaps the most significant positive aspect of this effort is that it signals recognition of the problem. If this particular measure doesn’t pass–or fails to stem or reverse the decline of local news–that recognition is a sign that other interventions are likely to be tried.

The importance of that–the importance of agreement over the existence of a problem–is hard to overstate.

There really is no problem we humans cannot address more or less successfully, once there is broad agreement on the existence and nature of a problem.We see this most vividly as we confront climate change and regret the years wasted–the years during which we might have avoided what is now unavoidable–because too many people refused to admit the existence and nature of the threat. We are seeing it in the insistence by right-wingers who refuse to get vaccinated that COVID is a “hoax.”

We can’t solve problems we refuse to see.

What is most heartening about the Local Journalism Sustainability Act is its recognition of the importance of credible, comprehensive local news sources, and the determination to keep that horse alive.

Comments