God And Country, Redux

In 2007, I wrote a book titled God and Country: America in Red and Blue. It explored a question that had preoccupied me for years: how do religiously inculcated world-views affect our political behaviors? I was–and remain–convinced that a number of ostensibly secular policy positions are (consciously or unconsciously) rooted in religious ways of seeing the world.

In order to examine the religious roots of America’s cultural and policy divisions, I needed to do a lot of research. I was–and am– far from well-versed about my own tradition, which is Judaism, and I knew little or nothing of the 2000-plus Christian denominations in the U.S., or how religious beliefs affect socialization. Writing the book required a “deep dive,” and I remain very grateful to Christian friends–including a couple of clergy members (you guys know who you are!)– who patiently read drafts and checked my conclusions.

Those conclusions are detailed in the book (which is still available) and it is not my intent to recite them here. I share the fact of that rather extensive research because it is the background with which I approached a recent column by John Pavlovitz and a New York Times guest essay about America’s rapidly growing secularism.

Pavlovitz is a writer, pastor, and activist from North Carolina, and a favorite among my Facebook friends, who share his posts rather frequently. He’s what I consider a “real” Christian (granted, deciding who is “real” is pretty arrogant coming from a non-Christian…). This column was titled “How You Know if You Have the Wrong Religion,” and what struck me was that his message–with which I entirely agreed– addressed the longstanding divide between faith and works. (Traditional Christian denominations are typically concerned with belief; Judaism prioritizes works.)

Growing up and later ministering in the Church, the elemental heart of spiritual community was the stated or implicit sense that we alone had cracked the God code; that we’d figured out what every other faith tradition (and many communities within our tradition) had not. Evangelism was less about sharing God’s love with the world around us but about getting the world to be as enlightened as we were by completely agreeing with us.

Believing the right thing was everything. The world was sharply divided between the saved and the damned and the greatest imaginable sin was to reject that idea. And it wasn’t enough to believe in God, you had to believe in the correct God, adopt the correct doctrine, and pray the correct prayers—or else your sincerity or judgment (not to mention, your eternal destination) were questioned.

Pavlovitz isn’t the only critic of those ostentatiously pious believers whose faith never quite translates into good works or even loving-kindness. There’s significant research suggesting that the growing exodus from churches and organized religion is a reaction to precisely that form of religiosity.

And that brings me to a New York Times guest essay by a Baptist pastor who is also a college professor. After charting the steady decline in American religiosity since 1988, he reports

Today, scholars are finding that by almost any metric they use to measure religiosity, younger generations are much more secular than their parents or grandparents. In responses to survey questions, over 40 percent of the youngest Americans claim no religious affiliation, and just a quarter say they attend religious services weekly or more.

The partisan implication of that statistic, which he duly notes, is a reduction of support for the Republican Party, which is heavily dependent upon religiously observant Christians, including but not limited to Evangelicals. As he also points out,  however, Democrats will have to balance policy priorities “between the concerns of the politically liberal Nones and the more traditional social positions espoused by groups like Black and mainline Protestants.” 

Whatever the partisan consequences, Christians like Pavlovitz are offering a way forward that would significantly reduce  today’s religious tribalism–and ultimately, redefine what counts as genuinely religious.

If you claim to be a “God and Country “Bible-believing Evangelical,” great. But if you have contempt for immigrants or bristle at white privilege or oppose safeguards in a pandemic, your Christianity is ineffectual at best and at worst, it’s toxic. You might want to rethink something.

If you believe because you prayed a magic prayer to accept Jesus at summer camp when you were 13,  that you can inflict any kind of adult damage to the people and the world around you and you’ll still be golden, while gentle, loving, benevolent atheists and Muslims go to hell—you’re doing religion wrong.

So many of America’s problems stem from “doing religion wrong”…

Comments

How “Woke” Is Academia, Really?

We Americans harbor all sorts of prejudices about all sorts of things.

One of the problems with racism, anti-Semitism, and similar tribal bigotries is that such attitudes ignore individual differences. When you criticize “those people” (whoever “those people” are in your particular worldview), you inevitably sweep with far too broad a brush.

That impulse–to generalize from anecdotal experiences–isn’t limited to our attitudes about marginalized racial or ethnic groups. It has become a ubiquitous–and extremely annoying– element of America’s polarized political world-views. There is, for example, a widespread–and dangerously oversimplified–belief that America’s universities are bubbling cauldrons of “woke” indoctrination. That charge has become part of the Republican Party’s current war on evidence, science, and accurate history.

Before I retired from my position on a university faculty, I was periodically accused of being part of a liberal “brainwashing” enterprise by people who clearly knew very little about academic life, my particular campus, or –for that matter–the huge differences around the country among institutions of higher education.

I was reminded of those discussions when I read a rant on the subject that had been posted to Daily Kos.

The posted diatribe was triggered by a televised exchange between Andrew Sullivan and Bill Maher on the latter’s show, in which the two of them decried the “wokeness” of today’s colleges and universities.

I have likely spoken at more colleges in the past 15 years than Bill Maher and Andrew Sullivan put together. The difference is that they speak at select elite colleges and I speak everywhere else. For instance, next week I speak at Hastings College in Nebraska.

Most colleges really aren’t that woke. In fact, being too liberal when I speak is always in the back of my mind. For example, eleven years ago I spoke at New Mexico Tech. From my perspective, it was one of my best nights as a speaker. I performed two shows in a full theater and received a standing ovation. Nevertheless, the woman who booked me refuses to ever bring me back, because a few people walked out and complained. I actually noticed the walkouts, and they did it right after a section in my show where I talked about global warming and childish Republicans who renamed French fries “Freedom fries” and French toast “Freedom toast” in the congressional cafeterias to protest France not participating in the Iraq War.

Additionally, one religious college politely asked me not to speak about evolution, and another booked me under the condition that I not speak out against Donald Trump. Other colleges outright won’t book me because I’m too liberal (i.e. woke). That’s okay. I’m not going to whine about it. I mention it only because people like Bill Maher and Andrew Sullivan are clueless about what life is like in the real world.

He’s right.

According to the National Center for Education Statistics, there were 3,982 degree-granting postsecondary institutions in the U.S. as of the 2019-2020 school year. Believe me–and believe the author of the quoted rant–very few of them look like Harvard or Berkeley.

One of the numerous faculty committees on which I served was the admissions committee, where we reviewed applications for entry into our graduate programs. Those applications were frequently submitted by students from undergraduate institutions I’d never heard of.

When my husband and I are driving out of state, we constantly pass road signs announcing that such-and-such town is the home of such-and-such local college. These are almost always schools that– despite the fact I was “in the field”– I’d never heard of.

The sheer number of these small and often-struggling schools is intriguing; I sometimes wonder about their ability to attract competent instructors, the focus and breadth of their curricula, and the career prospects they offer their graduates.

It is likely that  the quality of these institutions varies widely. I would bet good money, though, that very few of them are “woke.” (To the extent that they are “indoctrinating,” it is likely to be with denominational religious views–and those are most unlikely to be “liberal.”)

Judging all post-secondary schools as if they are all alike shares the same fallacy that characterizes racial and religious bigotry–the notion that all Blacks or Jews or Muslims or immigrants–or Republicans or Democrats–are alike and interchangeable. One of the many, many defects of our current media environment is its tendency to find an example of something–generally an extreme example–and suggest that it represents an entire category that we should either embrace or reject.

Reality is more complicated than prejudice admits.

Comments

Memorializing History?

The angry battles over the propriety of statues commemorating Confederate soldiers, and the somewhat different arguments that regularly erupt over the design of war memorials and the like, are reminders that–as a species–we humans like to erect permanent (or at least, long-lasting) mementos of people and events we consider worth memorializing.

There are probably psychological studies probing the reasons for that evidently widespread impulse, and I thought about what that desire to commemorate might tell us when I read a request from Lester Levine, who regularly comments here. Lester asked me to post the following announcement/invitation to the community that converses (or lurks) on this site.

You are a creative bunch. So, as we all wallow in health and political turmoil, I would like to invite you and anyone you know to deeply immerse hands, minds and souls in an engaging project. It will require minimal artistic skills and “production” time.

In addition to my curmudgeonly comments on this blog, I am Lester Levine, the only person to read all 5,201 entries to the 2003 World Trade Center Memorial Competition. My 2016 book highlights the 100+ most innovative designs submitted. That research forever interested me in the role of memory/memorials in history and culture.

And so, as we approach 9/11/2021, I am struck by how the January 6, 2021 event has also been “named” by its date.. Likewise, I can’t help but wonder about the artistic/intellectual challenge of imagining a physical marker for historical memory. It was not war; there were few killed. Yet, many think the event was a seminal moment for the United States and all that it stands for.
___________________________________________________________
Announcing the “Remembering 1/6 Design Competition”
Opens 8/17/21
Open to anyone, anywhere
Entry format, rules, questions – lester.levine@gmail.com
Entries due by midnight US Eastern time, 10/22/21 to lester.levine@gmail.com
Judged by creators of innovative 9/11 memorial entries
Winners announced on 12/6/21
____________________________________________________________

I will be interested to see what the people responding consider appropriate “markers” of that very troubling and ominous event.

I just hope that a hundred years or so from now there are still people around to see whatever monument is ultimately erected.

Comments

The Scales Of Justice

We are all familiar with “Lady Justice”–the statue of a blindfolded woman holding scales, intended to represent the dispassionate weighing of evidence leading to a just result.

The justice system has developed a number of rules governing how the “weighing and measuring” symbolized by those scales occurs. Rules against the consideration of hearsay, for example, are intended to exclude evidence that is essentially gossip–matters for which the person testifying cannot personally vouch.

Most people understand why courts disallow hearsay, or allow cross-examination. The reasons for other rules are less intuitive. As Paul Ogden has recently noted on his own blog, statutes of limitations fall into that latter category.

Ogden shares his concerns about a recent case brought by a woman against singer/songwriter Bob Dylan, alleging that he molested her when she was twelve years old– in 1965. 

Let’s think about this.

The “Me Too” movement ushered in a long-deferred accounting of the exploitation of women by men who were often in positions of power and/or privilege. “Me Too” has done a lot of good–but like so many overdue movements, it has had its share of excesses. The state of New York, in a recent effort to protect abused children (male or female), passed the New York Child Victims Act. Among other things, it temporarily expanded by decades the statute of limitations for child victims to bring civil lawsuits.  It also protected the identity of those bringing such lawsuits from disclosure–presumably, even the identity of plaintiffs who are now adults.

On the surface, it might seem that allowing individuals much more time to bring a lawsuit would advance justice. But as Paul points out, there are sound reasons for statutes limiting the time periods within which suits can be filed. As he notes, in 1965 “Lyndon Johnson was President, man had not yet stepped on the moon (1969), and seat belts were not yet required in cars (1968).  

As Paul also notes, extending or eliminating statutes of limitations can put accused people at a distinct disadvantage.  As time passes, memories fade, witnesses die, evidence gets lost, destroyed or simply buried by history.  Statutes of limitations exist to ensure that claims are litigated while the evidence is relatively fresh and the evidence proving or disproving the claim is still available.

In his post, Paul lists the specific allegations of the complaint and details the monumental difficulty of proving or disproving those allegations over 50 years later.

We can certainly debate the ideal time period within which lawsuits should be commenced, but declaring “open season” for such suits not only makes the achievement of certainty virtually impossible, it invites all sorts of mischief. Let’s say you were on a date in college, had a bit more to drink than was wise (but not enough to make you insensitive), and had consensual sex that you later regretted. As the years pass, you “remember” the incident a bit differently–perhaps as date rape. If your “assailant” comes into a lot of money later in life (perhaps through fame, perhaps through hard work, perhaps through inheritance–whatever), how tempting would it be to use the justice system to confirm your now-sincere but somewhat “adjusted” recollection of the event?

I am absolutely not suggesting that tardy allegations are all–or even mostly– manufactured. I’m sure they aren’t. And I have no idea whether the plaintiff accusing Dylan was actually abused or not. She may well have been.

The point is, after the passage of a certain amount of time, it is absolutely impossible to know. 

Achieving justice requires according fundamental fairness to both the accuser and the accused. The rules governing “due process of law” are meant to ensure that injured people get their day in court, and that unfairly accused people have the means to demonstrate their innocence. 

Being as fair as possible to both parties means requiring an aggrieved person to sue within a reasonable period of time. Admittedly, what constitutes a reasonable time is debatable.

Research has shown memory can be unreliable at pretty much any time, but requiring that litigation be pursued while any witnesses are still likely to be alive and probative evidence is likely to be obtainable–seems only fair. 

It’s the process that is “due.”

Comments

Defining Moderation

Remember our prior discussions of the Overton Window?The Overton window is the name given to the range of policies that are considered politically acceptable by the mainstream population at a particular time. It’s named (duh!) for someone named Overton, who noted that an idea’s political viability depends mainly on whether it falls within that range.

The rightward movement of the Overton Window over the past few decades explains why the hand-wringing of the “chattering classes” over the disappearance of “moderation” is so misplaced.

I have noted previously that in 1980, when I ran for Congress as a Republican, I was frequently accused of being much too conservative. My political philosophy hasn’t changed (although my position on a couple of issues has, shall we say, “matured” as I became more informed about those issues)–and now I am routinely accused of being a pinko/socialist/commie.

My experience is anything but unique. I basically stood still; the Overton Window moved. Significantly.

As the GOP moved from center-right to radical right to semi-fascist, the definition of “moderation” moved with it. America has never had a true leftwing party of the type typical in Europe, but today, anything “left” of insane is labeled either moderate or “librul.” That makes these tiresome screeds about the lack of moderation dangerously misleading.

As Peter Dreir recently wrote at Talking Points Memo,(behind the TPM paywall),

Here’s how the Los Angeles Times described how the infrastructure bill was passed: “After months of negotiation among President Biden, Democrats and a group of moderate Republicans to forge a compromise, the Senate voted 69 to 30 in favor of the legislation.”

The Times then listed the “ten centrist senators” — five Republicans and five Democrats — who worked to craft the bill: Republicans Rob Portman (OH), Bill Cassidy (LA), Susan Collins (ME), Lisa Murkowski (AK), and Mitt Romney (UT) and Democrats Jeanne Shaheen (NH) Jon Tester (MT), Joe Manchin (WV), Mark Warner (VA), and Sinema.

But by what stretch of the imagination are Cassidy, Portman, Romney, Collins and Murkowski “moderate” or “centrist” Republicans? None of them are even close to the “center” of America’s ideological spectrum. They all have opposed raising taxes on the wealthy, toughening environmental standards, expanding voting rights, adopting background checks for gun sales and limiting the sale of military-style assault weapons, and other measures that, according to polls, are overwhelmingly popular with the American public.

As the essay points out, there is no longer any overlap between America’s two major parties. There may be some overlap among voters, but not among elected officials.

What we have experienced is what political scientists call “asymmetrical polarization.” Over the past decades, as the scholarly literature and survey research make abundantly clear, Republicans have moved far, far to the right, while Democrats have moved slightly to the left.

Finding a center point between the far right and the center-left may be “splitting the difference,” but only in an alternate universe can it be considered “moderation.”

When I became politically active, people like Michael Gerson were considered quite conservative. But Gerson stopped well short of crazy, and he has been a clear-eyed critic of the GOP’s descent into suicidal politics.  Gerson recently considered the spectacle of DeSantis and Abbot, who have been playing to the populist base of today’s Republican Party.

These governors are attempting, of course, to take refuge in principle — the traditional right not to have cloth next to your face, or the sacred right to spread nasty infections to your neighbors. But such “rights” talk is misapplied in this context. The duty to protect public health during a pandemic is, by nature, an aggregate commitment. Success or failure is measured only in a total sum. Incompetence in this area is a fundamental miscarriage of governing. Knowingly taking actions that undermine public health is properly called sabotage, as surely as putting anthrax in the water supply….

The problem for the Republican Party is that one of the central demands of a key interest group is now an act of sociopathic insanity. Some of the most basic measures of public health have suddenly become the political equivalent of gun confiscation. It’s as if the activist wing of the GOP decided that municipal trash pickup is a dangerous socialist experiment. Or chlorine in public pools is an antifa plot. There can be no absolute political right to undermine the health and safety of your community. Or else community has no meaning.

If “moderation” means finding middle ground between sociopathic insanity and common sense, language has lost any ability to inform or communicate. When presumably serious commentators misuse such terminology, it just makes it more difficult to cure–or even understand– our manifest political dysfunctions.

Comments