About Those Statues…

In the last couple of days, I’ve gotten two messages from friends in different (Northern) states who are troubled about the efforts to remove statues of Civil War figures. 

Here’s the first:

I am in a quandary. I am an educated, white, privileged male.  I can understand, but not empathize with, the thoughts of those who wish to see the statues of Confederate officers removed.  As an English major, I also see the statues as art.  So what is next? Paintings, then books? Are the Holocaust museum displays too emotional, the paintings at the WWII museum too one-sided, the ceiling of the Sistine Chapel acceptably historical?  And who would decide?  

Shakespeare said a rose by any other name would smell as sweet.  Is Fort Bragg any less offensive to humanity than Fort Sherman?  

I don’t want to get too deep in the weeds with the idea, but I do see the opportunity for a slippery slope.  Maybe it’s just my white, privileged male quandary? 
I look forward to your thoughts.

Here’s the second:

I’ve been thinking a lot about the new wave of dismantling Confederate statues, not displaying the Confederate flag, dropping Gone with the Wind from Netflix, Lady Antebellum changing their name to Lady A, etc. I agree with a lot of this, but I wonder if we’re going too far? Where do we draw the line? I noted on Facebook that Washington and Jefferson were slave owners. Should we tear down their monuments while we’re at it? Is it rewriting history? I would love for you to write a blog about this and help me figure it out!

Both of these individuals are progressive, thoughtful and public-spirited. If they are uncomfortable with removing these monuments and renaming bases, I’m sure many other people are equally conflicted.

Here’s my “take” on the issue:

First of all, I see a profound difference between statues and monuments that honor historical figures, and museum and other displays that educate about those figures. The placement of statues in public places pretty clearly falls into the first category. (In a couple of instances, Confederate statues have been moved to museums rather than destroyed–an implicit recognition of the difference, and in my view, an entirely satisfactory resolution.) With respect to the names of military posts, same thing—we don’t name streets, buildings, etc. for “bad guys,” we reserve naming rights for figures we admire.

Germany doesn’t have statues of Hitler, but German history certainly hasn’t been lost.

The men who fought for the South in the American Civil War were defending slavery– an indefensible system–and they were traitors to their country. We should remember them, but we certainly shouldn’t honor them. (There’s also the fact that most of these monuments were erected long after the war, to signal white resistance to the civil rights movement.)

So I think removing Civil War statues is a relatively easy call. But I understand the concern about “slippery slopes.”

None of the historical figures we admire were perfect people. As the second message notes, Washington and Jefferson (among others) were slaveowners. But we don’t honor them for slave-holding; we honor them for their willingness to risk their “lives, fortunes and sacred honor” to bring a new nation into existence, and for their crafting of the Constitution and Bill of Rights.  

If being a flawed human being was reason to ignore significant contributions made by historical figures, there wouldn’t be many statues. (Maybe Mother Teresa, although who knows? There might be something in her past….)

Before we either defend or dismantle a monument, I think we need to ask why it exists, and what it is that we are honoring.

It’s pretty clear that the only reason there are statues of Robert E. Lee and other Civil War figures is because they were central figures in an uprising–a rebellion– against our country. We are honoring their decision to be traitors, and implicitly sending a message that although they lost, their “cause” was honorable.

In the case of figures like Jefferson, Madison, Washington, et al, we are honoring their undisputed service and the importance of their contributions–and those contributions are clearly worthy of honor.

Anyway–that’s my take on the issue. I welcome the perspectives of my readers.

Comments

When We Talk About “Systems” And “Structures”

Speaking of communication, as I did yesterday–Sometimes, it’s a good idea to define your terms.

I’m as guilty as anyone–I sometimes use terms without stopping to specify what I mean. In the interests of clarity, today I’m defining three structural labels that I use frequently–terms that identify aspects of America’s political environment that it’s past time to revisit and revise.

Federalism

Federalism is the name given to America’s division of authority among local, state and federal levels of government. That division recognizes realities of governance: state and federal governments have no interest in handing out zoning permits or policing domestic violence disputes, to cite just two examples. Increasingly, however, many original assignments of responsibility are no longer workable. State-level management of elections, for example, was necessary in the age of snail-mail registration and index cards identifying voters; in the computer age, it’s an invitation to chaos and misconduct.
Federalism also facilitates assertions of state sovereignty where there really is none. Federal highway dollars are conditioned on state compliance with federally mandated speed limits, and similar “strings” are attached to almost all of the federal funding that cities and states rely upon. There are also an increasing number of issues, including climate change and pandemics, that must be addressed globally.

Businesses need uniformity in state laws in order to operate efficiently across state lines. Problems with acid rain can’t be solved by municipal ordinance. The internet cannot be controlled by a state legislature–or even by Congress. Even in law enforcement, generally considered the most local of issues, multistate criminal enterprises justify an increased federal presence.
 
The world has changed since the Constitution was drafted. Today, where should authority for governmental responsibilities reside? What should federalism mean in the age of technological connectivity and globalism?

Home Rule

Despite the existence of an Indiana statute labeled “Home Rule,” efforts at self-government by Indiana municipalities are routinely pre-empted by the Indiana Legislature. Just in the past few years, lawmakers have prevented local governments from restricting the use of disposable plastic bags and dictated what modes of public transit cities are permitted to use and tax themselves for. In a particularly ironic ruling, a Judge found that the state’s Home Rule statute itself blocked Ft. Wayne’s enforcement of a “good government” ordinance intended to restrict “pay for play” politics. The ordinance would have limited the amount of money owners of a company could give elected officials and still bid on city contracts.

In Indiana, the absence of genuine home rule means that decisions affecting residents of urban areas are routinely made by representatives of suburban and especially rural populations (see gerrymandering), whose grasp of the challenges and realities faced by elected officials in metropolitan areas is limited, at best.

Indiana is not unique. The Brookings Institution has described the extent to which state laws preempt local control over public health, economic, environmental, and social justice policy solutions. In 2019, state lawmakers made it illegal for locally-elected officials to enact a plastic bag ban in Tennessee, raise revenues in Oregon, regulate e-cigarettes in Arkansas, establish minimum wages in North Dakota, protect county residents from water and air pollution produced by animal feedlots in Missouri, or protect immigrants from unjust incarceration in Florida.
 
There are clearly issues that should be decided at the state or federal level. (See Federalism) Policy debates should center on what those issues are, and state-level lawmakers should to allow local governments to make the decisions that are properly local. Right now, they can’t.
  

Gerrymandering

Every ten years, the Constitution requires that a census be taken and the results used to remedy population discrepancies in the succeeding year’s congressional redistricting.

In our federalist system, redistricting is the responsibility of state legislatures. Gerrymandering, or partisan redistricting, occurs when the party that controls a statehouse manipulates district lines to be as favorable as possible to its own electoral prospects. “Packing” creates districts with supermajorities of the opposing party; “cracking” distributes members of the opposing party among several districts to ensure that it doesn’t have a majority in any of them; and “tacking” expands the boundaries of a district to include a desirable group from a neighboring district.

Partisan redistricting takes its name from then-governor of Massachusetts Elbridge Gerry.

Studies have tied gerrymandering to the advantages of incumbency and to partisan rigidity, but by far its most pernicious effect has been the creation of hundreds of Congressional seats that are safe for one party. The resulting lack of competitiveness reduces the incentive to vote or otherwise participate in the political process, because the winner of the district’s dominant party primary is guaranteed to win the general election. Primary voters tend to be more ideologically rigid, and as a result, candidates in safe districts are significantly more likely to run toward the extremes of their respective parties. Gerrymandering is thus a major contributor to partisan polarization.

Thanks to the way gerrymandered districts have been drawn in Indiana, a majority of policymakers in the Statehouse represent predominantly rural areas. As a consequence, state distribution formulas that allocate funding for roads and education significantly favor rural areas over urban ones, and members of Indiana’s General Assembly are more responsive to rural than urban concerns.

When I use these words, this is what I mean.

Comments

Communicating?

Communication is hard work in the best of times–and we definitely don’t live in the best of times.

Academics who study communication tend to focus on barriers to understanding like cultural differences and different reactions to “nonverbal” cues and body language. (I await research on how Zoom interactions affect those nonverbal cues…)There’s a whole field of intercultural communication, established back in 1959 by an anthropologist named Edward Hall.

So what–I hear you asking–does any of this abstract scholarly research have to do with the people filling American streets clamoring for justice and change?

A lot, I think. An enormous amount of civic unrest is a result of failure to truly communicate.

A recurring discussion on this blog has focused on the extent to which our inability to understand each other is been rooted in the media environment we currently inhabit. It isn’t simply the propaganda promulgated by talk radio, Fox and Sinclair–it is also the relatively recent, well-meaning but misplaced effort of so-called “Legacy” media to be “balanced,” to be fair, to give even the fringiest points of view a respectful treatment. As a result, even bizarre perspectives have been given a patina of respectability. This emphasis on “balance” plays directly into the narrative of the far Right–and the recent publication of the Tom Cotton op-ed by the New York Times is just one recent example.  Zuckerberg’s cowardly refusal to fact-check Republican lies on FaceBook is another.

Heather Cox Richardson sees signs that such unearned respect may be changing–and that Trump’s sinking poll results are evidence that he and his enablers are losing the benefits of that unduly deferential narrative.

Even more indicative that the national narrative is changing was the announcement yesterday that James Bennet had resigned as the editorial page editor of the New York Times. Bennet ran an op-ed last Wednesday by Arkansas Senator Tom Cotton titled (by the Times, not by Cotton) “Send in the Troops.” The inflammatory piece blamed “cadres of left-wing radicals like antifa” for an “orgy of violence” during the recent protests and claimed that “outnumbered police officers… bore the brunt of the violence.” Neither of these statements is true, and they clothe a false Republican narrative in what appears to be fact. Cotton’s solution to the protests was to send in the military to restore “law and order,” and he misquoted the Constitution to defend that conclusion.

The kerfuffle over this op-ed seems like it’s more than a normal media skirmish. For more than a century, American media has tried to report facts impartially….

Richardson pegs the start of talk radio to the abandonment of the Fairness Doctrine; it was the beginning of a propaganda barrage with which we are now all too familiar:  white taxpayers under siege by godless women and people of color. Fox News Channel wasn’t far behind. Fox’s greatest success was in equating “fair” with “balanced.”  Other media outlets became defensive; in order to protect themselves against charges that they were biased, they accepted the notion that media must show “both sides.”

Richardson thinks Bennet’s resignation over the Cotton op-ed “marks a shift in the media that has been building for months as newspapers and television chyrons increasingly check political falsehoods in favor of fact-based argument.”

If accurate, that is good news. It would be even better if the Left wasn’t–once again–engaging in communicative suicide.

Richardson is hardly the only commentator expressing frustration over the slogan  “defund the police”–a phrase that suggests abolishing police departments. What is actually intended is perfectly reasonable–  proponents want to shrink police responsibilities and decrease police budgets, investing instead in the community resources that have lost money as police budgets have exploded–but it is hard to imagine a stupider slogan or a more welcome gift to a GOP desperately trying to change the subject from a pandemic and massive protests.

They may not be able to govern, but one thing Republicans are good at is labeling, at carefully choosing terminology likely to resonate with the majority of voters who are not obsessively following political news and able to “deconstruct” political phrasing. Remember the “death tax”? Remember when “undocumented workers” became “illegal aliens,” the “social safety net” became “socialism,” and “national health care” became “socialized medicine”?

I think Richardson is right that the media is–slowly– jettisoning false equivalency for fact-based objectivity. That’s good news for “team blue”–and not an invitation to muddy the waters with yet another unforced communication error.

When you mean “reform policing,” say “reform policing,” or something similar. Don’t hand Trump a weapon with which to confuse and mislead. Communicate!

Comments

Collaboration

This month, the Atlantic published a lengthy article written by Anne Applebaum. It addressed what is perhaps the most difficult-to-understand aspect of our contemporary political reality, what she dubs “collaboration.” Why do some people go along with–or even genuinely support–what they must know to be wrong, or even evil, while others do not?

What’s the difference between Lindsey Graham and Mitt Romney?

Applebaum began the article with a story from Germany, a description of two similar East German officials. One defected; one collaborated. What made the difference?

Separately, each man’s story makes sense. But when examined together, they require some deeper explanation. Until March 1949, Leonhard’s and Wolf’s biographies were strikingly similar. Both grew up inside the Soviet system. Both were educated in Communist ideology, and both had the same values. Both knew that the party was undermining those values. Both knew that the system, allegedly built to promote equality, was deeply unequal, profoundly unfair, and very cruel. Like their counterparts in so many other times and places, both men could plainly see the gap between propaganda and reality. Yet one remained an enthusiastic collaborator, while the other could not bear the betrayal of his ideals. Why?

Applebaum cites a historian, Stanley Hoffmann, for his classification of Nazism’s French collaborators into “voluntary” and “involuntary.” Many people in the latter group had no choice, but Hoffmann sorted“voluntary” collaborators into two categories–those who rationalized collaboration (we have to protect the economy, or preserve French culture)– and the “active ideological collaborators.” These were people who believed that “prewar republican France had been weak or corrupt and hoped that the Nazis would strengthen it, people who admired fascism, and people who admired Hitler.”

Hoffman’s description of the voluntary collaborators is more than a little relevant to today’s United States.

Hoffmann observed that many of those who became ideological collaborators were landowners and aristocrats, “the cream of the top of the civil service, of the armed forces, of the business community,” people who perceived themselves as part of a natural ruling class that had been unfairly deprived of power under the left-wing governments of France in the 1930s. Equally motivated to collaborate were their polar opposites, the “social misfits and political deviants” who would, in the normal course of events, never have made successful careers of any kind. What brought these groups together was a common conclusion that, whatever they had thought about Germany before June 1940, their political and personal futures would now be improved by aligning themselves with the occupiers.

There is much more in the article that deserves consideration and illuminates political and social realities, and I urge readers to click through and read it in its entirety. But the quoted paragraph could easily be a description of the Americans who continue to support Donald Trump.

It is impossible for any sentient person to observe Trump and conclude that he is fit for office (or even sane). So why does he still maintain the support of roughly 40% of Americans? Hoffman’s two categories are explanatory: that “natural ruling class” that is being “unfairly deprived of power” describes the educated cohort of white “Christian” males who are mortally offended by the prospect of sharing social dominance with uppity women and people of color. And our Facebook pages and Twitter feeds are full of pictures and videos of the “social deviants”–waving Confederate flags, carrying assault weapons to government buildings to assert their “right” to infect their neighbors, attacking black joggers, and flourishing misspelled placards insulting the “libtards.”

Whatever either group had thought about Trump before November, 2016, they decided that their political and personal futures would now be improved by aligning themselves with him.

Describing the members of both categories is one thing. Figuring out why people become who they are is another–and much harder.

Why do some people grow up to model the virtues society preaches– compassion, empathy and self-reflection (or at the very least, human decency), while others enthusiastically reject and demean those values?

Why do some people work to make a better world, often at considerable risk to their own well-being, while others cheerfully collaborate with evil?

Comments

What’s The Same, What’s Different

If you had asked me in, say, 2003–as we were waging war in Iraq–whether I would ever look back on the Presidency of George W. Bush with anything less than disgust, I’d have suggested a mental health checkup. If someone had argued that, in retrospect, Richard Nixon had his good points, I’d have gagged.

But here we are.

George W. wasn’t–as the saying goes–the brightest bulb, and at times his religiosity tended to overcome his fidelity to the Constitution–but he listened to the people around him (granted, several were unfortunate choices) not his “gut,” and his faith was evidently sincere. His official performance left a lot to be desired, but when he left the Oval Office, the country was still standing. (Talk about a low bar–but still…) And he’s been a pretty decent former President.

Nixon was actually smart. True, he was paranoid and racist, but he was really good on environmental policy and worked (unsuccessfully) to improve the social safety net. As Paul Krugman recently wrote

Donald Trump isn’t Richard Nixon — he’s much, much worse. And America 2020 isn’t America 1970: We’re a better nation in many ways, but our democracy is far more fragile thanks to the utter corruption of the Republican Party.

The Trump-Nixon comparisons are obvious. Like Nixon, Trump has exploited white backlash for political gain. Like Nixon, Trump evidently believes that laws apply only to the little people.

Nixon, however, doesn’t seem to have been a coward. Amid mass demonstrations, he didn’t cower in the MAGAbunker, venturing out only after his minions had gassed peaceful protesters and driven them out of Lafayette Park. Instead, he went out to talk to protesters at the Lincoln Memorial. His behavior was a bit weird, but it wasn’t craven.

 And while his political strategy was cynical and ruthless, Nixon was a smart, hard-working man who took the job of being president seriously.

His policy legacy was surprisingly positive — in particular, he did more than any other president, before or since, to protect the environment. Before Watergate took him down he was working on a plan to expand health insurance coverage that in many ways anticipated Obamacare.

As Krugman–and many others–have pointed out, the most relevant difference between “then” (the 60s) and now is the profound change in the Republican Party and the spinelessness and lack of integrity of the people the GOP has elected. Yes, Trump is a much worse human being than even Richard Nixon; but the real problem lies with his enablers.

Trump’s unfitness for office, his obvious mental illness and intellectual deficits, his authoritarian instincts and racial and religious bigotries have all been on display since he first rode down that ridiculous escalator. But aside from a small band of “Never Trumpers,” today’s Republican Party has been perfectly happy to abandon its purported devotion to the Constitution and the rule of law–not to mention free trade– in return for the power to enrich its donors and appoint judges who will ensure the continued dominance of white Christian males.

The good news is that the GOP is a significantly smaller party than it was in Nixon’s day.  According to Pew,

In Pew Research Center surveys conducted in 2017, 37% of registered voters identified as independents, 33% as Democrats and 26% as Republicans. When the partisan leanings of independents are taken into account, 50% either identify as Democrats or lean Democratic; 42% identify as Republicans or lean Republican.

The 8-percentage-point Democratic advantage in leaned partisan identification is wider than at any point since 2009, and a statistically significant shift since 2016, when Democrats had a 4-point edge (48% to 44%).

As utterly depressing as it is to see 42% of our fellow Americans still claiming allegiance to a political party that has shown itself to be unmoored from its principles and origins–and for that matter, antagonistic to fundamental American values–the fact remains that more people reject the party of white supremacy than embrace it.

Republicans who supported Nixon in the 60s rarely defend him these days. It will be interesting to see how today’s 42% remember their loyalties fifty years from now.

Assuming, of course, that we still have a country (and a planet) when the devastation wrought by this administration clears….

Comments