The angry battles over the propriety of statues commemorating Confederate soldiers, and the somewhat different arguments that regularly erupt over the design of war memorials and the like, are reminders that–as a species–we humans like to erect permanent (or at least, long-lasting) mementos of people and events we consider worth memorializing.
There are probably psychological studies probing the reasons for that evidently widespread impulse, and I thought about what that desire to commemorate might tell us when I read a request from Lester Levine, who regularly comments here. Lester asked me to post the following announcement/invitation to the community that converses (or lurks) on this site.
You are a creative bunch. So, as we all wallow in health and political turmoil, I would like to invite you and anyone you know to deeply immerse hands, minds and souls in an engaging project. It will require minimal artistic skills and “production” time.
In addition to my curmudgeonly comments on this blog, I am Lester Levine, the only person to read all 5,201 entries to the 2003 World Trade Center Memorial Competition. My 2016 book highlights the 100+ most innovative designs submitted. That research forever interested me in the role of memory/memorials in history and culture.
And so, as we approach 9/11/2021, I am struck by how the January 6, 2021 event has also been “named” by its date.. Likewise, I can’t help but wonder about the artistic/intellectual challenge of imagining a physical marker for historical memory. It was not war; there were few killed. Yet, many think the event was a seminal moment for the United States and all that it stands for. ___________________________________________________________ Announcing the “Remembering 1/6 Design Competition” Opens 8/17/21 Open to anyone, anywhere Entry format, rules, questions – [email protected] Entries due by midnight US Eastern time, 10/22/21 to [email protected] Judged by creators of innovative 9/11 memorial entries Winners announced on 12/6/21 ____________________________________________________________
I will be interested to see what the people responding consider appropriate “markers” of that very troubling and ominous event.
I just hope that a hundred years or so from now there are still people around to see whatever monument is ultimately erected.
We are all familiar with “Lady Justice”–the statue of a blindfolded woman holding scales, intended to represent the dispassionate weighing of evidence leading to a just result.
The justice system has developed a number of rules governing how the “weighing and measuring” symbolized by those scales occurs. Rules against the consideration of hearsay, for example, are intended to exclude evidence that is essentially gossip–matters for which the person testifying cannot personally vouch.
Ogden shares his concerns about a recent case brought by a woman against singer/songwriter Bob Dylan, alleging that he molested her when she was twelve years old– in 1965.
Let’s think about this.
The “Me Too” movement ushered in a long-deferred accounting of the exploitation of women by men who were often in positions of power and/or privilege. “Me Too” has done a lot of good–but like so many overdue movements, it has had its share of excesses. The state of New York, in a recent effort to protect abused children (male or female), passed the New York Child Victims Act. Among other things, it temporarily expanded by decades the statute of limitations for child victims to bring civil lawsuits. It also protected the identity of those bringing such lawsuits from disclosure–presumably, even the identity of plaintiffs who are now adults.
On the surface, it might seem that allowing individuals much more time to bring a lawsuit would advance justice. But as Paul points out, there are sound reasons for statutes limiting the time periods within which suits can be filed. As he notes, in 1965 “Lyndon Johnson was President, man had not yet stepped on the moon (1969), and seat belts were not yet required in cars (1968).
As Paul also notes, extending or eliminating statutes of limitations can put accused people at a distinct disadvantage. As time passes, memories fade, witnesses die, evidence gets lost, destroyed or simply buried by history. Statutes of limitations exist to ensure that claims are litigated while the evidence is relatively fresh and the evidence proving or disproving the claim is still available.
In his post, Paul lists the specific allegations of the complaint and details the monumental difficulty of proving or disproving those allegations over 50 years later.
We can certainly debate the ideal time period within which lawsuits should be commenced, but declaring “open season” for such suits not only makes the achievement of certainty virtually impossible, it invites all sorts of mischief. Let’s say you were on a date in college, had a bit more to drink than was wise (but not enough to make you insensitive), and had consensual sex that you later regretted. As the years pass, you “remember” the incident a bit differently–perhaps as date rape. If your “assailant” comes into a lot of money later in life (perhaps through fame, perhaps through hard work, perhaps through inheritance–whatever), how tempting would it be to use the justice system to confirm your now-sincere but somewhat “adjusted” recollection of the event?
I am absolutely not suggesting that tardy allegations are all–or even mostly– manufactured. I’m sure they aren’t. And I have no idea whether the plaintiff accusing Dylan was actually abused or not. She may well have been.
The point is, after the passage of a certain amount of time, it is absolutely impossible to know.
Achieving justice requires according fundamental fairness to both the accuser and the accused. The rules governing “due process of law” are meant to ensure that injured people get their day in court, and that unfairly accused people have the means to demonstrate their innocence.
Being as fair as possible to both parties means requiring an aggrieved person to sue within a reasonable period of time. Admittedly, what constitutes a reasonable time is debatable.
Remember our prior discussions of the Overton Window?The Overton window is the name given to the range of policies that are considered politically acceptable by the mainstream population at a particular time. It’s named (duh!) for someone named Overton, who noted that an idea’s political viability depends mainly on whether it falls within that range.
The rightward movement of the Overton Window over the past few decades explains why the hand-wringing of the “chattering classes” over the disappearance of “moderation” is so misplaced.
I have noted previously that in 1980, when I ran for Congress as a Republican, I was frequently accused of being much too conservative. My political philosophy hasn’t changed (although my position on a couple of issues has, shall we say, “matured” as I became more informed about those issues)–and now I am routinely accused of being a pinko/socialist/commie.
My experience is anything but unique. I basically stood still; the Overton Window moved. Significantly.
As the GOP moved from center-right to radical right to semi-fascist, the definition of “moderation” moved with it. America has never had a true leftwing party of the type typical in Europe, but today, anything “left” of insane is labeled either moderate or “librul.” That makes these tiresome screeds about the lack of moderation dangerously misleading.
Here’s how the Los Angeles Times described how the infrastructure bill was passed: “After months of negotiation among President Biden, Democrats and a group of moderate Republicans to forge a compromise, the Senate voted 69 to 30 in favor of the legislation.”
The Times then listed the “ten centrist senators” — five Republicans and five Democrats — who worked to craft the bill: Republicans Rob Portman (OH), Bill Cassidy (LA), Susan Collins (ME), Lisa Murkowski (AK), and Mitt Romney (UT) and Democrats Jeanne Shaheen (NH) Jon Tester (MT), Joe Manchin (WV), Mark Warner (VA), and Sinema.
But by what stretch of the imagination are Cassidy, Portman, Romney, Collins and Murkowski “moderate” or “centrist” Republicans? None of them are even close to the “center” of America’s ideological spectrum. They all have opposed raising taxes on the wealthy, toughening environmental standards, expanding voting rights, adopting background checks for gun sales and limiting the sale of military-style assault weapons, and other measures that, according to polls, are overwhelmingly popular with the American public.
As the essay points out, there is no longer any overlap between America’s two major parties. There may be some overlap among voters, but not among elected officials.
What we have experienced is what political scientists call “asymmetrical polarization.” Over the past decades, as the scholarly literature and survey research make abundantly clear, Republicans have moved far, far to the right, while Democrats have moved slightly to the left.
Finding a center point between the far right and the center-left may be “splitting the difference,” but only in an alternate universe can it be considered “moderation.”
When I became politically active, people like Michael Gerson were considered quite conservative. But Gerson stopped well short of crazy, and he has been a clear-eyed critic of the GOP’s descent into suicidal politics. Gerson recently considered the spectacle of DeSantis and Abbot, who have been playing to the populist base of today’s Republican Party.
These governors are attempting, of course, to take refuge in principle — the traditional right not to have cloth next to your face, or the sacred right to spread nasty infections to your neighbors. But such “rights” talk is misapplied in this context. The duty to protect public health during a pandemic is, by nature, an aggregate commitment. Success or failure is measured only in a total sum. Incompetence in this area is a fundamental miscarriage of governing. Knowingly taking actions that undermine public health is properly called sabotage, as surely as putting anthrax in the water supply….
The problem for the Republican Party is that one of the central demands of a key interest group is now an act of sociopathic insanity. Some of the most basic measures of public health have suddenly become the political equivalent of gun confiscation. It’s as if the activist wing of the GOP decided that municipal trash pickup is a dangerous socialist experiment. Or chlorine in public pools is an antifa plot. There can be no absolute political right to undermine the health and safety of your community. Or else community has no meaning.
If “moderation” means finding middle ground between sociopathic insanity and common sense, language has lost any ability to inform or communicate. When presumably serious commentators misuse such terminology, it just makes it more difficult to cure–or even understand– our manifest political dysfunctions.
Opponents of (a dramatically-mischaracterized) Critical Race Theory are essentially arguing against the recognition of just how deeply racism has affected American law and culture. They argue–and some undoubtedly believe–that civil rights laws created a level playing field, and that it’s now up to minority folks to stop complaining and make use of their equal opportunities.
The problem with that belief–even if we leave aside the sociological effects of two hundred plus years of history–is that it is wrong.
As a society, we are just beginning to appreciate the extent to which racial animus has been baked into our laws and customs. (I was shocked to read The Color of the Law, for example, which documented how deeply the federal government was implicated in redlining and the segregation of America.) Only because I was involved in an effort to modify plans for rebuilding Indiana’s interstates within Indianapolis’ downtown did I become aware of the degree to which the original placement of those highways was the result of racist motives and assumptions.
Fifty-plus years ago, when the interstate system was built, entire neighborhoods were razed to make room for them. Homes, businesses, and urban amenities were destroyed, and the highways became barriers between neighborhoods, cutting people off from job opportunities and retail options.
Subsequent environmental studies have shown that air pollution from highways negatively impacts student outcomes in nearby schools.
All of these negative impacts fell most heavily on Black neighborhoods and businesses, and that was definitely not accidental. As an architect recently wrote in The Washington Post about North Claiborne Street, formerly a bustling corridor in New Orleans:
There were many masters on North Claiborne, and Black New Orleanians were the beneficiaries of their talents. There were doctors, lawyers, retailers, insurance agents, teachers, musicians, restaurateurs and other small-business owners. The avenue stretched across the Tremé and 7th Ward neighborhoods, and in the Jim Crow era, it served as the social and financial center of the Black community.
The government tore up the avenue nearly 60 years ago, burying the heart of Tremé and the 7th Ward so the Claiborne Expressway, part of Interstate 10’s transcontinental span, could run through the city. New Orleans wasn’t alone. The same kind of thing happened across the country; Black communities like those in St. Paul, Minn., Orlando, Detroit, Richmond, Baltimore, Oakland, Calif., and Syracuse, N.Y., were leveled or hollowed out to make way for federal highway building. The Biden administration hopes to use the massive infrastructure bill now working its way through Congress to help remedy the harm done by these hideous scars, to “reconnect neighborhoods cut off by historic investments,” in President Biden’s words. It’s not clear how much of the trillion dollars that lawmakers are contemplating will actually make it to places like North Claiborne. But those places aren’t just abstract line items in a budget resolution to people like me; they’re lived realities — vivid examples of how racist planning destroyed communities of color in America.
Our aging infrastructure now requires repair and replacement, and a number of cities have recognized the harms done by those original siting decisions, They have also recognized how racist assumptions–and all too often, conscious racial animus–prompted those decisions, and have moved to ameliorate them. (Indiana’s DOT, it will not surprise you to learn, has thus far resisted similar efforts to fundamentally redesign those highways and reconnect neighborhoods.)
There are numerous reasons to rethink the country’s interstates, and most of those reasons have nothing to do with race. City centers have changed, historic districts have proliferated, we know more about the negative effects of highway pollution, etc. But we also shouldn’t forget why so many of those highways were built where they were.
As the author of the Post essay concluded:
I do not understand why we can’t look at these infrastructure relics the way we look at monuments to white supremacy, such as statues of Confederate heroes and obelisks apotheosizing the Lost Cause. The statues are hurtful reminders of the times when Black people and Native Americans were seen as commodities or nuisances that needed removal. But urban highways are more than a reminder; they continuously inflict economic, social and environmental pain on neighborhoods like mine. Like other monuments to racism, they must be removed. The nation has a chance to support the rebuilding of disenfranchised and fractured communities and make them whole. It won’t be easy, but I hope we will seize the moment.
We don’t look at highways as monuments to White Supremacy, because we don’t know–and haven’t learned–how White Supremacy influenced–determined– their placement. It’s just one more aspect of our current society whose origins we prefer not to understand.
In a recent column in the New York Times, Farhad Manjoo asked a question that has preoccupied me for several years: given the wide diversity of global humanity–given the sheer numbers of humans who coexist on this planet with drastically different beliefs, personalities, experiences and cultural conditioning–is genuine co-operation and a measure of community even possible?
As Manjoo puts it,
What if we’ve hit the limit of our capacity to get along? I don’t mean in the Mister Rogers way. I’m not talking about the tenor of our politics. My concern is more fundamental: Are we capable as a species of coordinating our actions at a scale necessary to address the most dire problems we face?
Because, I mean, look at us. With the Covid-19 pandemic and climate change, humanity is contending with global, collective threats. But for both, our response has been bogged down less by a lack of ideas or invention than by a failure to align our actions as groups, either within nations or as a world community. We had little trouble producing effective vaccines against this scourge in record time — but how much does that matter if we can’t get it to most of the world’s people and if even those who have access to the shots won’t bother?
As Manjoo points out, most of the multiple ways in which we are inter-related and interdependent aren’t immediately evident. (As he says, the way deforestation in the Amazon rainforest affects sea levels in Florida isn’t exactly obvious to the man on the street). But as he also notes, quite properly, the threat posed by the pandemic is another matter. Or at least, it should be.
Sometimes, though, our fates are so obviously intertwined, you want to scream. Vaccines work best when most of us get them. Either we all patch up this sinking ship or we all go down together. But what if lots of passengers insist the ship’s not sinking and the repairs are a scam? Or the richest passengers stockpile the rations? And the captain doesn’t trust the navigator and the navigator keeps changing her mind and the passengers keep assaulting the crew?
Can we ever put the common good of humanity above our individual and tribal commitments?
Research suggests that we humans do have the capacity to come together. Manjoo refers to groundbreaking work by Indiana University’s own Elinor Ostrom, a Nobel winner. Ostrom’s research went a long way toward debunking widespread belief in the “tragedy of the commons,” and showed “countless examples of people coming together to create rules and institutions to manage common resources.” Despite an enormous amount of neocon and rational-choice propaganda to the contrary, Ostrom demonstrated that most people aren’t profit-maximizing automatons– that humans really can act on behalf of the common good, even when that action requires personal sacrifice.
However, Ostrom also understood that the nurturing of community requires institutions supportive of the common good. And therein lies my own concern.
Someone–I no longer recall who–said “It’s the culture, stupid.” What far too few of us seem to recognize about human society is the absolutely critical role that is played by culture and paradigms/worldviews–widespread assumptions about “the way things are” supported by embedded systems and institutions and habits of socialization.
What we desperately need are institutions supporting a culture that facilitates an appropriate balance between “I” and “we”–an overarching construct that enables each individual to pursue his or her own idiosyncratic telos while still being supportive of a strong community–and a recognition of how capacious our understanding of community must be.
The tribes fighting it out today are grounded in race, religion, and other (essentially superficial) markers of human identity. At some point–and thanks to the existential threat posed by climate change, we may be at that point–we need to redefine “we” as the human race. At a minimum, we need to come together to do those things that are necessary to keep the planet we inhabit capable of sustaining human life.
At some point, we need to realize that humanity is our tribe.
Maybe it’s because I’m old, maybe it’s because I’ve seen too many instances of people who are bound and determined to pursue obviously destructive paths, but I worry that too many of us have lost the ability to see beyond “I” to “we” and to envision a healthy balance between the two.
The pandemic and climate change are tests, and so far, at least, we’re failing.