Where We Are

I’d planned to introduce today’s post with a rundown of what we’ve narrowly escaped OR what comes next after a disappointing midterm. I still don’t know where the results will land us, but it is obviously neither a rout nor the Blue Wave I’d hoped to see.

The good news, as Heather Cox Richardson reminded us yesterday, is that many more Americans today are concerned about our democracy, and determined to reclaim it, than were even paying attention to it in 2016. As she pointed out, we see new organizations, new connections, new voters, and new efforts to remake the country better than it has ever been.

And new efforts to prevent a rightwing populist takeover.

In last Sunday’s New York Times book review, two recent books exploring the decline of democracy investigated “the F word”–fascism

As the review noted, the use of that epithet used to be reserved for extremist organizations like the Ku Klux Klan and the John Birch Society. No longer. Even mild-mannered Joe Biden has admitted what virtually any person familiar with politics and political history can see: the Republican “MAGA philosophy” is–if not full-on–at least “semi-fascism.”

If we look at the 1920sand ’30s versions of fascism, some things are different but other elements are frighteningly similar.  As the reviewer noted, anti-democratic ultranationalism — one definition of fascism — looks different today, but overall,  MAGA Republicanism “employs many of the rhetorical tropes of traditional fascist politics.” Those tropes include a focus on racial purity, a proud anti-intellectualism, and especially the invocation of “a mythic past and appeals to blood and soil.”

The two books focused specifically upon fascism that were reviewed by the Times were “How Fascism Works: The Politics of Us and Them” by Jason Stanley, and “Strongmen: Mussolini to the Present” by Ruth Ben-Ghiat. Both authors emphasized the importance of “alternative facts”–invocation of a mythical past, and the absence of a shared factual reality.

The invocation of the past is politically strategic. “It is never the actual past that is fetishized,” Stanley writes. He notes that monuments to the Confederacy were erected long after the Civil War ended in part as propaganda to whitewash the horrors of slavery. Fascists, both authors suggest, want to destabilize the shared sense of reality that is necessary for democratic dialogue. They seek to create what one might call an air of QAnon-like unreality, in which elected officials and government institutions are targets of bizarre claims — including, for instance, that they are covers for child sex-trafficking rings.

And of course–as we have seen in the most recent electoral cycle–there is a constant drumbeat of “othering”–an insistence of the dramatic differences between “us” and “them.”

The classic debate between liberty and equality is distorted by fascists, who see equality as a denial of a natural law whereby some people are inherently more deserving of power than others. For fascists, democracy makes unequal people equal, and tries to equate “them” with “us.” Fascist rhetoric is designed to divide citizens into two distinct classes: the sons and daughters of the soil, who are the true citizens of the nation, and the “other” — the foreign, the rabble, the lawless.

 I know my constant insistence on the importance of civic literacy can seem tiresome–the carping of an academic convinced of the supreme importance of her area of “expertise.” But a citizenry unfamiliar with the history of their country and unacquainted with the most basic premises of its system of government is uniquely vulnerable to the distortions that turn one American against another.

Just one example: Voters who don’t understand why the Founders separated Church from State are easy targets for revisionists who deny both the history that impelled that separation and the fact that the language of the First Amendment was intended to erect it. They are receptive to the fascist claim that their God has made them the rightful custodians of the country.

The philosopher Santayana warned that “Those who cannot remember the past are condemned to repeat it.”

Comments

The Stakes

I’ve always liked Joe Biden, but the descriptive words that come to mind when I think of him are words like “decency” and “competence.” He’s an essentially understated man; unlike with Obama, the word “eloquence” is not the first word that comes to mind in connection with him.

His speech this week on democracy, however, was nothing if not eloquent– and heartfelt. It was also an accurate and important reminder of where we are right now in this experiment we call America.

I’m linking to the transcript of that speech, and begging you to click through read it. Completely.

Then vote BLUE NO MATTER WHO.

Comments

Are We On Self-Destruct?

I am still mulling over the attack that sent Paul Pelosi to the hospital.

You will note that I have not characterized that vicious assault as an attack “on” Paul Pelosi, because that would be inaccurate. The maniac who invaded the Pelosi home was clearly intent upon finding and injuring or killing Nancy Pelosi. It was only because she wasn’t home that he turned his fury and hammer on her 82-year-old husband.

It’s bad enough that the crazy media outlets have responded by doing what they do–inventing weird and exculpatory stories entirely remote from any evidence whatever. (One “explanation” making the rounds suggests that Nancy Pelosi attacked her husband and the entire episode as reported was a cover-up. Other rightwing fantasies are equally bizarre.) But coverage from the sources we like to believe produce legitimate journalism hasn’t been much better.

As several pundits have reminded us, this was an attempted assassination of the Speaker of the U.S. House–the person who is third in line for the U.S. Presidency. Think about that.

In his newsletter, Robert Hubbell minced no words, asserting that the attack “has struck at the heart of America’s political dysfunction and mass delusion.”

Major media outlets are going out of their way to caution that “the assailant’s motives are unknown” and limiting their description of what occurred to “an attack on Paul Pelosi” without acknowledging that the intended target was the person third-in-line for the presidency of the US. Right-wing media is in full conspiracy mode, trafficking in wild and baseless claims that are insulting, defamatory, and offensive to a grieving family and a severely wounded victim. Elon Musk inflamed the situation by tweeting and deleting a bogus “opinion” article from a media outlet known for peddling bizarre conspiracy theories, e.g., that Hillary Clinton died before the 2016 election and her “body double” debated Trump

Apparently, Elon Musk tweeted a link to an “opinion” piece that was admittedly pure  speculation about what “might” have happened. According to Hubbell, Musk deleted the tweet shortly thereafter, “but not before it was exposed to his 120 million followers.”

The damage was done. No amount of truth-telling or retractions by reckless Fox affiliates will overcome the momentum created by Musk’s tweet. See NYTimes, Elon Musk, in a Tweet, Shares Link From Site Known to Publish False News and WaPo, Paul Pelosi attack prompts Elon Musk and political right to spread misinformation.

 In short order, Elon Musk and a reckless Fox affiliate converted a near-miss national tragedy into a cesspool of disinformation and delusion. In the process, the Pelosi family is being subjected to a second trauma that may be greater than the original assassination attempt and injuries suffered by Paul Pelosi.

So here we are. An estimated third of American citizens get their “information” from sources so distant from fact and reality that the term “propaganda” seems inadequate. If, as the Founders’ believed, democratic self-government requires an informed citizenry, the United States is in big trouble.

A commenter to a previous post on the state of our information environment pointed out that the ability to spread disinformation and confusion has grown with each “advance” in communication–newspapers, radio, television, movies, and now the Internet. True. The question we face is: what do we do about it? No serious person wants to abandon the First Amendment–and for that matter, we couldn’t totally suppress manufactured garbage if we tried.

And to be fair, it isn’t just America.

We are at a place in human history where a substantial portion of the population simply cannot cope with the realities, constant changes and uncertainties  of modern life. Those humans are a ready-made, eager audience for the purveyors of hate and division–and so long as there is an audience, there will be self-promoters to prey on that audience, either to make money (Alex Jones) or acquire political power (Trump/ fill in your favorite example).

My middle son has a theory that the reason we haven’t detected evidence of superior alien civilizations “out there” is because, at a certain point in the evolution of a civilization, it self-destructs. I hope he’s wrong, but the trajectory of humanity right now sure lends weight to that theory.

In less than a week, Americans will go to the polls and choose whether to continue down the path of conspiracy and theocracy–a path that will continue to facilitate the fascist fantasies being spread by Elon Musk, Fox News and their ilk, and will likely signal the end of the American Idea as we have understood it.

Even if we manage to avoid that result, we will be left with a conundrum: what do we do about the prevalence and appeal of invented realities–lies– and the people who believe and act on them?

Comments

The Essence Of The Argument

Okay–back to basics.

Morton Marcus and I are currently working on a book examining the causes and effects of women’s legal and social equality.

We understand that the movement toward equality is still a work in progress. We are also well aware that women’s progress has engendered considerable resistance–and that, in a very real way, that progress will be on the ballot November 8th.

As we approach a midterm election that will be crucial for women–not to mention American democracy– it seems appropriate to share some of that book’s relevant analysis.  What follows is long, despite the fact that I am breaking those arguments into three posts, and you may wish to skip or skim it, but it represents my understanding of the barriers  to women’s equality erected and defended by paternalism, religion and culture.

On November 8th, we will be voting on whether to keep or dismantle those barriers.

I have omitted the footnotes; if you want citations, ask me.

____________

Let’s begin with the obvious: there are genuine biological differences between men and women, and those differences profoundly and understandably shaped human cultures for thousands of years. Over time, science and technology have operated to minimize the social impact of those differences, although the differences themselves remain. In addition to changes in the job market that have made physical strength less important and inventions that significantly reduced the time spent on housework, women can now plan, defer or abstain from procreation without the necessity of celibacy, a reality that allows females to pursue educational and career choices that used to be available exclusively to males. Those choices have facilitated their ability to participate more fully in civic and political life.

Despite those advances, the drive for gender equity in the workplace and polity continues to be hindered by the persistence of attitudes and traditions more appropriate to bygone generations, and especially by religious beliefs that powerfully influence the country’s politics and culture. As the second section of this chapter will explain, a number of religious denominations work assiduously to impose their doctrinal beliefs about women (and what they believe to be the proper, subordinate place of females in society) through legislation applicable to everyone. Those theological positions support and strengthen a cultural patriarchy rooted in history, politics and privilege. As we will see, religious arguments are used to justify the still- significant resistance to women’s personal autonomy—and to motivate the increasingly frantic efforts of the political Right to reverse women’s social, legal and economic progress.

                                                                    Biology and Destiny
For generations, there have been two major biological impediments to women’s equal participation in society and especially in the workforce: women’s relative lack of physical strength vis a vis their male counterparts, and the fact that women get pregnant. Those two realities have exerted a major effect on cultural attitudes about men and women. For a very long time, most jobs required manual labor—and often, brute strength—and most (although not all) females were physically unable to undertake such tasks. Over the years, as technology has improved, the job market has also changed and fewer jobs today require physical strength. An increasing number instead require education, intellect and/or particular skills, qualifications that are more evenly distributed between the genders and even, in some cases, are more likely to be possessed by women.

In 2020, Janet Yellen authored a report for the Brookings Institution that focused on the prior century’s history of women’s employment. As she noted, early in the 20th century, most women in the United States didn’t work outside the home, and the few who did were primarily young and unmarried. A mere 20 percent of all women were “gainful workers,” and only 5 percent of those were married. (Yellen did point out that those statistics understated the economic contributions of married women who worked from home in family businesses and/or in the home production of goods for sale. The statistics also obscured racial difference—African-American women were about twice as likely to participate in the labor force as White women at the time, and were more likely to remain in the labor force after marriage.) When women did work outside the home, it was often taken as evidence that the husband was unwilling or unable to support the household. As a result, men tended to view a wife’s paid employment as a shameful statement on the husband’s role as a breadwinner. As Yellen wrote,

The fact that many women left work upon marriage reflected cultural norms, the nature of the work available to them, and legal strictures. The occupational choices of those young women who did work were severely circumscribed. Most women lacked significant education—and women with little education mostly toiled as piece workers in factories or as domestic workers, jobs that were dirty and often unsafe. Educated women were scarce. Fewer than 2 percent of all 18- to 24-year-olds were enrolled in an institution of higher education, and just one-third of those were women. Such women did not have to perform manual labor, but their choices were likewise constrained.

As a result, as Yellen notes and many of us vividly remember, there was widespread sentiment against women, especially married women, working outside the home. Even in the face of severely limited opportunities, however, increasing numbers of women did continue to enter the labor force during this period. As a result, some 50 percent of single women worked by 1930, as did nearly 12 percent of married women. Mores and social attitudes were slowly changing, partly as a result of what is often referred to as the “first wave” of the women’s movement, which focused on suffrage and (to a lesser extent) temperance, and which culminated in the ratification of the 19th Amendment in 1920, giving women the right to vote.

Between the 1930s and mid-1970s, women’s participation in the economy—especially the participation of married women–continued to rise, spurred by several social changes. The growth of mass high school education was accompanied by a similar rise in graduation rates. New technologies led to an increased demand for clerical workers, and clerical jobs were seen as appropriate for women, because they tended to be cleaner and safer. And while there were still bizarre rules that kept many women out of the labor force—for example, female librarians in most cities could not be married, and female school teachers who became pregnant were dismissed once they “showed”—these restrictions were gradually removed following World War II, although it wasn’t until 1986 that United Airlines was ordered to pay $33 million in back pay and to reinstate 475 flight attendants who had been forced to quit in the mid-1960s because of a no-marriage rule.
By far the most consequential change, however—the development that eliminated the major impediment to women’s full participation in economic and civic life—was the introduction of reliable contraception, primarily although not exclusively the birth control pill.

Before the advent of reliable birth control, every sexual encounter carried the risk of pregnancy, and pregnancy generally meant the end of a woman’s economic independence. A pregnant woman was almost always unemployable; for that matter, a married woman in her childbearing years was similarly unemployable, since there was always the possibility of pregnancy and the resulting need to care for offspring, seen as a uniquely female responsibility. Most women were therefore economically dependent upon the men to whom they were married. (Refusing to marry was no panacea: unmarried women were routinely labeled “old maids,” and were objects of pity and/or derision.) If her marriage was unhappy, or worse, violent, a woman with children was literally enslaved; given the barriers she faced to participation in the workforce and her resulting inability to support herself and her offspring, she usually couldn’t leave. Absent charitable intervention or inherited wealth—or friends or relatives willing to house and feed her and her children—she was totally dependent on her husband’s earnings.

Access to reliable contraception –and in situations where that contraception failed, abortion—was thus absolutely essential to women’s independence. If women could plan when to procreate, they could also plan when not to procreate. They could choose to schedule or defer motherhood in order to pursue education and career opportunities. The availability of the birth control pill didn’t simply liberate millions of women, opening possibilities that had been foreclosed by reasons of biology, its availability and widespread use triggered enormous changes in social attitudes that in turn opened the door to legislation that advanced both females’ economic independence and women’s ability to more fully participate in the civic life of the nation.

A 2010 article in Forbes marking the fiftieth anniversary of the pill acknowledged its immense significance. The article began by noting the then-current workforce status of women:

For the first time in U.S. history, women have overtaken men in the workplace. More specifically, they’ve overtaken men in professional roles. As of 2009, women represented half of all U.S. workers and are the primary or co-breadwinners in nearly two-thirds of American households. That’s a far cry from 1967, when women made up only one-third of all U.S. workers.

Without the birth control pill, women would almost certainly not have made it into powerful senior positions. While the political and social will to bring a critical mass of women into the workplace was certainly there–the advent of the birth control pill coincided with the second wave of feminism and the fight for equal rights–the pill gave women a tangible tool to level the playing field with men. They no longer had to be mothers first and careerists second. The pill allowed for both their entrance–and ascendance–in the workplace.

To be sure, there’s no denying the pill triggered the sexual revolution for women as well. Because they no longer had to worry about getting pregnant, it freed them up to have sex outside of marriage. But it was the workplace where the pill made its most lasting impact.
Together with women’s new prominence in political and economic life, that sexual revolution, such as it was (the punditry continues to argue about its nature, extent and consequences) ran headlong into what is perhaps the most regressive element of American culture: fundamentalist religion.

Tomorrow: religion and women’s rights

[

Comments

The Problem With “Get Off My Lawn”

Okay–I’m going to indulge in yet another rant. (Then I’ll return this blog to its usual political preoccupations–promise!)

You will have to forgive me for these expressions of bile: as the midterms approach, and my patience and sanity continue to erode, I tend to get annoyed by things I would probably overlook if I was in a less fragile state of mind. But one commenter to this site–who often has very thoughtful and pertinent things to say–has a habit that has sent me over the edge.

That habit? Repeatedly denigrating the younger generation. Characterizing and dismissing all young people as cut from the same “me me” cloth.

You know who you are…

This wholesale dismissal of the younger generation–the tendency of us older folks to shout “get off my lawn!”– has been going on since the time of Socrates. (If you don’t believe me, here’s a compilation of insults directed at young people over the centuries.)

These sweeping denunciations were wrong when they were issued, and they’re wrong now.

First of all, there is really no difference between insisting that all Blacks or all Jews or all White Evangelicals are the same and [fill in your preferred negative label] and insisting that all members of a particular age cohort exhibit [fill in your preferred behavioral insult].  Bigotry isn’t limited to defamation based upon race, religion or sexual orientation.

Secondly, and more substantively, it’s inaccurate–and I don’t say that just because it is demonstrably inapplicable to my own children and grandchildren.

I taught classes filled with young people for 21 years–my students (I usually had a total of anywhere from 60-120 in a given semester) ranged in ages from 18 to 35, depending on whether they were undergraduates or graduate students. My classrooms were diverse, and my students were pretty representative of their generation–I taught at an urban campus that drew students predominantly from central Indiana. I had some students who came from more privileged backgrounds, but the majority did not. A significant number were the first in their families to attend college.

And while there was some “self-selection” due to our programs preparing students for public and non-profit careers, our largest academic  program was public safety–and it attracted mainly would-be police officers. So I feel confident that I saw a pretty good cross-section of young Americans.

I would turn this country over to them in a heartbeat.

Overall, my students were inclusive, caring and community-oriented. I saw very little evidence of bigotry or “me-ism” and considerable evidence of a firm–even passionate– commitment to social justice and legal equality. The papers they wrote for my classes were, overall, thoughtful, and reflected genuine concern for their communities and for the underprivileged people in those communities.

Granted, when students entered my classrooms they rarely came armed with knowledge of the Constitution, Bill of Rights or other aspects of America’s legal structure, but their attitudes had been shaped by what I like to call “The American Idea”–a belief in both individual liberty and civic equality.

And they acted on those commitments.They volunteered and organized. When it comes to political participation, the data confirms that youth turnout has been on the rise; in 2020, it hit 50%, an 11 point increase from 2016.

 Recent surveys tell us that 59% of them plan to vote this year.

There is a lot wrong with America right now, and a lot of structural problems that make solving those thorny problems difficult. It’s tempting to look for scapegoats–but it is neither accurate nor helpful to blame an entire generation for the unpleasant or unhelpful behaviors of some of them.

Sorry to pick on a reader I really like, butI feel better now…..

Comments