Are We On Self-Destruct?

I am still mulling over the attack that sent Paul Pelosi to the hospital.

You will note that I have not characterized that vicious assault as an attack “on” Paul Pelosi, because that would be inaccurate. The maniac who invaded the Pelosi home was clearly intent upon finding and injuring or killing Nancy Pelosi. It was only because she wasn’t home that he turned his fury and hammer on her 82-year-old husband.

It’s bad enough that the crazy media outlets have responded by doing what they do–inventing weird and exculpatory stories entirely remote from any evidence whatever. (One “explanation” making the rounds suggests that Nancy Pelosi attacked her husband and the entire episode as reported was a cover-up. Other rightwing fantasies are equally bizarre.) But coverage from the sources we like to believe produce legitimate journalism hasn’t been much better.

As several pundits have reminded us, this was an attempted assassination of the Speaker of the U.S. House–the person who is third in line for the U.S. Presidency. Think about that.

In his newsletter, Robert Hubbell minced no words, asserting that the attack “has struck at the heart of America’s political dysfunction and mass delusion.”

Major media outlets are going out of their way to caution that “the assailant’s motives are unknown” and limiting their description of what occurred to “an attack on Paul Pelosi” without acknowledging that the intended target was the person third-in-line for the presidency of the US. Right-wing media is in full conspiracy mode, trafficking in wild and baseless claims that are insulting, defamatory, and offensive to a grieving family and a severely wounded victim. Elon Musk inflamed the situation by tweeting and deleting a bogus “opinion” article from a media outlet known for peddling bizarre conspiracy theories, e.g., that Hillary Clinton died before the 2016 election and her “body double” debated Trump

Apparently, Elon Musk tweeted a link to an “opinion” piece that was admittedly pure  speculation about what “might” have happened. According to Hubbell, Musk deleted the tweet shortly thereafter, “but not before it was exposed to his 120 million followers.”

The damage was done. No amount of truth-telling or retractions by reckless Fox affiliates will overcome the momentum created by Musk’s tweet. See NYTimes, Elon Musk, in a Tweet, Shares Link From Site Known to Publish False News and WaPo, Paul Pelosi attack prompts Elon Musk and political right to spread misinformation.

 In short order, Elon Musk and a reckless Fox affiliate converted a near-miss national tragedy into a cesspool of disinformation and delusion. In the process, the Pelosi family is being subjected to a second trauma that may be greater than the original assassination attempt and injuries suffered by Paul Pelosi.

So here we are. An estimated third of American citizens get their “information” from sources so distant from fact and reality that the term “propaganda” seems inadequate. If, as the Founders’ believed, democratic self-government requires an informed citizenry, the United States is in big trouble.

A commenter to a previous post on the state of our information environment pointed out that the ability to spread disinformation and confusion has grown with each “advance” in communication–newspapers, radio, television, movies, and now the Internet. True. The question we face is: what do we do about it? No serious person wants to abandon the First Amendment–and for that matter, we couldn’t totally suppress manufactured garbage if we tried.

And to be fair, it isn’t just America.

We are at a place in human history where a substantial portion of the population simply cannot cope with the realities, constant changes and uncertainties  of modern life. Those humans are a ready-made, eager audience for the purveyors of hate and division–and so long as there is an audience, there will be self-promoters to prey on that audience, either to make money (Alex Jones) or acquire political power (Trump/ fill in your favorite example).

My middle son has a theory that the reason we haven’t detected evidence of superior alien civilizations “out there” is because, at a certain point in the evolution of a civilization, it self-destructs. I hope he’s wrong, but the trajectory of humanity right now sure lends weight to that theory.

In less than a week, Americans will go to the polls and choose whether to continue down the path of conspiracy and theocracy–a path that will continue to facilitate the fascist fantasies being spread by Elon Musk, Fox News and their ilk, and will likely signal the end of the American Idea as we have understood it.

Even if we manage to avoid that result, we will be left with a conundrum: what do we do about the prevalence and appeal of invented realities–lies– and the people who believe and act on them?

Comments

The Essence Of The Argument

Okay–back to basics.

Morton Marcus and I are currently working on a book examining the causes and effects of women’s legal and social equality.

We understand that the movement toward equality is still a work in progress. We are also well aware that women’s progress has engendered considerable resistance–and that, in a very real way, that progress will be on the ballot November 8th.

As we approach a midterm election that will be crucial for women–not to mention American democracy– it seems appropriate to share some of that book’s relevant analysis.  What follows is long, despite the fact that I am breaking those arguments into three posts, and you may wish to skip or skim it, but it represents my understanding of the barriers  to women’s equality erected and defended by paternalism, religion and culture.

On November 8th, we will be voting on whether to keep or dismantle those barriers.

I have omitted the footnotes; if you want citations, ask me.

____________

Let’s begin with the obvious: there are genuine biological differences between men and women, and those differences profoundly and understandably shaped human cultures for thousands of years. Over time, science and technology have operated to minimize the social impact of those differences, although the differences themselves remain. In addition to changes in the job market that have made physical strength less important and inventions that significantly reduced the time spent on housework, women can now plan, defer or abstain from procreation without the necessity of celibacy, a reality that allows females to pursue educational and career choices that used to be available exclusively to males. Those choices have facilitated their ability to participate more fully in civic and political life.

Despite those advances, the drive for gender equity in the workplace and polity continues to be hindered by the persistence of attitudes and traditions more appropriate to bygone generations, and especially by religious beliefs that powerfully influence the country’s politics and culture. As the second section of this chapter will explain, a number of religious denominations work assiduously to impose their doctrinal beliefs about women (and what they believe to be the proper, subordinate place of females in society) through legislation applicable to everyone. Those theological positions support and strengthen a cultural patriarchy rooted in history, politics and privilege. As we will see, religious arguments are used to justify the still- significant resistance to women’s personal autonomy—and to motivate the increasingly frantic efforts of the political Right to reverse women’s social, legal and economic progress.

                                                                    Biology and Destiny
For generations, there have been two major biological impediments to women’s equal participation in society and especially in the workforce: women’s relative lack of physical strength vis a vis their male counterparts, and the fact that women get pregnant. Those two realities have exerted a major effect on cultural attitudes about men and women. For a very long time, most jobs required manual labor—and often, brute strength—and most (although not all) females were physically unable to undertake such tasks. Over the years, as technology has improved, the job market has also changed and fewer jobs today require physical strength. An increasing number instead require education, intellect and/or particular skills, qualifications that are more evenly distributed between the genders and even, in some cases, are more likely to be possessed by women.

In 2020, Janet Yellen authored a report for the Brookings Institution that focused on the prior century’s history of women’s employment. As she noted, early in the 20th century, most women in the United States didn’t work outside the home, and the few who did were primarily young and unmarried. A mere 20 percent of all women were “gainful workers,” and only 5 percent of those were married. (Yellen did point out that those statistics understated the economic contributions of married women who worked from home in family businesses and/or in the home production of goods for sale. The statistics also obscured racial difference—African-American women were about twice as likely to participate in the labor force as White women at the time, and were more likely to remain in the labor force after marriage.) When women did work outside the home, it was often taken as evidence that the husband was unwilling or unable to support the household. As a result, men tended to view a wife’s paid employment as a shameful statement on the husband’s role as a breadwinner. As Yellen wrote,

The fact that many women left work upon marriage reflected cultural norms, the nature of the work available to them, and legal strictures. The occupational choices of those young women who did work were severely circumscribed. Most women lacked significant education—and women with little education mostly toiled as piece workers in factories or as domestic workers, jobs that were dirty and often unsafe. Educated women were scarce. Fewer than 2 percent of all 18- to 24-year-olds were enrolled in an institution of higher education, and just one-third of those were women. Such women did not have to perform manual labor, but their choices were likewise constrained.

As a result, as Yellen notes and many of us vividly remember, there was widespread sentiment against women, especially married women, working outside the home. Even in the face of severely limited opportunities, however, increasing numbers of women did continue to enter the labor force during this period. As a result, some 50 percent of single women worked by 1930, as did nearly 12 percent of married women. Mores and social attitudes were slowly changing, partly as a result of what is often referred to as the “first wave” of the women’s movement, which focused on suffrage and (to a lesser extent) temperance, and which culminated in the ratification of the 19th Amendment in 1920, giving women the right to vote.

Between the 1930s and mid-1970s, women’s participation in the economy—especially the participation of married women–continued to rise, spurred by several social changes. The growth of mass high school education was accompanied by a similar rise in graduation rates. New technologies led to an increased demand for clerical workers, and clerical jobs were seen as appropriate for women, because they tended to be cleaner and safer. And while there were still bizarre rules that kept many women out of the labor force—for example, female librarians in most cities could not be married, and female school teachers who became pregnant were dismissed once they “showed”—these restrictions were gradually removed following World War II, although it wasn’t until 1986 that United Airlines was ordered to pay $33 million in back pay and to reinstate 475 flight attendants who had been forced to quit in the mid-1960s because of a no-marriage rule.
By far the most consequential change, however—the development that eliminated the major impediment to women’s full participation in economic and civic life—was the introduction of reliable contraception, primarily although not exclusively the birth control pill.

Before the advent of reliable birth control, every sexual encounter carried the risk of pregnancy, and pregnancy generally meant the end of a woman’s economic independence. A pregnant woman was almost always unemployable; for that matter, a married woman in her childbearing years was similarly unemployable, since there was always the possibility of pregnancy and the resulting need to care for offspring, seen as a uniquely female responsibility. Most women were therefore economically dependent upon the men to whom they were married. (Refusing to marry was no panacea: unmarried women were routinely labeled “old maids,” and were objects of pity and/or derision.) If her marriage was unhappy, or worse, violent, a woman with children was literally enslaved; given the barriers she faced to participation in the workforce and her resulting inability to support herself and her offspring, she usually couldn’t leave. Absent charitable intervention or inherited wealth—or friends or relatives willing to house and feed her and her children—she was totally dependent on her husband’s earnings.

Access to reliable contraception –and in situations where that contraception failed, abortion—was thus absolutely essential to women’s independence. If women could plan when to procreate, they could also plan when not to procreate. They could choose to schedule or defer motherhood in order to pursue education and career opportunities. The availability of the birth control pill didn’t simply liberate millions of women, opening possibilities that had been foreclosed by reasons of biology, its availability and widespread use triggered enormous changes in social attitudes that in turn opened the door to legislation that advanced both females’ economic independence and women’s ability to more fully participate in the civic life of the nation.

A 2010 article in Forbes marking the fiftieth anniversary of the pill acknowledged its immense significance. The article began by noting the then-current workforce status of women:

For the first time in U.S. history, women have overtaken men in the workplace. More specifically, they’ve overtaken men in professional roles. As of 2009, women represented half of all U.S. workers and are the primary or co-breadwinners in nearly two-thirds of American households. That’s a far cry from 1967, when women made up only one-third of all U.S. workers.

Without the birth control pill, women would almost certainly not have made it into powerful senior positions. While the political and social will to bring a critical mass of women into the workplace was certainly there–the advent of the birth control pill coincided with the second wave of feminism and the fight for equal rights–the pill gave women a tangible tool to level the playing field with men. They no longer had to be mothers first and careerists second. The pill allowed for both their entrance–and ascendance–in the workplace.

To be sure, there’s no denying the pill triggered the sexual revolution for women as well. Because they no longer had to worry about getting pregnant, it freed them up to have sex outside of marriage. But it was the workplace where the pill made its most lasting impact.
Together with women’s new prominence in political and economic life, that sexual revolution, such as it was (the punditry continues to argue about its nature, extent and consequences) ran headlong into what is perhaps the most regressive element of American culture: fundamentalist religion.

Tomorrow: religion and women’s rights

[

Comments

The Problem With “Get Off My Lawn”

Okay–I’m going to indulge in yet another rant. (Then I’ll return this blog to its usual political preoccupations–promise!)

You will have to forgive me for these expressions of bile: as the midterms approach, and my patience and sanity continue to erode, I tend to get annoyed by things I would probably overlook if I was in a less fragile state of mind. But one commenter to this site–who often has very thoughtful and pertinent things to say–has a habit that has sent me over the edge.

That habit? Repeatedly denigrating the younger generation. Characterizing and dismissing all young people as cut from the same “me me” cloth.

You know who you are…

This wholesale dismissal of the younger generation–the tendency of us older folks to shout “get off my lawn!”– has been going on since the time of Socrates. (If you don’t believe me, here’s a compilation of insults directed at young people over the centuries.)

These sweeping denunciations were wrong when they were issued, and they’re wrong now.

First of all, there is really no difference between insisting that all Blacks or all Jews or all White Evangelicals are the same and [fill in your preferred negative label] and insisting that all members of a particular age cohort exhibit [fill in your preferred behavioral insult].  Bigotry isn’t limited to defamation based upon race, religion or sexual orientation.

Secondly, and more substantively, it’s inaccurate–and I don’t say that just because it is demonstrably inapplicable to my own children and grandchildren.

I taught classes filled with young people for 21 years–my students (I usually had a total of anywhere from 60-120 in a given semester) ranged in ages from 18 to 35, depending on whether they were undergraduates or graduate students. My classrooms were diverse, and my students were pretty representative of their generation–I taught at an urban campus that drew students predominantly from central Indiana. I had some students who came from more privileged backgrounds, but the majority did not. A significant number were the first in their families to attend college.

And while there was some “self-selection” due to our programs preparing students for public and non-profit careers, our largest academic  program was public safety–and it attracted mainly would-be police officers. So I feel confident that I saw a pretty good cross-section of young Americans.

I would turn this country over to them in a heartbeat.

Overall, my students were inclusive, caring and community-oriented. I saw very little evidence of bigotry or “me-ism” and considerable evidence of a firm–even passionate– commitment to social justice and legal equality. The papers they wrote for my classes were, overall, thoughtful, and reflected genuine concern for their communities and for the underprivileged people in those communities.

Granted, when students entered my classrooms they rarely came armed with knowledge of the Constitution, Bill of Rights or other aspects of America’s legal structure, but their attitudes had been shaped by what I like to call “The American Idea”–a belief in both individual liberty and civic equality.

And they acted on those commitments.They volunteered and organized. When it comes to political participation, the data confirms that youth turnout has been on the rise; in 2020, it hit 50%, an 11 point increase from 2016.

 Recent surveys tell us that 59% of them plan to vote this year.

There is a lot wrong with America right now, and a lot of structural problems that make solving those thorny problems difficult. It’s tempting to look for scapegoats–but it is neither accurate nor helpful to blame an entire generation for the unpleasant or unhelpful behaviors of some of them.

Sorry to pick on a reader I really like, butI feel better now…..

Comments

Downtown

As regular readers have undoubtedly noticed, I frequently use this blog as a platform to vent–and that’s what I plan to do today. Usually, my rants are political, but despite political overtones, this one is personal.

First, some background.

I use Facebook primarily as a method for “pushing out” this blog–I very rarely post about personal matters, and because I am essentially viewing the site as a marketing tool, I have accepted lots of Facebook “friends” I’ve never met. Recently, one of them posted about the prosecutor’s race in my county, and that led to a string of dismissive (and easily disproven) comments about crime and downtown Indianapolis.

I have lived in downtown’s historic neighborhoods since 1980, and eighteen months ago, my husband and I downsized to an apartment in the heart of downtown’s central business district.

I now live barely two blocks from a Starbucks that the company is closing, a decision accompanied by pious declarations to the effect that closure was impelled by concerns for customer safety. Believing that excuse requires ignoring contemporaneous Starbucks closures in SIXTEEN other cities, and the fact that safety concerns seem not to have affected the other NINE downtown Starbucks locations. (Given the enormous number of competing coffee shops operating in just the Mile Square, my guess is over-saturation…)

Several people commenting on the post used the Starbucks closure to assert that downtown Indianapolis is not only unsafe, but–and I quote– a “shithole.”

Let me describe that “shithole” for those who don’t live in my neighborhood.

Saturday night, I attended an event at the downtown History Center. On my way home (four blocks), I passed restaurants filled to overflowing with diners inside and out (it was still a balmy evening, and downtown is blessed with numerous eateries offering outside dining.) Throngs of young couples were strolling up and down Massachusetts Avenue–a revitalized stretch of street hosting bars, restaurants, retail shops and theaters– all of which my husband and I frequent.

On foot.  We also walk two blocks to our preferred grocery, cleaners and hardware store…

Counterintuitively for a “shithole,” downtown Indianapolis attracts ongoing construction of apartment complexes and condominiums. People keep moving downtown to occupy them. (For the past few years, new construction has been so constant my husband and I joke as we pass a new complex: “Gee–that wasn’t there last Tuesday!”) As a recent report from the Indianapolis Star put it, renters and buyers continue to show a “high demand for Downtown living, a trend driven by amenities such as walkable streets, contemporary restaurants and bustling nightlife.”

If there’s a legitimate concern about downtown living, it’s the cost:a lot of  people are paying top dollar to enjoy the ambience and amenities of our downtown “shithole.”

Visit the website for Downtown Indy and find lists of residential options (both affordable and “wow, that’s pricey”) along with lists of the dozens of festivals, venues and events enjoyed by the 30,000+ of us who currently live downtown– as well as the thousands who come down to attend  them.

Indianapolis does have a crime problem–as most cities do–but it is primarily located in outer, impoverished neighborhoods. 

That said, I’m pretty sure I know what accounts for the ignorant accusations about downtown Indy. 

When I look at the throngs of people on the streets, most are young, and many are Black, Brown or Asian. A number of couples are interracial.  Unfortunately, depressing numbers of  Americans continue to equate nonwhite races with crime and decay. I’d be willing to bet good money that the people posting sneering comments about downtown Indianapolis hold stereotypes that equate “downtown” with “ghetto” and “scary.”

Prejudice can work both ways, of course.

For years, when my husband and I would drive past those grim, cookie-cutter, tree-less suburban developments that clearly require long commutes to work or shop, he would observe that “this is the environment people are willing to accept in order to avoid Black neighbors.” I would have to remind him that not every resident of suburbia or exurbia is a bigot–that there are non-racist reasons nice people might want a big yard or a quiet neighborhood.

I’ll end this screed by taking my own advice, and conceding that downtown living isn’t everyone’s cup of tea. The vitality, walkability and street life I treasure can be off-putting to others, and those differences are just differences–they don’t necessarily reflect ignorance or prejudice.

On the other hand, when someone describes the center of my city–my neighborhood–in demonstrably inaccurate, pejorative terms, I’m pretty confident I know where that opinion comes from. And it isn’t pretty.

It’s just one more data point demonstrating the prevalence and persistence of  American racism.

Comments

If This Doesn’t Terrify You…

Little by little, media outlets have begun reporting on a variety of really horrifying “movements”–most embracing Neo-fascist and/or crypto-Christian beliefs–that have been accumulating large numbers of adherents despite their underground status.

One such movement is the New Apostolic Reformation.

On July 1, 2022, inside a packed Georgia arena, four religious leaders stood on stage as they recited a blood chilling Prayer Declaration called the “Watchman Decree”:

Whereas, we have been given legal power from heaven and now exercise our authority, Whereas, we are God’s ambassadors and spokespeople over the earth. Whereas, through the power of God we are the world influencers. Whereas, because of our covenant with God, we are equipped and delegated by him to destroy every attempted advance of the enemy, we make our declarations: … 3. We decree that our judicial system will issue rulings that are biblical and constitutional. 4. We declare that we stand against wokeness, the occult, and every evil attempt against our nation. 5. We declare that we now take back our God-given freedoms, according to our Constitution. 6. We decree that we take back and permanently control positions of influence and leadership in each of the “Seven Mountains.”

Not only was the arena “packed,” the video of the recitation–which you can see at the link– was viewed more than 3 million times on Twitter alone.

The Watchman Decree is a product of something called the New Apostolic Reformation (NAR). The relatively few media outlets that have reported on the movement tell us that it is a rapidly accelerating worldwide Christian authoritarian movement, one that includes practices of faith healing and exorcism. It also promotes dominionism, the belief that Christians must take control of government, business and the culture before Jesus can return to earth.

The men on stage included NAR apostles Dutch Sheets (who wrote the decree) and Lance Wallhau, along with two close colleagues, pastors Mario Murillo and Hank Kunneman. The fifth man, pastor Gene Bailey, hosted the event for his show Flashpoint on Victory TV, a Christian network that platforms the NAR and pro-Trump Make America Great (MAGA) influencers.

Those relatively obscure individuals are joined in NAR by Trump supporters with far more familiar names: former Trump National Security Advisor, Michael Flynn, who has appeared on Flashpoint several times, and longtime Trump advisor Roger Stone. Stone, Flynn, and other MAGA influencers were announced as participants in a Pennsylvania tour called the“Reawaken America Tour” (RAT). That tour was founded by a far right podcast host from Oklahoma, and was sponsored by an NAR apostle through something called Charisma News.

Leading NAR apostles are blatantly pro-Trump, and claim their view is supported by God, whereas opposition to Trump is satanic. “Fighting with Trump is fighting God,” Wallnau declared in October 2020. “God does not want” Joe Biden to be president, Sheets claimed in December 2020. “All those witchcraft curses that did not land on Donald Trump are trying to take out his kids,” Wallnau raged in a 2017 video. In a 2017 tweet, he wrote, “Praying for the President-elect at Press Club in D.C. with Lou Engle. Prophetic location. Trump must keep wrecking media witchcraft.”

The NAR also opposes freedom of religion, teaching instead that Christians must exert dominion over all aspects of our society. The NAR isn’t the only movement that espouses dominionism, but it may be the most influential. As explained by Wagner, who fathered the NAR:

“Dominion has to do with control. Dominion has to do with rulership. Dominion has to do with authority and subduing. And it relates to society. In other words, what the values are in Heaven need to be made manifest on earth. Dominion means being the head and not the tail. Dominion means ruling as kings.”

The specific pillars of society over which the NAR plans to “rule as kings” are seven-fold: 1. business, 2. government, 3. family, 4. religion, 5. media, 6. education, and 7. entertainment. NAR leaders call this the “Seven Mountains” mandate.

I find it hard to get my head around the fact that thousands–perhaps millions–of Americans can hold and act upon such beliefs in the 21st Century. I can only speculate about the fears and/or resentments that might account for a person’s  embrace of such a worldview. The fact that the NAR and its ilk have largely flown under the radar adds to the danger. These are people who believe they are privy to the will of a deity they have created out of their own inadequacies, and that they are entitled to exert “dominion” over the rest of us.

We live in very scary times.

Comments