How “Woke” Is Academia, Really?

We Americans harbor all sorts of prejudices about all sorts of things.

One of the problems with racism, anti-Semitism, and similar tribal bigotries is that such attitudes ignore individual differences. When you criticize “those people” (whoever “those people” are in your particular worldview), you inevitably sweep with far too broad a brush.

That impulse–to generalize from anecdotal experiences–isn’t limited to our attitudes about marginalized racial or ethnic groups. It has become a ubiquitous–and extremely annoying– element of America’s polarized political world-views. There is, for example, a widespread–and dangerously oversimplified–belief that America’s universities are bubbling cauldrons of “woke” indoctrination. That charge has become part of the Republican Party’s current war on evidence, science, and accurate history.

Before I retired from my position on a university faculty, I was periodically accused of being part of a liberal “brainwashing” enterprise by people who clearly knew very little about academic life, my particular campus, or –for that matter–the huge differences around the country among institutions of higher education.

I was reminded of those discussions when I read a rant on the subject that had been posted to Daily Kos.

The posted diatribe was triggered by a televised exchange between Andrew Sullivan and Bill Maher on the latter’s show, in which the two of them decried the “wokeness” of today’s colleges and universities.

I have likely spoken at more colleges in the past 15 years than Bill Maher and Andrew Sullivan put together. The difference is that they speak at select elite colleges and I speak everywhere else. For instance, next week I speak at Hastings College in Nebraska.

Most colleges really aren’t that woke. In fact, being too liberal when I speak is always in the back of my mind. For example, eleven years ago I spoke at New Mexico Tech. From my perspective, it was one of my best nights as a speaker. I performed two shows in a full theater and received a standing ovation. Nevertheless, the woman who booked me refuses to ever bring me back, because a few people walked out and complained. I actually noticed the walkouts, and they did it right after a section in my show where I talked about global warming and childish Republicans who renamed French fries “Freedom fries” and French toast “Freedom toast” in the congressional cafeterias to protest France not participating in the Iraq War.

Additionally, one religious college politely asked me not to speak about evolution, and another booked me under the condition that I not speak out against Donald Trump. Other colleges outright won’t book me because I’m too liberal (i.e. woke). That’s okay. I’m not going to whine about it. I mention it only because people like Bill Maher and Andrew Sullivan are clueless about what life is like in the real world.

He’s right.

According to the National Center for Education Statistics, there were 3,982 degree-granting postsecondary institutions in the U.S. as of the 2019-2020 school year. Believe me–and believe the author of the quoted rant–very few of them look like Harvard or Berkeley.

One of the numerous faculty committees on which I served was the admissions committee, where we reviewed applications for entry into our graduate programs. Those applications were frequently submitted by students from undergraduate institutions I’d never heard of.

When my husband and I are driving out of state, we constantly pass road signs announcing that such-and-such town is the home of such-and-such local college. These are almost always schools that– despite the fact I was “in the field”– I’d never heard of.

The sheer number of these small and often-struggling schools is intriguing; I sometimes wonder about their ability to attract competent instructors, the focus and breadth of their curricula, and the career prospects they offer their graduates.

It is likely that  the quality of these institutions varies widely. I would bet good money, though, that very few of them are “woke.” (To the extent that they are “indoctrinating,” it is likely to be with denominational religious views–and those are most unlikely to be “liberal.”)

Judging all post-secondary schools as if they are all alike shares the same fallacy that characterizes racial and religious bigotry–the notion that all Blacks or Jews or Muslims or immigrants–or Republicans or Democrats–are alike and interchangeable. One of the many, many defects of our current media environment is its tendency to find an example of something–generally an extreme example–and suggest that it represents an entire category that we should either embrace or reject.

Reality is more complicated than prejudice admits.

Comments

Demographics And Politics

As the results of the Trump- delayed census have emerged, we’ve learned that American diversity is both growing and shifting.

The overview–the national picture–is considerably less White, and that reality is further enraging the too-plentiful numbers of White Supremicists among us. Their hysteria–not unlike a child’s tantrum–is likely to have some ugly political repercussions. We can only hope that, in the scheme of things and the sweep of history, those repercussions are temporary.

When the picture focuses on distribution rather than on aggregate numbers, things get more interesting. Charles Blow has provided a rundown of those numbers in a recent column, and the basic thrust is that Black people are moving out of what were dismissively called the “inner cities.”

The term “inner city” has long been used as a derisive euphemism for Black — poor, blighted and in distress. But many inner cities in the North and West are becoming less and less Black because Black people are moving out.

Black populations in what were considered to be Black strongholds have been dwindling, and that has been happening all over the country. There has been a reverse migration wave of Black people from the North and West moving back to the South. Blow goes through an extensive list of cities that have lost Black populations.

Among the most interesting:

Detroit, once the Blackest big city in America, home of Motown, dropped from 82 percent Black in 2010 to 77 percent Black in 2020. The Hispanic, white and Asian populations all grew in the city over that period.

New York City, with two million Black residents, more than any other city in America, saw its Black population fall by 4.5 percent over the past decade. This came on the heels of the Black population declining 5.1 percent the previous decade, the first drop in the number of Black residents in recent history.

Harlem, according to the 2020 census, is now just 37% Black. Harlem!

Chicago, Philadelphia, Los Angeles–Blow provides the numbers showing diminished Black population. He also shares the numbers showing the growth of Black population in the American South.

These shifts don’t mean that there are now fewer cities with Black majorities; the number is on the rise, as Brookings pointed out in 2019. It’s just that 90 percent of majority Black cities are now in the South. In fact, I think it would be safe to say that much of the municipal South is Black.

Ironically, Selma and Montgomery, Ala.; Jackson and Philadelphia, Miss.; Little Rock, Ark.; and Atlanta —places we associate with some of the worst episodes of America’s racist past– now have Black mayors.

All of this is politically relevant.

For a number of years, Republican Party leaders have been reacting to predictions that “demography is destiny”–fears that the growing diversity of America (and the decidedly Democratic lean of the country’s youth)–will soon make the GOP electorally irrelevant. That fear is what has prompted the GOP’s extreme gerrymandering, vote suppression tactics and efforts to control who counts the votes.

The movement of Black Americans out of easily demonized metropolitan centers changes the calculus. It’s harder to whip up fear of “those people” who live in the centers of large cities when so many of “those people” are White, young, upwardly mobile Starbucks drinkers. And as Stacy Abrams and her fellow-activists showed in Georgia, the previously solid South, which could be counted on to vote White, whatever party was carrying the banner of White Supremacy at any given time, is no longer so solid. It isn’t just cultural change, welcome as that is. It’s demographic shift. 

Blow didn’t include cities in Texas in his recitation, and it is likely that increasing demographic diversity there is due more to the growth of Latino populations than an influx  of Blacks, but when we think of states south of the Mason-Dixon Line currently headed by stubbornly reactionary Republicans, Texas certainly comes to mind. Whether Abbott’s frantic efforts to suppress minority votes in the face of demographic change will keep Texas in the Red column for another few years is anyone’s guess.

Blow says the new distribution of America’s Black population is producing “chocolate chip cities.” If–as sociologists tell us–bigotry is reduced by familiarity with members of previously marginalized populations, those “chocolate chip” cities bode well for civic amity.

Eventually.

Comments

A Philosophical Big Sort

I have previously cited Bill Bishop’s excellent 2008 book, The Big Sort, in which Bishop focused on physical “sorting”–the increased geographical clustering of like-minded Americans choosing to live in areas populated by people who generally shared their political worldviews.

A very thoughtful book review by Ronald Aronson in The Hedgehog Review centered on a different type of American division–what one might call “philosophical sorting.”

The book being reviewed was The Upswing written by Robert Putnam (he of “Bowling Alone” fame) and Shaylyn Garrett. The book looked at what it called the “I-We-I arc” through the lens of the last 125 years of American economic, political, social, and cultural history.

A remarkable assemblage of data and a compelling story about America history, The Upswing begins with the Gilded Age, the period of disintegration, conflict, and aggressive individualism after the Civil War. It was followed by seventy-five years of growth in equality and national community achieved first by the Progressive movement, then by the New Deal, and, under different conditions, by wartime solidarity. But then things went sour: “Between the mid-1960s and today—by scores of hard measures along multiple dimensions—we have been experiencing declining economic equality, the deterioration of compromise in the public square, a fraying social fabric, and a descent into cultural narcissism.” The last century’s upswing has been followed by the slide toward an unhappy collection of democratic ills: inequality, individualism, austerity, the domination of human needs by the “free market,” political polarization, and the blockage of economic and educational gains by African Americans.

According to the review, the book is replete with graphs that reveal a repeating arc: an inverted U. Until around 1970, the data shows an increasing sense of “community, equality, belongingness, and solidarity—a growing “we.” After that, however, the graphs show a “sharp collapse into an individualistic and even conflictual assertion of “I” in values and culture as well as politics and economics.”

This is a story that unfolds in four overlapping parts. First, the trend toward greater economic equality reversed sharply over the past fifty years. Second, political polarization, some of it rooted in the Civil War, gave way under the influence of the Progressive movement to a remarkable degree of political consensus by the 1930s. But then things turned in the other direction as the Civil Rights Act of 1964, supported by substantial majorities of both Republicans and Democrats, led to bitter party polarization that was accompanied by a steep decline in trust in government and a rise in cynicism. Third, social life became anemic as membership in clubs and associations declined (a main theme of Putnam’s Bowling Alone) and the social and cultural force of labor unions dramatically weakened. Fourth, as an indicator of the changing frequency of occurrence of certain words, Google Ngrams tell a parallel story of a rise and fall in values of community and individualism: “association,” “cooperation,” “socialism,” and the “common man,” as well as “agreement,” “compromise,” and “unity,” all showing the same inverted U-shaped curve, rising and then declining steeply, to where we are today.

I was particularly intrigued by the observation that many whites come to champion the idea of individualism…”because it provides them with a principled and apparently neutral justification for opposing policies that favor Black Americans.” If racism is truly a major underpinning of the “I” portion of that I-We-I arc, I’m afraid the “upswing” Putnam and Garrett believe is on the horizon will be a long time coming.

 Aronson is equally dubious about the prospects of an upswing. As he points out, if anything should have prompted a return to “we,” it should have been the pandemic. It didn’t. Americans “sorted” philosophically and politically.

Survey research tells us that 36 percent of Republicans– as opposed to  4 percent of Democrats– thought the 2020 shutdowns were too restrictive. Prominent Republicans insisted that COVID-19 was a hoax and that the death toll was exaggerated. The U.S. has 5 percent of the world’s population–and 20 percent of COVID deaths. Twelve of the fifteen hardest-hit states are governed by Republicans.

The Upswing was published in 2020, prior to the pandemic, and didn’t address it. Other omissions are less understandable.

Aronson points to the multiple social influences that are simply missing from the book’s analysis: the role played by American capitalism’s “outsourcing, deregulation, financialization, speculative bubbles, austerity, and neoliberalism;” globalization; the Vietnam War; “inflation, and American imperialism, including the Cold War and the post–Cold War military-industrial complex.” And as he says, “we must come back in the end to the crucial link between America’s coming apart and its deeply imbedded racism.” 

I am very much afraid that the continued existence of a White Supremacy Party–and the philosophical gulf between Americans that is symbolized by that continued existence–is incompatible with an imminent upswing.

I hope I’m wrong.

Comments

Return On Investment

I tend to get testy when I hear people intone that government should “be run like a business.” (Granted–I’m testy a lot…) Government is very different from business–its purposes (which do not include a profit motive) are distinct. Not recognizing the substantial differences  between governance and enterprise marks those making that facile comment as–at best– uninformed.

That said, there is one concept fundamental to both business plans and investment decisions that should also guide governmental decisions: return on investment. Interestingly, however, many of the same folks who want more businesslike governance routinely ignore that calculation.

If I’m purchasing stock in a company, I want evidence that the shares I purchase will appreciate in value–or at least pay dividends. If I am a savvy/mature investor, rather than a gambler playing the market, I understand that such appreciation will likely not be immediate; I will invest “for the long haul.” 

That same calculation ought to determine America’s investments in social spending.  Although appropriate returns on government investment will not and should not be monetary, a number of studies confirm that a surprising number of programs actually do turn a fiscal profit for taxpayers.

Children who have been fed thanks to food stamps grow up into healthier, more productive adults than those who didn’t get enough to eat. That greater productivity means that government eventually recoups much of what it spent on those food stamps–and also saves money due to reduced spending on things like disability payments.

A recent study by Harvard economists found that many programs — especially those focused on children and young adults — made money for taxpayers, when all costs and benefits were factored in.

That’s because they improved the health and education of enrollees and their families, who eventually earned more income, paid more taxes and needed less government assistance over all.

The study, published in The Quarterly Journal of Economics, analyzed 101 government programs. In one way, it was a standard cost/benefit analysis–it looked at what  government’s costs were, and the resulting benefits to the recipients. However, the researchers took an extra step–they calculated the “fiscal externalities: the indirect ways that a program affected the government’s budget.”

In other words, in addition to the upfront costs, they calculated the monetary return on taxpayers’ investment.

Consider one program: health insurance for pregnant women. In the mid-1980s, the federal government allowed states to expand Medicaid eligibility to more low-income pregnant women. Some, but not all, states took up the offer. Increased Medicaid coverage enabled women to receive prenatal care and better obstetric care, and to save on personal medical spending.

For the federal government, the most straightforward fiscal impact of this expanded coverage was increased spending on health insurance. The indirect fiscal effects were more complex, and could easily be overlooked, but they have been enormous.

First, newly eligible women had fewer uninsured medical costs. The federal government picks up part of the tab for the uninsured because it reimburses hospitals for “uncompensated care,” or unpaid bills. Thus, this saved the government some money. On the other hand, some of the women stopped working, probably because they no longer needed employer-provided private health insurance, and this cost the government money.

But the biggest indirect effects were not apparent until children born to the Medicaid-covered women became adults. As shown in a study by Sarah Miller at the University of Michigan and Laura Wherry at the University of California, Los Angeles, those second-generation beneficiaries were healthier in adulthood, with fewer hospitalizations. The government saved money because it would have paid for part of those hospital bills. The now-adult beneficiaries had more education and earned more money than people in similar situations whose mothers did not get Medicaid benefits. That meant higher tax revenue.

Data on other social programs yields similar results. Researchers have found that Medicaid expansion, for example, more than paid for itself, even after accounting for the fact that future benefits are “discounted”–i.e., worth less today. 

Businesspeople understand that it usually takes time to realize profit. With government social programs, too, the fiscal “payoff” generally is delayed. That doesn’t mean it is less substantial or less real. In the cited study, 16 of the social policies that the researchers examined either broke even or actually made a profit. 

I’m certainly not suggesting that government programs be limited to those with a positive financial return–government is most definitely not a business. I am suggesting, however, that we consider government social programs investments–and that the returns on those investments aren’t limited to improving the safety and security of the communities in which we all live, sufficient as that return would be. In many cases, taxpayers also get a positive monetary return on investment.

Just like well-run businesses.

Comments

This Isn’t Dunkirk

Longtime readers of this blog know that I rarely, if ever, post about foreign policy. There’s a reason for that–I am uninformed about most aspects of such policies, and I am deeply conflicted about America’s obligations vis a vis purely humanitarian concerns. 

When it comes to warfare, I mostly agree with those who insist we should keep our cotton-pickin’ hands off unless there is a very clear American interest to be protected, or a humanitarian crisis of significant proportions that we are actually in a position to ameliorate. I will readily admit that the definition of American interests and the nature and extent of humanitarian crises are matters of considerable debate.

If I had been the person determining the parameters of America’s intervention in Afghanistan, I would have approved an initial intervention to root out Al Qaida and “get” Osama Bin Laden–but not the slog of the subsequent 18 years, during which we wasted trillions of dollars–not to mention the lives of thousands of soldiers and civilians.

But here we are.

President Biden has made what I consider the absolutely correct call–and the media and self-styled pundits, abetted by deeply dishonest Republicans sensing political advantage, are having a field day attacking him for, among other things, recognizing and admitting the obvious.

I think that Michael Moore, of all people, has it right in the following paragraphs. (I say “of all people” because I tend to find Moore tiresome–you usually know precisely what he’ll say because, like far too many people, he approaches all issues through an unshakable, pre-defined lens. Sometimes, of course, like that “stopped clock” he’s right; sometimes, not so much.)

In this case,I think he is “on point.” In his recent letter, Moore wrote about our departure from Afghanistan: 

This is nothing here to celebrate. This should only be a monumental gut-check moment of serious reflection and a desire to seek redemption for ourselves. We don’t need to spend a single minute right now analyzing how Biden has or has not messed up while bravely handling the end of this mess he was handed — including his incredible private negotiations all this week with the Taliban leaders to ensure that not a single enemy combatant from the occupying force (that would be us; e.g., U.S. soldiers and spies and embassy staff), will be harmed. And Biden so far has gotten every American and foreign journalist out alive, plus a promise from the Taliban that those who stay to cover it will not be harmed. And not a single one has! Usually a force like the Taliban rushes in killing every enemy in sight. That has not happened! And we will learn that it was because of the negotiating skills and smarts of the Biden team that there was no mass slaughter. This is not Dunkirk.

Dozens of planes have safely taken off all week — and not one of them has been shot down. None of our troops in this chaotic situation have been killed. Despite the breathless shrieks of panic from maleducated journalists who think they’re covering the Taliban of the 1990s (Jake Tapper on CNN keeps making references to “beheadings“ and how girls might be “kidnapped” and “raped” and forced to become “child brides”), none of this seems to be happening. I do not want to hear how we “need to study” what went wrong with this Taliban victory and our evacuation because (switching to all caps because I can’t scream this loud enough): WE ARE NEVER GOING TO FIND OURSELVES IN A SITUATION LIKE THIS AGAIN BECAUSE OUR DAYS OF INVADING AND TAKING OVER COUNTRIES MUST END. RIGHT? RIGHT!!

Unfortunately, we probably will find ourselves in similar situations, because a substantial portion of our citizenry believes we have the right–indeed, the duty–to impose our will around the globe, irrespective of any threat to genuine American interests.

Is our exit from Afghanistan being accomplished smoothly? No. To the extent both the war and the exit were bungled, we’ll need sober analyses of those failures in order to inform future foreign policy decisions. But sober analyses are not what we’re getting–for that matter, even presumably straightforward eyewitness reports of what is occurring “on the ground” are wildly inconsistent. 

If people of good will are truly concerned about the fate of non-Taliban Afghanis–especially Afghani women–under a fundamentalist religious regime, what they can and must do is extend a welcome to those who want to emigrate, and work to facilitate their speedy immigration and resettlement.

It is telling–but not surprising– that the monkeys throwing poo in hopes it sticks to the administration are unwilling to do that.

Comments