Ideology, Meet Evidence

A few days ago, I shared a talk I gave to the Indianapolis Council on Women about the UBI–the theory behind efforts to replace much of America’s dysfunctional safety net with a Universal Basic Income.

There is, as I noted in that discussion, hysterical resistance to such a drastic change. We are, after all, a country that is politically unable to provide even universal access to healthcare. The cost of such a benefit would require us to look critically at America’s multiple wasteful subsidies  and it would require the Uber-rich to pay their share of taxes.

Cost is a legitimate concern. Less legitimate–and far more potent–is the belief that poor people are “takers” who would cease productive labor, neglect their kids, and spend their stipends on booze and drugs. I realize that most of the ideologues who subscribe to this theory are impervious to evidence, but evidence contrary to that belief continues to accumulate. I cited the results of previous pilot projects in the talk I referenced, and subsequently, additional evidence has emerged.

After getting $500 per month for two years without rules on how to spend it, 125 people in California paid off debt, got full-time jobs and had “statistically significant improvements” in emotional health, according to a study released Wednesday.

The program was the nation’s highest-profile experiment in decades of universal basic income, an idea that was revived as a major part of Andrew Yang’s 2020 campaign for president.

Cynics had predicted that free money would eliminate the incentive to work, creating a population dependent on the state. The experiment in Stockton, California that yielded these results was an effort to test that thesis. It was funded by private donations, including a nonprofit led by Facebook co-founder Chris Hughes who has been a longtime supporter of the UBI.

Run by a nonprofit founded by former Stockton Mayor Michael Tubbs, the program included people who lived in census tracts at or below the city’s median household income of $46,033.

A pair of independent researchers at the University of Tennessee and the University of Pennsylvania reviewed data from the first year of the study, which did not overlap with the pandemic. A second study looking at year two is scheduled to be released next year.

When the program started in February 2019, 28% of the people slated to get the free money had full-time jobs. One year later, 40% of those people had full-time jobs. A control group of people who did not get the money saw a 5 percentage point increase in full-time employment over that same time period, from 32% to 37%.

“These numbers were incredible. I hardly believed them myself,” said Stacia West, a researcher at the University of Tennessee who analyzed the data along with Amy Castro Baker at the University of Pennsylvania.

The money came once a month, and was distributed via a debit card. That allowed the researchers to track how people spent it. The largest expense each month was for food, followed by sales and merchandise, which included purchases at places like Walmart and Target, which also sell groceries. The next highest categories were utilities, automobile (gas and repairs) and services. Less than 1% of the money went to tobacco and alcohol.

Given America’s political culture, which valorizes individualism and looks askance at any suggestion that social support might increase–rather than disincentivize–individual ambition, the prospects for a UBI are pretty dim. But there are some signs that opposition may be softening.

Still, guaranteed income programs seem to be gaining momentum across the country. More than 40 mayors have joined Mayors for a Guaranteed Income, with many planning projects of their own. A proposal in the California Legislature would offer $1,000 per month for three years to people who age out of the state’s foster care system. And in Congress, Republican U.S. Sen. Mitt Romney of Utah has proposed expanding the child tax credit to send most parents at least $250 per month.

We’ll see how the evidence accumulates…..

Comments

Stop The World…But Then What?

Every once in a while, I come across an article or column which doesn’t convey anything particularly new or earth-shattering, but that sets out conventional wisdom in a way that makes a light-bulb come on. I had that “aha” experience when I read an opinion piece in the New York Times titled “Trumpism Has No Heirs.

The author, Jane Coaston, pointed out that–at least for the next two years–the Republican Party is ideally positioned.

As the opposition party, it will not be expected to offer solutions to the country’s myriad problems, much less introduce substantive legislation. It will not be expected to do anything except what it does best — oppose the Democratic administration and the Democratic Party.

Coaston’s observation isn’t new, of course–anyone who can spell “Mitch McConnell” or  has followed national politics even superficially over the past few years will agree that “even when holding power, movement conservatism is fundamentally an opposition movement.”

However, Coaston suggests that this “spirit of opposition” is the GOP’s Achilles’ heel –a weakness that will doom Republican efforts to “move on” from Donald Trump. Over the past few years, “conservatism” has become an empty label; as she notes, although many people  call themselves conservatives, they mostly agree about what conservatism isn’t. There is no consensus on what conservatism in the 21st Century is. And she says that Donald Trump’s candidacy and presidency exploited conservatism’s glaring lack of a central motivating vision.

The conservatism that was seemingly agreed upon by the Heritage Foundation and the American Enterprise Institute and National Review was not the conservatism that Mr. Trump sold to the American people.

Mitt Romney campaigned in 2012 on being “severely conservative” and lost. Mr. Trump campaigned on a self-serving redefinition of what it even means to be conservative and won. After all, as Mr. Trump told ABC News in early 2016, “this is called the Republican Party, it’s not called the Conservative Party.”

But what Mr. Trump was for, and what his voters supported, was not the populist nationalism generally associated with “Trumpism.” Populist nationalism has a long history in this country. Paleoconservatives like Pat Buchanan, the former Nixon assistant and political commentator, have espoused a blend of America First isolationist foreign policy rhetoric and distrust of perceived culture and political “elites” for decades.

Pundits who see Trumpism as a form of populist nationalism miss the fact that such nationalism doesn’t depend on any one individual. Trumpism does, which is why no one will pick up the “mantle.” There is no mantle, no program or philosophy of governance. Trumpism is simply the “middle finger to perceived enemies and the bulwark against real or imagined progressive assault.”

The central motivating impulse of today’s GOP is grievance and an overwhelming desire to “own the libs.” What Coaston has identified–and what I previously failed to focus on–is the essential weakness of using opposition as an organizing principle over time.

In the short term, of course,  being against something or someone generates energy and turnout. (A significant portion of the 81 million Americans who voted for Joe Biden would have voted for Daffy Duck if Daffy was running against Trump.) But for the longer term, it’s not enough.

At some point, being against everything–having no programs, no coherent political philosophy, no vision–will fail to energize enough voters to keep a party in power. That recognition is behind the formidable assault the GOP is currently mounting against voting rights.

The question is: when does disillusionment kick in? Until 2022, being against everything the Democrats want to accomplish is likely to be seen by the Republican base as a valiant effort to stop the modernity and social change that so deeply threatens them. Only if they are successful in retaking the House or Senate (or both) will citizens recognize that they have nothing positive to offer.

And by then, it might be too late.

Comments

About A UBI…

I’m speaking today to a woman’s group about proposals for a Universal Basic Income. Here’s what I’ll say. WARNING: It’s a lot longer than my usual posts.

_______________________________

I’ve recently been obsessing about what an updated social contract might look like. How would the realities of modern life alter the framework that emerged, after all, from the 18th Century Enlightenment? Is it possible to craft a governing structure that both respects individual liberty and provides basic material security? Actually, is anyone truly free when they face a daily struggle just to survive? And most important, at a time when we are recognizing how polarized Americans are, can government safety-net policies help to unify a quarrelsome and diverse population?

Social scientists are just beginning to appreciate the multiplicity of ways in which America’s obsessive focus on individual responsibility and achievement has obscured recognition of the equally important role played by the communities within which we are all embedded. A much-cited remark made by Elizabeth Warren during her first Senate campaign reminded us of the important ways social infrastructure makes individual success and market economies possible:

“There is nobody in this country who got rich on their own. Nobody. You built a factory out there – good for you. But I want to be clear. You moved your goods to market on roads the rest of us paid for. You hired workers the rest of us paid to educate. You were safe in your factory because of police forces and fire forces that the rest of us paid for. You didn’t have to worry that marauding bands would come and seize everything at your factory… Now look. You built a factory and it turned into something terrific or a great idea – God bless! Keep a hunk of it. But part of the underlying social contract is you take a hunk of that and pay forward for the next kid who comes along.”

The fact that Warren’s observation garnered so much attention (it evidently triggered an epiphany in many people) suggests that Americans rarely see individual success stories as dependent upon the government’s ability to provide a physical and legal environment—an infrastructure– within which that success can occur. It was a pointed rebuke of our national tendency to discount the importance of effective and competent governance.

The importance of hard work and individual talent certainly shouldn’t be minimized, but neither should it be exaggerated. When the focus is entirely upon the individual, when successes of any sort are attributed solely to individual effort, we fail to see the effects of social and legal structures that privilege some groups and impede others. When marginalized groups call attention to additional barriers they face, members of more privileged groups cling even more strongly to the fiction that only individual merit explains success and failure.

The problem is, when we ignore the operation of systemic influences, we feed pernicious stereotypes. We harden our tribal affiliations. That’s why the first priority of a new social contract should be to nurture what scholars call “social solidarity,” the ability of diverse citizens to see ourselves as part of an over-arching, encompassing American community.

Here’s the thing: Public policies can either increase or reduce polarization and tensions between groups. Policies intended to help less fortunate citizens can be delivered in ways that stoke resentments, or in ways that encourage national cohesion.  Think about widespread public attitudes about welfare programs aimed at poor people, and contrast those attitudes with the overwhelming majorities that approve of Social Security and Medicare. Polling data since 1938 shows growing numbers of Americans who believe laziness and lack of motivation  to be the main causes of poverty, and who insist that government assistance—what we usually refer to as welfare—breeds dependence. These attitudes about poverty and welfare have remained largely unchanged despite overwhelming evidence that they are untrue.

Social Security and Medicare send a very different message. They are universal programs; virtually everyone contributes to them and everyone who lives long enough participates in their benefits. Just as we don’t generally hear accusations that “those people are driving on roads paid for by my taxes,” or sentiments begrudging a poor neighbor’s garbage pickup, beneficiaries of programs that include everyone (or almost everyone) are much more likely to escape stigma. In addition to the usual questions of efficacy and cost-effectiveness, policymakers should evaluate proposed programs by considering whether they are likely to unify or further divide Americans. Universal policies are far more likely to unify, an important and often overlooked argument favoring a Universal Basic Income.

Attention to the UBI—a universal basic income– has increased due to predictions that automation could eliminate up to 50% of current American jobs, and sooner than we think. Self-driving cars alone threaten the jobs of the over 4 million Americans who drive trucks, taxis and delivery vehicles for a living—and those middle-aged, displaced workers aren’t all going to become computer experts. A UBI could avert enormous social upheaval resulting from those job losses–but there are many other reasons to seriously consider it.

A workable social contract connects citizens to an overarching community in which they have equal membership and from which they receive equal support. The challenge is to achieve a healthy balance—to create a society that genuinely respects individual liberty within a renewed emphasis on the common good, a society that both rewards individual effort and talent, and nurtures the equal expression of those talents irrespective of tribal identity.

What if the United States embraced a new social contract, beginning with the premise that all citizens are valued members of the American community, and that (as the advertisement says) membership has its privileges? In my imagined “Brave New World,” government would create an environment within which humans could flourish, an environment within which members—citizens—would be guaranteed a basic livelihood, including access to health care, a substantive education and an equal place at the civic table. In return, members (aka citizens) would pay their “dues:” taxes, a year or two of civic service, and the consistent discharge of civic duties like voting and jury service.

In my Brave New World, government would provide both a physical and a social infrastructure. We’re all familiar with physical infrastructure: streets, roads, bridges, utilities, parks, museums, public transportation, and the like; we might even expand the definition to include common municipal services like police and fire protection, garbage collection and similar necessities and amenities of community life. Local governments across the country understand the importance of these assets and services, and struggle to provide them with the generally inadequate tax dollars collected from grudging but compliant citizens.

There is far less agreement on what a social infrastructure should look like and how it should be funded. The most consequential element of a new social infrastructure, and by far the most difficult to implement, would require significant changes to the deep-seated cultural assumptions on which our current economy rests. Its goals would be to ease economic insecurities, restore workers’ bargaining power and (not so incidentally) rescue market capitalism from its descent into plutocracy. The two major pillars of that ambitious effort would be a Universal Basic Income and single-payer health insurance.

The defects of existing American welfare policies are well-known. The nation has a patchwork of state and federal efforts and programs, with bureaucratic barriers and means testing that operate to exclude most of the working poor. Welfare recipients are routinely stigmatized by moralizing lawmakers pursuing punitive measures aimed at imagined “takers” and “Welfare Queens.” Current anti-poverty policies haven’t made an appreciable impact on poverty, but they have grown the bureaucracy and contributed significantly to racial stereotyping and socio-economic polarization; as a result, a number of economists and political thinkers now advocate replacing the existing patchwork with a Universal Basic Income.

A UBI is an amount of money that would be sent to every U.S. Citizen, with no strings attached– no requirement to work, or to spend the money on certain items and not others. It’s a cash grant sufficient to insure basic sustenance; most proponents advocate $1000 per month. As Andy Stern has written,

“A basic income is simple to administer, treats all people equally, rewards hard work and entrepreneurship, and trusts the poor to make their own decisions about what to do with their money. Because it only offers a floor, people are encouraged to make additional income through their own efforts: As I like to say, a UBI gives you enough to live on the first floor, but to get a better view—for example, a seventh-floor view of the park—you need to come up with more money. Welfare, on the other hand, discourages people from working because, if your income increases, you lose benefits.

As Stern points out, with a UBI, in contrast to welfare, there’s no phase-out, no marriage penalties, no people falsifying information. Support for the concept is not limited to progressives. Milton Friedman famously proposed a “negative income tax,” and F.A. Hayek, the libertarian economist, wrote “There is no reason why in a free society government should not assure to all, protection against severe deprivation in the form of an assured minimum income, or a floor below which nobody need descend.” In 2016, Samuel Hammond of the libertarian Niskanen Center, noted the “ideal” features of a UBI: its unconditional structure avoids creating poverty traps; it sets a minimum income floor, which raises worker bargaining power without wage or price controls; it decouples benefits from a particular workplace or jurisdiction; since it’s cash, it respects a diversity of needs and values; and it simplifies and streamlines bureaucracy, eliminating rent seeking and other sources of inefficiency.

Hammond’s point about worker bargaining power is especially important. In today’s work world, with its dramatically-diminished unions and the growth of the “gig economy,” the erosion of employee bargaining power has been severe. Wages have been effectively stagnant for years, despite significant growth in productivity. In 2018, Pew Research reported that “today’s real average wage (that is, the wage after accounting for inflation) has about the same purchasing power it did 40 years ago. And what wage gains there have been have mostly flowed to the highest-paid tier of workers.” With a UBI and single payer health coverage, workers would have the freedom to leave abusive employers, unsafe work conditions, and uncompetitive pay scales. A UBI wouldn’t level the playing field, but it would sure reduce the tilt.

It is also worth noting that a UBI would have much the same positive effect on economic growth as a higher minimum wage. When poor people get money, they spend it, increasing demand—and increased demand is what fuels job creation and economic growth. If nobody is buying your widgets, you aren’t going to hire people to produce more of them.

Several countries have run pilot projects assessing the pros and cons of UBIs, and American pilot projects are currently underway in Stockton amd Oakland, California, and Mississippi; Gary Mayor Jerome Prince just announced that Gary will be participating in one. A rigorous academic evaluation of an earlier experiment, in Kenya, found that—contrary to skeptic’s predictions—the money had primarily been spent on food, medicine and education, and that there was no increase in use or purchase of alcohol and tobacco. The study also identified “a significant positive spillover on female empowerment,” and “large increases in psychological well-being” of the recipients.

Psychologists have underscored the importance of that last finding. Families with few resources face barriers that can overwhelm cognitive capacities. The psychological impacts from scarcity are real and the outcomes are difficult to reverse. A 2017 article in Forbes reported that when Native Americans opened casinos along the Rio Grande and used the proceeds to deliver basic incomes to the tribal poor, child abuse and crime dropped drastically. Simply handing money to poor people was enormously helpful. Being trapped in poverty, with the stress and insecurities associated with that, is progressively debilitating.

Counter-intuitive as it may seem, a significant body of research supports the
importance of a robust social safety net to market economies. As Will Wilkinson, vice-president for policy at the libertarian Niskanen Center, put it in the conservative National Review, contemporary arguments between self-defined capitalists and socialists both misunderstand economic reality. The left fails to appreciate the important role of capitalism and markets in producing abundance, and the right refuses to acknowledge the indispensable role safety nets play in buffering the socially destructive consequences of insecurity.

I may be a nerd, but I’m not delusional: Even if a UBI sounds good, the enormous barriers to its adoption are obvious: politically, shifting from a paternalistic and judgmental “welfare” system to one awarding benefits based upon membership in American society would require a significant culture change and would be vigorously opposed by the large number of companies and individuals whose interests are served by America’s dysfunctional patchwork of programs. State-level legislators would resist policy changes that moved decision-making from the state to either the federal or local level. And of course, voters are notoriously suspicious of change, even when it serves their interests. Nevertheless, if survey research is to be believed, public opinion is slowly moving in these directions. In time, and with sufficient moral and strategic leadership, change is possible. First, however, misconceptions must be confronted. (As the old saying goes, it isn’t what we don’t know that’s a problem, it’s what we know that isn’t so.)

Although Americans’ deeply-ingrained belief that people are poor because they made bad choices or didn’t work hard enough continues to be a barrier to a more generous and equitable social safety net, the most significant impediment to passage of a UBI is the same argument that has consistently and successfully thwarted universal healthcare, that America, rich as the country is, simply can’t afford it. This argument flies in the face of evidence from poorer counties with far more robust safety nets. Both the UBI and some version of Medicare-for-All could be funded by a combination of higher taxes, savings through cost containment, efficiencies and economies of scale, the elimination or reform of existing subsidies, and meaningful reductions in America’s bloated defense budget. (I should also note that government already pays some 70% of U.S. healthcare costs through a variety of programs and via coverage for government employees—and that’s without the substantial savings that a national system could achieve. According to one 2014 study, a single-payer system would save $375 billion per year just by removing inefficient administrative costs generated by multiple payers.) But back to UBI.

First, taxes. I know—dirty word.

Interestingly, public debates over taxes rarely if ever consider the extent to which individual taxpayers actually save money when government taxes them to supply a service. If citizens had to pay out-of-pocket for privatized police and fire protection or private schooling, the expense would vastly exceed the amounts individual households pay in taxes for those services. Low-income citizens, of course, would be unable to afford them.

There is a reason that debates about taxes rarely include consideration of the saving side of the ledger; the American public is positively allergic to taxes, even when a majority financially benefits from them. If low-and-middle income American families did not have to pay out-of-pocket for health insurance, and could count on receiving a stipend of $1000/month, most would personally be much better off, even if some of them experienced tax increases.

Tax increases, of course, are levied against people capable of paying them. Americans used to believe in progressive taxation, and not simply to raise revenue. Taxes on the very wealthy were originally conceived as correctives, like tobacco taxes, that should be judged by their societal impact as well as their ability to generate revenue. High tax rates on the rich were intended to reduce the vast accumulations of money that serve to give a handful of people a level of power deemed incompatible with democracy. Of course, in addition to reducing inequality, progressive taxation does raise money. Elizabeth Warren proposed taxing households with over $50 million in assets by levying a 2 percent tax on their net worth every year. The rate would rise to 3 percent on assets over $1 billion. Warren’s plan would affect a total of just 75,000 households, but would raise $2.75 trillion over 10 years. Representative Alexandria Ocasio-Cortez has called for raising the marginal federal tax rate on annual incomes over $10 million. Both proposals reflect a growing consensus that the very rich are not paying their fair share.

There’s also growing anger directed at the generosity of various tax “loopholes,” that allow immensely profitable corporations to reduce their tax liabilities (or escape them completely). In 2018, Amazon, which reported 11.2 billion dollars in profit, paid no tax and received a rebate of 129 million. The use of offshore tax havens and other creative methods of eluding payment devised by sophisticated tax lawyers employed by the uber-wealthy is an ongoing scandal.

Both economic research and real-world experiments like Governor Sam Brownback’s tax cuts in Kansas confirm that, contrary to the emotional and ideological arguments against imposing higher taxes on wealthy individuals, high marginal rates don’t depress economic growth and cutting taxes doesn’t trigger an increase in either job creation or economic growth. In 1947, the top tax rate was 86.45% on income over $200,000; in 2015, it was 39.60% on income over $466,950. During that time, research has found very little correlation between economic growth and higher or lower marginal rates. In 2012, the Congressional Research Service published a research study that rebutted the presumed correlation between tax rates and economic growth.

It isn’t just taxes that need to be adjusted. We need to significantly reduce fossil fuel subsidies, farm subsidies and our bloated military budget—and we need to stop subsidizing shareholders of immensely profitable companies like Walmart and McDonalds. If a UBI allowed workers to cover basic essentials, taxpayers wouldn’t need to supplement the wages of low-wage workers. A Senate panel recently reported that nearly half of workers making less than $15 an hour currently rely on public assistance programs costing taxpayers $107 billion dollars each year.

Climate change is already affecting America’s weather, increasing the urgency of efforts to reduce carbon emissions and increase the development and use of clean energy sources. Yet the United States spends twenty billion dollars a year subsidizing fossil fuels. That includes 2.5 billion per year specifically earmarked for searching out new fossil fuel resources, at a time when development of those resources is arguably suicidal. Permanent tax breaks to the US fossil fuel industry are seven times larger than those for renewable energy. Research tells us that, at current prices, the production of nearly half of all U.S. oil would not be economically viable, but for federal and state subsidies.

The Obama administration proposed to eliminate 60% of federal fossil fuel subsidies. That  proposal went nowhere–perhaps because during the 2015-2016 election cycle oil, gas, and coal companies spent $354 million on campaign contributions and lobbying. The industry received $29.4 billion in total federal subsidies those same years – an 8,200% return on investment. We waste billions of dollars propping up an industry that makes climate change worse. Eliminating these subsidies would free up funds for other uses, including a UBI.

Farm subsidies represent another 20 Billion dollars annually. Arguments for and against terminating these subsidies are more complicated than for fossil fuel subsidies, but the case for means-testing them is strong.  In 2017, the USDA released a report showing that approximately half the money paid out went to farmers with household incomes over $150,000. That means billions of dollars, every year, go to households with income nearly three times higher than the median U.S. household income, which was $55,775 that year.

Farm subsidies were created during the Depression to keep family farms afloat and to ensure a stable national food supply. Since 2008, however, the top 10 farm subsidy recipients have each received an average of $18.2 million – that’s $1.8 million annually, $150,000 per month, or $35,000 a week– more than 30 times the average yearly income of U.S. families. Surely the formula governing distribution of those subsidies could be changed to ensure that millionaires aren’t benefitting from a program established to protect family farms during times of economic distress.  According to Forbes, since 2008, the top five recipients of farm subsidies took in between $18.6 million and $23.8 million apiece. Some of us are old enough to remember that Richard Lugar consistently criticized farm subsidies as wasteful and even counterproductive and offered legislation to limit them; his legislation also went nowhere.

Making the case for eliminating fossil fuel subsidies or limiting farm subsidies is much simpler than advocating for strategic cuts in America’s bloated military budget. Most citizens understand why government should not be providing billions of dollars to support companies that make climate change worse, or adding to the bottom lines of massively-profitable corporate farms. Efforts to cut the military budget, enormous though it is, encounter genuine anxieties about endangering national security, as well as more parochial concerns from lawmakers representing districts with economies heavily dependent upon military bases or contractors. That may explain why U.S. military spending in 2017 was over 30% higher in real terms than it was in 2000. The United States spent $716 billion in 2019; annually, we spend more than twice what Russia, China, Iran and North Korea spend collectively.

Critics of the military budget make three basic arguments: the budget is much bigger than threats to U.S. security require; very little of the money appropriated supports efforts to fight the terrorist groups that pose the real threat in today’s world; and the countries that might threaten American interests militarily are historically few and weak. (Russia, for example, has an energy-dependent economy roughly the size of Italy’s. According to America’s intelligence community, Russian efforts to destabilize us are made through social media, assaults by “bots,” and hacks into vulnerable data repositories, not military action.)

The massive amounts that America spends on its military support bases and troops that aren’t even suited to the conduct of modern-day defense. It would also be worth investigating whether the existence of this enormous military capacity creates an incentive to substitute military intervention for the exercise of diplomacy and soft power (as the Japanese proverb warns, when the tool you have is a hammer, every problem looks like a nail.) We appear to be supporting a military establishment that’s prepared to fight the last war, not the next one.  Several experts argue that the U.S. could safely cut the military budget by 25%.

We should address these subsidies in any event, but when it comes to paying for a UBI, there are a number of ways it might be funded, including “cashing out” all or most of the existing 126 welfare programs that currently cost taxpayers $1 trillion a year. The UBI would make most of these programs unnecessary.

America’s problem is a lack of political will to confront the special interest groups that currently feed at the government trough, not a lack of realistic funding mechanisms.

A girl can dream….

Comments

City And State

In the wake of John Kerry’s 2004 electoral defeat,  the editors of The Stranger, an alternative newspaper published in Seattle, published a wonderful rant. The editors looked at the red and blue election map, and pointed to the (visually obvious) fact that even in the reddest states, cities were bright blue. America’s urban areas comprised what they called an “urban archipelago” that reflected political values and attitudes vastly different from those of rural America.

Academic researchers have since confirmed that observation: virtually every major city (100,000 plus) in the United States of America has a political culture starkly different from that of the less populous areas surrounding it. As I wrote in a post back then, the problem is, the people who live in densely populated cities have demonstrably less political voice than their country cousins. Most states don’t really have “one person one vote” and the result is that rural voters are vastly overrepresented. State taxes paid by city dwellers go disproportionately to rural areas, and the people who populate state legislatures  have gerrymandered voting districts to keep things that way.

Representative government wasn’t genuinely representative then, and in 2021, the situation hasn’t improved.

Earlier this month, Governing Magazine noted the same problem, in an article titled “Why Cities Have More People But Less Clout.”

Gun violence is on the rise in Philadelphia. In January, homicides jumped by a third over the same month in 2020, which itself had been the deadliest in three decades. Non-fatal shootings increased last month by 71 percent.

City officials, wanting to address the issue, have repeatedly come up with gun control measures they believe will save lives. Their efforts, however, have gone nowhere. Pennsylvania, along with more than 40 other states, blocks localities from passing their own firearms regulations.

Last fall, Philadelphia sued the state to end its gun pre-emption law. “If the Pennsylvania General Assembly refuses to do anything to help us protect our citizens,” said Darrell Clarke, the president of the Philadelphia city council, “then they should not have the right to prevent us from taking the kinds of actions we know we need to keep our residents safe from harm.”

Good luck with that. Courts have repeatedly upheld Pennsylvania’s power to block local gun control laws. Across the country, states have consistently pre-empted localities on a broad range of issues, from minimum wage increases and paid sick leave requirements to bans on plastic bags or removal of Confederate monuments.

Sounds pretty familiar to us Hoosiers…

The article reports what most of us know–that the majority of the nation’s economic growth has been concentrated in major cities that are the primary economic engines of their states. You would think that would make them deserving of support– but state officials pretty consistently opt to keep money flowing from those cities to rural, less prosperous areas of the state. Cities send far more tax dollars to the state that they receive back in spending.

As cities are prospering (or at least were, before the pandemic and the great migration out of downtown offices), they have been moving in an increasingly progressive direction. Only three of the nation’s 25 largest cities have Republican mayors. Meanwhile, a majority of state legislatures are controlled by the GOP. That creates a disconnect that leads to frequent pre-emption, particularly in Republican states in the South, Southwest and Midwest.

It isn’t just a partisan political gap; the urban/rural divide “reflects and is reinforced by other overlapping differences, including cultural attitudes, education levels, class and race.”

Democrats can compete and win statewide in states including Michigan, North Carolina, and Wisconsin — and now Arizona and Georgia — but they’re shut out of power at the legislative level in all those places. Pennsylvania falls into this category as well.

The article acknowledges the a long tradition of outstate resentment of the dominant city–a resentment made stronger by the partisan split.

“They don’t have any reason to take into account the interest of the urban population in making legislation, and they have a lot of interest in not doing so,” says Schragger, the UVA law professor. “Particularly on cultural issues and fiscal issues, it pays for these legislators to resist giving cities more home-rule powers, because their constituents tend to be opposed even to local policies that are contrary to national conservative positions.”

The article is further evidence of America’s undemocratic move to minority rule, buttressed by giving every state two senators, irrespective of population count (the recent Republican Senate majority, which refused to rein in Trump’s abuses after his first impeachment, was elected with 20 million fewer votes than the Democratic minority), and
by the anti-majoritarian operation of the Electoral College.

How we give America’s urban majority at least an equal say with its rural minority is an increasingly critical question.

Comments

Texas

Early in my academic career, I really came to appreciate Texas. I taught Law and Public Policy, and on those rare occasions when Indiana failed to provide a “teachable moment”– an example of truly awful policy– I could always count on Texas.

I still remember Molly Ivins’ wonderful explanation of the logic of the Texas “lege.” She noted that when gun deaths exceeded highway deaths, Texas lawmakers sprang into action–and raised the speed limit.

Ted Cruz (known around Twitter and Facebook these days as “Fled” Cruz) is a perfect example of the sort of Republican Texas routinely elects–arrogant, bigoted, and thoroughly full of himself. While he took off for Cancun, Beto O’Roark was setting up telephone outreach to elderly citizens who’d lost their power and access to clean drinking water, and AOC, the much-hated “socialist” who doesn’t live anywhere near Texas was raising two million dollars for relief efforts. (I note that, as of yesterday, it was up to five million…)

To be sure, Cruz has lots of company. Former Governor and Energy Secretary Rick Perry (who was Governor in 2011 and ignored experts who recommended winterizing the power grid) insists that Texans prefer an occasional apocalypse to the indignity of federal regulation, and a “compassionate” Republican mayor had to resign after telling freezing people who’d lost power and water to stop whining and get off their lazy asses and take care of themselves.

I don’t think it is at all unfair to claim that these buffoons are perfect representatives of today’s GOP–a party that exhibits absolutely no interest in actual governing. I agree entirely with Ryan Cooper, who wrote in The Week that  the blizzard nightmare is “Republican governance in a nutshell.”

After describing Cruz’s attempted getaway, Cooper wrote that

what Cruz did is emblematic of the Republican Party’s mode of governance. The reason Cruz felt comfortable leaving Texans to freeze solid on the sidewalks of Houston is the same reason the Texas power grid crumpled under the winter storm. Theirs is a party in which catering to the welfare of one’s constituents, or indeed any kind of substantive political agenda, has been supplanted by propaganda, culture war grievance, and media theatrics. Neither he nor anybody else in a leadership position in the party knows or cares about how to build a reliable power grid. They just want to get rich owning the libs….

People have known for decades how to winterize electrical infrastructure — after all, there is still power in Canada and Finland. The reason those investments haven’t been made in Texas is because it would have cost a lot of money, and nobody wanted to pay for it — especially because the deregulated Texas energy grid makes it hard to pay for upgrades or extra capacity.

The reason the Texas grid isn’t connected to the national system is pure GOP ideology; the grid was purposely kept within the state in order to avoid federal regulation. (It’s notable that a couple of small parts of the state that aren’t connected to the Texas-only grid–places that were subject to those hated regulations– have mostly been fine.)

Unfortunately for Texas politicians, it’s hard to blame the “libs” for this debacle, although they’ve tried; after all, Republicans have run the state since 1994–and they’ve pretty much been owned by the state’s fossil fuel companies–especially Governor Abbott.

When the Texas power grid buckled under the strain of worse-than-expected winter cold, Texas Gov. Greg Abbott (R) went on Fox News and blamed frozen wind turbines for what was mostly a problem with natural gas–fueled power supply. Then he savaged the Electric Reliability Council of Texas (ERCOT), which manages the Texas-only power grid. But he has notably “gone easier on another culprit: an oil and gas industry that is the state’s dominant business and his biggest political contributor,” The Associated Press reports.

Abbott, in office since 2015, has raised more than $150 million in campaign contributions — the most of any governor in U.S. history — and “more than $26 million of his contributions have come from the oil and gas industry, more than any other economic sector,” AP reports. In a news conference Thursday, Abbott mostly blamed ERCOT for assuring state leaders Texas could handle the storm.

ERCOT, of course, is managed by people appointed by Abbott…

Today’s Republicans may not be good at–or interested in– governing, but they are absolute masters of shamelessly lying and blaming others when they are threatened with the consequences of an ideology that translates into “let them eat cake.”  

Comments