Majority Rule

Majority rule in our democratic republic is more complicated than we like to think.

For one thing, our particular form of government carves out matters that are specifically insulated from what the Founders called the “passions of the majority”–the individual liberties enumerated and “reserved to the people” by various provisions of the Bill of Rights. For another, in those areas where majority opinion is supposed to count, our mechanism for determining what a majority of citizens really wants  is the vote–and not every citizen entitled to cast a vote does so. (The differences between what popular majorities want and what gets enacted can often be seen by comparing polling and survey research with legislation passed by victorious candidates.)

And don’t get me started on the Electoral College.

Then there’s the distortion regularly provided by media–very much including Twitter and Facebook, etc. We too often assume that the loudest and most persistent voices reflect the opinion of majorities–and that is not a well-founded assumption.

Take, for example, the issue of vaccine mandates.

A recent report by the Brookings Institution’s William Galston suggests that requiring vaccination is a lot more popular than we might imagine if we only listened to the hysterical purveyors of misinformation and conspiracy theories. (Recently, those vaccine deniers were accurately–if intemperately–labeled “assholes” by the Mayor of West Lafayette, Indiana. I don’t know him, but I’m pretty sure I’d really like him.)

Galston did a deep dive into the data. Not surprisingly, he found that unvaccinated Americans were less concerned about COVID than those who’d had the sense to get vaccinated.

In the face of massive evidence to the contrary, more than half of unvaccinated adults regard getting vaccinated as a bigger risk to their health than is getting infected with the coronavirus. Only one in five of the unvaccinated say that the spread of the delta variant has made them more likely to get vaccinated. These data do not support hopes that the recent outbreak will suffice to increase vaccination rates enough to bring the pandemic under control.

The data also reflects surprisingly robust support for vaccine mandates.

Since the beginning in March 2020, government’s response to the pandemic has occasioned intense controversy, much of it along partisan lines. Although the level of conflict remains high, recent events have solidified public support for the most intrusive policy government can undertake—mandatory vaccinations. According to a survey conducted by the Covid States Project, 64% of Americans now support mandatory vaccinations for everyone, and 70% support them as a requirement for boarding airplanes. More than 6 in 10 say that vaccinations should be required for K-12 students returning for in-school instruction as well as for college students attending classes at their institutions. And the most recent Economist/YouGov survey found that more than 60% support mandatory vaccinations for frontline workers—prison guards, police officers, teachers, medical providers, and the military—and for members of Congress as well…

“Solid majorities of every racial and ethnic group support vaccine mandates, as do Americans at all levels of age, income, and education.

The data also supports the growing recognition by sane Americans that the GOP has  devolved into a cult of anti-science, anti-evidence, crazy folks: Only 45% of Republicans support vaccine mandates, compared to 84% of Democrats.

When I sent my children to school, I was required–mandated– to provide evidence that they’d been vaccinated, and thus did not threaten the health and safety of the other children with whom they would be taught. When I was young myself, Americans lined up with gratitude to receive the polio vaccine that would allow them to avoid the alternatives–death, or imprisonment in iron lungs.

When providing for “the General Welfare” requires rules–mandates– a majority of us understand that such mandates not only do not infringe our liberties, but actually give us more liberty–allowing us to go about our daily lives without the danger of infection (or the need to wear a mask).

Vaccine mandates are supported by medical science, by law, by morality, and by a majority of Americans. We periodically need to remind ourselves that “loudest” doesn’t equate to “most”–and that a fair number of the hysterical people shouting about “personal freedom” can’t define it and don’t want their neighbors to have it.

Comments

The Robots Are Coming…

When I opened my email a few days ago, the first thing that popped up was an article from the Brookings Institution titled “The Robots Are Ready as the Covid-19 Recession spreads,” predicting that a coronavirus-related downturn will increase the rate at which American industry invests in labor-replacing automation.

As I have previously argued, jobs don’t matter simply because most of us need to put food on the table. Having a job–even a job we dislike–gives most of us a sense of purpose and identity. (There is a reason so many people die shortly after retiring.)In “The Truly Disadvantaged,” William Julius Wilson noted the significant differences between neighborhoods where residents are poor but employed and neighborhoods where residents are poor and jobless.

The longterm trend was worrisome well before the advent of the Coronavirus: American economic mobility and job creation had already begun to slow, largely as a result of policies favoring larger firms over the entrepreneurial start-ups that were once responsible for the creation of most new jobs. Numerous studies have documented what Brookings calls “a steady and significant increase in consolidation” Thanks to anemic anti-trust enforcement, the number of so-called “mega-mergers” has increased, and as the market power of these huge companies grows, competition decreases. The under-enforcement of anti-trust laws has reduced entrepreneurship, increased predatory pricing practices and economic inequality, and resulted in the concentration of economic growth.

Rather than the vigorous competition that characterizes healthy markets, we have increasingly moved from capitalism to corporatism, or crony capitalism, in which government shields favored industries and companies from competitive pressures rather than acting as the guarantor of a level playing field.

Until recently, people expressing concerns about job losses have focused their criticisms on the outsourcing of manufacturing to low-wage countries, ignoring what is by far the biggest contributor to job loss–  automation, and the replacement of workers by machines.

A 2018  study by Ball State University found that just since 2000, nine out of ten manufacturing workers have been replaced by automation. That same year, the Pew Research Center asked approximately 1900 experts to opine on the impact of emerging technologies on employment; half of those questioned predicted the displacement of significant numbers of both blue- and white-collar workers by robots and digital agents, and predicted that those displacements will lead to serious consequences: larger increases in income inequality, masses of people who are unemployable, and breakdowns in the social order.

Forecasts varied widely. One analysis, by the Organization for Economic Cooperation and Development, predicted that ten percent of the jobs in advanced economies will be automated, while scholars at Oxford University warned that 50% of American jobs are at risk. Obviously, no one can say with confidence how many jobs will be lost, or which workers will sustain those losses, but technologies now in development threaten millions.

Think about the numbers. There are 3.5 million professional truck drivers in the United States, and another 1.7 million Americans drive taxis, Ubers, buses and delivery vans for a living. Self-driving cars, which are already being road-tested, could put them all in the ranks of the unemployed.

Think skilled workers are immune? Think again.  Reports show accelerating automation of jobs held by skilled knowledge workers engaged in data-driven decision-making. Between 2011 and 2017, Goldman Sachs replaced 600 desk traders with 200 coding engineers. Even medical professionals are at risk: in 2017, Entilic, a medical start-up, reported that its AI algorithm “outperformed four radiologists in detecting and classifying lung modules as either benign or malignant.” In 2016, the World Economic Forum projected a total loss of 7.1 million jobs to automation, including jobs in advertising, public relations, broadcasting, law, financial services and health care.

Automation will obviously create jobs as well as destroy them, but that will be cold comfort to that 55-year-old truck driver with a high-school education–he isn’t going to move into a new position in Informatics.

What does the current pandemic have to do with this longterm trend? According to Brookings:

Robots’ infiltration of the workforce doesn’t occur at a steady, gradual pace. Instead, automation happens in bursts, concentrated especially in bad times such as in the wake of economic shocks, when humans become relatively more expensive as firms’ revenues rapidly decline. At these moments, employers shed less-skilled workers and replace them with technology and higher-skilled workers, which increases labor productivity as a recession tapers off.

America wasn’t prepared for a pandemic, and we won’t be prepared for the civic unrest exacerbated by widespread joblessness. We are going to require skilled leadership, and that leadership will not be provided by the Party of the Past, led by a mentally-ill ignoramus.

Comments

All Cost, No Benefit

Every city of any size, and every state, has a government agency charged with “economic development.” Economic development is almost always a euphemism for luring new employers to the city or state.

A productive discussion about what a genuine effort to improve the local economy should and should not entail is considerably overdue. Such a re-examination remains unlikely, but here and there, investigations of current practices do remind us that not everything we call an “incentive” deserves the name.

Which brings us to Wisconsin, Scott Walker and Foxcomm. A report from the Brookings Institution recently described that embarrassing boondoggle:

In 2017, the state of Wisconsin agreed to provide $4 billion in state and local tax incentives to the electronics manufacturing giant Foxconn. In return, the Taiwan-based company promised to build a new manufacturing plant in the state for flat-screen television displays and the subsequent creation of 13,000 new jobs.

It didn’t happen. Those 13,000 jobs never materialized, and plans for the manufacturing plant have been consistently scaled back. Even if the project had gone through as planned, there is no way the Foxconn subsidy would have made money for the state, or provided earnings benefits for residents that exceed its costs. It now appears that few of Foxconn’s promises will be fulfilled, even though local governments have gone into debt over the project.

The Foxcomm “deal” was widely panned at the time, but as Brookings reports, criticisms of that effort were mostly based on the enormous size of the incentives being offered, not on the underlying concept. But since 1990, even the average size of these business incentives has tripled, threatening public services and the social safety net.

Even when the incentive being offered is comparatively modest, however, research doesn’t confirm the underlying assumptions of the approach. At least 75% of the time, the incentives don’t really affect the relocation decision one way or the other.

They’re all cost and no benefit. Furthermore, even when incentives do tip a location decision, they do not pay for themselves. They may create new jobs, but frequently they also bring in new workers from outside the city or state, which raises costs to public services that offset at least 90% of any increased revenue…On average, only 10-30% of new jobs go to state residents who are not already employed.

Are there incentives that would work? Brookings says there are, and offers the following checklist:

Do the incentives target the right businesses?

Will the business provide multiplier effects? When the business buys from local suppliers, it helps increase jobs at those companies. Workers employed at the business, too, will buy from local retailers, increasing those jobs.

Is the business “traded”—i.e., selling its goods and services outside of the state or community? Incentives to non-tradeable firms will just displace jobs at other local non-tradable firms.

Is the real job multiplier accurately calculated? Multipliers can be overstated if they ignore the increased local costs that accompany business growth.

Is the business locally owned? Locally owned firms spend more of their revenue locally, benefiting the hometown economy.

Do the incentives target the right areas?

Incentives should target economically distressed local areas, with more available labor that is not employed. That way, the share of new jobs that go to local residents can be two to three times as great, compared to already-booming areas.

Do the incentives target high-tech businesses in an area with an above-average high-tech base? High-tech businesses have additional multiplier effects because they support and spawn other local firms whose workers and ideas flow from one to another. But this only works when the area has a sufficiently large “cluster” of tech firms to build from.

Are they the right type of incentives?

Are they structured so cash incentives occur upfront? Upfront incentives are more cost-effective in affecting business location decisions, because they are more relevant to business decisionmakers who focus on the short term.

Do they include enticements/requirements to hire locally? For example, customized training programs can encourage firms to hire the local unemployed.
Do they include a healthy share of customized businesses services, or is it all cash giveaways? Business services such as job training, business advice to smaller businesses, and new transportation infrastructure can have job creation effects per dollar that are five to 10 times greater than tax or cash incentives.

Do the incentives avoid robbing Peter to pay Paul? If governments pay for incentives by decreasing public spending on education, training, or infrastructure, the negative economic development effects of those budget cuts may exceed any benefits from the incentives.

Finally, is there a decent model to accurately assess the impact of the incentive?

There are practical ways to evaluate incentives. We can compare assisted with unassisted firms, or assisted areas with unassisted areas. There are good estimates of how many location decisions will be swayed by a cash incentive package of a particular size, and how many jobs per dollar will be created by a high-quality customized job training program. State and local government researchers can combine these evaluation approaches with models of local labor markets and fiscal impact to see whether a specific incentive package’s benefits are likely to exceed its costs.

Finding the right answer depends on asking the right questions–not on constantly sweetening the pot.

Comments

Credit Where Credit Is Due

One of the unfortunate effects of our corrupt and paralyzed political structure is the “drowning out” effect, sometimes described as Washington “sucking the oxygen out of the room.” While our attention is fixated on the more dramatic consequences of our national government’s “brokenness,” we fail to notice the harms being done by the multitude of problems that government is simply not fixing.

One of those is the way creditworthiness is measured.

There’s no doubt that credit card companies charge excessive rates of interest. But as scholars at the Brookings Institution point out, simply legislating a cap would actually compound the problem.

When does the interest rate a lender charges cross the line from economically justified to immoral? Societies have struggled with this question since biblical times. Last week, Sen. Bernie Sanders (I-Vt.) and Rep. Alexandria Ocasio-Cortez (D-N.Y.) took a crack at this puzzle, proposing to cap credit card interest rates at 15 percent. They’re concerned that the U.S. credit system traps working families with unsustainable debt. We share their concern, but their proposal uses a blunt instrument to attack a nuanced problem.

The Loan Shark Prevention Act, as the new legislation is called, is likely to hurt the people it’s designed to help, driving the market away from consumers with low credit scores. Some people may have their interest rates reduced, but many would no longer have access to credit at any price. Banks have been clever in figuring out how to hide credit in fees, as anyone who has paid $35 for an overdraft knows.

Instead, the authors propose making affordable credit accessible to a much larger group, by fixing what they identify as “the flawed scoring system that allocates credit.”

Our current system decides who gets credit and at what price using algorithms that analyze a person’s credit history and calculate a credit score. FICO, the most common credit score, employs a range between 300 and 850. There is no universally accepted definition of what constitutes a prime or subprime credit score but, generally, people with scores above about 680 are rewarded with cheap credit and high borrowing limits. Those classified as either near-prime or subprime, whose scores largely fall below 680, have a tougher time accessing and paying for credit.

The apparent objectivity of the algorithm masks a whole host of issues. A peek behind the credit-scoring curtain reveals that, as in “The Wizard of Oz,” there are humans feeding imperfect information into the machine. You could be the most creditworthy person on the planet, but if you lack a credit history, are a young adult or a recent immigrant, or had financial hardship in the past five years, your score will be low. Credit reports are rife with errors: One out of 5 Americans has a material error on their score.

I recently encountered this precise circumstance with my granddaughter-in-law: she is young and had virtually no credit history. It wasn’t bad credit, it was no credit, because she had been prudent and avoided debt. No credit became a real problem when she and my grandson applied for a mortgage. (Even more maddening, one of the three reporting agencies kept telling the bank her credit was “frozen”–whatever that means–but continued to insist to her, during her multiple calls to correct the issue, that it wasn’t.)

The Brookings scholars write that “Congress should start examining this system and aggressively pushing for its improvement.”

Lawmakers should push for credit-scoring formulas that take a wider range of data into consideration. Paying a mortgage on time improves your credit score, but paying your rent on time does not, because mortgages are tracked and rents generally are not. That’s just not fair…

The Consumer Financial Protection Bureau estimates that 45 million Americans lack the data that credit bureaus use to create a credit score. If you don’t have a score, it can be very hard to get a loan, rent an apartment or persuade an employer to hire you. Credit scores have become an essential component of what Princeton sociologist Frederick Wherry calls “financial citizenship” — the ingredients necessary to participate fully in the economy and civil society.

If we had a functioning Congress, this is one of the multiple tasks to which they should attend. But of course, we don’t. Right now, Mitch McConnell (aka the most evil man in America) is preventing the Senate from even considering one hundred bills that have been passed by the House.

We have a legislature that is incapable of doing anything, and an Administration trying its best to undo what was accomplished in the past. We aren’t even a banana republic: we’re a failed state.

Comments

If Demographics Are Destiny…..

The most encouraging headline I’ve come across lately was on a Brookings Institution study titled “Trump Owns a Shrinking Republican Party.”

It’s worth remembering the central point of the study when we read that a majority of Republicans remain adamant in their support of Trump–that’s a majority of a smaller and smaller number of voters.

The opening paragraphs of the report confront the puzzle of Trump’s disinterest in what has typically been the first goal of political candidates and parties alike: expanding one’s base.

Most American presidents come into office seeking to expand their support beyond their most loyal voters. But among the many peculiarities of the Trump presidency is his lack of interest in expanding his base, a fact that is even more surprising for someone who lost the popular vote by nearly 3 million and carried his key electoral college states by less than 100,000 votes. The story of Trump and his base has two sides.

The first “side” is what is most often reported: the devotion of Trump’s base. These are the people who would vote for him even if he shot someone in broad daylight on 5th Avenue, as he famously boasted.

Loyalty to Trump among the Republican base is looking so strong that it led Republican Senator Bob Corker (R-Tenn.), a Trump critic who is not running again, to tell reporters “It’s becoming a cultish thing, isn’t it?”

Indeed it is.  (As regular readers of this blog know–I have some fairly strong and not at all complimentary opinions about why people join that cult.)

The other “side” of the equation is the continuing erosion of party identification, especially Republican identification.

As the following graph of Gallup polls indicates, both political parties find themselves less popular now than they did in 2004 with a substantial rise in those who identify as independents. For the Democrats, party identification peaked in Obama’s first term and then dropped in his second term. For Republicans, party identification took a sharp drop at the end of George W. Bush’s second term and never really recovered. The trend seems to have taken another drop after Trump’s election.

How can we explain what looks to be a long-term decline for the Republican brand? Age, for one thing. From the beginning of the Trump administration the oldest Americans, those aged 50 and over, have consistently given Trump his highest approval ratings while young people aged 18–29 have consistently given him his lowest approval ratings.

The study concludes–not unreasonably–that a political party unable to attract young people, especially when a generation is as big as the Millennial generation, is not a party with a very bright future.

But it isn’t only young people. We don’t have data–at least, I’m unaware of any–that gives us a handle on the numbers of disaffected “old guard” Republicans, the good-government, civic-minded folks I used to work with, who are horrified by what their party has become. The Steve Schmidts and other high-profile “never Trumpers” are only the tip of that iceberg.

Of course, the GOP establishment is aware of these demographics; those dwindling numbers are the impetus for the party’s constant efforts to rig the system–to gerrymander, impose draconian voter ID requirements, purge registration rolls and generally do whatever they can to suppress turnout.

They know that members of the cult will vote, no matter what. If the rest of us–however numerous– don’t, the current (profoundly unAmerican) iteration of what used to be a Grand Old Party will retain power.

You don’t have to love the Democrats to find that prospect a chilling one.

Comments