Cities

A reader recently sent me an article from Governing addressing an issue near and dear to me: are people moving out of cities in significant numbers? Has the pandemic increased those numbers?

I’d seen a couple of New York Times articles about an exodus from New York City to “healthier” outlying areas, and of course, there is an ongoing debate about the sustainability of the national population shift from small town America to the nation’s cities. The article addressed two highly pertinent questions: are lots of people really leaving cities, and why do people move anywhere?

As most readers of this blog have figured out by now, I’m a “city girl.” (Well, “girl” might be stretching things…) I’m a huge fan of urban life, and a believer in the social and intellectual benefits of density and diversity, so I was interested in an article that looked at what the evidence suggested, rather than what various theories have propounded. And the article actually started by distinguishing theory from reality

There’s an old joke about economists that I’ve always liked. A junior professor goes to his senior colleague with a brilliant new idea. The older man dismisses it. “That may be fine in practice,” he sniffs, “but it will never work in theory.”

Economists are like that, at least many of them. They don’t like to have reality intrude on their abstractions. One of the best examples has to do with mobility. Years ago, I read an article by a prominent economist downplaying the problem of a small-town factory that spews out pollution. What’s the big deal, he asked. There must be another town nearby without a soot-belching factory. The residents of the first town could just move over there. Pretty soon the polluter would get the idea.

It works in theory. But it isn’t the way most people behave. They don’t like the idea of uprooting themselves. This may be because they don’t want to leave their friends and relatives, because they cling to hometown memories and traditions, or maybe because they just don’t feel like cleaning out the garage. In any case, they don’t move. Or if they do, they don’t go far away.

The article acknowledged the predictions that have been worrying me–the economic forecasts of an “outpouring of affluent Americans from virus-plagued cities to safer rural climes.” One libertarian predicted a flood of “fresh college graduates and new parents” lighting out for Mayberry, accompanied by employees no longer tethered to corporate buildings downtown. (This rosy scenario overlooks the fact that COVID is currently ravaging the nation’s “Mayberries.”)

So what does the evidence show?

There has been an outflow from many urban neighborhoods, but it hasn’t been very large. Last June, a careful study by the Pew Research Center found that 3 percent of Americans reported moving permanently or temporarily for reasons related to the coronavirus. In November, the number was up to 5 percent. That’s not a trivial number of people, but it’s far short of a national exodus…

It’s also interesting to see where those folks are going. The largest destination of people leaving San Francisco last year was across the bay, to Oakland and surrounding Alameda County. The three next most common destinations were all in the Bay Area as well. Other targets were Denver; Portland, Oregon; and Austin, Texas–not Mayberry.

Most people who did move cited economic reasons–job loss, especially–not the pandemic.

Most cities that lost population in 2020 didn’t lose it because of people leaving. They shed population because newcomers weren’t coming. In New York City, according to a McKinsey study, the ratio of arriving workers to departing ones was down 27 percent. This, too, is only common sense. Why would you move into New York when jobs were disappearing there? Similar numbers apply to Los Angeles, Boston and Seattle.

This has the makings of a significant event. Nearly all the big cities that gained or held onto population numbers in the past decade did so because of immigrants arriving from outside the United States. If they stop coming for an extended length of time, big-city populations could drop significantly even if the mass exodus continues to be a myth.

The racist assault on immigration has had an effect on cities. As the article notes, America’s most vibrant cities have become enclaves of affluent professionals and modestly paid service workers–the bulk of whom have been immigrants. If the immigrants stop coming,  we’re likely to see a shortage of urban workers and a decline in demand for housing in many urban neighborhoods. That could make central cities more attractive, and not just to immigrants– it could fuel added arrivals by young professionals. Or…??

I’m sure economists will have a theory…

Comments

How The World Really Works…

A few days ago, a reader sent me a link to this article by Rutger Bregman from the now-defunct publication, Correspondent. It’s important.

His premise is that the world is in the midst of the biggest social shakeup since the second world war–one that will mark the end of neoliberalism, and see the emergence of far more  robust government.

As evidence of this impending change, the article quoted a 2020 editorial from the British-based Financial Times.

The Financial Times is the world’s leading business daily and, let’s be honest, not exactly a progressive publication. It’s read by the richest and most powerful players in global politics and finance. Every month, it puts out a magazine supplement unabashedly titled “How to Spend It” about yachts and mansions and watches and cars.

But on this memorable Saturday morning in April, that paper published this:

“Radical reforms – reversing the prevailing policy direction of the last four decades – will need to be put on the table. Governments will have to accept a more active role in the economy. They must see public services as investments rather than liabilities, and look for ways to make labour markets less insecure. Redistribution will again be on the agenda; the privileges of the elderly and wealthy in question. Policies until recently considered eccentric, such as basic income and wealth taxes, will have to be in the mix.”

Bregman points out that economic changes don’t emerge “out of the blue,” noting that there had been a time – some 70 years ago – that defenders of free market capitalism  were the radicals. The system we have now (if you can dignify it by calling it a system) began as a small think-tank established in the Swiss village of Mont Pèlerin by self-proclaimed “neoliberals” like Friedrich Hayek and Milton Friedman.

Friedman memorably wrote that “Only a crisis – actual or perceived – produces real change. When that crisis occurs, the actions that are taken depend on the ideas that are lying around.”

The thesis of the article is that economic crises of the 1970s ushered in neoliberalism (the ideas that were “lying around”) and a series of current crises–beginning with the fall of Lehman Brothers in 2008 and extending through COVID-19– will trigger changes based on the very different ideas that are now “lying around.”

Unlike the 2008 crash, the coronavirus crisis has a clear cause. Where most of us had no clue what “collateralised debt obligations” or “credit default swaps” were, we all know what a virus is. And whereas after 2008 reckless bankers tended to shift the blame to debtors, that trick won’t wash today.

But the most important distinction between 2008 and now? The intellectual groundwork. The ideas that are lying around. If Friedman was right and a crisis makes the unthinkable inevitable, then this time around history may well take a very different turn.

The new ideas have been planted by economists like Piketty and Zucman (example:Zucman and Saez’s “How the Rich Dodge Taxes and How to Make Them Pay”). And then there’s Mariana Mazzucato, who wroteThe Entrepreneurial State.

Mazzucato demonstrates that not only education and healthcare and garbage collection and mail delivery start with the government, but also real, bankable innovations. Take the iPhone. Every sliver of technology that makes the iPhone a smartphone instead of a stupidphone (internet, GPS, touchscreen, battery, hard drive, voice recognition) was developed by researchers on a government payroll.

And what applies to Apple applies equally to other tech giants. Google? Received a fat government grant to develop a search engine. Tesla? Was scrambling for investors until the US Department of Energy handed over $465m. (Elon Musk has been a grant guzzler from the start, with three of his companies – Tesla, SpaceX, and SolarCity – having received a combined total of almost $5bn in taxpayer money.) ….

But maybe the example that best makes Mazzucato’s case is the pharmaceutical industry. Almost every medical breakthrough starts in publicly funded laboratories. Pharmaceutical giants like Roche and Pfizer mostly just buy up patents and market old medicines under new brands, and then use the profits to pay dividends and buy back shares (great for driving up stock prices). All of which has enabled annual shareholder payments by the 27 biggest pharmaceutical companies to multiply fourfold since 2000.

The article ends with an explanation of the Overton Window–and how it has shifted.

If there was one dogma that defined neoliberalism, it’s that most people are selfish. And it’s from that cynical view of human nature that all the rest followed – the privatisation, the growing inequality, and the erosion of the public sphere.

Now a space has opened up for a different, more realistic view of human nature: that humankind has evolved to cooperate. It’s from that conviction that all the rest can follow – a government based on trust, a tax system rooted in solidarity, and the sustainable investments needed to secure our future. And all this just in time to be prepared for the biggest test of this century, our pandemic in slow motion – climate change.

You really need to read the entire article.

Comments

From Hudnut And Lugar To…Pathetic

In a recent post, I explained my long-ago departure from the Republican Party by sketching the GOP’s transition from a political party into a ragtag collection of culture warriors, con men and moral pygmies.

I’m certainly not the only person who’s noticed: David Brooks–a conservative-but-not-insane columnist for the New York Times recently bemoaned the fact that Republicans have abandoned principled policy debate in favor of fighting culture wars. And Yuval Levin of the American Enterprise Institute has wondered whether we may see a “policy realignment without a partisan realignment” because Republicans have found so many “cultural ways” to attract votes.

One of the many problems with Republicans’ metamorphosis from political partisans to culture warriors (a nice word for White Supremicists)  is the quality–or, more accurately, the absence of quality–of the political figures the party elevates.

Here in Indiana, the GOP is no longer  the party of able, principled people like Bill Hudnut and Richard Lugar. Besmirching the legacy of Lugar’s long and honorable Senate service, we have Mike Braun enabling the worst of Trumpism, and Todd Young obediently protecting billionaires from taxation and gun crazies from regulation.

Both have voted  in lockstep with other Trumpian “ditto heads” in the Senate–against the recent COVID relief package, against confirming Merrick Garland as Attorney General… essentially, against anyone or anything a Democratic President wants.

Neither of them can point to anything positive or important that they’ve accomplished. Their sole “platform” is that they have faithfully enabled  the GOP descent into Trumpian bigotry and culture war. 

Dick Lugar is rolling over in his grave.

And then we have Todd Rokita. I have previously posted about his effort to hold a second, well-paid job while purportedly acting as Indiana’s Attorney General–a full-time job. After the media highlighted that particular scam, Rokita quit that particular private-sector job–but it turns out, that wasn’t the only con he was running.

According to the Indiana Star

Indiana Attorney General Todd Rokita is getting paid $25,000 a year for advising a Connecticut-based pharmaceutical company on top of being compensated by at least one other company for similar work, IndyStar has discovered.

On Wednesday, Rokita filed a financial disclosure form with the Indiana inspector general’s office in which he described his ongoing involvement in 2020 with various companies. He acknowledged being paid by these companies, but his office declined to tell IndyStar how much. 

“We have provided all of the information required to be in compliance with the law,” spokesperson Molly Craft told IndyStar over email.

The financial disclosure comes weeks after Rokita faced scrutiny when it was reported that he was still working for the health care benefits firm Apex Benefits despite taking public office.

The paper reported that Rokita is being compensated by business accelerator Acel360, the Indianapolis-based transportation and logistics company Merchandise Warehouse and a pharmaceutical company, Sonnet BioTherapeutics. It was also able to confirm that he is being compensated by another pharmaceutical company called NanoViricides that he began working with in 2020 (in anticipation of his winning his campaign for AG?). 

Democrats have labeled Rokita a “walking conflict of interest,” and pointed out that–as a “well-known opponent of the Affordable Care Act”  he’s in position to place not just his political ideology but the interests of his private-sector benefactors ahead of the duties his office demands.

Thanks to the fact that the Indiana General Assembly is part-time, members are allowed to keep their “day jobs.” (That situation has pluses and minuses–members arguably bring  deepened understanding of many issues to legislative discussions, but the potential for conflicts of interest is greatly enhanced.) That same permissiveness doesn’t apply to statewide elected positions. Governor, Lieutenant-Governor, Superintendent of Public Instruction and Attorney-General are intended to be full-time jobs.

It would seem obvious to principled people that if they can’t “make it” on the state’s salary, they shouldn’t run for the office. Of course, it would also be obvious to principled people that throwing  dishonest red meat to voters terrified of losing cultural hegemony is a dishonorable way to win an election.

But then, these aren’t principled people.

Comments

Bubbles

The current, extreme polarization of the American public obviously cannot be attributed to any one cause. Differences in race, religion, gender, education, culture, experience– all of those things contribute to the way any particular individual sees the world.

But if I were pressed to identify a single culprit–a single source of today’s dysfunction–I would have to point a finger at our fragmented “Wild West” information environment. And research supports that accusation.

Americans are divided – that much is obvious after a contentious presidential election and transition, and in the midst of a politicized pandemic that has prompted a wide range of reactions.

But in addition to the familiar fault line of political partisanship, a look back at Pew Research Center’s American News Pathways project finds there have consistently been dramatic divides between different groups of Americans based on where people get their information about what is going on in the world.

Pew’s Pathway Project found–unsurprisingly–that Republicans who looked to former President Donald Trump for their news were more likely to believe false or unproven claims about the pandemic and the election.

And while Americans widely agree that misinformation is a major problem, they do not see eye to eye about what actually constitutes misinformation. In many cases, one person’s truth is another’s fiction.

The Pathways project explored Americans’ news habits and attitudes, and traced how those habits influenced what they believed to be true. The project focused on claims about the Coronavirus and the 2020 election; it drew its conclusions from 10 different surveys conducted on Pew’s American Trends Panel, a nationally representative panel of U.S. adults. Each survey consisted of about 9,000 or more U.S. adults, so the “n” (as researchers like to call the number of people participating in any particular study) was sufficient to produce very reliable results.

Over the course of the year, as part of the project, the Center published more than 50 individual analyses and made data from more than 580 survey questions available to the public in an interactive data tool. We now have the opportunity to look back at the findings over the full course of the year and gather together the key takeaways that emerged.

The report that did emerge can be accessed at the link. It explored key findings in five separate areas: evidence pointing to media “echo chambers” on the left and the right, and the identity and characteristics of the Americans who consistently turned to those echo chambers: Trump’s role as a source of news;  Americans’ concerns about and views of what constitutes misinformation; the distinctive characteristics of Americans who rely on social media for their news; and a final chapter tracing changes in these beliefs and attitudes over time.

The entire report is nuanced and substantive, as is most research from Pew, but the “take away” is obvious: Americans today occupy information “bubbles” that allow them to inhabit wildly different realities.

This most recent study builds on what most thoughtful Americans have come to recognize over the past few years, and what prior studies have documented. One study that has received wide dissemination found that watching only Fox News made people less Informed than those who watched no news at all. The study found NPR and the Sunday morning television shows to be most informative.

There are fact-checking sites, and media bias sites that rate the reliability of news sources–but these sources are only useful when people access them. Ideologues of the Left and Right, who engage in confirmation bias, rarely do.

The Pew study builds on a number of others, and together they pose a critical question: since the law cannot draw a line between propaganda and truth without eviscerating the First Amendment, how do we overcome the vast informational trust chasm that the Internet has generated?

Comments

Ideology, Meet Evidence

A few days ago, I shared a talk I gave to the Indianapolis Council on Women about the UBI–the theory behind efforts to replace much of America’s dysfunctional safety net with a Universal Basic Income.

There is, as I noted in that discussion, hysterical resistance to such a drastic change. We are, after all, a country that is politically unable to provide even universal access to healthcare. The cost of such a benefit would require us to look critically at America’s multiple wasteful subsidies  and it would require the Uber-rich to pay their share of taxes.

Cost is a legitimate concern. Less legitimate–and far more potent–is the belief that poor people are “takers” who would cease productive labor, neglect their kids, and spend their stipends on booze and drugs. I realize that most of the ideologues who subscribe to this theory are impervious to evidence, but evidence contrary to that belief continues to accumulate. I cited the results of previous pilot projects in the talk I referenced, and subsequently, additional evidence has emerged.

After getting $500 per month for two years without rules on how to spend it, 125 people in California paid off debt, got full-time jobs and had “statistically significant improvements” in emotional health, according to a study released Wednesday.

The program was the nation’s highest-profile experiment in decades of universal basic income, an idea that was revived as a major part of Andrew Yang’s 2020 campaign for president.

Cynics had predicted that free money would eliminate the incentive to work, creating a population dependent on the state. The experiment in Stockton, California that yielded these results was an effort to test that thesis. It was funded by private donations, including a nonprofit led by Facebook co-founder Chris Hughes who has been a longtime supporter of the UBI.

Run by a nonprofit founded by former Stockton Mayor Michael Tubbs, the program included people who lived in census tracts at or below the city’s median household income of $46,033.

A pair of independent researchers at the University of Tennessee and the University of Pennsylvania reviewed data from the first year of the study, which did not overlap with the pandemic. A second study looking at year two is scheduled to be released next year.

When the program started in February 2019, 28% of the people slated to get the free money had full-time jobs. One year later, 40% of those people had full-time jobs. A control group of people who did not get the money saw a 5 percentage point increase in full-time employment over that same time period, from 32% to 37%.

“These numbers were incredible. I hardly believed them myself,” said Stacia West, a researcher at the University of Tennessee who analyzed the data along with Amy Castro Baker at the University of Pennsylvania.

The money came once a month, and was distributed via a debit card. That allowed the researchers to track how people spent it. The largest expense each month was for food, followed by sales and merchandise, which included purchases at places like Walmart and Target, which also sell groceries. The next highest categories were utilities, automobile (gas and repairs) and services. Less than 1% of the money went to tobacco and alcohol.

Given America’s political culture, which valorizes individualism and looks askance at any suggestion that social support might increase–rather than disincentivize–individual ambition, the prospects for a UBI are pretty dim. But there are some signs that opposition may be softening.

Still, guaranteed income programs seem to be gaining momentum across the country. More than 40 mayors have joined Mayors for a Guaranteed Income, with many planning projects of their own. A proposal in the California Legislature would offer $1,000 per month for three years to people who age out of the state’s foster care system. And in Congress, Republican U.S. Sen. Mitt Romney of Utah has proposed expanding the child tax credit to send most parents at least $250 per month.

We’ll see how the evidence accumulates…..

Comments