American Polarization

I have been attending a conference on American Political History, for which I prepared a paper. The following is my (abbreviated) presentation of that paper–still considerably longer than my daily posts, so be forewarned….

_____________________________________

America’s first motto was e pluribus unum, “out of the many, one.” That motto has always been more aspirational than descriptive, but thanks to a number of factors– from residential sorting to the hardening of racial and religious attitudes– America now faces fissures in the body politic that call even the aspiration into question.

Humans are hard-wired to be tribal—to prefer those we see as our “own kind” to members of groups that register as “other.” Recognition of this aspect of human nature is hardly new; multiple studies deal with aspects of human tribalism, and there’s an equally large number detailing the various mechanisms through which humans express, reinforce and justify tribal prejudices. History records the frequently horrifying consequences of dehumanizing people deemed to be “other” during wars and other conflicts, and the equally appalling behaviors that stem from the demonizing of targeted minority populations by dominant majorities within a single country.

The term tribalism is shorthand for this human predisposition to divide the world into in-groups and out-groups. There is considerable evidence that some degree of in-group favoritism is an inescapable attribute of group membership. It is the favoritism that is problematic; the human need to be part of a family, clan or tribe is not in itself a negative. Just as our families and more extended clans provide us with emotional and material support, membership in a larger group with which one identifies has its benefits. The presence of what sociologists call bonding social capital provides people within the relevant groups with cultural norms, and (at least within one’s group) supports increased levels of interpersonal trust and reciprocity, assets that facilitate collaborative action.

It’s when identification with a tribe operates to exclude and demean anyone who isn’t a member—when it creates a world peopled by “us” (good) and “them” (bad)—that it becomes destructive. It becomes especially dangerous when the definition of “us” is narrow, dependent upon immutable characteristics or upon rigid adherence to a particular ideology or religious belief that excludes and distrusts others. When negative stereotypes of an out-group are endorsed by celebrities or political authority figures, the damage can be substantial; for example,  researchers have linked Donald Trump’s anti-Muslim tweets to spikes in anti-Muslim hate crimes.

Identity
Although the terms “identity” or “identity politics” can mean different things depending on the context, for purposes of this analysis, the terms reference an individual’s group affiliation, or social identity. As noted, identification with others in a particular group or category can confer feelings of acceptance and provide role models: this is how “we” behave. When individuals conclude that “I am like the other people in this group” and I am unlike “those people in other groups,” that recognition can lead to a sense of belonging and a recognition of interdependence with others in the relevant “tribe.” That said, membership in a tribe is usually accompanied by some degree of suspicion of those who fall outside that tribe. Trouble starts when that suspicion is heightened, and members of other groups are seen as competitors, enemies, or threats that must be subdued or eliminated. When “we” are God’s chosen, and “they” are by definition abominations, tolerance of difference is simply not possible.

The United States has not been immune from tribal conflicts, and today’s citizens continue to struggle with their legacies. America may have abolished slavery, but racism has proved much harder to eradicate. Religious conflicts and anti-Semitism have been—and remain—a constant. Women continue to struggle against an attitudinal “glass ceiling” that works against genuine equality in both the home and workplace. It wasn’t until the 1960s that LGBTQ citizens began emerging from the closet in significant numbers, and homophobia, like racism, continues to characterize much of American culture. And the country is experiencing yet another eruption of the hostility with which we have repeatedly greeted successive waves of immigrants.

In much of America’s admittedly contentious past, except for individuals who were automatically categorized as “other” by virtue of an immutable characteristic like race or gender, American affiliations have tended to be cross-cutting, meaning that people often identified as a member of several different communities having limited overlap. Individuals with such heterogeneous affiliations are likely to interact on a regular basis with fellow citizens holding views contrary to their own, and less likely to stereotype and malign people with whom they disagree as a result. In his seminal study The Social Requisites of Democracy, Seymour Martin Lipset concluded that that democratic stability is enhanced when individuals and groups have a number of cross-cutting, politically relevant affiliations. As a 2018 article in The Guardian noted, “[R]esearch has lined cross-cutting cleavages with toleration, moderation and conflict prevention.”

For a number of reasons, America’s “tribes” have become far more overlapping, meaning that people’s various identities have coalesced in ways that reinforce each other. As a result, and thanks also to the residential “sorting” documented by Bill Bishop in The Big Sort, most Americans have much less interaction with people who have opinions different from those of their tribes, and are less likely to engage with ideas and beliefs different from their own.

Historically, American tribal conflicts have centered upon identities and affiliations that were difficult or impossible to change: the ethnic, racial and religious differences that have been a source of human conflict for centuries. Those differences remain potent today. Racism, in particular, has re-emerged with a vengeance, and it isn’t limited to the White Nationalist movements that have become active across much of Europe and the United States. Longstanding racial and religious fault-lines have been deepened by the emergence of newer ideological and cultural cleavages, many of which are exacerbated by geography: in today’s U.S., for example, the worldviews of urban and rural inhabitants are frequently incommensurate. Research has documented deep differences in values and outlook between Americans who are well-to-do (or at least economically comfortable) and the poor, and between white people with a college education and white people without. Americans’ affiliations have become increasingly reinforcing rather than cross-cutting, enabling the growth of a toxic partisanship that sees the world in stark terms of black and white and right versus wrong. These world-views demand winners and losers.

Thanks to a variety of factors, significant numbers of Americans currently occupy “bubbles” populated largely by people who share and fortify their preferred worldviews. Even a cursory examination of the 21st Century media and policy environment allows  identification of several of those worldviews, as well as the environments that created and nurture them.  A caveat: the following list is not exhaustive—and due to time constraints, the categories are described in far more depth in the paper.

  • Cosmopolitan and Parochial.  Cosmopolitanism challenges the primacy citizens place onattachments to the nation-state and other parochial shared cultures. The cosmopolitan/parochial divide shares many attributes with classism.
  • Richer and Poorer.The economic divide between America’s rich and poor is now as damaging as it was during the Gilded Age.  This dangerous and growing gap between struggling Americans and the well-to-do means they have increasingly disparate life experiences and live increasingly segregated lives.
  • College Educated and Not. In the 2016 election, white voters divided sharply based upon their levels of education. Clinton carried counties with high numbers of educated voters, and even high income low education counties voted for Trump.
  • Urban versus Rural.Urban Americans are more than three times more likely than their rural counterparts to say that religion isn’t particularly important to them, and attitudes on social issues reflect that difference. They are also far more likely to be Republican.
  • Republican versus Democrat, Liberal versus Conservative.An individual’s self-identification as Republican or Democrat has come to signify a wide range of attitudes and beliefs not necessarily limited to support for a political party. Lilliana Mason notes that “A single vote can now indicate a person’s partisan preferences as well as his or her religion, race, ethnicity, gender, neighborhood and favorite grocery store.”  Partisan identity has become a shorthand encompassing racial, professional and religious identities. Party identification now outweighs ideological commitments, as can be seen by the acquiescence of Republican lawmakers to Trump’s tariffs that are wildly at odds with longtime Republican positions.
  • Black, White, Brown. It is impossible to talk about tribalism, of course, without addressing the stubborn persistence of racism. Age-old racial hatreds have been fed by economic anxieties and by demographic changes that threaten white Christian Americans with loss of their long-time social dominance and privilege. The  election of America’s first African-American President exacerbated long-simmering racial resentments, giving rise to the so-called “birther” movement, while Donald Trump’s overt appeals to racist sentiments have unleashed a sharp increase in racist, anti-Semitic, anti-Muslim and anti-immigrant assaults.

America’s cultural and political polarization has been facilitated by the Internet and the reduced reach of so-called “legacy” media that previously provided the country with (relatively homogenized) information. The current media landscape allows Americans to consult a multitude of news and opinion sites of widely varying credibility and to choose the “news” that accords with their partisan preferences. Social media has encouraged the sharing of dubious assertions and unfounded accusations. One result has been a widespread loss of confidence in our ability to know what is factual and what is not—to distinguish between journalism and propaganda. The widespread availability of disinformation is especially troubling because the American public has abysmally low levels of civic literacy.

Economic insecurity, the threatened loss of jobs to global trade and especially automation, and the rapidity of social and technological change have contributed to widespread fear and uncertainty. Too many political figures have appealed to those fears rather than trying to ameliorate them. There is also the increasing complexity of the national and international issues we face, and the failure to reform antiquated government structures that are increasingly inadequate to meet the challenges posed by changes in where and how today’s Americans live.

All of these developments and many others have been consequential. That said, it is impossible to analyze the ways in which these changes have been experienced and various tribes have been formed without recognizing the degree to which America’s historic struggle with racism has exacerbated the salience of all of them.

Whatever our beliefs about “American exceptionalism” today, it behooves us to recognize that the founding of this country was genuinely exceptional—defined as dramatically different from what had gone before—in one incredibly important respect: for the first time, citizenship was made dependent upon behavior rather than identity. In the Old World, the rights of individuals were largely dependent upon their identities, the status of their particular “tribes” in the relevant political order. (Jews, for example, rarely enjoyed the same rights as Christians, even in countries that refrained from oppressing them.) Your rights vis a vis your government depended largely upon who you were—your religion, your race, your social class, your status as conqueror or conquered.

The new United States took a different approach to citizenship. Whatever the social realities, whatever the disabilities imposed by the laws of the various states, any white male born or naturalized here was equally a citizen. We look back now at the exclusion of blacks and women and our treatment of Native Americans as shameful departures from that approach, and they were, but we sometimes fail to appreciate how novel the approach itself was at that time in history. All of what we think of as core American values—individual rights, civic equality, due process of law—flow from the principle that government must not treat people differently based solely upon their identity. Eventually (and for many people, very reluctantly) America extended that founding principle to gender, skin color and sexual orientation. Racism is thus a rejection of a civic equality that is integral to genuinely American identity.

When the nation’s leaders have understood the foundations of American citizenship, when they have reminded us that what makes us Americans is allegiance to core American values—not the color of our skin, not the prayers we say, not who we love—we emerge stronger from these periods of unrest. The political divisions that are so stark in our polarized time represent, at least in part, a clash between those who fear we are departing from that essential (if imperfectly recognized) commitment to equality and those who want to “return” to an imagined White Christian America.

Religious Liberty?

As America becomes more diverse, and White Christians face the loss of cultural hegemony, they have increasingly turned to the First Amendment’s Free Exercise Clause to make arguments about their right to an expansive and ahistorical “religious liberty.”

A bit of history for this history conference: What the phrase “Religious liberty” meant to the Pilgrims who landed at Plymouth Rock was the “liberty” to impose the correct religion on their neighbors. The idea that Church and State could even be separated would have been incomprehensible to the Puritans; the liberty they wanted was freedom to “establish” the True Religion, and to live under a government that would impose that religion on their neighbors.

The Puritans defined liberty as “freedom to do the right thing,” to impose the correct religion.

A hundred and fifty years later, however, the men who crafted our Constitution had a very different understanding of liberty. The philosophical movement we call the Enlightenment had given birth to science and empiricism, privileged reason over superstition, and caused philosophers to reconsider the purpose and proper role of government.

Liberty had come to mean the individual’s right to self-government, the right to decide for oneself what beliefs to embrace. Liberty now meant the right of individuals to live their lives in accordance with their own consciences, free of both state coercion and what the founders called “the passions of the majority,” so long as they did not harm others, and the Bill of Rights limited what government could require even when a majority of citizens approved.

The problem is that, although America’s Constitution and legal framework were products of the Enlightenment, many American citizens remain philosophical Puritans.

Many of the fundamentalist Christians fearing loss of cultural hegemony are deeply Puritan: anti-science, anti-reason, anti-diversity. They are absolutely convinced of their own possession of the Truth, and like the original Puritans, absolutely convinced that a proper understanding of “religious liberty” should give them the right to make rules for everyone else.

Under the Constitution, Americans have the right to believe anything they want. They do not have an absolute right to act on those beliefs. (You can sincerely believe God wants you to sacrifice your first-born, but the law doesn’t let you do that.) Many people have trouble understanding that distinction.

Opponents of civil rights for LGBTQ citizens argue that rules preventing businesses from refusing to hire employees who offend their religious beliefs, or from firing or otherwise discriminating against such individuals, denies them religious liberty. (This is a variant of the argument that anti-bullying legislation infringes the “free speech rights” of those doing the bullying.) They argue that they should be able to discriminate against gay people—or black people, or women, or Muslims–if they claim a religious motivation. Of course, an exemption for discrimination based upon “religious motivation” would eviscerate civil rights laws.

This is the same argument that erupted when Congress enacted the 1964 Civil Rights Act. Opponents argued that being forced to hire or do business with women or people of color violated their liberty to act upon a “sincere religious belief” that God wanted women to be subordinate and the races to be separate. And it did limit their liberty. In a civilized society, the right to do whatever one wants is constrained in all sorts of ways: I don’t have the liberty to play loud music next to your house at 2:00 a.m., or drive my car 100 miles per hour down city streets. And so on.

Civil rights laws are an outgrowth of the social contract. The citizen who opens a bakery– or a shoe store or a bank or any other business–- expects local police and fire departments to protect her store, expects local government to maintain the streets and sidewalks that enable people to get there, expects state and federal agencies to protect the country, to issue and back the currency used to pay for his products, and to ensure that other businesses and institutions are playing by the rules and not engaging in predatory behaviors that would put him out of business. People of all races, religions, genders and sexualities pay the taxes that support those government responsibilities, and in return, have a right to expect those who are “open for business” to provide cakes or shoes or other goods to any member of the public willing and able to pay for them.

The religion clauses of the First Amendment give religious folks the right to exclude those they consider “sinners” from their churches, their private clubs and their living rooms. That right does not extend to their hardware stores.

Today’s Americans live with over 330 million others, many of whom have political opinions, backgrounds, holy books, and perspectives that differ significantly from their own. The only way such a society can work–the only “social contract” that allows diverse Americans to coexist in reasonable harmony–is within a legal system and culture that respects those differences to the greatest extent possible. That means laws that require treating everyone equally within the public/civic sphere, while respecting the right of individuals to embrace different values and pursue different ends in their private lives. Only a legal system that refuses to take sides in America’s ongoing religious wars is able to safeguard anyone’s religious liberty.

History teaches us that social change that threatens the privileged status of dominant groups will be ferociously opposed by those groups. Throughout American history, when previously subordinated populations have demanded a seat at the civic table, those whose hegemony was threatened have resisted. That resistance may not completely explain today’s polarization, but it has massively contributed to  it.

As Mark Twain is said to have observed, history may not repeat itself, but it does rhyme.

We live in rhyme time.

Comments

Pride In Indiana

Today is Pride Day in Indianapolis. The parade –which I always attend– will have well over 100 entrants, representing a wide variety of government agencies, educational institutions, churches and area businesses–a far cry from the few forlorn entries in the first such effort 25 years ago.

Among other things, Pride now celebrates the legal and social progress of the LGBTQ community, which has made great strides nationally over the last couple of decades. In Indiana, it will not surprise you to discover that such progress has been considerably more spotty; cities and towns have passed inclusive Human Rights ordinances, but the state as a whole is an embarrassment on this issue (as well as on so many others.)

The very different politics of cities and rural areas with respect to LGBTQ rights has recently been highlighted by the effort of Jim Merritt–a longtime legislator now running for Mayor of Indianapolis–to “cozy up” to the gay community, and to distance himself from his “perfect” anti-gay record in Indiana’s Statehouse. Our legislature has been gerrymandered to create districts dominated by rural voters, and Merritt has pandered accordingly.

He is not alone. Indiana’s legislature has stubbornly refused to pass an inclusive bias crime bill. Efforts to add four little words–sexual orientation and gender identity– to the list of protected categories in the state’s civil rights law have gone nowhere.

Two years ago, on this blog, I posted some revelatory statistics about the legal disabilities of LGBTQ Hoosiers. The laws that facilitated those statistics haven’t changed. Here’s a smattering of what I wrote then:

Approximately 133,000 LGBT workers in Indiana are not explicitly protected from discrimination under state law….  If sexual orientation and gender identity were added to existing statewide non-discrimination laws, 61 additional complaints of discrimination would be filed with the Indiana Civil Rights Commission each year. Adding these characteristics to existing law would not be costly or burdensome for the state to enforce.

Recent polling discloses that 73% of Indiana residents support the inclusion of sexual orientation as a protected class under Indiana’s existing civil rights law. That’s 73% in Very Red Indiana.

Major employers in the state have worked with civil rights and civil liberties organizations in an effort to add “four little words” to the list of categories protected under the state’s civil rights statute:  sexual orientation and gender identity. So far, the legislature has exhibited zero interest in doing so.

The public outrage over Pence’s RFRA led to a subsequent “clarification” (cough cough) that the measure would not override provisions of local Human Rights Ordinances that do proscribe discrimination on the basis of sexual orientation. A number of city councils around the state promptly added those protections to their Ordinances, which was gratifying.

The problem, as the research points out, is twofold: municipal ordinances in Indiana don’t have much in the way of “teeth.” They are more symbolic than legally effective. Worse, for LGBTQ folks who don’t live in one of those municipalities, there are no protections at all.

The result: Only 36% of Indiana’s workforce is covered by local non-discrimination laws or executive orders that prohibit discrimination based on sexual orientation and gender identity. And that discrimination occurs with depressing regularity.

– In response to the National Transgender Discrimination Survey, 75 percent of respondents from Indiana reported experiencing harassment or mistreatment at work, 30 percent reported losing a job, 21 percent reported being denied a promotion, and 48% reported not being hired because of their gender identity or expression at some point in their lives.

– Several recent instances of employment discrimination against LGBT people in Indiana have been documented in court cases and administrative complaints, including reports from public and private sector workers.

– Census data show that in Indiana, the median income of men in same-sex couples is 34 percent lower than that of men married to different-sex partners.

– Aggregated data from two large public opinion polls found that 79 percent of Indiana residents think that LGBT people experience a moderate amount to a lot of discrimination in the state.

Four little words. Why is that so hard?

Today, at the parade and the event itself, the community and its allies will celebrate the progress that has been made.

Monday morning,  opponents of bigotry need to go back to work.

Comments

Private Prisons And Perverse Incentives

Every once in a while, my city gives me something to brag about. Most recently, that’s the current administration’s approach to Criminal Justice.

A recent article from Fortune Magazine, of all places, sets it out.

When the city heads to Wall Street Thursday to borrow $610 million to build a jail and criminal justice complex on the site of an old coking factory, it’s betting it can better house criminals and rehabilitate them on its own. That means CoreCivic, which has run a Marion County jail for two decades, will lose the contract when the new one opens.

The decision to sever ties with CoreCivic is part of a shift in policy-making that seeks to address a cycle of recidivism that keeps sending repeat offenders back to jail. It joins other governments nationwide, including California, that are reconsidering a reliance on the private companies that stepped in as the war on drugs and mandatory minimum sentencing laws caused inmate populations to soar, leaving more than half of the states paying businesses to incarcerate their residents.

There is a mountain of data detailing what’s wrong with private prisons. (When my graduate students choose to write their research papers on the subject of for-profit prisons, their conclusions range from highly critical to horrified, and for good reason.) Zach Adamson, Vice-President of the Indianapolis City-County Council is quoted in the article with what may be the best summary of the problem with prisons for profit:

“The idea that there would be profit to be made through the imprisonment of our neighbors is something that’s abhorrent to a number of people—many of our constituents cannot process that,” said Zach Adamson, vice president of the council that oversees the consolidated government of Indianapolis and Marion County. “Criminal justice is not getting better as long as our primary concern is looking to cut corners and save costs.” (emphasis supplied)

In 2016, the city convened a task force to consider ways Indianapolis could cut crime and address jail overcrowding. The task force recommended addressing “underlying causes,” in an effort to reduce both crime and the $440 million dollars Indianapolis spends on criminal justice each year–far and away the city’s biggest expense.

The issues facing Indianapolis are hardly unique: some 40% of people detained in the country’s jails are mentally ill and up to 85 percent suffer from substance abuse (with respect to those who are mentally ill, psychiatrists tell us that substance abuse is an effort at “self-medicating.”)

The complex will consolidate the courthouse, its jails, and rehabilitation operations in one modern site. The city-county council voted in April 2018 not to privatize the new lockup, dealing a blow to CoreCivic, which has managed a facility there since 1997.

“The goal of the jail system shouldn’t be to fill the beds,” said Andy Mallon, corporation counsel for the government. “We’re trying to reduce crime and reduce the number of people who are involved in crime.”

Mallon’s observation is at the heart of what’s wrong with privatizing these elements of the criminal justice system. Private prison companies are in business to fill beds, and to do so as cheaply as possible, not to rehabilitate offenders. Their lobbyists work to criminalize additional behaviors and increase prison terms for offenses already on the books–measures that feed their bottom lines.

Their goal isn’t public safety, it’s profit, and the big private prison companies donate generously to politicians in order to protect those profits.

During the Obama Administration, the Department of Justice and several state governments  responded to the research, recognized the existence of the perverse incentives, and began  terminating contracts with companies like GEO and CoreCivic. Then, of course, we got Trump, and headlines like these:”Trump’s First Year Has Been the Private Prison Industry’s Best.”  and “Trump’s Immigrant-Detention Plans Benefit Private Prison Operators.”

In Indianapolis, I am happy to say, the city has chosen to bring best practices to bear on its criminal justice problems, to evaluate those it incarcerates in order to determine appropriate interventions– and to stop paying for-profit companies to warehouse offenders.

Comments

Credit Where Credit Is Due

One of the unfortunate effects of our corrupt and paralyzed political structure is the “drowning out” effect, sometimes described as Washington “sucking the oxygen out of the room.” While our attention is fixated on the more dramatic consequences of our national government’s “brokenness,” we fail to notice the harms being done by the multitude of problems that government is simply not fixing.

One of those is the way creditworthiness is measured.

There’s no doubt that credit card companies charge excessive rates of interest. But as scholars at the Brookings Institution point out, simply legislating a cap would actually compound the problem.

When does the interest rate a lender charges cross the line from economically justified to immoral? Societies have struggled with this question since biblical times. Last week, Sen. Bernie Sanders (I-Vt.) and Rep. Alexandria Ocasio-Cortez (D-N.Y.) took a crack at this puzzle, proposing to cap credit card interest rates at 15 percent. They’re concerned that the U.S. credit system traps working families with unsustainable debt. We share their concern, but their proposal uses a blunt instrument to attack a nuanced problem.

The Loan Shark Prevention Act, as the new legislation is called, is likely to hurt the people it’s designed to help, driving the market away from consumers with low credit scores. Some people may have their interest rates reduced, but many would no longer have access to credit at any price. Banks have been clever in figuring out how to hide credit in fees, as anyone who has paid $35 for an overdraft knows.

Instead, the authors propose making affordable credit accessible to a much larger group, by fixing what they identify as “the flawed scoring system that allocates credit.”

Our current system decides who gets credit and at what price using algorithms that analyze a person’s credit history and calculate a credit score. FICO, the most common credit score, employs a range between 300 and 850. There is no universally accepted definition of what constitutes a prime or subprime credit score but, generally, people with scores above about 680 are rewarded with cheap credit and high borrowing limits. Those classified as either near-prime or subprime, whose scores largely fall below 680, have a tougher time accessing and paying for credit.

The apparent objectivity of the algorithm masks a whole host of issues. A peek behind the credit-scoring curtain reveals that, as in “The Wizard of Oz,” there are humans feeding imperfect information into the machine. You could be the most creditworthy person on the planet, but if you lack a credit history, are a young adult or a recent immigrant, or had financial hardship in the past five years, your score will be low. Credit reports are rife with errors: One out of 5 Americans has a material error on their score.

I recently encountered this precise circumstance with my granddaughter-in-law: she is young and had virtually no credit history. It wasn’t bad credit, it was no credit, because she had been prudent and avoided debt. No credit became a real problem when she and my grandson applied for a mortgage. (Even more maddening, one of the three reporting agencies kept telling the bank her credit was “frozen”–whatever that means–but continued to insist to her, during her multiple calls to correct the issue, that it wasn’t.)

The Brookings scholars write that “Congress should start examining this system and aggressively pushing for its improvement.”

Lawmakers should push for credit-scoring formulas that take a wider range of data into consideration. Paying a mortgage on time improves your credit score, but paying your rent on time does not, because mortgages are tracked and rents generally are not. That’s just not fair…

The Consumer Financial Protection Bureau estimates that 45 million Americans lack the data that credit bureaus use to create a credit score. If you don’t have a score, it can be very hard to get a loan, rent an apartment or persuade an employer to hire you. Credit scores have become an essential component of what Princeton sociologist Frederick Wherry calls “financial citizenship” — the ingredients necessary to participate fully in the economy and civil society.

If we had a functioning Congress, this is one of the multiple tasks to which they should attend. But of course, we don’t. Right now, Mitch McConnell (aka the most evil man in America) is preventing the Senate from even considering one hundred bills that have been passed by the House.

We have a legislature that is incapable of doing anything, and an Administration trying its best to undo what was accomplished in the past. We aren’t even a banana republic: we’re a failed state.

Comments

Lessons From A Small French Town

It’s a truism among urbanists that small towns in the United States are dying.

Here in Indiana, the data confirms the bleak prognosis: Main Streets are filled with boarded up stores, wig shops and formerly vibrant emporiums that have been turned into sad “museums.” Young people move away as soon as they can, leaving a graying and resentful population behind.

The phenomenon is not restricted to the United States; everywhere, cities are booming while small towns are on the same, sad trajectory. So a recent report in the Guardian was eye-opening.

On a lane in what was once considered eastern France’s grimmest town, a street artist is up a ladder finishing a mural, the independent bookshop has a queue at the till, the organic cooperative is full of customers and Séverine Liebold’s arty independent tea shop is doing a brisk trade….

Just over a decade ago, Mulhouse, a town of 110,000 people near the German and Swiss borders, was a symbol of the death of the European high street. One of the poorest towns of its size in France, this former hub of the textile industry had long ago been clobbered by factory closures and industrial decline. It had high rates of poverty and youth unemployment, a shrinking population, and more than 100 shops empty or boarded up. The centre had become associated with gangs….

Today, Mulhouse is known for the staggering transformation of its thriving centre, bucking the national trend for high street closures.

In the past eight years, more than 470 shops and businesses have opened here. Mulhouse is unique in that 75% of new openings are independents, from comic book stores to microbreweries and organic grocers. It is one of the only places in France with as many independents as franchises. And it is one of very few places in France where more shops are opening than closing.

Mulhouse was only one of a large number of dying small towns in France.

French political powers woke up late to the problem of dying town centres. Outside the Paris region, an average of 11% of high street premises lie empty, similar to the UK. But France, which has a powerful hypermarket industry and lobby, has for decades hastened town centre decline by allowing out-of-town superstores to mushroom over kilometres of dull grey hangars on the outskirts of towns.

Leaders only recently turned to the issue, fearing boarded up shopfronts and vanishing services could help usher in Donald Trump-style populists. Polls showed that in small French towns, the fewer the services on offer – notably post offices – the higher the vote for the far right.

What caused this town’s turnaround? How did Mulhouse buck the trend?

The simple answer is public investment in public amenities.

Mulhouse set out to rebalance the housing mix. Generous subsidies for the renovation of building fronts expedited a facelift of more than 170 buildings. Security and community policing were stepped up. Transport was key – with a new tram system, bike schemes, shuttle buses and cheap parking.

But making the town’s public spaces attractive was just as important, with wider pavements, dozens of benches, and what officials deemed a “colossal budget” for tree planting and maintenance, gardening and green space. Local associations, community groups and residents’ committees were crucial to the efforts.

The idea was to create a town center where people could feel good, where they could congregate. The town re-appropriated the town’s center as a kind of agora, the place where everyone could meet. Olivier Razemon, the author of a recent study called How France Killed Its Towns, says town centers should be seen as a theatrical backdrop to life’s encounters, with the understanding that: “People don’t go to the town centre just for shops, but because it’s pleasant, because they want to meet up.”

There were several other aspects to Mulhouse’s revitalization. An important element was emphasis upon independent, “home grown” enterprises offering wares not available in the big box stores.

The major driver of the town’s resurgence, however, was its substantial public investment in public amenities: public transportation, restoration of the built environment, generous plantings and landscaping that made the town’s public spaces attractive–all of the elements of what urbanists call “quality of life.”

In so many small towns, unwillingness to spend tax dollars on these “quality of life” elements creates a vicious cycle of disinvestment and abandonment. Mulhouse chose to invest heavily in them instead, and to create a virtuous cycle. It worked.

There’s a lesson there.

Comments