There is substantial agreement across constituencies that America’s political system is broken. That’s where the agreement ends; who you blame for the dysfunction will depend upon your political orientation and the source and degree of your disgust.
Wherever your finger is pointing, it is becoming increasingly clear that the toxic nature of our politics is threatening to turn off the young people who will be needed if we are ever to improve the system.
A couple of professors recently shared some troubling data in the Washington Post: high school students they had surveyed would rather do anything other than become involved in politics. They don’t want to run for office. They don’t want to be Mayors or Councilors or Governors. Serving as a Member of Congress was their dead-last occupational choice.
As the professors wrote, the fact that they do not want to run for office
“cannot be divorced from their perceptions of the political system. Eighty-five percent of our survey respondents did not think that elected officials want to help people; 79 percent did not consider politicians smart or hard-working; nearly 60 percent believed politicians are dishonest; and fewer than 30 percent said they thought candidates and elected leaders stand up for their convictions.”
There are more than 500,000 elected positions in the U.S. Given the attitudes of these young people, we can only imagine who will run to fill those positions.
As the writers concluded, it’s easy to blame young people for a lack of civic engagement, but this time, “the fault lies with a political system and political figures whose behavior has turned off an entire generation.”
The recent Duck Dynasty brouhaha has demonstrated once again that many Americans are totally clueless about what freedom of speech and religion are and aren’t. But apparently, a whole lot of Americans are equally ignorant of what’s really in that bible people keep thumping. According to a recent article by one C.J. Werleman,
More than 95 percent of U.S. households own at least one copy of the Bible. So how much do Americans know of the book that one-third of the country believes to be literally true? Apparently, very little, according to data from the Barna Research group. Surveys show that 60 percent can’t name more than five of the Ten Commandments; 12 percent of adults think Joan of Arc was Noah’s wife; and nearly 50 percent of high school seniors think Sodom and Gomorrah were a married couple. A Gallup poll shows 50 percent of Americans can’t name the first book of the Bible, while roughly 82 percent believe “God helps those who help themselves” is a biblical verse.
The article notes that when Congressional Republicans quoted 2 Thessalonians (“Anyone unwilling to work should not eat”) to justify cutting food stamps, 90% of Christian respondents to a poll attributed the quote to Jesus. It was actually from a letter purportedly written by Paul to members of his church in Thessalonica, reminding them that if they didn’t help build the church, they wouldn’t be paid. (As it turns out, the letter wasn’t even from Paul; biblical scholars have dismissed it as a forgery.)
The actual texts of both the bible and the Constitution can be inconvenient, even for determined cherry-pickers. Add in historical and cultural context and….well, I guess it’s just too much work to actually read the damn things.
Yesterday, I read two unrelated items that brought that Talmudic injunction rather forcefully to mind. The first was a line in an excellent review in the Atlantic of two books about Sholem Aleichem, sometimes called the “Jewish Mark Twain.” (Aleichem was the creator of Tevye, the inspiration for the central character in Fiddler on the Roof.) The sentence that struck me was this: Jewish humor arises in the gap between reality and dreams, reality and justice.”
The other item was a story from the Huffington Post. The headline says it all: For the First Time Ever, a Prosecutor Will Go to Jail for Wrongfully Convicting an Innocent Man.
The prosecutor in the case, Ken Anderson, possessed evidence that would have cleared the defendant, including a statement from the crime’s only eyewitness that the defendant wasn’t the perpetrator. Anderson sat on that evidence and obtained a conviction of the accused, who remained in prison for the next 25 years. Meanwhile, Anderson’s career flourished, and he eventually became a judge.
As unjust as this situation was, as shocking to the conscience, what makes it newsworthy is the fact that Anderson actually was punished. As the story notes–and as most lawyers can attest– this is not an isolated case of malfeasance. Although most prosecutors and judges are ethical practitioners who take their obligations to the rule of law seriously, there are far too many who do not, and they are rarely, if ever, sanctioned. A recent study found prosecutorial misconduct in nearly a quarter of all capital cases in Arizona. Only two of those prosecutors have been reprimanded or punished in any way.
Evidence of gross misconduct leading to injustice isn’t limited to the legal system. It is increasingly impossible to ignore the corruption of our social and governing institutions. Over the past couple of decades, we’ve seen it everywhere–from rampant corporate misbehavior to major league sports doping to revelations of priestly child molestation to corrupt lobbyists to propagandists masquerading as journalists to Congressmen who cut food stamps for hungry children while fiercely protecting tax loopholes and corporate welfare for their patrons.
In this dismal ethical environment, justice isn’t the first word that comes to mind.
An unjust, unfair world invites–demands–political satire, and satire, at least, is thriving. You need only watch Jon Stewart for an example of Jewish humor that “arises in the gap between justice and reality.” It’s a big gap. The Sholem Aleichem reviewer suggests that the purpose of Jewish humor is to “give yourself some distance from your hopeless situation.” If that’s accurate, most humor these days is Jewish humor.
Gallows humor.
I enjoy a good laugh, but I’d prefer a more just world.
I’ve often argued that universal healthcare–Medicare for All–would spark an outpouring of entrepreneurship. If you want to open a shop, or go into the widget-making business, one significant barrier to doing so is the need to offer (very expensive) health insurance to your employees. Of course, you could decide not to provide that benefit, but you wouldn’t be very competitive in the market for good workers.
I understand, dimly, the historical reasons why the U.S. linked employment to health care, but it has always seemed to be a bad idea. What about people who don’t/can’t work? What about independent contractors? Why should an employer have to assume the costs–and risks–of employees’ health? Other countries do not couple jobs and insurance in this way–health insurance is provided as part of the social safety net, and the costs are spread much more widely.
Yesterday, in a Facebook post, a friend of mine explained why medical insurance provided through government–decoupled from employment–would boost the economy and make American businesses more competitive.
As he noted in his post, when you buy a product, all the costs of creating that product are reflected in the price: production, workers’ wages and benefits, materials. Most of the nations with whom we trade big-ticket items have had government-sponsored health care for decades, and at far lower cost. As a result, Saab and Mercedes, among others, are able to compete unfairly with American-made autos whose prices include a hefty private-sector health care premium. (I’ve seen numbers suggesting that this was one of the reasons GM and Chrysler went bankrupt; healthcare coverage for current and retired employees added over 2000 to the average price of their cars.)
If we really cared about keeping U.S. businesses competitive–and the health insurance system comprehensible–we’d have Medicare for All, or at least for anyone who wanted it. Given our political environment, and the lobbying clout of Big Insurance and Big Pharma, that was never in the cards.
Obamacare was (barely) politically feasible because it was originally the Republican alternative. With all its warts, it’s a step in the right direction, but if we want America to remain competitive, we will eventually need to separate access to health insurance from the vagaries of employment.
Ameica’s love affair with the suburbs began after World War II, and has thrived for most of my adult life. I’ve never really understood the desire for a tract house on a quarter-acre lot, or a McMansion set even farther away from the nearest neighbor. I’m a congenitally urban person–and I’ve always been a bit envious of European cities, where those who could afford it live in the center of well-maintained and loved cities, and those who are less fortunate are relegated to the suburbs.
Only in America does such an enormous percentage of the middle-class live in such low densities on so much land.
I have always chalked up this predilection for grass to the “to each his own” category, and assumed my own urban preferences would always mark me as a minority. But if two books I read last week are to be believed, we may be seeing a welcome shift–an increased appreciation for the many charms and conveniences of city living.
In The End of the Suburbs: Where the American Dream is Moving, the authors point to several significant signs that times are changing. The most recent census data suggests that–after 50 years of pretty constant growth–the suburbs have stalled. Recently, cities and high-density suburbs have grown twice as fast, with the largest cities growing faster than their suburbs for the first time in a hundred years.
Meanwhile, home values have inverted. During the Great Recession, housing values held up much better in urban centers than in suburban ones. And construction activity has reversed, with higher percentages of building permits being issued for “walkable, urbanized” locations and for multi-family developments than for traditional suburbia.
Household size has been steadily shrinking. People marry later, or not at all, and women wait longer to have fewer children. The suburbs were built for families with children, but Ozzie and Harriet have moved to an assisted living facility, and their grandchildren, according to the data, “hate the burbs.” Seventy-seven percent of Millennials express a preference for urban living. They also don’t care about driving: in 1980, 66% of all seventeen-year-olds had a driver’s license. In 2010, the figure was 47%. According to the data, they don’t want cars and they don’t want cul-de-sacs. Meanwhile, the price of oil continues to rise, and concerns about the environment have sparked an “anti-stuff” revolution.
Finally, the authors note that suburbs were poorly designed. They spread people far from each other, from their routine destinations, and from their jobs, making residents totally dependent on cars that get more expensive to operate every year. The suburbs’ low density complicates efficient provision of services, and fails to generate enough tax revenue to pay for the infrastructure needed to support them.
This book, together with The Metropolitan Revolution: How Cities and Metros are Fixing Our Broken Politics and Fragile Economy, provided plenty of thought-provoking data, and I’ll continue to share some of it in subsequent posts. Perhaps the most compelling finding, highlighted in both books, was the importance of public transportation in attracting new residents, jobs, and young people–and enabling economic development.
Both books shared lots of success stories. The common threads running through those successes included visionary leadership, collaborations between governments, nonprofits, universities and the business community, and good public transportation.
It won’t surprise you to find that Indianapolis wasn’t mentioned.