Our Current Mess–a recent rant

[The other night, I spoke to the Washington Township Democratic Club, and thought I’d post those remarks here.]                                               

 

When I labeled this talk “The Current Mess” it was because I hadn’t decided what to talk about, and I figured “mess” covered pretty much anything I might choose—locally, I might be talking about our Mayor. (People tell me we do have one, although I’m dubious…). Or I could be talking about the state’s budget crisis, Mitch’s privatization fixation, or the multiple failures of what Harrison Ullmann used to call the World’s Worst Legislature. Nationally, there’s our economic meltdown, the fact that we are mired down in two ill-conceived and mismanaged wars, the damage that has been done to civil liberties and the justice system…well, you all know the drill.

 

But when I thought about it, I decided that there is a deeper problem—one that is really at the root of all the others. That problem is Americans’ loss of confidence and trust in government. I don’t mean our longstanding political debates about what government ought to do; those are both inevitable and in my opinion, productive. I’m talking about the de-legitimization of the whole enterprise of government. It is one thing to say that government should or should not do X; it’s another to say, as Ronald Reagan did, that government is the problem, not the solution.

 

I think our multiple current messes all begin with that attitude, with that scorn for using government to address even the most challenging of our collective civic problems. In my most recent book, Distrust, American Style (now available at a bookstore near you!!), I spend a lot of time discussing why Americans lost trust in our governing institutions. I actually wrote the book to address a different issue: the country’s growing diversity.

 

Because America is—and has always been—a remarkably heterogeneous country, we have long been consumed by the question “what is it that holds us together?”  The proper answer to that question, in my own opinion, is what one writer has called “our American covenant” and what I have called “The American Idea”—allegiance to the ideals that gave rise to the Declaration of Independence, the Constitution and the Bill of Rights.

 

Many of you are familiar with Robert Putnam’s book “Bowling Alone.” Putnam was worried about what the decline of civic clubs and bowling leagues meant for civic engagement. Well, more recently, Putnam has published research that led him to an even more troubling conclusion: he found that people who live in more ethnically diverse communities are less trusting of their neighbors than are people living in more homogeneous neighborhoods. And he found that they are less trusting of everyone, not just of those who belong to other ethnic groups.

 

Putnam’s original findings were controversial, but this current research has set off an academic firestorm.  Opponents of immigration, multiculturalism, and interfaith dialogue have seized upon Putnam’s research as evidence for their most paranoid fears. The article has especially been cited by opponents of immigration as proof that a continued influx of “others” will corrode the social fabric and doom the civic enterprise. You can almost hear Pat Buchanan urging “real” Americans to dig that moat.

 

When I began my research, I wanted to investigate whether this decline in what scholars call “generalized social trust,” assuming it had occurred, was really an outcome of increased diversity, or whether other aspects of contemporary civic life might be equally—or more—responsible. I also wanted to research whether the kind of trust America requires at this particular juncture in our national evolution is different from the kind needed in simpler, more rural communities, and if so, why and how.   In simpler societies, for example, we could depend on reputation to decide who was trustworthy. Gossip actually used to be valuable because it gave people information about who they could trust—and who they couldn’t. The prospect of a bad reputation that would become the source of gossip often was all it took to discourage untrustworthy behaviors. In more complicated societies, however, trust itself becomes more complicated. 

 

Think about it. We deposit our paychecks and take for granted that the funds will appear on our next bank statement. We make a deposit with the gas company without worrying whether they’ll turn on our heat. We mail checks to payees on the assumption that the envelopes will reach their destination, intact and unopened (if not always on time). We call the fire department and expect their prompt response. We even engage in internet transactions with merchants who may be located halfway around the world, merchants we’ve never dealt with before, because we trust their representations that their sites are secure and their merchandise will be shipped—the volume of business done in cyberspace multiplies exponentially month after month.

 

That kind of trust not only allows necessary social mechanisms to function, it makes our lives immeasurably more convenient and comfortable. But that isn’t trust in our neighbors; that’s trust in our common social institutions. And that’s where government comes in. Government is the largest and most important—not to mention the most pervasive—of our collective social mechanisms.

 

As America has grown larger and more complicated, the government has had to assume added responsibilities. Especially after the Depression, we recognized that citizens needed an “umpire,” a trustworthy institution to police and regulate a variety of business practices. Even the most ardent contemporary advocate of limited government is likely to concede the need for FDA regulations of food quality, for example. (I’m pretty libertarian, but I personally do not want so much “freedom” that I have to test the chicken I buy at my local Kroger for e coli. I prefer to trust the FDA.) Americans today rely on government agencies to ensure that our water is drinkable, our aircraft flyable, our roads passable, and much more.

 

It would be difficult to overstate the importance of our being able to trust government agencies to discharge these and similar functions properly. When America goes through a time where government is inept or corrupt, or both, as we have these past eight years, that confidence is shaken, and our skepticism and distrust affect more than just the political system. That is because trust in government institutions sets the tone for our confidence in all institutions. When we perceive that our government is not trustworthy, that perception infects the entire society. There was a reason the United States experienced so much upheaval and social discord in the wake of the Watergate scandal.

 

In urban communities and complex societies, we will never know most of our neighbors, even by sight. The informal mechanisms people employed in simpler social settings—reputation, gossip, identity—can no longer carry the information we require, cannot give us the guidance we need. We don’t have many places like the bar in Cheers, places where everyone knows your name. We have no alternative but to put our trust in the complex web of institutions we have created—the police and other government agencies, Better Business Bureaus, watchdog industry groups and the like—to discharge their responsibility for maintaining the trustworthiness of our economic and social systems.

 

In my book, I identified two culprits responsible for our loss of trust: one unwitting, and one just witless. The unwitting culprit is privatization, and I spend a whole chapter on the Goldsmith administration. (You’ll need to read the book to see the connection between institutional trust and privatization, but it’s only $14 at Amazon.com) Now, advocates of government contracting aren’t intentionally trying to make government less trustworthy—that’s just an unintended consequence. That’s why I say the outcome is unwitting.

 

The witless culprit, of course, was the Bush Administration. Let me just read the introductory pages of the chapter I devote to Bush, titled “Betrayal of Trust.”

 

The past decade has produced an unremitting—and seemingly escalating—litany of unsettling news, emanating from virtually all the major sectors of American society. It sometimes seems as if each day brings a new challenge or scandal. We sustained a stunning attack on American soil, reminding us that the oceans no longer safeguard us from the hostility of others. We invaded another nation because we were told that it had weapons of mass destruction that made it an imminent threat, only to discover that no such weapons existed. News reports have brought daily warnings that our governing institutions are “off the track.” There has been visible, worrying erosion of our constitutional safeguards. Meanwhile, the imperatives of population growth and commerce, technology and transportation, as well as politics, have eroded local control and hollowed out “states rights,” leaving people powerless to change or even affect many aspects of their legal and political environments.

Old-fashioned corruption and greed have combined with political and regulatory dysfunction to undermine business ethics. Enron, WorldCom, Halliburton, the sub-prime housing market meltdown—these and so many others are the stuff of daily news reports. Newspapers report on the stratospheric salaries of corporate CEOs, often in articles running alongside stories about the latest layoffs, reductions in employer-funded health care and loss of pensions for thousands of retired workers. Throughout most of this time, business forecasters have insisted that the economy was in great shape—a pronouncement that met with disbelief from wage earners who hadn’t participated in any of the reported economic gains, and whose take-home pay in real terms had often declined. By 2007, the gap between rich and poor Americans was as wide as it had been in the 1920s.[1] Many of the business scandals were tied to failures by—or incompetence of—federal regulatory agencies; others were traced back to K Street influence-peddlers of whom Jack Abramoff is only the most prominent example.[2]

Meanwhile, American religious institutions have not exactly covered themselves with glory, heavenly or otherwise.  Doctrinal battles over ordination of women and gays have split congregations. Revelations ranging from misappropriation of funds to protection of pedophiles to the “outing” of stridently anti-gay clergy have discouraged believers and increased skepticism of organized religion. In that other American religion, major league sports, the news has been no better. High profile investigations confirmed widespread use of steroids by baseball players. At least one NBA referee was found guilty of taking bribes to “shade” close calls, and others have been accused of betting on games at which they officiate. Football players seem habitually prone to wind up on the front pages; Atlanta Falcon Michael Vick’s federal  indictment and guilty plea on charges related to dog fighting was tabloid fodder for several weeks. Even charitable organizations have come under fire; a few years ago, United Way of America had to fire an Executive Director accused of using contributions to finance a lavish lifestyle. Other charities have been accused of spending far more on overhead than on good works.

The constant drumbeat of scandal has played out against a background of gridlock and hyper-partisanship in Washington. And—more significantly, for purposes of the public mood—all of it has been endlessly recycled and debated by a newly pervasive media: all-news channels that operate twenty-four hours a day, talk radio, satellite radio, “alternative” newspapers, and literally millions of blogs (weblogs), in addition to the more traditional media outlets.[3] Political gaffes and irreverent commentaries find their way to YouTube, where they are viewed by millions; wildly popular political satirists like Jon Stewart, Bill Maher and Stephen Colbert have used cable television to engage a generational cohort that had not traditionally focused on political news. Everyone who leaves government service seems to write at least one book pointing an accusing finger or otherwise raising an alarm; their exposes join literally hundreds of other books (most of them alarmist) cranked out by pundits, political scientists and scolds playing to partisan passions. The political maneuvering, cozy cronyism and policy tradeoffs that used to be the stuff of “inside baseball,” of interest only to political players and policy wonks, are increasingly the stuff of everyday conversation at the local Starbucks. In this hyper-heated media environment, if you don’t like the news, you can run—but you really can’t hide. Even partisans who limit their news sources to those likely to validate their opinions hear about the latest controversies, if only from their chosen perspective.

When you add to this constant din of revelations, charges and counter-charges the highly visible and widely reported ineptitude of the current administration’s handling of Hurricane Katrina, the drawn-out, inconclusive war in Iraq, the even more nebulous and worrisome conduct of the so-called “War on Terror,” and mounting questions about the nature and extent of government surveillance, is it any wonder American citizens have grown cynical?  Furthermore, all these miscues and misdeeds—and many more—are taking place in an environment characterized by economic uncertainty and polarization, as well as accelerating social, technological and cultural change (including but certainly not limited to the growth of diversity). Add in the so-called “culture wars,” and it’s not hard to understand why generalized trust has eroded.

 

We are not the only country to have gone through periods of turmoil, corruption or worse. I know of none that have escaped episodes of poor—sometimes disastrous—leadership. And as anyone who follows the news knows, democracy is no guarantee that you won’t get leaders who are ill-equipped to govern. All governments are human enterprises, and like all human enterprises, they will have their ups and downs. In the United States, however, the consequences of the “down” periods are potentially more serious than in more homogeneous nations, precisely because this is a country based not upon identity but upon covenant. Americans do not share a single ethnicity, religion or race. We never have. We don’t share a worldview. We don’t even fully share a culture. What we do share is a set of values, and when the people we elect betray those values, we don’t just lose trust. We lose a critical part of what it is that makes us Americans.

 

Policy prescriptions and ten-point plans are all well and good, but at the end of the day, our country won’t work unless our public policies are aligned with and supportive of our most fundamental values. The people we elect absolutely have to demonstrate that they understand, respect and live up to those values.

 

As we in this room know, the word “values” means different things to different people. In the wake of the 2004 election, I remember pundits telling us that Bush voters had come out on November 4th to vote for “values.” What they meant by values—opposition to reproductive choice and equal rights for gays and lesbians, and nationalistic jingoism masquerading as patriotism—was the antithesis of the American ideals most of us really do value.

 

Let me be quite explicit about what I believe to be genuine American values—values that have been shaped by our constitutional culture, values that are shared by the millions of Americans who have been dismayed, enraged and dispirited by the revelations of the past eight years. Real American values are the values that infuse the Declaration of Independence, the Constitution and the Bill of Rights, the values that are absolutely central to the American Idea.

  • Americans value justice and civil liberties—understood as equal treatment and fair play for all citizens, whether or not they look like us, and whether or not we agree with them or like them or approve of their reading materials, religious beliefs or other life choices.
  • Americans value the rule of law. And we believe that no one is above the law— most emphatically including those who run our government. We believe the same rules should apply to everyone who is in the same circumstances, that allowing interest groups to “buy” more favorable rules or other special treatment with campaign contributions, political horse-trading or outright bribery is un-American.
  • Americans value our inalienable right to speak our minds, even when—perhaps especially when—we disagree with our government. We understand that dissent can be the highest form of patriotism, just as mindless affirmation of the decisions made by those in power can create untold damage. Those of us who care about America enough to speak out against policies that we believe to be wrong or corrupt are not only exercising our rights as citizens, we are discharging our most sacred civic responsibilities.
  • Americans believe that when politicians play to the worst of our fears and prejudices, using “wedge issues” to marginalize immigrants, or gays, or blacks, or “east coast liberals” (a time-honored code word for Jews) in the pursuit of political advantage, they are betraying American values.
  • Americans value reason and respect for evidence, including scientific evidence. We may go “off the reservation” from time to time, especially when the weight of the evidence points to results we don’t like, but eventually, Americans will place reason and compromise above denial and hysteria in the conduct of our collective affairs.
  • To use the language of the nation’s Founders, Americans value “a decent respect for the opinions of mankind” (even European mankind).  
  • Finally, Americans value the true heartland of this country, which is not to be found on a map. The real heartland is made up of all the Americans who struggle every day to provide for their families, dig deep into their pockets to help the less fortunate, and understand their religions to require goodwill and loving kindness. The men and women who make up that heartland understand that self-righteousness is the enemy of righteousness. They know that the way you play the game is more important, in the end, than whether you win or lose. And they know that, in America, the ends don’t justify the means.

 

Americans’ ability to trust one another depends to a very great extent on our ability to keep faith with those values.

 

Life in a liberal democratic system is never going to be harmonious. Harmony, after all, wasn’t the American Idea. Despite the dreams of the communitarians, we aren’t all going to share the same telos; at most, we will have what the philosopher John Rawls called an “overlapping consensus.” In a country that celebrates individual rights and respects individual liberty, there will always be dissent, differences of opinion, and struggles for power. But there are different kinds of discord, and they aren’t all equal. When we argue from within our constitutional culture—when we argue about the proper application of the American Idea to new situations or to previously marginalized populations—we strengthen our bonds and learn how to bridge our differences. When our divisions and debates pit powerful forces wanting to rewrite our most basic rules against citizens who don’t have the wherewithal to enforce those rules, we undermine the American Idea and erode social trust.

 

At the end of the day, diversity (however we want to define it) is not the problem. And that’s a good thing, because the fact is that increasing diversity is inescapable. The real issue is whether it is too late to restore our institutional infrastructure and make our government competent and trustworthy again—whether it is too late to reinvigorate the American Idea and make it work in a brave new world characterized by nearly instantaneous communications, unprecedented human mobility, and the twin challenges of climate change and international terrorism.

 

The election of Barack Obama was a very hopeful sign, but the damage done during the past eight years to our most important national values and institutions is going to be very hard to reverse. As we lawyer-types like to say, the jury is still out.


 

 

 

Going Galt

If you’ve been following the financial news (and these days, who hasn’t?), you’ve probably come across stories about various wealthy and well-connected folks who are so incensed about Obama’s intention to let Bush’s tax cuts for the wealthy expire that they are threatening to “go Galt.”

The reference is to John Galt, the hero of Ayn Rand’s monumental book, Atlas Shrugged. In the book, Galt and other highly productive members of society decide to simply withdraw from participation in an economic system that has—in Rand’s view—become corrupted. The economic environment in Atlas Shrugged is highly politicized, with the result that it takes from those who are productive and honorable, and gives to those who are intellectually dishonest and morally defective. Rand characterizes the latter as “looters” and as “pull-peddlers” (what we would call “influence peddlers”)—people who know how to work the system to gain advantage over those who play by the rules.

What is so ironic, of course, about these publicized threats to “go Galt”—which in this case means to cut back on work in order to keep one’s taxable income under $250,000—is that they are being made by folks who have a lot more in common with Rand’s “looters” than with John Galt. These rants and threats are coming from people who have been prospering by doing all the things Rand (and Galt) hated. They are the people who were born into privilege, the people whose companies benefitted from favorable tax breaks, lax regulation, and the ability to hire lobbyists to skew the system in their favor, rather than through the production of anything of value. To those of us who have actually read the book, they look a lot more like James Taggert, the slimy, politically-connected, perpetually whining brother of the heroine Dagny Taggert.

It isn’t only Atlas Shrugged that the “don’t raise the tax on my marginal income another three percent” folks are mischaracterizing. As the noted economist Amartya Sen pointed out in a recent essay in the New York Review of Books, these self-righteous, self-proclaimed “pro-business” types have also been playing fast and loose with Adam Smith and the “Wealth of Nations.”

As Sen points out, Smith viewed markets and capital as doing good work “within their own sphere,” but he also explicitly recognized that markets required “restraint and correction” by other institutions–including well-devised government regulations and state assistance for the poor–in order to prevent “instability, inequity and injustice.” Smith—who was not an economist, but a Professor of Moral Philosophy—also recognized that “commercial exchange could not effectively take place until business morality made contractual behavior sustainable and inexpensive–not requiring constant suing of defaulting contractors, for example.”

It’s bad enough that extremists on the political right have insisted upon highly selective readings of both the bible and the constitution. Now they are selectively reading both Ayn Rand and Adam Smith, as well.

Or maybe they haven’t actually read any of them. That really would explain a lot.

Confusing the Issue

Earlier in my academic career, I did research into what Americans erroneously call “privatization”—outsourcing government functions to for-profit and nonprofit organizations. (True privatization would require government to divest itself of that activity. Through outsourcing and grant-making, government essentially “hires” an outside entity to do the work, but still pays the bills and retains responsibility for providing the service.)

 

Outsourcing raises constitutional issues, because only government can violate the Bill of Rights, and outsourcing makes it difficult to tell when government has acted. Recent headlines remind us that blurring the lines between public and private raises other thorny issues as well. The mess at FSSA is one recent reminder that contracting can create as many problems as it can solve.

 

Outsourcing is a tool. Sometimes it is the appropriate tool, sometimes it isn’t. Government agencies aren’t alone in losing control over contractors or grantees; the practice of outsourcing mortgage processing contributed significantly to the current banking crisis.

 

One truly bizarre result of the increasingly complicated relationship between the public and private sectors was the recent invalidation of the mayoral election in Terre Haute. Duke Bennett had defeated former Mayor Kevin Burke, and Burke sued, alleging that Bennett was ineligible to hold the office.

 

Bennett was employed as Director of Operations at Hamilton Center, a nonprofit established primarily for the purpose of providing behavioral health services. In 2007, the Center also opened a Head Start program, supported partly by a grant from HHS. The grant was $861,631,of which $125,789 was for Head Start’s proportionate share of overhead (security, maintenance, liability insurance, etc.)

 

Burke sued to have Bennett declared ineligible under a law that applied the Hatch Act to Head Start Grant recipients, and provided that such recipients should be “treated as a local government agency funded through Federal grants or loans.”

Bennett was responsible for providing and managing some of those overhead services, not simply for the Head Start program, but for all programs the Center operated. The Court found that $2,041—or 1.84% of Bennett’s salary and benefits for 2006-2007—came from the federal grant.

 

The court also found that “the violation was not willful or intentional,” that the issue hadn’t been raised during any of Bennett’s three prior election bids, and that his role with Head Start was essentially non-existent. Nevertheless, the Court held that Bennett was effectively a government employee, and thus prohibited from running for office.

 

There are many things we could say about the insanity of this result—all negative. The ruling has already encouraged other losing candidates to sue, and promises to create electoral uncertainty across Indiana.

 

The Hatch Act was intended to prevent abuses of power, not to limit the pool of people willing to engage in the political process. Indeed, in smaller communities, where overlapping civic commitments are the norm, that will almost certainly be the result.

 

If we continue down this path, we may end by transforming every recipient of a government grant, however minimal or accidental, into a government employee.

Comments

Patronage versus Progress

Whoever said “The more things change, the more they stay the same,” was probably thinking of Indiana.

Governor Mitch Daniels recently held a press conference at which he addressed the critical challenges now facing our state. He was flanked by former Governor Joe Kernan, a Democrat, and Indiana Chief Justice Randall Shepard, a Republican. The message was simple and direct: Indiana’s looming fiscal crisis makes adoption of the Kernan-Shepard Commission recommendations especially urgent.

The response of Indiana elected officials was dispiriting, to put it mildly. According to the Indianapolis Star, “ County officials said they don’t want to give up their elected positions. School boards stressed that they oppose forced consolidation. And House Speaker B. Patrick Bauer said the General Assembly has more pressing matters to consider next year than ‘an academic’s view of how government should operate, without any consideration given to whether such ideas are practical, or even feasible, in the real world.’”

Bauer’s comment, in particular, reminded me why the late Harrison Ullmann used to call the Indiana General Assembly “The World’s Worst Legislature.” It also reminded me of a lengthy conversation I had some years ago with George Geib, Indiana’s pre-eminent political historian. As he told me then, what really drives Indiana’s political culture is not ideology, but patronage.

Patronage and political self-interest have kept Indiana’s government bloated, costly and inefficient. In fact, the only good thing you can say about our resistance to modernization is that the effort to keep state government mired in the late 1800s has been entirely bipartisan—a lonely example of co-operation in our otherwise polarized politics.

It is understandable that people whose jobs are on the line would resist efforts to bring Indiana into the 21st century. But it was Pat Bauer’s snide dismissal of the Kernan-Shepard recommendations as “academic” that provided us with a perfect example of what is wrong with the Indiana General Assembly.

Leaving aside the use of the word “academic” to mean nonsensical (okay, I’m a bit sensitive there!), how many overlapping units of government does Bauer’s “real world” need? Indiana has 3100 units of government, run by 10,300 people paid for with our tax dollars. We have more counties than California. The reforms recommended by the Commission have long characterized government in most other states.

Maybe this slicing and dicing of jurisdictions into so many small units made sense when it took half a day (by horse) to reach the county seat. But in the “real world” I live in, it takes half an hour or less. Increasingly, I don’t need to travel at all; I can renew many permits and obtain needed information online.

The Kernan-Shepard Commission studied Indiana’s multiple levels of government, held hearings around the state, reviewed reforms instituted elsewhere in “the real world” and issued recommendations of 27 ways to cut waste, become more efficient, increase accountability and save tax dollars.

Government officials are supposed to work for us. Thanks to Indiana’s entrenched patronage, we seem to be working for them. 

 

 

 

Comments

Health and Prosperity

In 2006, the Economist—hardly a leftwing publication—had this to say about the U.S. healthcare system:

“America’s health care system is unlike any other. The United States spends 16% of its GDP on health, around twice the rich country average, equivalent to $6,280 for every American each year. Yet it is the only rich country that does not guarantee universal health coverage. Thanks to an accident of history, most Americans receive health insurance through their employer, with the government picking up the bill for the poor (through Medicaid) and the elderly (through Medicare).

[…]

In the longer term, America, like this adamantly pro-market newspaper, may have no choice other than to accept a more overtly European-style system.” 

 

We have all heard the litany: Forty-six million Americans are uninsured. America spends more per person than any other country, but ranks 37th in overall quality of service. If our infant mortality rate was as good as Cuba’s—Cuba’s!—we would save the lives of an additional 2,212 babies every year. Duplicative paperwork wastes billions each year. People with pre-existing conditions are chained to jobs they don’t like. The list goes on, and I don’t intend to stand here and repeat it, or add to it. Most of you are already all too aware of the problems.

 

Instead, I want to make an economic development/American competitiveness argument for single-payer national health insurance—something along the lines of what United Senior Action has called, if I am not mistaken, “Medicare for all.”

 

As I analyze the situation, if the United States adopted a “single payer” health insurance system funded through tax revenues and administered through a single insurer, we could expect a number of positive economic results, in addition to better health and reduced social anxiety.

 

First of all, we would see an increase in economic development/job creation. The business sector currently spends an amount in excess of its net profits to provide  health insurance for employees, and the cost of health insurance is the single largest “drag” on new job creation. The difference between what it costs an employer to create a new position and the amount that employee actually receives is sometimes called the employment “wedge.” As health costs and insurance premiums escalate, the wedge grows larger, and inhibits hiring additional workers. In good economic times, that is troubling; in times like these, it can be catastrophic.

 

For the shrinking number of companies that can afford to offer health insurance, negotiating and administering medical benefits, and complying with the government regulations attendant to them, consumes untold hours of HR time. This is a drag on productivity—a generator of overhead costs that reduce profits and divert effort away from the core business operations.  Single-payer would remove those costs and that burden. If you don’t think that would be economically significant, let me share an example. In the case of our struggling auto industry, amounts paid for employee health adds somewhere between 1800 and 2000 of the price of each new car. No wonder American automakers find it difficult to remain competitive! It should be noted that in a single payer system, doctors’ overhead would similarly decline: currently, medical offices spend considerable sums on personnel whose only job is dealing with insurers—confirming coverage, complying with insurer regulations, submitting claims on multiple different forms and collecting amounts due. The US could save millions of dollars each year JUST by standardizing insurance forms!

 

Smaller companies—the engines of economic growth and job creation—are increasingly unable to offer benefits, and that puts them at a competitive disadvantage when they try to hire good employees. If health coverage were de-coupled from employment, the United States would become a much more attractive location for new businesses, and incentives to outsource production to overseas workers would be reduced. (Tell Toyota/Canada story.)

 

We should also note that, if the burden of providing health care coverage were removed from employers, they could increase wages by some percentage of the amount currently being paid for insurance.

 

There are two predictable, immediate responses to suggestions that we provide national health insurance. The first is that we can’t afford it; the second that quality of care would be compromised.

 

Let’s dispose of  the question of costs first, because there is enormous public ignorance of the costs we already incur. It may surprise many of you to know that any additional tax revenues needed in order to accomplish universal basic coverage would be minimal, for the following reasons:

·         government at all levels already expends huge amounts for health, through Medicaid, Medicare and other federally required programs (Mothers and Children, AIDS, etc.), through health care research grants,  through insurance for public employees (Universities, police, public school teachers, state and municipal workers, etc.), and through support for public hospitals like Wishard. By some estimates, American government at all levels already pays for over 60% of American health care now. We just do it in the least efficient, most wasteful way imaginable. In single-payer countries, governments pay an average of 70% of all health costs.

·         Furthermore, the economies of scale available in a national system would allow us to effect significant savings. We could save money not just by standardizing paperwork, but also by lowering the  costs of administration. It is estimated that between 25-30% of private U.S. healthcare expenditures are eaten up by administrative overhead. Medicare, on the other hand, keeps its overhead costs between 2 and 3%. It’s not just big salaries for the insurance executives—although that’s part of it. The biggest chunk is marketing, including the costs involved in “cherry-picking” (explain). We could save a very large percentage of these overhead costs by administering insurance through government, or even by doing as some European countries do—by contracting with a few insurers to administer the program on government’s behalf, on condition that the premium structure eliminate the marketing costs that are now included.

·         An often-ignored benefit of a national system is that it would provide an incentive for more effective public health and prevention services—incentives our current, patchwork system lacks. Total costs decline when people are able to access routine medical care soon after the onset of symptoms, rather than visiting far more expensive emergency rooms when they can no longer ignore the problem.

·         A national system could—and should—save money by negotiating with drug manufacturers and other medical vendors for lower prices. Every other industrialized country does this, and to be honest, I was outraged when the Bush Administration prohibited such negotiations in the bill that expanded Medicare’s prescription drug coverage. That bill was a cynical give-away to drug and insurance companies. I used to believe the drug company argument that research and development would suffer if they couldn’t price new drugs at high levels. But that was before I understood how much medical and pharmaceutical research is underwritten by taxpayers through grants from institutions like the NIH. Frankly, we would all be better off if drug companies diverted some of the five billion dollars they spend each year on television ads for Viagra and the “purple pill” to research and development. 

·         Cost controls would also be enormously enhanced by eliminating the practice of cost-shifting by hospitals. Those practices are increasingly irrational; as you all know, those of us who are hospitalized and who have insurance pay prices that have been inflated in order to cover the costs that cannot be recovered from those without. On the other hand, because insurance companies exercise considerable pressure on hospitals, those same providers will often charge uninsured but solvent patients more for the same procedures. There is no uniformity to these practices, and they make rational cost accounting difficult, if not impossible. (Dan Hodgkins story)

 

These savings are often identified by proponents of national health care. What is far less frequently recognized is that even if taxes did go up, individuals would save money as well.

·         Automobile and homeowners insurance premiums would decline significantly, because the underwriting would no longer need to take the costs of medical care into account.

·         The considerable percentage of citizens who are currently uninsured would not incur significant out-of-pocket costs attributable to illness or accident.

·          And of course, those who are currently paying for their own insurance would have that considerable expense lifted. I had a student a year or so ago who did not have employer-paid coverage. She worked for a nonprofit CDC, and she and her husband were paying over 12,000/year for their family of two adults and two children. That is without co-pays and other out-of-pocket costs.

 

Those are just a few of the quantifiable, cash savings we could realize under a single-payer system. But there are also significant social costs associated with our current haphazard approach to healthcare. If all citizens had basic health coverage, America would arguably see a decline in the social costs associated with the current dysfunctional system. Let me just give you a few examples of what I mean:

·         Over 50% of personal bankruptcies are attributable to medical bills; those bankruptcies cost local businesses millions of dollars, and are a drag on the economy.

·         Employees with pre-existing conditions would no longer be chained to jobs they dislike.

·         Absenteeism could be expected to decline.

·         Immunizations would increase, and infant mortality decline.

·         Studies also suggest that violent crime rates decline and social trust climbs as social safety nets increase.

Even a small drop in crime yields huge savings and increases the quality of life.

While not quantifiable, these consequences are far from insignificant.

 

So much for costs. What about the argument that “socialized medicine” will cause a decline in the quality of American health care? That markets are most efficient way to allocate services/costs?

 

I am a great believer in markets. But functioning markets require a willing buyer and seller, both of whom have access to adequate relevant information. They do not work in areas where there is unequal access to information and widely unequal bargaining power. Both of those situations characterize the sale and purchase of medical care.

 

Even if markets in medical services did work, however, we don’t have a market now, if we ever did. We already have socialized medicine, but we have the very worst possible system—we have socialized medical care through the private insurance companies. The result is that we have the worst of both systems. A recent study by the Commonwealth Fund found that 82% of Americans are dissatisfied with our current patchwork approach to health care, and believe the system should be fundamentally changed.

 

One in three adults reported their doctors had ordered a test that had already been done, or had recommended unnecessary treatment or care within the past two years.  Forty-seven percent had experienced poorly coordinated care—meaning they hadn’t been informed of test results, or had to call repeatedly to get them, or that important medical information had not been shared between doctors and nurses, or between primary care physicians and specialists.

 

We hear a lot about waiting times in nationalized systems, but wait times are a significant problem in the US right now. Nearly 3 out of 4 Americans reported difficulty getting timely doctor’s appointments, telephone advice, or after-hour’s care, and that included people with health insurance. There is a reason that America consistently ranks 37th or 38th in quality of health care, despite the fact that we spend over twice as much per person as the next most expensive system.

 

I do not pretend to have expertise in how we should proceed to overhaul our system, but I do know we are not limited to Canada and Great Britain as models. (Which is not to say those systems aren’t working; despite the criticisms we hear, I have friends in Canada and a granddaughter in Great Britain, and they are very happy with their care.) But France and New Zealand have widely praised systems, just to name two others. My son lived in France for three years (explain).

 

 We have the luxury of learning from the history and performance of multiple other systems. All that stops us is ideology and a stubborn refusal to believe that other countries might have lessons to teach.    

 

The one bright hope is that the bankrupt nature of our current system has become apparent to anyone who cares to look. Large employers like GM, who have historically been opponents of national health care, are now favorably inclined. Even the AMA has offered a plan—although it is a pretty flawed one. Doctors have largely come to recognize that their interests would be better served by a single-payer system, and groups like Physicians for National Health Care are working hard to make that happen. There are rumors that Senator Kennedy is working feverishly on a plan, because he wants health care to be his legacy. And with the election of a President who actually understands the economics of our current situation, I am cautiously optimistic that the time for change may FINALLY have come.

 

It can’t happen a moment too soon!

 

 

 

Comments