Legal Nostalgia

A former student recently needed a copy of the syllabus I’d used in her graduate Law and Policy class back in 2010. When I reviewed it, I was struck by the changes effected by Trump, MAGA, and our current, corrupt Supreme Court majority. I became positively nostalgic for the legal environment of my time in the classrooom–nostalgic for the “black-letter law” and for precedents that were considered settled by my cohort of lawyers and law professors.

In that syllabus, I explained the course as follows:

___________

This course will examine the response of the American legal system, with its historic commitment to individual liberty and autonomy, to the growth of the administrative state and to an increasingly complex social environment characterized by pluralism and professional differentiation. We will discuss conflicting visions of American government and different approaches to public administration, and consider how those differences have affected the formation and implementation of public policy within our constitutional framework. Throughout, we will consider the constitutional and ethical responsibilities of public service—the origins of those responsibilities and their contemporary application.

While relatively few people will become public officials or public managers, all Americans are citizens, and most citizens will participate in the selection of public officials and will take positions on the policy issues of the day. Accordingly, this course is intended to introduce all students to the constituent documents that constrain public action and frame policy choices in the American system. These explorations will inevitably implicate political (although not necessarily partisan) beliefs about the proper role of the state, the health of civil society, and the operation of the market. To the extent possible, these theoretical and philosophical beliefs will be made explicit and their consequences for policy and public sector behavior examined. The goal is to help students understand why certain policy prescriptions and/or public actions attract or repel certain constituencies, and to recognize the ways in which these deeply held normative differences impact our ability to forge consensus around issues of public concern.

In the course of these inquiries, we will consider the implications of the accelerating pace of social change on issues of governance: globalization, especially as it affects considerations of legal jurisdiction; the increasing interdependence of nations, states, and local governmental units; the blurring of boundaries between government, for-profit and nonprofit organizations, and the effect of that blurring upon constitutional accountability; the role of technology; and the various challenges to law and public management posed by change and diversity, including the  impact and importance of competing value structures to the formation of law and policy.

By the end of the semester, students should be able to recognize legal and constitutional constraints on public service and policy formation, and to identify areas where public policy or administration crosses permissible boundaries. They should be able to recognize and articulate the impact of law and legal premises on culture and value formation, and to understand and describe the complex interrelation that results.

_________

During my years on the faculty teaching law and policy, it never occurred to me that I would live in an America where a President and virtually everyone in his administration would find the foregoing paragraphs incomprehensible–where individuals in positions of authority would reject–indeed, be unfamiliar with– the very concept of Constitutional restraints, let alone the existence and importance of civil society and/or competing arguments about the proper role of government.

I certainly wouldn’t have anticipated that so many of the ambitious politicians serving in the House and Senate–men and women presumably concerned for the national interest– would neuter themselves in slavish submission to a man whose ignorance of government and policy and whose intellectual and moral deficits were impossible to ignore even before the emergence of unmistakable dementia.

I would have rejected as fanciful the notion that a duly constituted United States Supreme Court would substitute partisan ideology and Christian nationalism for the rule of law, upending years of settled precedents and thoughtful, considered jurisprudence, not to mention the Separation of Powers that lies at the very heart of our constitutional architecture.

And yet here we are.

Forgive this somewhat whiney post, but coming across my old syllabus has made me nostalgic for the legal world I once inhabited. It wasn’t perfect, but it was infinitely preferable to our current reality, and we need to recover, reinstate, and improve it.

Comments

The Greeks Were Right

The early Greeks are said to have invented the idea of democracy, but that wasn’t their only contribution to the philosophy of governance. They also pioneered the importance of the “golden mean,” the mean between extremes. 

Right now, we are experiencing an assault on both of those critically important concepts.

The assault on democracy is well-understood; indeed, it preoccupies the political discourse. The importance of the “Golden Mean” is less understood. The Golden Mean was a core concept in Aristotelian ethics; Aristotle argued that virtue consists of finding the right balance in our behaviors and emotions.  (For example, courage is a virtue that lies between the extremes of recklessness and cowardice. Generosity is a virtue that lies between stinginess and prodigality.)  

American politics constantly wrestles with the proper balance between individualism and communitarianism. The country was founded on the principle that individuals are entitled to a generous zone of liberty–a zone that government should not invade until or unless that individual is harming the person or property of another, That principle gave rise to a very American, almost religious belief in individualism, and a corresponding suspicion of social programs and laws for the common good, which are inevitably opposed as unAmerican “socialism” or “communism.”

In the real world, of course, we are faced with finding a proper balance: what sorts of things really must be done communally, and when do government programs unnecessarily breach individual liberties? (I will ignore, for purposes of this discussion, the hypocrisy of MAGA folks who disdain “socialism” only when it benefits poor folks, and who have no problem with a corporatism that translates into socialism for the rich and a brutal capitalism for everyone else…)

What triggered the foregoing discussion was an article from the Guardian about–of all things–diet and exercise and long life. The article noted a decline in public health and life expectancies in rich countries, and posed the obvious question: what explains the gap between the public’s growing knowledge about living longer and its collective health going backwards?

The author of the essay is a public health scientist in Great Britain, whose job is looking into the factors that affect how long we will live. As she wrote, 

Most of these are out of individual control and have to do with the country and community we live in. The truth is, this “self-help” narrative doesn’t reflect the reality of how health works. In fact, the focus on personal responsibility and self-improvement has distracted us from the real issue –the impact that public policy, infrastructure and community make in affecting our health chances and longevity.

After citing the far better health and longevity outcomes in places like Japan, she writes that “What stands out about these places is that the people living there don’t just make individual choices that lead to better health – they live in places where healthy lives are normalised by government and culture.”

As I talk about in my new book, if I’m going to live to 100, I need more than fastidiously counting my calories and posting pictures of myself exercising on Instagram (which I am guilty of). I need to live in a world where health is a collective responsibility, not an individual one. This means supporting policies that make us all healthier – and politicians who prioritise the conditions for good health such as nutritious food especially for children, active cities, clean air policies, preventive healthcare and public provision of water, which should be at the core of what a government provides its citizens. There are lessons in how to improve life in all of these areas across the world: these are places where good health is built into daily life.

I confess that I have a strong libertarian streak, and a corresponding belief in the importance of the individual values of diligence, honesty, and hard work. But common sense requires recognition of the importance of the communities in which we live–the societies within which we are, in communitarian jargon–“embedded.” People cannot pull themselves up by their bootstraps if they don’t have boots. They cannot simply choose to breathe clean air and drink uncontaminated water. Poor people without health insurance cannot simply decide not to need medical care.

Whether politicians want to acknowledge it or not, there are major elements of our lives that can only be addressed communally, and most of those can only be accomplished through government. Our job is to craft a social infrastructure that is adequate, that supports without intruding–to find that elusive “Golden Mean.”

I don’t think MAGA is interested…..

Comments

The War On Knowledge

When citizens are subjected to a “flooding of the zone”–daily assaults on a wide variety of systems, beliefs and values that have long been an accepted part of our governing environment–we can be forgiven for a lack of focus. It’s hard enough just to keep track of what is happening, let alone to decide which attacks are most worrisome. But Adam Serwer makes a good case for putting the war on knowledge at the top of the list.

In The New Dark Age, Serwer writes

The warlords who sacked Rome did not intend to doom Western Europe to centuries of ignorance. It was not a foreseeable consequence of their actions. The same cannot be said of the sweeping attack on human knowledge and progress that the Trump administration is now undertaking—a deliberate destruction of education, science, and history, conducted with a fanaticism that recalls the Dark Ages that followed Rome’s fall.

Serwer enumerates the Trump assaults: threats to withhold funding from colleges and universities that don’t submit to MAGA demands. Sustained attacks on the engines of American scientific inquiry– the National Science Foundation and the National Institutes of Health–and on repositories of America’s history, including the Smithsonian.  Arts organizations and libraries are losing funding. Large numbers of government scientists have lost  their jobs and remaining researchers prevented from broaching forbidden subjects. “Entire databases of public-health information collected over decades are at risk of vanishing. Any facts that contradict the gospel of Trumpism are treated as heretical.”

These various initiatives and policy changes are often regarded as discrete problems, but they comprise a unified assault. The Trump administration has launched a comprehensive attack on knowledge itself, a war against culture, history, and science. If this assault is successful, it will undermine Americans’ ability to comprehend the world around us. Like the inquisitors of old, who persecuted Galileo for daring to notice that the sun did not, in fact, revolve around the Earth, they believe that truth-seeking imperils their hold on power.

Serwer describes the attacks on universities. He uses the example of West Point, and the administration’s purge of forbidden texts to reveal what MAGA’s “ideal university” might look like.

West Point initiated a schoolwide push to remove any readings that focused on race, gender or the darker moments of American history.” A professor who “leads a course on genocide was instructed not to mention atrocities committed against Native Americans, according to several academy officials. The English department purged works by well-known Black authors, such as Toni Morrison, James Baldwin and Ta-Nehisi Coates.

The Trump administration’s attack on knowledge is broad-based; it isn’t limited to academia. The administration has also singled out and fired government employees involved in research of multiple kinds.

These are people who do the crucial work of informing Americans about and protecting them from diseases, natural disasters, and other threats to their health. Thousands of employees at the Centers for Disease Control and Prevention have been let go, including most of those whose job it is to maintain workplace safety standards. Experts at the Food and Drug Administration including, according to the Times, “lab scientists who tested food and drugs for contaminants or deadly bacteria; veterinary division specialists investigating bird flu transmission; and researchers who monitored televised ads for false claims about prescription drugs” have been purged. Workers in the Department of Agriculture’s U.S. Forest Service research team, who develop “tools to model fire risk, markets, forest restoration and water,” have been targeted for layoffs. The Environmental Protection Agency’s entire research arm is being “eliminated.” The administration has made “deep cuts” to the Department of Education’s research division.

Serwer enumerates the nature of the cuts and their foreseeable consequences, especially for public health. As he notes, modern agriculture and medicine, and advances in information technology like the internet and GPS were built on the foundation of federally funded research.

For the past century, state-funded advances have been the rule rather than the exception. Private-sector innovation can take off after an invention becomes profitable, but the research that leads to that invention tends to be a costly gamble—for this reason, the government often takes on the initial risk that private firms cannot. Commercial flight, radar, microchips, spaceflight, advanced prosthetics, lactose-free milk, MRI machines—the list of government-supported research triumphs is practically endless.

MAGA’s racist fight against “wokeness” requires destroying huge swaths of scholarship and research, and distorting any American history that undercuts the administration’s goal: destroying the “ability to discover, accumulate, or present any knowledge that could be used to oppose Trumpism.”

You really need to click through and read the entire essay–and weep.

Welcome to a new Dark Ages.

Comments

The Perils Of Privatization

According to the Washington Post, Elon Musk and the Trump Administration are hauling out an “oldie but goodie” and promising that once they’ve hollowed out the federal government’s capacity to govern, they’ll turn any functions they deem necessary over to the private sector. They’ll privatize for “efficiency.”  What could possibly go wrong?

Let me count the ways.

I spent a fair amount of my academic career researching what folks on the Right misleadingly call “privatization.” The first thing you need to know is that calling what Trump and Musk want to do “privatizing” is a misnomer. When Margaret Thatcher sold off government-owned industries to the private sector–where they made or lost money, paid taxes, and were left to sink or swim–that was privatization. In the U.S., the term is used to mean contracts between a government agency and a business or nonprofit organization to provide a government benefit or service. Government continues to pay for that service or benefit with tax dollars, and government remains responsible for its proper delivery.

Sometimes, contracting out makes sense. Sometimes it doesn’t. (It also shouldn’t be confused with procurement— government’s purchase of goods and services from the private market.)

Contracts with units of government are qualitatively different from contracts between private actors, and those differences make it far more likely that the “privatization” contracts ultimately negotiated will be unfavorable to taxpayers. Contracting out first became a fad at the state and local level some twenty-plus years ago, and the results weren’t pretty.

As I wrote back in 2013, mayors and governors who are considering privatization are operating under a different set of incentives than the corporate CEO who is charged with long-term profitability of his business. Long term to a politician means “until the next election.” Typically, the elected official is looking for immediate cash to relieve fiscal stress (and improve his immediate political prospects) and is much less concerned with the extended consequences of the transaction.

Furthermore–although it really pains me as a former Corporation Counsel to admit this–the lawyers who reviewed these deals for local governments tended to be far less sophisticated than  lawyers acting on behalf of the contractors. That’s not because they aren’t good lawyers–most are. But the skills required to advise a municipality or state agency aren’t generally the same skills as those needed by practitioners of business transaction law.

In addition to the existence of unequal bargaining capacities, there is also—unfortunately—the very high potential for “crony capitalism,” the temptation to reward a campaign donor or political patron with a lucrative contract at taxpayer expense. Back in the bad old days, patronage meant that you volunteered for the party and if your party won, you–or maybe your brother-in-law–got a job with the city or state. With “privatization,” patronage meant that you made a meaningful contribution to the party and if it won, you got a cushy contract.

Ideally, the media would act as a watchdog in these negotiations, alerting the public when a proposed contract is lopsided or otherwise unfavorable. But media has never been very good at providing this sort of scrutiny, because news organizations rarely employ business reporters able to analyze complex transactions. (In today’s media environment, of course, we’re lucky if we even know a deal is in the works.)

In that 2013 post, I warned that we shouldn’t be surprised when these transactions turn out to be unfavorable to the taxpayer–and in the years that followed, a great many of them proved to be very unfavorable indeed. (For one thing, it turned out that too many government agencies lacked the capacity to effectively monitor contractors.)

Worse, from an accountability standpoint, when services are delivered by an intermediary, citizens often fail to realize that those services are really being provided by government. That failure has constitutional as well as political implications. Only government can violate an individual’s civil liberties–that’s what lawyers call “state action”–so it’s important that we be able to distinguish actions taken by private actors from those that can be attributed to government. Privatization has significantly muddied that distinction.

Also, when contracting is extensive, it masks the true size of government. Today, there are approximately 3.7 million contract employees in addition to 2.1 million civil servants. Only the latter are being targeted by Musk.

Will the public fall for this replay of an expensive and discredited “reform”? Hopefully, our earlier, extensive negative experience with privatization will prevent folks from falling for this again, but as we know, simple prescriptions sell.

The plutocrats are undoubtedly salivating….

Comments

The Work Of Governing

An unfortunate side-effect of Americans’ fascination with celebrity is their accompanying confusion of fame with competence. That inability to understand the difference–especially when it comes to political campaigns– is largely a result of widespread ignorance of the day-to-day grunt-work of governing.

John Sweezy, the long-ago (now deceased) Republican chairman of my county party used to say that every citizen should be required to serve two years in government, and prohibited from staying for more than four years. While I disagreed with his four-year edict, I completely understood the benefit of a two-year stint that would introduce citizens to the distinctly unglamorous realities involved.

I served as Corporation Counsel in Indianapolis for a bit over two years, many–many–years ago, and it was an education. I was disabused of the then-widespread notion that civil servants were largely folks who couldn’t find private sector jobs–my co-workers were some of the brightest and most hard-working people I’ve ever known. Most of all, I came to understand the realities of government service, along with the difficulties of weighing competing public interests.

In one of her recent Letters from an American, Heather Cox Richardson illuminated those lessons by recalling the efforts that averted a threatened Y2K calamity.

When programmers began their work with the first wave of commercial computers in the 1960s, computer memory was expensive, so they used a two-digit format for dates, using just the years in the century, rather than using the four digits that would be necessary otherwise—78, for example, rather than 1978. This worked fine until the century changed.

As the turn of the twenty-first century approached, computer engineers realized that computers might interpret 00 as 1900 rather than 2000 or fail to recognize it at all, causing programs that, by then, handled routine maintenance, safety checks, transportation, finance, and so on, to fail. According to scholar Olivia Bosch, governments recognized that government services, as well as security and the law, could be disrupted by the glitch. They knew that the public must have confidence that world systems would survive, and the United States and the United Kingdom, where at the time computers were more widespread than they were elsewhere, emphasized transparency about how governments, companies, and programmers were handling the problem. They backed the World Bank and the United Nations in their work to help developing countries fix their own Y2K issues.

Those of us who were adults in the run-up to the turn of the century still remember the dire warnings. Planes would fall out of the sky, computers would fail to work, the funds in your bank account would be inaccessible…on and on. Preachers of some religions predicted the end times.

None of that happened, not because the threat was unfounded, but because public servants worked for many months to correct the problem. As Richardson wrote,

In fact, the fix turned out to be simple—programmers developed updated systems that recognized a four-digit date—but implementing it meant that hardware and software had to be adjusted to become Y2K compliant, and they had to be ready by midnight on December 31, 1999. Technology teams worked for years, racing to meet the deadline at a cost that researchers estimate to have been $300–$600 billion. The head of the Federal Aviation Administration at the time, Jane Garvey, told NPR in 1998 that the air traffic control system had twenty-three million lines of code that had to be fixed.

Richardson followed her description of the problem and its solution with what I will label “the moral of the story.”

Crises get a lot of attention, but the quiet work of fixing them gets less. And if that work ends the crisis that got all the attention, the success itself makes people think there was never a crisis to begin with. In the aftermath of the Y2K problem, people began to treat it as a joke, but as technology forecaster Paul Saffo emphasized, “The Y2K crisis didn’t happen precisely because people started preparing for it over a decade in advance. And the general public who was busy stocking up on supplies and stuff just didn’t have a sense that the programmers were on the job.”

I don’t know how to make the majority of American voters understand that when they cast a ballot, they need to vote for someone with the skills or background to understand the job–someone who is competent to fix the sorts of problems governments encounter. When they vote for an entertainer, or culture warrior, or “outsider” who proudly claims to know nothing about politics or government, they get what they vote for–and governing suffers.

After all, most of us wouldn’t choose a doctor who’d never been to medical school…

Comments