Food Fights

Deconstructing the escalating battles over food is anything but simple.

We have critics and foodies like Michael Pollan counseling us to avoid eating anything our grandmothers wouldn’t have recognized as food. We have “food activists” insisting on labeling foods containing any ingredient that has been genetically modified. And we have Oscar-nominated films like “Food Inc.,” focusing upon the practices of the huge corporate farmers who have largely displaced the romanticized family farm. Those who have paid attention to the “natural food” movement (a cohort that would not include Paula Deen aficionados, or those cheering the return of the Twinkie) hardly know what they can safely eat.

I often quote my cousin the cardiologist, whose scientific expertise I respect. When Whole Foods announced that the company would be labeling genetically modified foods, he sent me an extensive tract, arguing that fear of GMs was ill-founded and the labeling movement dangerous. (You can read the whole thing here.)

As he pointed out, “genetic modifications” used to be called hybrids. Humans engaged in growing foods have spent generations selecting for desirable traits, and combining and propagating them. Historically, this has been a lengthy process. In many cases, genetic manipulation simply accelerates that process. (In other cases, however, the modifications may include the introduction of genes not native to that plant.)

He also points out that genetically modified plants promise to correct nutritional deficits in developing countries where the population depends primarily on a single foodstuff, like rice. Furthermore, the greater yields of modified crops keep many people in those countries from starvation. And it is true, as he notes, that foods derived from genetically modified crops have been consumed by hundreds of millions of people across the world for more than 15 years with no reported ill effects.

Or at least, ill effects that can be reliably connected to such crops.

It is probably obvious that I am less sanguine than my cousin; although I agree that most GM crops are no different from the hybrids farmers have long produced, I harbor some concerns–for which I admittedly have absolutely no evidence–about the long-term effects of those modifications that involve the introduction of “new” genes to a plant’s DNA. (Somehow, I don’t get a warm and fuzzy feeling when I hear that Montsano has modified seeds to withstand its pesticides….)

That said, I think the uproar about GMs distracts us from far more concrete dangers posed by factory farming.

For example, most of the beef produced in the United States has been fattened on corn, because corn is cheap, abundant, and allows cattle to come to the market in 12-14 months. In order for cattle to be raised on corn instead of grass, however,  the cows have to be given antibiotics in feed, feminizing hormones, and often protein that comes from other animal parts. Even if you overlook the inhumane conditions that have been amply documented, the large-scale production of chickens and pigs involve similarly unnatural processes. Unlike the situation with GMs, there is substantial evidence that these practices pose health risks for consumers.

Another legitimate cause for concern is the increased and often indiscriminate use of pesticides that linger in our food, and that run off into our water supplies.

Unfortunately, the “food fights” we are engaged in tend to conflate these different issues, confusing consumers and policymakers alike.

What is “natural”? Breeding crops to be disease-resistant, or more nutritious, allows us to meet human needs. We’ve done that for generations, and so long as we don’t get carried away–so long as we don’t create new and strange “Frankenfoods,” we probably don’t have much to worry about. Medicating livestock with hormones and antibiotics so that they can be fed foods they did not evolve to eat, in order to fatten them more and more quickly, is much more troubling.

As with so many of the issues people fight about these days, it’s complex, and most of us lack the scientific knowledge to make sound judgments.We used to trust the FDA to ensure food safety, but thanks to over a quarter-century of being told that government can’t do anything right, we no longer trust anybody.

Welcome to the food fight!

Comments

Defining Our Terms

These days, you can’t engage in cocktail party chatter or turn to a “serious” television program without finding yourself in a conversation about education reform. Everyone has a theory, and almost everyone has a culprit–the sad state of education is due to (choose one or more) teachers’ unions, poor parenting, bloated administrations, corporate privatizers, or the ACLU and its pesky insistence on fidelity to the Establishment Clause.

I’m still waiting for one of those conversations to turn to a pretty basic question: just how are we defining education?

Make no mistake: in most of these conversations, we are talking past each other. There is a huge disconnect in what people mean when they criticize education or advocate for changes in education policy. All too often, parents view education as a consumer good–skills they want their children to learn so that they can compete successfully in the American economy. That parental concern is far more understandable than the obliviousness of legislators and educators who want to assess the adequacy of high schools and colleges by looking at how many graduates land jobs.

Let me be clear. There is nothing wrong with job training. But job training is not the same thing as an education. 

An op-ed in yesterday’s New York Times–The Decline and Fall of the English Major–detailed “a new and narrowing vocational emphasis in the way students and their parents think about what to study in college. As the American Academy report notes, this is the consequence of a number of things, including an overall decline in the experience of literacy, the kind of thing you absorbed, for instance, if your parents read aloud to you as a child. The result is that the number of students graduating in the humanities has fallen sharply.”

 What many undergraduates do not know — and what so many of their professors have been unable to tell them — is how valuable the most fundamental gift of the humanities will turn out to be. That gift is clear thinking, clear writing and a lifelong engagement with literature.

Maybe it takes some living to find out this truth. Whenever I teach older students, whether they’re undergraduates, graduate students or junior faculty, I find a vivid, pressing sense of how much they need the skill they didn’t acquire earlier in life. They don’t call that skill the humanities. They don’t call it literature. They call it writing — the ability to distribute their thinking in the kinds of sentences that have a merit, even a literary merit, of their own.

As a college professor, I can confirm the abysmal writing skills of most undergraduates. And as a former high-school English teacher, I can also confirm that an inability to express a thought clearly is usually a good indicator of an inability to think clearly. (When a student says “I know what I mean, I just can’t say it,” it’s a safe bet that student does not know what he means.)

People learn to communicate clearly from reading widely. Reading widely introduces students to the human condition, to different ways of understanding, to the importance of literature and history and science, to the meaning of citizenship, to the difference between fact and opinion. Such people–educated people–are also more likely to succeed at whatever they choose to do. But that greater likelihood of success is a byproduct of genuine education, not its end.

Unless the conversation about education reform begins with a discussion of what we mean by “education,” unless we can agree on our goals for our schools, we will be unable to measure our progress.

We will keep talking past each other, and looking for someone to blame.

Comments

Litmus Test

It has been instructive watching the various reactions to the Paula Deen tragicomedy.

On one hand are the folks–including a number of “known liberals”– who see the Food Network’s decision not to renew her show as an excess of “political correctness.” Others, of course, have been far more judgmental.

Most people have reacted without bothering to go beyond the superficial. Had Deen only admitted to using the “N” word a couple of times in the past, it might be legitimate to find the response disproportionate. However, the admission came in the context of a number of other behaviors; the lawsuit in which she gave the deposition alleges long-standing workplace bigotry, including requiring black and white workers in one of her restaurants to use separate restrooms. A story in the New York Times quotes Deen herself about taping a TV show in which she was going to make a hamburger she called a “Sambo” sandwich, but was overruled by the producer. She was also quoted justifying other behaviors by “explaining” that “most jokes” are about Jews, black and gays. (If the lawsuit allegations are accurate, all of these groups were objects of her workplace behaviors.)

Some people who defend Deen are simply unaware of this backstory. But for others, that defense clearly has a personal element. Many of the comments on Facebook and emails sent to Food Network display the very ugly and persistent underside of our multi-cultural society.

We have a very long way to go when it comes to race. And the election of an African-American President has just exacerbated that very jagged social wound.

Comments

An Interesting Observation…

Yesterday, President Obama nominated a veteran of the Bush Administration to head up the FBI. There has been a lot of chatter about the choice–the nominee is apparently highly regarded on both sides of the aisle, something we don’t see much of these days. But I was struck by an observation posted to Maddowblog:

If the president and his team had any reason to worry at all about ongoing investigations casting the White House in a negative light — or worse — there’s simply no way Obama would choose a Republican lawyer with a history of independence to lead the FBI. Indeed… just the opposite is true — if Obama were the least bit concerned about any of the so-called “scandals,” he’d almost certainly look for a Democratic ally to lead the FBI.

But the president is doing the opposite — Comey is not only a veteran of the Bush/Cheney administration, he also donated to the McCain/Palin and Romney/Ryan campaigns, in the hopes of preventing Obama from getting elected. If the president thought the “scandals” might lead to the Oval Office, he’d never choose someone like Comey to take over the FBI right now.

Another indication that there is no “there” there.

The real problem with the persistent, ongoing hysterical efforts to prove that Obama’s Administration has done something criminal, of course, is that it has utterly distorted what ought to be the critique of this or any administration. Rather than focusing on the misplaced policies or bureaucratic inefficiencies  that are always fair game, drummed up (and sometimes wholly fabricated) accusations simply feed the appetites of GOP partisans who want to believe the worst. That in turn generates knee-jerk defensiveness by Democrats who might otherwise disagree with administration policy.

Partisan pissing contests have taken the place of potentially productive conversations about how we might govern ourselves better, or how we might grow the economy or improve education or balance the right to privacy against the needs of national defense.

We are governed by two-year-olds.

Comments

Religious Liberty? Hardly.

Historians tell us that the Establishment Clause of the First Amendment went through more than 20 drafts, with the Founders rejecting formulations like “there shall be no National Church.” The Establishment Clause prohibits the government from making any law “respecting an establishment of religion.” The courts have uniformly held that this language not only forbids the government from establishing an official religion or state Church, but also prohibits government actions that endorse or sponsor religion, favor one religion over another, or that prefer religion to non-religion, or non-religion over religion.

In other words, government is prohibited from playing favorites–from either benefitting or burdening citizens based upon their beliefs or lack thereof.

There’s constitutional principle, and then, of course, there’s real life.

A woman named Margaret Doughty, who has lived in the U.S. for 30 years, recently applied for US citizenship. One of the standard questions asked of applicants is  whether they would be willing to take up arms to defend the country. According to Ed Brayton over at Dispatches from the Culture Wars, Doughty replied as follows:

“I am sure the law would never require a 64 year-old woman like myself to bear arms, but if I am required to answer this question, I cannot lie. I must be honest. The truth is that I would not be willing to bear arms. Since my youth I have had a firm, fixed and sincere objection to participation in war in any form or in the bearing of arms. I deeply and sincerely believe that it is not moral or ethical to take another person’s life, and my lifelong spiritual/religious beliefs impose on me a duty of conscience not to contribute to warfare by taking up arms…my beliefs are as strong and deeply held as those who possess traditional religious beliefs and who believe in God…I want to make clear, however, that I am willing to perform work of national importance under civilian direction or to perform noncombatant service in the Armed Forces of the United States if and when required by the law to do so.”

Seems like a heartfelt and entirely acceptable position to me, but no. The immigration service responded by demanding that she “submit a letter on official church stationery, attesting to the fact that you are a member in good standing and the church’s official position on the bearing of arms.” In other words, unless she can demonstrate an affiliation with an established church with an established position on the bearing of arms, this 64-year-old woman cannot become a citizen.

The official position of the immigration service, evidently, is that atheists cannot have moral objections to killing other humans. (Nor, presumably, can members of churches without “official positions” against violence. If you are a Quaker, okay; if you are a Presbyterian or a Jew, not so much.)

When the U.S. still had a military draft, this same approach imposed a real burden on conscientious objectors who could not claim membership in a pacifist congregation. Eventually, the courts agreed that personal moral positions would be deemed adequate–but only if the individual claiming conscientious objector status could “prove” that he had long harbored such compunctions. Members of religious congregations could simply verify that membership; non-members and non-believers had to provide “clear and convincing” evidence of their beliefs, by bringing in people who would testify to past conversations, letters they’d written expressing pacifist sentiments, or the like.

You might think about that, and about Margaret Doughty, the next time some rightwing pundit whines about the advance of the secular hordes, or the (non-existent )”war on Christianity.”

Comments