Last night, I spoke to a group composed of equal numbers of high school students and adults at the northeast side Great Decisions series. I was especially impressed with the young students, who were all from Lawrence North, and who were attentive and (during the extended Q and A) engaged. Maybe there’s hope for humanity after all….
Anyway, in a departure from this blog’s usual snark, I thought I’d post my comments on “Promoting Democracy Abroad” from last night. (If lengthy posts aren’t your thing, this might be a good one to skip….)
……………………………………………………………………………
Let me introduce these remarks with a disclaimer: Foreign Policy is not my area of expertise. I think I was asked to address this particular reading because I wrote a book a couple of years ago called “God and Country: America in Red and Blue” in which I explored the religious roots of American policy debates (expand/explain). One of the areas I looked at was foreign policy—I tried to identify the worldviews, the intellectual “frames” within which we approach our relationships with other countries.
As your reading points out, conventional wisdom categorizes American foreign policy preferences as either “realist” or “Wilsonian idealist,” but if you take a brief tour through American history—and I am about to take you on that tour—I think you’ll find that realist and idealist are relative terms—that they characterize approaches all of which fall within what I called in my book a moralistic worldview—an intellectual “lens” firmly rooted in Protestantism and founded in no small part on a belief in American exceptionalism.
In fact, one of those moralists, an unabashed and self-defined “patriot” named David Gerlenter, insists that “Americanism is in fact a Judeo-Christian religion; a millenarian religion; a biblical religion.” While American religiosity is hardly the monolithic influence for good that Gerlernter seems to believe it is, he is undoubtedly correct about the largely religious worldview within which almost all Americans approach foreign policy, whether they consider themselves idealists or realists.
As another historian, Walter McDougall, has explained, American realists and idealists do not represent a conflict between amoral pragmatism on the one hand and high moral principle on the other; they are advancing arguments about policy that are equally rooted in morality and religious culture, but they are drawing different conclusions—often based on differing analyses of the facts—about where that morality lies. Other historians and political scientists have observed that the narratives we have constructed to describe America’s place in the world–our national stories–have shaped a passionate nationalism that some critics argue has itself become “the most powerful religion in the United States.”
America’s “stories” tend to reflect our Puritan roots: a world neatly divided into good and evil; a world in which the United States is “called” to be the City on the Hill; a world where capitalism is God’s preferred economic system. Our various interpretations of the obligations imposed by our belief that we are a “chosen” nation shape our approach to multi-national institutions and unilateral action. Our national stories are important and their effects are by no means all negative; they can be thought of as a covenant between our self-image and our obligations as citizens, and an important part of the “glue” holding an increasingly diverse people together. But our stories are also susceptible of many meanings.
Like the bible, America’s constituent documents and national history teach different lessons to different readers.
Let me give you just one example of what I mean. Americans are almost universally devoted to liberty, but that devotion often obscures the fact that we define that term in very different ways. As McDougall rather dryly noted, to the Puritans, rooted in the Reformation rather than the Enlightenment, liberty meant “freedom from Rome and Canterbury, no more.” In other words, liberty meant the right to impose the CORRECT religion on one’s neighbors. To the Founding Fathers, who came 150 years later and were products of the Enlightenment, liberty meant the freedom to live one’s life free of government interference or coercion by popular majorities, so long as you did not thereby harm the person or property of others, and so long as you respected the equal rights of others.
Americans may have quite different pictures of what liberty looks like, but we have historically agreed that we are committed to it. To a somewhat lesser extent, history suggests that we have continued to see ourselves as “the chosen people,” whose destiny it is to lead others to a proper understanding of true religion, political liberty and enlightened self-government.
A sermon delivered by Abraham Keteltas, a Massachusetts preacher, in 1777, was an early, common expression of that (still potent) belief.
“Our cause is not only righteous but, most important, it is God’s own cause. It is the grand cause of the whole human race…The cause of the American Revolution is the cause of truth against error and falsehood, the cause of righteousness against iniquity, the cause…of benevolence against barbarity, of virtue against vice…In short, it is the cause of heaven against hell…”
In the years following the Revolutionary War, such rhetoric became an important part of the American “story.” It is important to emphasize, however, that this mission to lead the world was not originally understood to require action; America was supposed to lead by example, not by force of arms or by diplomatic or other intrusions into the internal affairs of other countries. Our Puritan forebears believed America should be a “light unto other nations”—an example. The Founding Fathers, products of the Enlightenment, were intent upon limiting the power of their new country’s government, giving that government only so much power as necessary to defend the new nation, but not enough to endanger liberty at home. Neither of those views was compatible with foreign adventurism.
The presence of two large oceans separating the new country from others was seen as great good luck; it meant that America could forge its own destiny, that it didn’t need to put its trust in alliances and allies. A speech by George Washington is illustrative:
“Our detached and distant situation invites and enables us to pursue a different course…Why forgo the advantages of so peculiar a situation? Why quit our own to stand upon foreign ground? Why, by interweaving our destiny with that of any part of Europe, entangle our peace and prosperity in the toils of European ambition, rivalship, interest, humor or caprice?”
The Founders believed the new country would be able to make decisions free of treaties and other international constraints; it would be free to concentrate on domestic liberty (however understood). As Alexander Hamilton argued, limiting American efforts abroad and insisting on national self-determination was required both by morality and self-interest.
On the other hand, it soon became pretty obvious that if America was to remain independent, avoid “entangling alliances” and act unilaterally to advance its own interests, it needed to protect its own sphere of influence. We wouldn’t meddle in the affairs of Europe, but we also needed to ensure that Europe didn’t meddle in our hemisphere—hence the formula announced by President James Monroe that came to be called the Monroe Doctrine. That doctrine revolved around three principles: no new colonization, no transfer of existing colonies and no reimposition of colonial rule. Monroe was careful to underscore the non-aggressive nature of this doctrine. As he said,
“Our policy in regard to Europe, which was adopted at an early stage of the wars which have so long agitated that quarter of the globe, nevertheless remains the same, which is, not to interfere in the internal concerns of any of its powers; to consider the government de facto as the legitimate government for us; to cultivate friendly relations with it, and to preserve those relations by a frank, firm and manly policy, meeting in all instances the just claims of every power, submitting to injuries from none.”
For many years, American presidents did in fact resist the impulse to intervene in the internal affairs of other nations, even when sympathetic to the ideals of one side in the conflict. By the mid-1840s, however, the United States had developed what some scholars have called a “full-blown civic faith” that nourished a new approach–the doctrine of Manifest Destiny. America’s role was changed from providing an example—from being the “City on the Hill” that would be a “light unto nations”—to the view that God had chosen America to embrace a destiny. That “manifest” destiny included the right to extend influence not just through example, but through force. The term “Manifest Destiny” came from a frequently-cited speech by an influential editor named John L. O’Sullivan, in 1845. O’Sullivan wrote of America’s
“manifest destiny to over spread and to possess the whole of the continent which Providence has given us for the development of the great experiment of liberty…It is a right such as that of the tree to the space of air and the earth suitable for the full expansion of its principle and destiny of growth.”
Manifest Destiny began with the American West, where it informed the policies of Andrew Jackson–policies that led to the virtual extinction of Native Americans and to the war with Mexico, among many other consequences. But it didn’t end there. Drawing on social Darwinism and a belief in the “white man’s burden,” the doctrine justified a number of imperialist adventures as God’s plan. There are still echoes of Manifest Destiny today; as recently as 1997, the Project for the New American Century issued a statement of principles calling for the United States to “shape a new century favorable to American principles and interests” and declaring its mission to “make the case and rally support for American global leadership.” There are plenty of other examples.
After the Civil War, the United States became a world power, thanks to enormous population growth, continuing industrialization, and improvements in mass transportation that facilitated the shipping of trade goods. But these changes also made continued detachment from the rest of the world more difficult. The Civil War had also provided evidence that benevolent intervention can have positive results, supporting the argument that armed conflict is sometimes necessary in order to liberate others from evil systems.
Evidence of a shift in America’s foreign policy emphasis came in 1898: Cuba fought a war to gain independence from Spain, and President McKinley wanted permission from Congress to protect American interests in the area. Congress passed a joint resolution recognizing Cuban independence and authorizing the use of force against Spain. The resolution expressly denied any interest in annexing Cuba, and justified intervention not on the basis of the Monroe Doctrine or the protection of American interests, but—in the words of one Senator, “for humanity’s sake.” The historian Walter McDougall dates the policy he calls “Progressive Imperialism” from that time. “[A] newly prideful United States began to measure its holiness by what it did, not just by what it was, and through Progressive Imperialism committed itself, for the first time, ‘to the pursuit of abstractions such as liberty, democracy or justice.” That set the stage for Wilsonian idealism, sometimes called liberal internationalism.
The various perspectives on Woodrow Wilson fill volumes, and political scientists, historians and foreign policy experts continue to debate his legacy. On the one hand, Wilson was convinced that God had chosen him, he had contempt for “lesser races,” and he was a vocal critic of the limitations imposed on the Executive branch of government by separation of powers. He also apparently believed that a foreign policy based only on self-interest was unworthy. Wilsonian idealism drew a lot of its strength from the belief that America’s righteousness meant that America had the duty to refashion the world.
If that seems arrogant—and it does to me—the Depression, and then the Japanese attack on Pearl Harbor, dampened and tempered it, and paved the way for a new emphasis on realism in foreign policy. But even in trying times, the country has never completely abandoned idealism—Roosevelt’s support of the United Nations clearly drew on Wilson’s devotion to a League of Nations, and Truman’s moral appeal for the Marshall Plan are both evidence that it remained alive and well in our foreign policy. Nevertheless, after WWII, the Wilsonian brand of idealism was increasingly diluted with healthy doses of realpolitik.
The emergence of the Soviet Union and the reality of a world divided by what Winston Churchill dubbed “the iron curtain” ushered in a new strategy: the policy of containment.
Containment had elements of all of the prior “isms”—it began, as American policies tend to do, with the battle between good and evil, in this case, the battle between liberty and collectivism. Strategically, it hearkened back to the older belief that the American example would ultimately be more powerful—and not incidentally, more prudent—than a preventive war. But it also included the idealist’s belief that Americans’ liberties both required and depended upon the extension of a similar liberty to others, and that America had a duty to protect weaker nations from falling to stronger ones. That idealism led to yet another doctrine, sometimes called Global Meliorism, the belief that Americans have a moral mission to make the world a better place. It is distinguished from Wilsonian Idealism, in one historian’s view, because “Wilson just hoped to make the world safe for democracy; Global Meliorists want to make the world democratic.” The most visible twentieth-century product of Global Meliorization was Viet Nam.
There are obviously many other ways to analyze the historical evolution of American foreign policy, and how we got from there to here, but the broad outlines tend to be very similar.
So–What can this quick romp through history tell us about promotion of democracy, and related contemporary foreign policy debates? For one thing, it underscores our American tendency to conduct our arguments in shades of black and white. We are a very bipolar country! America is irredeemably evil, or America is uniformly good. Terrorism is evidence that “they” all hate us for our freedom, or it is evidence that we have systematically misunderstood and/or mistreated others who have every right to hate us. Such extremes are, to put it mildly, not very helpful.
In the words of one historian, writing at the turn of the century,
“The American political and cultural landscape at the year 2000 is torn between two incompatible ideas: one, that the West is globally triumphant and that the future of the world will be one of Westernization, in which all societies and cultures converge on a democratic and capitalist norm, with McDonalds in every town and Disney videos in every home. The other is that the West is an evil culture of exploitation, patriarchy and environmental degradation, and that certainly neither has nor deserves to have any future.”
The problem is, as political scientists have warned, when people act out of a certainty of their own moral righteousness, it is a very short step to sanctimony. In the wake of 9-11, President George W. Bush—paraphrasing Matthew 12:30—declared “Those who are not with us are against us.” This is a classic example of America’s impatience with nuanced arguments about good and evil, our historic tendency to cast conflicts in apocalyptic terms, and our firm belief that we are God’s Chosen People. In the wake of 9-11, a number of biblical literalists saw the attacks as God’s punishment for assorted sins, from “tolerating homosexuality” to “countenancing abortion;” others, equally moralistic, insisted that the attack was a foreseeable result of American imperialism and our failure to address all of the “legitimate grievances” of Arab populations.
This sort of absolutism is a particularly unhelpful approach to international relationships in a rapidly shrinking world. Globalization has presented policymakers with a number of unprecedented challenges, many of which have operated to intensify existing conflicts between Americans’ most deep-seated convictions. On the one hand, our commitment to market economics requires us to support efforts to ensure international stability and to create international institutions that have sufficient authority to regulate trade and mediate commercial disputes when they arise. On the other hand, our insistence on unilateralism and “going it alone,” and our reflexive recoil from any “entanglements” that might erode our absolute sovereignty makes us wary of treaties and multilateral institutions that we clearly need. In a 2003 speech, for example, former President Jimmy Carter despaired of the possibility of the United States ever ratifying the U.N. Convention on the Rights of Children—a Convention that has been ratified by every country except Somalia, which has no effective government able to ratify it. Carter was equally critical of the decision to pull out of the Kyoto Protocol; in his speech, he focused particularly on the need to understand the cultural, social and religious roots of opposition to these and other co-operative measures.
Our encounters with global pluralism are forcing us to face the same challenges we face in our ongoing domestic encounters with diversity. As the sociologist Peter Berger has put it, it is very unlikely that over time most of the world will come to look like Cleveland.” And that observation brings us—finally—to the question posed by your reading: When and how should the United States intervene in the internal affairs of other countries in order to promote democracy?
That question leads to many others. Most of us would agree that the United States—or any country—has the right and obligation to protect its own interests. The “preliminary” questions we ask less frequently are: What are those interests? When are they significant enough to justify military action?
Beyond diplomatic or military action to protect America’s interests, what is our obligation to help others around the globe? Again, most of us would agree that we have humanitarian obligations—but the devil is in the details. Clearly, we should help out in cases of disasters—tsunamis, famines. A very strong case can be made for intervening to prevent genocide or similar atrocities. After that, it gets a lot less clear. Your reading assumes an obligation to promote democracy—I think the jury is still out on that one.
I am one of those people who vacillates about foreign policy decisions—I think different situations call for different approaches, and those approaches need to be prudent as well as idealistic. We need to distinguish between idealism and arrogance. Making the sorts of decisions described in your reading requires that we have a national discussion about our role in the world, and its limitations.
Through most of America’s history, foreign policy has not been a partisan issue; it’s been an ideological one. Even in today’s highly polarized political environment, we can see all the various approaches to foreign policy in the positions of the Republican candidates for President: at one extreme, Ron Paul would have us disengage from the rest of the world; before he dropped out, Rick Perry advocated the sort of cowboy adventurism that I think Americans are very wary of right now.
The one thing that is inarguable is that the United States remains the world’s only superpower. We’re the “big kahuna.” How we use our power, how we behave in the world, will ultimately define us—and those decisions will be made by your generation. Mine has made a pretty big mess of it.
Good luck.
Comments