The Wall that Never Was

The Wall that Never Was

Hugh Heclo

Is religion a necessary part of American life?

Share:
Read Time:
32m 57sec

A hundred years ago, advanced thinkers were all but unanimous in dismissing religion as a relic of mankind’s mental infancy. What’s being dismissed today is the idea that humanity will outgrow religion. Contrary to the expectations of Sigmund Freud, Max Weber, John Dewey, and a host of others, religion has not become a mere vestige of premodern culture. If anything, Americans at the dawn of the 21st century are more willing to contemplate a public place for religion than they have been for the past two generations. But what does it mean for religion to “reenter the public square”? What good might it do there—and what harm?

Mention religion and public policy in the same breath these days, and what will most likely spring to mind are specific controversies over abortion, school prayer, the death penalty, and stem cell research. All are issues of public choice that arouse the religious conscience of many Americans. They are also particular instances of a larger reality: the profound, troubled, and inescapable interaction between religious faith and government action in America.

For most of American history, the subject of religion and public policy did not need much discussion. There was a widespread presumption that a direct correspondence existed, or should exist, between Americans’ religious commitments and their government’s public-policy choices. When the oldest of today’s Americans were born (which is to say in the days of Bryan, McKinley, and Theodore Roosevelt), the “public-ness” of religion was taken for granted, in a national political culture dominated by Protestants. It was assumed that America was a Christian nation and should behave accordingly. Of course, what that meant in practice could arouse vigorous disagreement—for example, over alcohol control, labor legislation, child welfare, and foreign colonization. Still, dissenters had to find their place in what was essentially a self-confident Protestant party system and moralistic political culture.

Those days are long past. During the 20th century, religion came to be regarded increasingly as a strictly private matter. By midcentury, Supreme Court decisions were erecting a so-called wall of separation between church and state that was nationwide and stronger than anything known in the previous practice of the individual states’ governments. National bans on state-mandated prayer (1962) and Bible reading in public schools (1963) soon followed. In 1960, presidential candidate John F. Kennedy was widely applauded for assuring a convocation of Baptist ministers that his Catholic religion and his church’s teachings on public issues were private matters unrelated to actions he might take in public office. Intellectual elites in particular were convinced that the privatization of religion was a natural accompaniment of modernization in any society.

But even as Kennedy spoke and Supreme Court justices wrote, strong crosscurrents were at work. Martin Luther King, Jr., and masses of civil rights activists asserted the very opposite of a disconnection between religious convictions and public-policy claims. King’s crusade against segregation and his larger agenda for social justice were explicitly based on Christian social obligations, flowing from belief in the person of Jesus Christ. So, too, in antinuclear peace movements of the time atheists such as Bertrand Russell were probably far outnumbered by liberal religious activists. After the 1960s, the United States and many other countries witnessed a political revival of largely conservative fundamentalist religious movements. These, according to prevailing academic theories, were supposed to have disappeared with the steam engine. The horror of September 11, 2001, showed in the most public way imaginable that modernization had not relegated religion to an isolated sphere of private belief. On the contrary, religious convictions could still terrorize, as they could also comfort a nation and inspire beautiful acts of compassion.

It is nonetheless true that during much of the 20th century the dominant influences in American national culture—universities, media and literary elites, the entertainment industry—did move in the predicted secularist direction. What at midcentury had been mere embarrassment with old-fashioned religious belief had, by century’s end, often become hostility to an orthodox Christianity that believed in fundamental, revealed truth. Noting that the inhabitants of the Indian subcontinent are the most religious society in the world and the inhabitants of Sweden the most secularized, sociologist Peter Berger has said, provocatively, that America might usefully be thought of as an Indian society ruled by a Swedish elite.

In contemporary discussions of religion and public affairs, the master concept has been secularization. The term itself derives from the Latin word saeculum, meaning “period of time” or “age” or “generation.” The idea of the secular directs our attention to the place and time of this world rather than to things religious and beyond time. It supposes a demarcation between the sacred and the profane. The social sciences of the 19th century developed theories of secularization that dominated much of 20th-century thinking. In the disciplines’ new scientific view of society, all human activities were to be analyzed as historical phenomena, rooted in particular places and times. Religion was simply another human activity to be understood historically, an evolutionary social function moving from primitive to higher forms.

The idea of secularization became tightly bound up with intellectuals’ understanding of modernization. As the 20th century dawned, the secularization that religious traditionalists condemned, leading modernists of the day saw as a benign and progressive evolution of belief systems. Secular political organizations had already gone far in taking over the social functions (welfare and education, for example) of medieval religious institutions. As society modernized, science and enlightened humanitarianism would provide a creed to displace religion’s superstitions, and religion would retreat to private zones of personal belief. Policymaking would deal with worldly affairs in a scientific manner, indifferent to religious faith. Public religion was something humanity would outgrow. Religion in the modern state would go about its private, one-on-one soul work; public life would proceed without passionate clashes over religious truth.

The foregoing, in very crude terms, is what came to be known as the secularization thesis. To be modern meant to disabuse the mind of religious superstitions (about miracles, for example) and recognize the psychological needs that prompt humankind to create religious commitments in the first place. Modern society might still call things sacred, but it was a private call. In public life, the spell of enchantment was broken—or soon would be.

But something happened on the way to privatizing religion in the 20th century. For about the first two-thirds of the century, secularization seemed to prevail as a plausible description of public life. Then, in the final third, the picture changed considerably. Religion re-engaged with political history and refused to stay in the private ghetto to which modernity had consigned it. Witness the Islamic Revolution in Iran, the role of the Catholic Church in communist Eastern Europe, and the growth of the Religious Right in the United States. Of this resurgence of public religion, the sociologist José Casanova has observed: “During the entire decade of the 1980s it was hard to find any serious political conflict anywhere in the world that did not show behind it the not-so-hidden hand of religion. . . . We are witnessing the ‘deprivatization’ of religion in the modern world.”

This “going public” of religion, moreover, was not an expression of new religious movements or of the quasi-religions of modern humanism. Rather, it was a reentry into the political arena of precisely those traditional religions—the supposedly vestigial survivals of an unenlightened time—that secular modernity was supposed to have made obsolete.

Three powerful forces make this a particularly important time to take stock of the new status of religion in public life. The first is the ever-expanding role of national policy in Americans’ mental outlook. During the 20th century, struggles over federal policy increasingly defined America’s political and cultural order. In other words, conceptions of who we are as a people were translated into arguments about what Washington should do or not do. The abortion debate is an obvious example, but one might also consider our thinking about race, the role of women, crime, free speech, economic security in old age, education, and our relationship with the natural environment. After the 1950s, academics, and then the public, began to make unprecedented use of the term policy as a conceptual tool for understanding American political life. But the entire century nourished and spread the modern syndrome of “policy-mindedness”—an addiction to the idea that everything preying on the public mind requires government to do or to stop doing something. It’s a notion that allows almost any human activity to be charged with public relevance—from the design of toilets to sexual innuendo in the workplace (filigrees of environmental policy and civil rights policy, respectively). Like it or not, our cultural discussions and decisions are now policy-embedded. And this in turn inevitably implicates whatever religious convictions people may have.

The juggernaut of today’s scientific applications is a second development that now compels us to think hard about religion in the public square. Tech­nological advances have brought our nation to a point where momentous public choices are inescapable. To be sure, scientific knowledge has been accumulating over many generations. But in the latter years of the 20th century, much of modern society’s earlier investment in basic research led to technological applications that will affect human existence on a massive scale. For example, though the human egg was discovered in 1827, the DNA structure of life did not become known until the middle of the 20th century. Since then, sweeping applications of that accumulated knowledge have cascaded with a rush. By 1978, the first human baby conceived in vitro had been born. By the 1990s, the first mammals had been cloned, the manipulation of genes had begun, and the first financial markets for human egg donors had developed. These and other scientific advances are forcing far-reaching decisions about the meaning of life forms, about artificial intelligence and reconstitution of the human brain, and even about the reconstruction of matter itself.

These choices at the microlevel have been accompanied by technology’s challenge to human destiny at the macrolevel. It was also in the mid-20th century that mankind became increasingly conscious that it held Earth’s very life in its own contaminating hands. At the onset of the 1960s, Rachel Carson did much to overturn generations of unbridled faith in scientific progress when she publicized the first dramatic charges about humankind’s disastrous impact on the environment in Silent Spring. By the end of the 1960s, people saw the first pictures from space of the fragile Earth home they share. The first Earth Day and the blossoming of the modern environmental movement soon followed. The point is not simply that issues such as ozone depletion, species extinction, global warming, and the like have not been thought about until now. It’s that people have never before had to deal with them as subjects of collective decision-making—which is to say, as public-policy choices.

Our modern technological condition thus represents a double historical climacteric. Today’s citizens must manage the first civilization with both the outward reach to bring all human societies within a common global destiny and the inward reach to put the very structures of life and matter into human hands. For more than 2,000 years, philosophers could talk abstractly about the problem of being and the nature of human existence. For 21st-century citizens, the problem of being is an ever-unfolding agenda of public-policy choices. The extent to which ethics leads or lags the scientific juggernaut will now be measured in the specific policy decisions our democratic political systems produce. And the decisions are saturated with religious and cultural implications about what human beings are and how they should live.

Even as the policy choices are forcing citizens into a deeper search for common understandings, doubts about a shared cultural core of American values are pushing in precisely the opposite direction. Those doubts are the third challenge to religion in the public square. Thanks to the homogenization produced by mass markets throughout the 50 states, 20th-century America experienced a marked decline in traditional geographic and class differences. But with the uniformity in material culture has come a greater insistence on and acceptance of variation in the realm of nonmaterial meanings and values. The widespread use of phrases such as “identity politics,” “culture wars,” “inclusiveness,” and “political correctness” reflects the extent to which affirmations of diversity have supplanted earlier assumptions about a cultural core. “Multiculturalism” is a label for a host of changes in mental outlook, group self-consciousness, and educational philosophy. And with Muslims almost as numerous as Presbyterians in today’s America, multiculturalism is more than faddish academic terminology. It’s true that self-identified Christians still outnumber all other faith categories of Americans by 8 to 1 (in 2001, 82 percent of the population reported themselves to be Christian, 10 percent non-Christian, and 8 percent nonbelievers). But the Christianity in the figures is often purely nominal, with little orthodox content. The cultural indicators show that, by and large, America is well on its way to becoming a post-Christian, multireligion society of personally constructed moral standards.

These, then, are three developments that compel attention to the interrelations between religion and public policy: A vast and powerful political society is defining itself to an ever greater extent through self-conscious policy decisions about what to do and what not to do; a technological imperative is driving that society’s policy agenda to raise ever more profound questions about the nature of life and the sustainability of our earthly existence; and an increasingly fragmented sense of cultural identity is taking hold among the self-governing people who are called upon to make and oversee these collective decisions.

In light of these developments, how are we to think about “public religion” in our self-proclaimed democratic world superpower (so much for Christian humility)? Ordinary Americans continue to profess a devotion to religion far greater than is found in other developed nations. At the beginning of the 21st century, some 90 percent of Americans say they believe in God and pray at least once a week. Sixty percent attend religious services at least once a month, and 43 percent do so weekly. Nonbelief is a distinctly minority position; it’s also a hugely unpopular position. Large majorities of Americans claim that they would vote for a presidential candidate who was female (92 percent), black (95 percent), Jewish (92 percent), or homosexual (59 percent)—but only 49 percent say they would do so for a candidate who was an atheist.

In the summer of 2002, a political firestorm greeted a federal appellate court’s decision that the words “under God” (which Congress added to the Pledge of Allegiance in 1954) violated the constitutional separation of church and state. As the hapless Ninth Circuit Court backed off, delaying the decision’s effect, polls showed that 87 percent of Americans supported keeping God in the pledge, and that 54 percent favored having government promote religion. These figures reflect a significant shift: After declining sharply between the mid-1960s and late 1970s, the proportion of Americans who say religion is very important to them grew to roughly two out of three by 2001. Seventy percent of Americans want to see religion’s influence on American society grow, and an even larger proportion, including two-thirds of Americans aged 17 to 35, are concerned about the moral condition of the nation. (And yet, by general agreement, the common culture has become more coarse and salacious. One may well wonder who’s left to be making the popular culture so popular.)

However, the meaning of religious belief has also been changing in recent decades. A great many Americans find that the search for spirituality is more important to them than traditional religious doctrines, confessional creeds, or church denominations. Although most Americans say they want religion to play a greater public role so as to improve the moral condition of the nation, only 25 percent say that religious doctrines are the basis for their moral judgments about right and wrong. Even among born-again Christians, fewer than half say that they base their moral views on specific teachings of the Bible. To claim that there are absolute moral truths (a view rejected by three out of four American adults at the end of the 20th century), or that one religious faith is more valid than another, is widely regarded as a kind of spiritual racism. The new cornerstone of belief is that moral truths depend on what individuals choose to believe relative to their particular circumstances. Human choice has become the trump value and judgmentalism the chief sin. Thus, three-fourths of that large majority of Americans who want religion to become more influential in American society say that it does not matter to them which religion becomes more influential. Similarly, between 80 and 90 percent of Americans identify themselves as Christians, though most of them dismiss some of the central beliefs of Christianity as it has traditionally been understood. Father Richard John Neuhaus, editor of the journal First Things, recently summed up the situation: “To say that America is a Christian nation is like saying it’s an English-speaking nation. There are not many people who speak the language well, but when they are speaking a language poorly, it is the English language they are speaking.”

What all this means for the intertwining of religious faith and the politics of government policymaking is something of a mystery. It’s mysterious because Americans both want and distrust religious convictions in the public arena. Thus, more than 60 percent want elected officials to compromise rather than to vote their religious beliefs, even on life-and-death issues such as abortion and capital punishment. And though Americans have become more open in recent decades to having religion talked about in the public arena, 70 percent of them also think that when political leaders talk about their faith, they’re just saying what people want to hear. Most people surveyed are willing to have religious leaders speak out more on public issues, but they also don’t care much whether they do so or not. In the spring of 2001, when President Bush’s faith-based initiatives were being publicized, three-quarters of Americans expressed strong support for the idea that faith-based groups should receive government funds to provide social services. But that same proportion opposed having government-funded religious groups hire only people who shared their beliefs. This amounts to support for religion as long as religion does not really insist on believing anything. Then again, most Americans opposed funding American Muslim or Buddhist groups, and they regarded even Mormon groups as marginal.

So never mind thinking outside the box. When it comes to religion—their own or others’—in the public square, today’s Americans have trouble thinking seriously even inside the box. The wonderfully rich history of religion and democracy in modern America has been ignored and even suppressed in public-school textbooks and university curricula, both of which were largely purged of “God talk” after the mid-20th century. The sowing of traditional religious information in the school system has been so sparse that one national researcher on the topic has called younger cohorts of Americans a “seedless” generation. A less polite term for their religiously lobotomized view of culture would be heathen. Here is crooked timber indeed for building a framework to support the culture-shaping interactions between religion and public policy.

When Americans do think, however imprecisely, about religion and public decision making, what do they commonly have in mind? Their predominant notion is probably of “a wall of separation between church and state.” Most citizens would be surprised to learn that the phrase is not in the Consti­tution; it comes from a a church building here, a government building there, distinctly separate institutions. And that, Americans have long believed, is as it should be. But even casual observation reveals that there’s more going on in that square. Perched atop the ostensible wall of separation between the structures of church and state, we watch a public forum where religion and politics are anything but separate. There are not two kinds of people in the forum, some of them religious and some political. There are only citizens. And as they interact, they express themselves both religiously and politically. Religious, nonreligious, and antireligious ideas are all at work in their heads when they define problems and choose measures to deal with them collectively, as a people. Religious and irreligious ideas commingle in programs enacted in behalf of this or that vision of a good social order. Church and state, religion and politics, and ideas and social action are crosscutting elements whose presence, even if poorly articulated, we often sense in the public square.

To put it another way, the major interactions between religion and public policy occur across three domains. The first is institutional and focuses on the way organized structures of religion and government impinge on one another—and together impinge on society. This institutional perspective comes most naturally to Americans because it’s encoded in their nation’s constitutional understanding of itself. The bland phrase “separation of church and state” conceals what was the most audacious and historically unique element of America’s experiment in self-government: the commitment to a free exercise of religion. In this institutional domain, one encounters, for example, groups claiming infringement on the unfettered exercise of their religious liberties and disputes over government sponsorship of religious organizations. Less obviously, it is where one also finds religious and public agencies jostling against one another as they pursue, for example, education and welfare policies.

The second domain where religion and public policy connect can be called behavioral. Here the term simply means that religious attachments move people to act in public ways (e.g., to vote, to organize within the community, to engage in other political activities). There’s a direct, though paradoxical, link between the first and second domains. The distancing between religious and government institutions has allowed religion in America to be an immense resource for the nation’s politics. Alexis de Tocqueville concluded that his American informants were correct in believing that the main reason religion held great sway over their country was the separation of church and state. He wrote that “by diminishing the apparent power of religion one increased its real strength.” Since his visit in 1831, Americans in religious associations have created and sustained public movements to promote slavery’s abolition, women’s rights, prison and asylum reform, child welfare and worker protection, mothers’ pensions, liquor regulation, racial desegregation, and civil rights legislation. (Such associations have also been an important source of less savory causes, such as anti-Catholic and anti-Jewish laws.) And citizens moved to more routine political action through religious affiliations have done much to shape America’s party system and election outcomes.

The third domain connecting religion and public policy is more difficult to describe, but one senses that something important is missing if we take account only of organized institutions and politically relevant behaviors. For lack of a better term, we might call the third domain philosophical. It reflects broad policy outlooks on the social order. At this intersection, ideas and modes of thought are expressed in programmatic courses of action. It’s the realm in which people operate when they speak about culture wars in the schools, the work ethic in welfare, or the need for moral clarity in foreign policy. It’s the basis on which some people cringe and others rejoice when a presidential candidate talks about his personal relationship with Jesus Christ. The cringers know that religion can mask all manner of hypocritical mischief in public affairs, and the joyful know that religion can yield a principled striving for a better world. Both groups are correct.

To see how the philosophical level is linked to the first two domains, we need only consider the philosophy that drove early American policy to separate church and state. The formal Constitution was constructed as a “godless” document that kept the new national government out of religious matters—not because religion was unimportant to the society, but because it was extraordinarily important. The Founders (acting in the behavioral dimension noted above) understood why the Constitution had best be essentially silent on matters of religion and God. For one thing, the memory of Europe’s religious wars kept vivid the dangers of political division based on religion. More importantly, in the context of 1787, they realized that the fragile coalition behind the proposed federal constitution would be endangered by any statements about religion and devotion to God that might compete with the abundant dealings by state and local governments on the subject.

In linking calculations of political behavior and institutional design, the Founders were also drawing on a deeper set of philosophical understandings present in the society—ideas about individual conscience that were identified with Protestantism. The ideas had been tested and refined during decades-long encounters among communities of religious believers and dissenters in colonies all along the Atlantic seaboard. Behind the separating of church and state there loomed an emerging cultural commitment to the free exercise of religion. Generally speaking, the champions of this norm were not secular philosophers but religious people who were convinced that religion could never be authentic if it were coerced, directly or indirectly, by government.

Though the three perspectives—institutional, behavioral, and philosophical—can be distinguished (and studied individually), they are lived together in the one life of the nation. That’s why it’s so vitally important to understand religion’s place in the public square. Some of us might wish to adopt a popular intellectual shortcut to escape the complexities of the matter—by invoking, for example, the simplistic formula that says religion and public policy really ought not to mix, or by striking a worldly pose and claiming that what’s at issue is merely religious groups’ self-interested struggles for political power. But the formula is an illusion and the pose a lie that obscures inescapable and growing religious implications in public policy choices.

Whether we like it or not, religion and government policy are unavoidably linked. Each claims to give authoritative answers to important questions about how people should live. Each is concerned with the pursuit of values in a way that imposes obligations. Each deals in “oughts,” and does so through commands, not suggestions. How so? It’s obvious that religion tells people how to live. Less obviously, public policy also issues directives for living, because it affirms certain choices, and not others, for society and backs up the choices with the coercive power of government. Modern government policy is in the business of mandating, promoting, discouraging, or prohibiting particular ways of life. Even adopting the comforting principle of “neutrality” is a way of taking sides, for it can amount to a de facto claim that something is not a moral issue.

The City of God and the City of Man are engaged in deep and substantive transactions with each other, and we cannot escape that fact. But there is also an all-important difference between the two spheres. Religion points toward matters of ultimate meaning for human beings; it’s concerned with what is timeless, unchanging, and holy. It’s about the absolute or it’s about nothing. By contrast, the policy courses pursued by government are societal engagements with the here and now. Their meanings are proximate. At least in democratic (as opposed to totalitarian) government, policymaking acknowledges itself to be contingent, potentially erroneous, and subject to change. Religion and policy mark a continuous flashpoint in public life because the two touch along that horizon where the great thing needful is to keep relative things relative and absolutes absolute.

Is serious religion necessarily intolerant because of its preoccupation with the absolute, and can only the godless be counted on to engage a diverse public audience by persuading them rather than by asserting dogma? Not at all. That way of parsing the subject obscures too many of the humane possibilities in religion and the inhumane possibilities in godless morality. It denies the possibility of a religious faith that, sensing the eternal and transcendent, renders relative and contingent all human institutions and claims to truth—including those made by religious and irreligious pharisees in the public square. Just as there can be humbling dimensions to lived religion, there can be absolutist assumptions surrounding supposedly contingent, secular policies. It’s precisely the prideful claim that mankind is the center of the universe and that the purposes of the Almighty coincide with our human purposes that orthodox religion denounces most ferociously.

In modern intellectual circles, a fashionable strategy for pursuing the decoupling of religion from policy debate has been to argue for “dialogic neutrality.” The concept means that, in democratic debate, religionists must learn to “translate” and make publicly accessible—that is, make secular—the reasons for their policy claims. As the philosopher Richard Rorty has put it, “The main reason religion needs to be privatized is that, in political discussion with those outside the relevant community, it is a conversation stopper.” For a religious person to say “Christian discipleship requires that I oppose abortion” is the equivalent of someone else’s saying “Reading pornography is the only pleasure I get out of life these days.” Rorty argues that both statements elicit the same response in the public sphere: “So what? We weren’t talking about your private life.”

Against Rorty are those who argue, with justification, that a commitment to translation, far from being dialogically neutral, amounts to a demand that religious believers be other than themselves and act publicly as if their faith were of no real consequence. If they accommodate only secular reasoning, religious believers will translate themselves out of any democratic existence and contribute to the creation of a shriveled public arena that cannot accommodate, respect, or even acknowledge theological grounds for discussing serious normative issues.

In this new debate, one encounters what is essentially a contest over the practical meanings of religion. At the beginning of the 20th century, William James anticipated the basic outlines of the contest in The Varieties of Religious Experience. He rejected the intellectually popular “survival theory” that religion is merely superstition held over from premodern times. He argued, rather, that religious life arises from an awareness of one’s personal connection with a transcendent reality mediated through the subconscious self. James went on to draw a sharp contrast between what he called “universalistic supernaturalism” and “piecemeal supernaturalism.” The former corresponds to a view that some today would call simply spirituality. It conceives of an abstract, ideal dimension separate from the world of phenomena, and, at most, it illuminates facts already given elsewhere in nature. Piecemeal supernaturalism conceives of a transcending but immanent power that is, in addition, a postulator of new facts in the world. This power bursts into the world of phenomena—enters into the flat level of historical experience and interpolates itself between distinct portions of nature with facts of its own. And those facts are obligatory for guiding practical conduct in the world.

The claim for dialogical neutrality fits comfortably with a view of religion as universalistic supernaturalism. Reflect, for example, on the way contemporary politicians use religion. To be sure, faith and even Jesus are invoked. The problem is that the God who’s brought into the public sphere doesn’t seem to count for much. No practical policy consequences follow from this God’s presence. By contrast, piecemeal supernaturalism insists on the scandal of doing religion in public. William James left no doubt where he stood in the contest over the meaning of religion: “In this universalistic way of taking the ideal world, the essence of practical religion seems to me to evaporate. Both instinctively and for logical reasons, I find it hard to believe that principles can exist which make no difference in facts. But all facts are particular facts, and the whole interest of the question of God’s existence seems to me to lie in the consequences for particulars which that existence may be expected to entail. That no concrete particular of experience should alter its complexion in consequence of a God being there seems to me an incredible proposition.” In other words, to orthodox believers God can never be dialogically neutral.

To think responsibly about religion and public policy, then, requires harkening to what might be taken as a prime commandment of all religions: “Pay attention.” We need to look past the surface of things and not assume that what meets the eye is all there is. Religion is sometimes full of hypocrisy and hate. And it is sometimes full of light and life.

Many people consider it disturbing, if not downright dangerous, to invoke religious commitments in matters of public policy. They have good reason to think so. Because religion makes claims about ultimate truth, compromise may be interpreted as sinning against God, and yet compromise is what makes peaceful politics possible. Mixing religion and policymaking opens the way to intolerance, persecution, and bloody-mindedness in the body politic. It stirs the modern mind’s vague but potent memories of Crusades, Inquisitions, and religious wars. It’s to this popular prejudice that John Lennon could appeal in urging his young listeners to imagine an ideal future: “Imagine there’s no countries/ It isn’t hard to do/ Nothing to kill or die for/ No religion too/ Imagine all the people/ Living life in peace.”

People may be less likely to recognize the danger that policy engagement poses for the religious. Politics can encourage expediency, duplicity, and hypocrisy, and high-minded religious individuals are easily exploitable. The historian Edward Gibbon observed that for philosophers all religions are false, for common people they are all true, and for politicians they are all useful. Mixing religion and policy may attract not only worldly knaves and naive saints but also the worst of crossbreeds: knaves who appear saints. A subtler and frequently overlooked danger for religious individuals is that policy engagement often reveals contradictions among “God’s people.” That, in turn, may shake the faith of believers, deepen doubts among religious skeptics, and supply ammunition to those who are actively hostile to all religion. It’s only prudent to recognize that, when religion touches politics, politics touches back.

And yet, despite all these historical risks, the greater risks in modern society run in the other direction: They derive from excluding public religion. In the history we’re writing today, there are sound reasons to welcome the mingling of religion and public policy. For one thing, there’s a benefit to serious religious faith—because influencing the policies that guide their society is an important way for believers to live out their beliefs. That’s true especially in a system of self-government where religious consciences can never be purely public or private. To claim that among a democratic people religious commitments should have little or no part in public policymaking is hardy a “neutral” position. It’s a formula for gutting both religion and democracy of real, practical meaning.

Nonbelievers and the wider society also gain from religion’s presence in today’s public square. Without religion in the policy debates, everyone runs the risk of falling for false claims about technological imperatives and moral “neutralism”—the too-easy reassurance that policy choices depend merely on technical knowledge or popular convenience. Neutralism is itself based on some fundamental assumptions, beliefs, and leaps of faith. Having to confront overtly religious individuals who set forth their fundamental assumptions in public can bring to light the hidden secular assumptions of policymaking.

But more is at stake than improving the forum for public debate. Religion also adds something vitally important to the content of what’s being said. It asserts that a transcendent purpose gives meaning to who human beings are and what they do. The religious voice insists that God-inspired standards be taken seriously when a society governs itself, and that questions of right and wrong are more than matters of passing opinion. By its nature, religion rejects faint-hearted stabs at real virtue and dismisses the easy excuse that no one is perfect. Authentic religion insists that leaders—and everyone else—measure up rather than adjust the yardstick.

The point is not simply that religion is a powerful foundation for moral behavior—which was the insistent view of America’s founders as they sought ways of preventing democratic liberty from descending into license. It’s that religion can increase the fundamental humaneness of society. A religious outlook (not the outlook of every religion, to be sure, but certainly of the religions that historically shaped America) contends with both the immortal grandeur and the tragic fallenness of humankind. It calls attention to the narrow ridge we must traverse between essential human worth and essential human humility. Looking down one side of the ridge, a religious outlook warns that if human beings live no more meaningfully than animals, and die as conclusively, they will be seen to lack any value that differentiates them intrinsically from animals. Looking down the other side, it warns that if humanity is its own god, what may it not do? Neither prospect is reassuring for the momentous choices posed by modern public policy.

Religion in the public square stirs up deep and troublesome issues. Yet it seems far healthier that our modern, policy-minded democracy endure the disturbance than dismiss it. If traditional religion is absent from the public arena, humanity will invoke secular religions to satisfy its quest for meaning. It’s not the history of the Crusades or 16th-century Europe’s religious wars that’s most relevant for us. It’s the history of the 20th century. Throughout that century a succession of antidemocratic and antitheist political ideologies exploited people’s yearning for meaning and social idealism. A godless faith in humanity as the creator of its own grandeur lay at the heart of communism, fascism, Maoism, and all the unnumbered horrors unleashed in that bloodiest of centuries. And adherents of traditional historical religions—individuals such as Martin Niemoller, G. K. Chesterton, C. S. Lewis, Dietrich Bonhoeffer, Karl Barth, Reinhold Niebuhr, and Martin Buber—warned most clearly of the tragedy that would come of humanity’s self-deification and the mad attempts to build its own version of the New Jerusalem on earth. Prophetic warning voices such as theirs will be at least as valuable in the high-tech, low-culture world of the 21st century.

 

More From This Issue