Do Ideas Matter in America?

Do Ideas Matter in America?

Wilfred M. McClay

Americans like to think of themselves as a pragmatic people, with little use for professors and fancy ideas. Yet they also live and die for abstractions such as freedom and equality. That’s not just some inexplicable paradox but a key to understanding the American intellectual landscape.

Share:
Read Time:
39m 56sec

In his classic study, Childhood and Society (1950), the psychologist Erik Erikson observed that “whatever one may come to consider a truly American trait can be shown to have its equally characteristic opposite.” Though a similar ambivalence can be found in many national cultures, traceable to a variety of causes, Erikson insisted that the bipolarity was especially pronounced in the modern American instance. In none of the other great nations of the world, he contended, were the inhabitants subjected to more extreme contrasts than in the United States, where tensions between individualism and conformity, internationalism and isolationism, open-mindedness and closed-mindedness, cosmopolitanism and xenophobia were powerfully felt.

Sweeping generalizations of that sort about “national character,” American or otherwise, have come to be regarded as artifacts of the 1950s. They’ve been superseded by doctrines that emphasize pluralism and social heterogeneity and stress the “inventedness” of the modern nation-state. But there is plenty of evidence for the cogency of Erikson’s dictum, which is nowhere more vividly illustrated than in the paradoxical role of ideas in American culture. One can make an equally plausible case that ideas are both nowhere and everywhere in America, that they have played a uniquely insignificant role in shaping America or a uniquely commanding one.
 
So which of the two assertions is the more accurate? At first blush, one would have to acknowledge that there is a strong basis for the familiar view that Americans are a relentlessly action-oriented people, constituents of a thoroughgoing business civilization, a culture that respects knowledge only insofar as it can be shown to have immediate practical applications and commercial utility. Alexis de Tocqueville voiced the theme early in the 19th century: To the extent that Americans cultivated science, literature, and the arts, he remarked, they invariably did so in the spirit of usefulness, not out of any high regard for the dignity of thought itself. His observation presaged what would become a consistent complaint of intellectuals from Walt Whitman to Matthew Arnold to Sinclair Lewis to George Steiner today: America is a philistine society, interested only in the arts of self-aggrandizement and enhanced material well-being, reflexively anti-intellectual, utterly lacking in the resources needed to support the high and disinterested curiosity that is the stuff of genuine cultural achievement.
 
A variation on this theme, also sounded early on by Tocqueville, was that America embodied a fresh and distinctive theory of government, even though its citizenry couldn’t begin to articulate what that theory was. Tocqueville claimed that there was no country in the civilized world where less attention was paid to philosophy—and yet Americans seemed enthusiastically committed to a particular, and very modern, philosophical method. Their country, he quipped, was the place where “the precepts of Descartes are least studied and best applied,” since it was the place where everyone believed that one should “seek the reason of things for oneself, and in oneself alone.” Yet the range of ideological possibilities in America was narrow, with comparatively little space between the supposed opposites of “left” and “right” and relatively little deviation from fundamental liberal principles. The very notion of intellectual debate as a process of public wrangling about alternative ideas of the political and social order tended to be regarded as anathema, even dangerous.
 
In 1953, the historian Daniel Boorstin went so far as to argue that the absence of debate over American political theory was one of the nation’s chief virtues. The unpremeditated “givenness” of American political institutions constituted for him the genius of American politics and defined the difference between the placid stability of American politics and the ideology-ridden horrors that had so recently erupted in European politics. “Our national well-being,” said Boorstin, is “in inverse proportion to the sharpness and extent of [our] theoretical differences.” So breathtaking a statement formulates on a national scale the powerful, if largely informal, everyday American social taboo against discussing either religion or politics in public. The taboo makes for a considerable measure of social peace, but it would be hard to imagine a deeper devaluation of the role of ideas.
 
Small wonder that Boorstin became one of the principal exponents of the “consensus” view of American history that was prevalent at midcentury: A rough but stable ideological homogeneity, built upon the cultural and economic premises of liberal capitalism, encompassed nearly the entire American people. Contemporaries of Boorstin who were associated with this view, such as Richard Hofstadter and the political scientist Louis Hartz, had a less positive regard for the alleged homogeneity—yes, they sighed, Americans all think the same way, and more’s the pity—but they did not challenge its basic outline. Nor, for that matter, was there much fundamental disagreement to be found in the writings of the consensus theorists’ immediate predecessors, such as historians Vernon Louis Parrington, Charles Beard, and Frederick Jackson Turner. Although all of these scholars saw conflict rather than consensus as the most salient characteristic of American history, they conceived the conflict primarily as one between rival material interests, in which ideas and ideologies played, at best, a supporting or derivative role.
 
Thus, a well-established tradition, shared by intellectuals and non-intellectuals alike, holds that ideas have been largely irrelevant to the nation’s practical concerns, and therefore tangential to the real business of American life. That belief has often figured in the tension-ridden relationship between America and Europe, of whose persistence we have recently been reminded. The contrast between America and Europe was thought to be the difference between youth and age, novelty and venerability, innocence and experience, purity and corruption, guilelessness and sophistication, naturalness and artificiality. But however distinct it may have wanted to be, America’s intellectual culture was from the start an extension of Europe’s, linked to European evaluative standards. Hence, that culture should be understood, at least in its early history, as an “American province,” to use historian David Hollinger’s apt term, with all the ambivalence it implies about Americans’ feelings of connection to, separation from, and competition with Europe. America was a province in the sense that its best and brightest minds yearned to be found worthy by the metropolitan arbiters of taste and sensibility. Their ultimate validation came not from Boston or New York, but from London or Paris. Yet the province yearned to breathe free, and it resented the cultural subordination in which it found itself becalmed long after political independence had been achieved.
 
The quest for originality, a passionate concern of such writers as Ralph Waldo Emerson and Walt Whitman, became one of the themes of 19th-century American culture. But the goal was more easily articulated than met. At the start of the 20th century, intellectuals were still complaining that the emergence of an authentic, indigenous American culture was being blocked by the smothering influence of artificial, European-derived, Anglo-Victorian notions of high culture. In their view, slavish imitation of the mother continent bespoke cultural immaturity and a lack of intellectual self-confidence and vigor. One of the most influential statements of this theme was made by the Spanish-born American philosopher George Santayana in a 1911 lecture titled “The Genteel Tradition in American Philosophy.” The lecture bequeathed to subsequent observers an indispensable term of analysis—“the genteel tradition”—to describe what was wrong with American art and expression. More than that, it offered a far-reaching diagnosis of a fault line in American culture. American intellectuals responded to Santayana’s critique with what one scholar has called “the rebellion against Victorianism,” a rebellion that would turn out to be one of the organizing principles of 20th-century intellectual activity, particularly in the realms of arts and One-half of the American mind,” he asserted, the part that was “not occupied intensely in practical affairs,” had remained “becalmed,” floating “gently in the backwater” of American life—prim, polite, refined, and irrelevant. Meanwhile, the other half of the mind—the part concerned with material innovation—“was leaping down a sort of Niagara Rapids,” surging ahead of the entire world “in invention and industry and social organization.” The division was between what was inherited and what was native born, between the legacy of Europe and the immediacy of Amer­ica—or, in Santayana’s words, between “the beliefs and standards of the fathers” and “the instincts, practices, and discoveries of the younger generation.” The division could be neatly symbolized in architectural terms: The former resembled a colonial mansion, the latter a modern skyscraper. “The Amer­ican Will inhabits the sky-scraper,” he continued, while “the American Intellect inhabits the colonial mansion. . . . The one is all aggressive enterprise, the other is all genteel tradition.” Each side needed, but lacked, the corrective of the other, and so both were diminished. The realm of ideas was so unsatisfactory in America because it was not nourished by the vital streams of the nation’s life.
 
Santayana’s brilliant analysis was further elaborated and popularized by literary critic Van Wyck Brooks in America’s Coming-of-Age (1915). For Brooks, the American mind was riven between highbrows and lowbrows, transcendent theory and “catchpenny realities,” “academic pedantry and pavement slang,” the pipe dreams of professors and the banal rhetoric of a William Jennings Bryan. The literary highbrows issued works of fastidious refinement and aloofness that were too prissy to have any salutary influence on the coarse and unkempt lowbrows. Yet the lowbrow culture, for all its vitality, was too vulgar and business-oriented to accommodate serious criticism or disinterested reflection. Between the two, Brooks lamented, “there is no community, no genial middle ground.”
 
Like Santayana, Brooks blamed this state of affairs on the early influence of Puritanism, less because of its specific tenets than because it forced upon early America an elaborate imported theology that was too arcane to address the real problems of men and women struggling to build a civilization in the wilderness. As a consequence, a parallel track of Yankee practicality developed, a streetwise mindset that valued skill in tinkering and shrewdness in business, and that would have nothing to do with the theological track issuing from Calvinism or other sources. The sets of issues thereby created were separated into two distinct currents in the American mind. The highbrow current of piety ran from Jonathan Edwards in the 18th century through the classic American writers of the 19th century, and led to the “final unreality” of most of what passed for official high culture in Brooks’s day. The lowbrow current of “catchpenny opportunism” found expression in the cracker-barrel maxims of Ben Franklin, the folksy tales of A. B. Longstreet and other American humorists, and finally in the vulgar and jocular atmosphere of the era’s business life.
 
Despite certain differences of emphasis, both Santayana and Brooks were driving at the same point: Americans had a disordered relationship to the realm of ideas, treating ideas as entirely abstract and ethereal, rather than as useful instruments for making sense of practical experience. The analysis struck a resonant note with intellectuals of the early 20th century. “A philosophy is not genuine,” Santayana remarked near the beginning of the “genteel tradition” lecture, “unless it inspires and expresses the life of those who cherish it”—the very things that the “hereditary philosophy” of America had signally failed to do. With those words, Santayana captured the spirit of what was to come: Opposition to the genteel tradition in American art and expression swelled into a generational rallying cry and set the tone for much of the century’s art and thought.
 
The rise of pragmatism—the school of thought so often touted as America’s chief contribution to philosophy—was a response to the same perceived philosophical inadequacy, and, as such, an effort to make ideas useful once again. Pragmatists asserted that ideas were most vital and valuable when they were understood as tools for adaptive living or blueprints for action, and when their truth or falsehood was judged not by arid deductive reasoning but by the real-world consequences that ensued from them. “What difference would it practically make to any one,” asked philosopher William James in Pragmatism (1907), “if this notion rather than that notion were true? If no practical difference whatever can be traced, then . . . all dispute is idle.” James (a colleague of Santayana’s at Harvard) sought to banish unprofitable metaphysical pursuits and bring the ordering principle of utility, and the empirical discipline of the natural sciences, to the notoriously misty and self-referential enterprise of philosophy. Anti-abstractionism and antiformalism became the defining marks of a whole generation’s worth of fresh contributions to American thought, from the 1880s to the 1920s, ranging from the sociologically informed jurisprudence of Oliver Wendell Holmes, Jr. to the economic contextualism of historian Charles Beard and social critic Thorstein Veblen and the pragmatic experimentalism of John Dewey.
 
The bottom-line bluntness of James’s question—and answer—showed the influence of Darwinian biological science upon pragmatism, and how compatible both were with certain ingrained patterns of American thinking. Human cognition was to be understood as an adaptive tool in the struggle for evolutionary survival, like the wings of a bird or the claws of a cat, and ideas themselves were compared to species, whose viability can be evaluated only by testing their ability to survive in action. Thus, intellectual life was a ceaseless flow of challenges and adaptations, with absolutes nowhere in sight. Equally notable, however, was James’s willingness to talk frankly in the language of lowbrows—to evaluate ideas in terms of their “cash value” and their ability to produce fruitful outcomes. Although pragmatism is nearly always depicted as a radical departure, it did nothing, in fact, to challenge the contention that ideas matter only insofar as they have a demonstrable practical use.
 
The Progressive movement of the early 20th century became, at least in part, a political expression of the pragmatist outlook. It provided an environment in which ideas were esteemed to the degree that they could be generated and put into action by a vigorously activist government for purposes of social, political, and economic reform. Progressivism was an effort to institutionalize the usefulness of ideas, in the name of preserving the workings of democracy against the corruption threatened by the rise of industrial capitalism. In the Progressive vision, as articulated by thinkers such as Herbert Croly, Walter Weyl, and John Dewey, an army of middle-class specialists, trained in the “science” of governance by the burgeoning new universities and municipal research bureaus, could run the apparatus of government, where they would use ideas as blueprints for social reform and the protection and cultivation of the public interest. The experts would be disinterested arbiters, bound by the logic of science and the rational autonomy of professional organizations rather than by the vagaries of economic self-interest or power politics. Staffed with such workers, government would in time come to play the same flexible and organic role in the life of the nation that the mind plays in the life of an individual, adapting and reformulating its policies and initiatives in response to shifting circumstances.
 
A dispensation of this sort would seem to elevate ideas to a high level indeed, by making them essential partners in democratic governance. Yet the fact remains that ideas were handmaidens rather than equal partners in the arrangement, because they were bound to the practical logic of purposive activity and dedicated to the triumph, in Walter Lippmann’s 1914 formulation, of “mastery” over “drift.” In short, ideas had to deliver the goods. As with pragmatism, ideas had no value independent of their practical use. Thus, rather than assert the independent importance of ideas, pragmatism and progressivism actually codified their subordination. And in downplaying the “ideal” aspect of ideas, pragmatism also downplayed their value as points of friction with reality that enable us to stand in judgment of existing arrangements and recall us to higher purposes—in a word, that inspire us.
 
In the face of all this evidence, can one seriously entertain the opposite side of the Eriksonian paradox—that ideas have played a commanding role in Amer­ican history? Indeed, one can, for the idea of America itself has remained powerful and alluring and multifaceted. If we do not readily perceive the pervasiveness of ideas in American history, it is for the same reason that deep-sea fish do not perceive the existence of water. Ideas provide the very medium within which Amer­ican life is conducted. From the start, the nation’s history has been weighted with a sense of great destiny and large meaning, of visions found not on the fringes of the story but at its very core, actively shaping the minds of those who make the history and those who write about it. One cannot tell the story without reference to the visions.
 
Talk about “the idea of America” is likely to be dismissed these days as a species of “American exceptionalism.” But invoking that familiar catchphrase simply fails to do justice to the matter. The concept of America has always carried large, exceptional meanings. It even had a place in the European imagination long before Columbus. From Homer and Hesiod, who located a blessed land beyond the setting sun, to Thomas More and his Utopia, to English Puritans seeking Zion, to Swedish prairie homesteaders and Scotch-Irish hardscrabble farmers and Polish and Italian peasants who made the transatlantic voyage west in search of freedom and material promise, to Asian and Latin American immigrants who have thronged to these shores and borders in recent decades—for all of these, the mythic sense of a land of renewal, regeneration, and fresh possibility has remained remarkably deep and persistent.
 
Though almost everyone is convinced that America means something, there are disagreements—sometimes quite basic—about what that something is. For example, is the United States to be understood as a nation built upon the extension of European—and especially British—laws, institutions, and religious beliefs? Or is it more properly understood as a modern, Enlight­enment-based, postethnic nation built on abstract principles such as universal individual rights rather than with bonds of shared tradition, race, history, conventions, and language? Or is it rather a transnational and multicultural “nation of nations,” in which diverse sub- or supranational sources of identity—race, class, gender, ethnicity, national origin, religion, sexual identity, and so forth—are what matter, and only a thin and minimal sense of national culture and obligation is required? Or is it something else again?
 
Each of those propositions suggests, in its own way, that American history has a distinctive meaning. Americans of years past actively sought a broad, expansive, mythic way to define what distinguished the nation. Consider this short list of appellations America has accumulated over the years: City upon a Hill, Empire of Reason, Novus Ordo Seclorum, New Eden, Nation Dedicated to a Proposition, Melting Pot, Land of Opportunity, Nation of Immigrants, Nation of Nations, First New Nation, Unfinished Nation, and, most recently, Indispensable Nation. Other nations sometimes earn names of this sort, but they are not so numerous, and they lack the universalistic implications that complicate the sense of American exceptionalism.
 
Nor should one neglect the religious dimension of Americans’ self-understanding, which continues to be powerfully present, even in the minds of the nonreligious. The notion that America is a nation chosen by God, a New Israel destined for a providential mission of world redemption, has been a near-constant element of the national experience. The Puritan settlers in Massachusetts Bay’s “city upon a hill” had a strong sense of historical accountability and saw themselves as the collective bearers of a world-historical destiny. That same persisting conviction can be found in the rhetoric of the American Revolution, in the vision of Manifest Destiny, in the crusading sentiments of Civil War intellectuals, in the benevolent imperialism of fin-de-siècle apostles of Christian civilization, and in the fervent speeches of President Woodrow Wilson during World War I.
 
Few presidents after Wilson cared to make a direct appeal to Americans’ sense of chosenness by God as a justification for foreign policy, and the disastrous intervention in Vietnam provided an especially severe chastening of such ambitions. But Wilson’s belief in America’s larger moral responsibility—particularly its open-ended obligation to uphold human rights, defend democracy, and impart American-style institutions, technologies, and values to the rest of the world—did not vanish with him. Indeed, by the time of the second American war against Iraq, that aspect of Wilson’s legacy had become the preferred position even of American conservatives, and in the oratory of George W. Bush, arguably the most evangelical president in modern American history, the legacy’s quasi-religious dimension seemed to have survived intact. “The advance of freedom is more than an interest we pursue,” the president declared last May. “It is a calling we follow. Our country was created in the name and cause of freedom. And if the self-evident truths of our founding are true for us, they are true for all.”
 
To be sure, other strains of American thought have operated, including a sober realist tradition grounded in John Quincy Adams’s famous assertion that the United States does not go abroad in search of monsters to destroy. There has been a reflexive, if shallow, reluctance in many quarters to “impose our values” on the rest of the world, although even in the protests of the critics it’s implicit that the temptation and the power to do so are somehow uniquely American. There’s a counterstrain, too, represented by the radical political views of the influential linguist Noam Chomsky, that envisions the United States as a uniquely pernicious force in human history, unexcelled in its guilt rather than its virtue. One could say that this is American exceptionalism with a vengeance—but exceptionalism nonetheless.
 
All of which goes to show how difficult it has been for Americans, and others, to think of the United States as “just another nation.” The idea of America, and of its national destiny, clings as tenaciously as ever to the nation’s self-consciousness, and if only by virtue of the influence of this one large idea, the second half of the Eriksonian paradox would seem to hold. That many Americans have believed steadily in their nation’s special mission is a fact of American history. In the 20th century it became a fact of world history, and by the early years of the 21st, it had almost come to seem a pattern for universal emulation.
 
How and when did this overwhelming sense of America as an idea take hold? Part of the answer can surely be found in the way the country has managed its changing ethnic and racial makeup. For much of the nation’s history, the primacy of English-speaking Anglo-Saxon Protestants was a reality to which immigrants and other minorities were expected to accommodate themselves. But by the beginning of the 20th century, that primacy had begun to weaken rapidly (by now it is largely gone). When commonalities of race, religion, ethnicity, history, and even language were eroded, they ceased being a basis for the cohesiveness of the nation, and abstract ideas moved in to take their place. America came to be defined increasingly in terms of the large ideas for which it stands—such as liberty, equality, and economic opportunity—rather than in terms of the history, geography, religion, customs, or culture that most Americans had shared to that point.
 
Hence the rise in the 20th century of concepts such as “pluralism” and “multiculturalism,” which had played no part in the thinking of the nation’s founders. Pluralism proposes that the national culture of the United States ought to be able to make room for robust and independent subcultures, usually based on race, ethnicity, religion, or country of origin—and often on all four at once. The philosopher Horace Kallen, a German-Jewish immigrant and chief proponent of the concept of “cultural pluralism,” asserted that American culture in the 1920s was best understood as a symphony orchestra, whose musical richness was enhanced precisely by the tonal dis­tinctiveness of each of its members. The assimilationist ideal of a “melting pot” would, he believed, destroy that symphonic richness and substitute for it a bland unison. Of course there was a need for some kind of national culture, just as there was a need for a national government. But Kallen and other pluralists assumed that the national culture could be thin and limited in character, allowing the depth and richness of more particular affiliations to be preserved.
 
Pluralism proved surprisingly compatible with American patterns of thought and was embraced in schools and in public oratory. Perhaps the tension between particularistic and national identities bore some similarity to the tension between state and national identities that was built into the Constitution’s federal system. But pluralism has its limitations. A delicate concept, poised between particularism and cosmopolitanism, it partakes of both while wholly embracing neither. Some of the chief problems with pluralism have been passed along, and augmented, in the even more elusive idea of “multiculturalism,” which has come to mean almost anything one wants it to, from taking a generous view of ethnic foods and customs to believing in the inviolable separateness of the various subcultures that make up modern America. Among intellectuals, that latter form of multiculturalism has been an important expression of the mood of postmodernism, with its fondness for the politics of identity (based on attributes such as race, class, gender, and sexual orientation) and its programmatic suspicion of all overarching generalizations (except those occurring within certain approved groups). Because it offers no principle of national commonality in whose name disparate groups can be enjoined to cooperate, multiculturalism has become a recipe for the destructive factionalization the politics of identity can induce in the political and social order.
 
But even in its exaggerations, multiculturalism has raised useful questions that pluralism largely evaded: How does one simultaneously protect both the distinctiveness of racial and ethnic groups and their membership, both collectively and individually, in the polity? How much diversity can America stand and still maintain its cohesiveness as a nation? How much of a uniform national culture does American society really need? And what will be the source of that uniformity? The last two of these questions have found their way into policy debates over immigration and education, two areas of contention where platitudes about “celebrating diversity” inevitably come up against the hard fact that there does seem to be a core of Americanness to which all immigrants and children must be trained if we are to have reliably loyal and competent citizens. The quest to define America is by no means concluded—and the task is far more than an academic exercise. If America is to be presented as an idea or a set of ideas, we had better make sure they’re the right ones.
 
Far from being devoid of ideas, then, it would appear that the American scene is virtually awash in them. But that fact often goes unacknowledged, or is not given its rightful weight by Americans themselves, who continue to fancy themselves a largely practical people and still tend to see their way of life as the unforced fruit of certain self-evident and universal truths, a system that entails merely doing—in the words of Annie Oakley—what comes naturally. Hence, the Eriksonian paradox may, in the end, be more apparent than real, born less of practical-mindedness than of unreflectiveness.
Such a condition is very different from anti-intellectualism. But it is not unrelated to the peculiarly unfruitful relationship between American ideas and American life that the generation of Santayana and Brooks, and the generation of Emerson before them, identified and lamented—a condition remarkably resistant to correction. One can be utterly besotted with ideas, yet fail all the more readily to make use of them in the right way. In some respects, the emergence in the 20th century of an entire class of individuals who deal professionally in ideas—and the more ideas the merrier—has only deepened the problem. Thinking, like lovemaking, changes its character dramatically when it’s turned over to the professionals. And ideas, like currency, may be more readily devalued by abundance than by scarcity.
 
Perhaps, then, it would be helpful to step back a bit and attempt to identify some of the chief clusters of ideas that formed the general American outlook in the century just past. Three subjects in particular—science, culture, and liberalism—seem to have been especially influential in setting the horizons of American thought, organizing the inquiry, and propelling the discussion. Like magnets set down among iron filings, these topics established the field of forces that shaped much of the debate.
 
As it had been for much of the 19th century, science in the 20th century continued to be taken as a model by any field of human endeavor that made a claim to thoroughness, accuracy, reliability, and disinterested truth. There was a comparative purity about the scientific ideal, not to mention a more impressive record of achievement than that of its competition. Indeed, the steady advance of scientific knowledge and technological innovation throughout the 20th century—the perfecting of air and space travel, the unlocking of the atom, the discovery of the DNA double helix, the mapping of the human genome, and countless other fabulous accomplishments—has to be accounted one of the great success stories of human history.
That success inevitably gave rise to attempts to apply scientific methods to the solution of the full range of human problems, and no thinker of the 20th century more consistently epitomized the effort than John Dewey, whose links to pragmatism and progressivism make him a symbol of much of the century’s intellectual distinction. His astonishingly wide-ranging oeuvre, embracing subjects from aesthetics to education to epistemology to politics to metaphysics, was imbued with reverence for the tentative, provisional, experimental methods of science, which he regarded as both the highest expression of human intellectual striving and a model for democratic discourse. For Dewey, who described his own intellectual journey as “from absolutism to experimentalism,” most of the questions that had preoccupied Western philosophers in the past were no longer worthy of attention. What was needed was a fresh approach, building not upon inherited moral formulations or other forms of idealism but upon careful attention to experience and to the supple, nondualistic interplay of mind and world. Science was the most refined elaboration of that effort. And science, because it was inherently transparent and non-authoritarian, and therefore theoretically accessible to all, improved the prospects for participatory democracy by making scientific knowledge an equal opportunity good rather than a possession of priests and elites.
 
The social sciences came to be dominated by schools of thought that set out to be as rigorously scientific as possible, which often meant redefining the animating questions of a field—or even defining the old questions out of existence. For example, behaviorism in the field of experimental psychology and behavioralism in the field of political science decreed that only observable and measurable external behaviors were to be studied, and that the murky realms of introspection and values were to be eschewed entirely. Growing knowledge of genetics has fueled the rise of various forms of sociobiology, which seek to ground social thought in the physiological bases of human behavior.
 
The rallying cry of these and other academic disciplines was “to push back the frontiers of knowledge”; to do so credibly, they had to examine problems that were susceptible of being formulated in precise—and preferably quantitative—terms. Even the humanities, where the “frontiers of knowledge” metaphor has always been highly inappropriate, were affected by this development. Thus, in the spirit of Dewey, and in the wake of Ludwig Wittgenstein, Bertrand Russell, A. J. Ayer, and Rudolph Carnap, various forms of analytic philosophy sought to emulate the rigorous spirit of science and, in the process, often elected to dispose of philosophers’ perennial questions—the existence of God, for example, or the meaning of good and evil—by dismissing them as mere byproducts of linguistic confusion.
 
A culmination of sorts for this triumphalist strain of scientific thought came in the lavishly ambitious 1998 book Consilience: The Unity of Knowledge, by the distinguished entomologist Edward O. Wilson, a pioneer of the sociobiology movement. In Consilience, Wilson proposed that a common body of inherent principles underlies the entire human endeavor. The Enlightenment thinkers of the 17th and 18th centuries had it generally right, he believed: We live in a lawful, perfectible material world in which knowledge is unified across all the great branches of learning. Once all fields of inquiry are brought under the cope of science, we’ll be able to explain everything in the world through an understanding and application of a finite number of natural laws.
Needless to say, the reception given Wilson was not universally warm. The poet, novelist, and social critic Wendell Berry wrote a scathing rebuttal called Life Is a Miracle (2000), and such influential figures as the philosopher Richard Rorty brushed Consilience aside with faint contempt. Even many of Wilson’s fellow scientists found in it an element of overreaching. The critical response showed that even the prestige of science has its limits in American intellectual life.
 
Indeed, there has been a reaction in recent years against the hegemony of science. One reason for the reaction is a growing anxiety about some scientists’ alarming—and increasingly conspicuous—lack of social and moral responsibility. When atomic physicist J. Robert Oppenheimer famously described the development of sophisticated nuclear weapons as a “technically sweet” pursuit, in which one first invents and then decides what to do with the invention, he seemed to many Americans to epitomize the problem with science. Compared with the fevered pronouncements of some of today’s biotechnological wizards, Oppenheimer’s statement seems a paragon of responsible self-awareness. Small wonder that the general public is anxious, particularly given the paucity of evidence that the scientific community recognizes any moral limits within which it’s willing to constrain itself. Moreover, the sprawling organizational demands and staggering expense of modern science have given it a vastly more corporate, bureaucratic, and results-oriented character, depriving the work of much of the moral and intellectual heroism it once had.
 
The hegemony of science as a worldview has been challenged as well, albeit indirectly, by the continued flourishing of religious faith and practice in America. Of all the surprises the 20th century had in store, none was greater than the amazing persistence of piety. It was not what the sociologists had expected. To be sure, there was a renegotiation of the terms of separation between religion and public life, mostly favoring the strict separation of the two. But the grand predictions of secularization theorists that, in becoming modern, America would also become secular have not been borne out. The precise relationship between religion and science is not easily defined, and the two are by no means mutually exclusive. But at the very least, the persistence of religion shows the profound inadequacies of science, which can tell us a great deal about how the world works, but very little about how to live in it.
 
A third challenge to the dominant position of science attacked its claim to epistemological superiority and purity. Science was to be understood, instead, as something inherently social and political, the work of particular communities of inquiry (i.e., accredited groups of scientists), rather than as a heroic endeavor to disclose the objective truth about nature. In 1962, Thomas Kuhn proposed that new theories are accepted by a community of scientists not because they meet objective criteria of truth but because of the way those theories address the concerns of that particular community. Others took this position further. In contending that the “foundations” sought by “objective” science were illusory, Richard Rorty argued that the work of science is really no different from the work of any other intellectual community. There’s nothing special about scientific knowledge. It stands in a continuum with all other forms of knowledge, in being made “true” not by objective verification but by the assent of interested parties. So much for the disinterested heroism of science.
 
If the great project of modern science was the conquest of nature, the concept of “culture” might be said to have pursued that conquest too, though by quite different means. No word in the language was made to do more heavy lifting in the 20th century than culture. In the 19th century, culture had referred both to a process of intellectual and moral cultivation and to a body of distinguished works of arts and the respective social roles of men and women, for example—were instead products of culture.
 
Thus, much of what we had thought of as natural was actually culturally conditioned (or, in the more recent usage, socially constructed). Such an idea, in optimistic and reform-minded America, tended to be used for progressive purposes, and cultural anthropology served as yet another tool in the rebellion against Victorianism. The field had perhaps its most powerful effects in the study of gender, sexuality, and family structure, underwriting a wholesale reconceptualization of those three subjects. The very use of the term gender, which was redefined in the 1970s to mark a distinction between the cultural and social characteristics ascribed to the sexes and those that are strictly biological, is an indication of the central role played by the concept of culture. Feminists had long seen biological determinism as their sworn enemy, and “culture” gave them a way of defeating it, by understanding human identity and relations as being more malleable than the concept of nature would permit.
 
The anthropological notion of culture enjoyed enormous influence, both scholarly and popular, and supported a certain vague but amiable relativism that became an essential marker of good intellectual breeding in the 20th-century West. But by the end of the century, the word culture, fraying from overuse, was in danger of turning into an empty signifier. In what sense could one speak of the culture of the United States, the culture of Microsoft, the culture of NASCAR, the culture of Harvard, the culture of PS 148, and the culture of poverty with any confidence that one was using a meaningful term? The cultures that make up American multiculturalism are clearly not all coherent or genuine cultures in the same sense as the cultures found in “native” countries. And a taboo in America against judging other cultures began to seem ridiculous in the context of grotesque violations of human rights and such practices as clitoridectomy and suttee in some parts of the non-Western world.
 
Thus did liberal (and Judeo-Christian) universalist notions begin to return to the mainstream of Western discourse, a development that one saw emerging with great strength and clarity in the wake of the 9/11 attacks on America and the subsequent debates about the Western response to Islamist terrorism. In addition, the newly vitalized biological disciplines—sociobiology, evolutionary psychology, and other emerging fields brimming with energy and confidence—mounted a fresh challenge to the predominance of culture as an explanatory tool. The battle between nature and nurture was being refought, and, as the century turned, culture was steadily losing ground.
 
In restoring the respectability of universal ideas, the events of 9/11 may also have had the effect of revitalizing liberalism, by presenting us with a vivid example of the very poisons that classical liberalism was devised to counter. But this could turn out to be a mixed blessing. The liberalism dominant in the United States at the end of the 20th century had its genuine vulnerabilities, deficiencies that were not likely to be addressed merely by invoking the negative example of murderous religious fanaticism.
 
By century’s end, liberalism had become essentially a rights-based political philosophy that subordinated considerations of the common good to the sovereign liberty of rights-bearing individuals and exalted the choice-making capacity of an autonomous self over all else, including commitments to family, community, nation, and religion. The older, progressive liberalism of Herbert Croly had emphasized the cultivation of social solidarity and national consciousness and regarded the individualistic tendencies of liberalism as a potential threat to other valuable goals. Not so the later liberalism, which proposed that individual rights trump all other considerations.
 
The shift that occurred in liberal thought led to its own excesses, including a corrosive tendency to reduce all questions of law and social policy to simplistic “rights talk,” and gave new plausibility to conservative alternatives, as represented by the political theory of philosophers Leo Strauss and Eric Voegelin, the historically informed social criticism of Richard Weaver and Russell Kirk, and the work of sociologist Robert Nisbet. In addition, the hypertrophy of rights talk precipitated a countermovement within liberalism called communitarianism, which incorporated conservative elements and established itself as a fresh alternative by the end of the 20th century.
 
Political scientist Michael J. Sandel, one of the most influential liberal-communitarian critics of rights-based liberalism, argued against what he called “the procedural republic”—a liberalism that makes government the referee of fair procedure and guarantor of individual rights, yet insists that government be scrupulously neutral when it comes to passing judgment upon the substantive ends that individuals elect to pursue. This liberal-neutralist philosophy, Sandel asserted, was inadequate to the needs of a democratic republic because it failed to inculcate the civic virtues and qualities of character necessary to sustain liberty and self-governance. In its place, he invoked the republican tradition of early American politics, which exalted participation in the civic life of the res publica—through specific institutions and practices to which contemporary rights-based liberalism gave too little attention.
Communitarianism thus addressed a central problem of late-20th-century thought: the inability to attend equally to both the liberal ideal of individual freedom and the various contexts—social, historical, cultural, linguistic, among others—in which the self is formed and embedded. The historian Thomas Haskell perceptively noted that, at century’s end, academic thought seemed simultaneously and equally committed, on the one hand, to rights talk and the inviolability of individual liberty and, on the other, to the thoroughgoing cultural “situatedness” and historicity of the self. Given such confusion, the public mind could count on little help from the ranks of political theorists.
 
The great underlying trend of intellectual life in the past century was the ascendancy of the academy. The modern re­search university has by now absorbed into itself an astonishingly large part of the nation’s intellectual life, with consequences yet to be fully grasped. One should acknowledge that, in many respects, this new institutional reality has been a good thing. The academy provided a haven for free and disinterested intellectual inquiry in a commercial society and gave those who work with ideas a place to make a decent living while plying their trade. The division of knowledge into academic disciplines and distinct communities of interpretation made for greater rigor and clarity in the production of studies in nearly all fields. Whatever its failings, the academy supplied a highly serviceable context for intellectual activity of a high order.
 
Yet there were costs, and they stemmed largely from success, not failure. The modern university still proceeded from Enlightenment assumptions about the nature of knowledge—that it can be objective, universal, progressive, and cumulative. Those assumptions worked fairly well for the natural sciences, moderately well for the hard social sciences, and not at all for the softer social sciences and the humanities, which found themselves in deepening crisis. The logic of specialization contributed to the problem by dividing inquiry into smaller and smaller subunits, each with its indigenous jargon and distinct community of interpretation, and each with little to communicate to the world beyond itself.
 
But this understates the extent of the problem. The abandonment of the general educated reader as a cultural ideal over the course of the century was, in fact, an intellectual, cultural, and moral calamity, and a betrayal of the nation’s democratic hopes. The situation at century’s end bore an uncomfortably close resemblance to what Santayana and Brooks had described nearly 100 years before. The split in the American mind still existed (as sharply etched as ever), and it still divided highbrows and lowbrows. But the highbrows became ponderous, impenetrable, professionalized academics, whose air castles of thought were surrounded by moats of jargon designed to keep the dabblers and dilettantes at bay. They were the true legatees and custodians of the genteel tradition, despite the disappearance of almost every trace of Victorian reticence and belletristic pretension. The lowbrows, meanwhile, were the manufacturers and purveyors of commercial mass entertainment, with debased aesthetic standards and a coarsening effect on the populace. Instead of being elevated by contributions from on high, political discourse was debased by the domination of the low.
As a result, the vital center of ideas still stood largely unoccupied. The leavening effect the two halves of the American cultural schism might have had upon one another—and occasionally did have—was hard to find, and harder to sustain. Those few hardy souls who were able to cross over—a Leonard Bernstein in music, a Tom Wolfe in literature, a David McCul­lough in history, an Andrew Wyeth in painting—won the scorn (often masking envy) of the illuminati and were dismissed as middlebrows, popularizers, and sellouts. Yet it is precisely in that vibrant democratic middle ground, where ideas drawn from elite and popular cultures mix and mingle, and where the friction between idea and lived reality is most powerful and productive, that the genius of American culture has been found in the past. Such was the hope of Emerson and Lincoln, whose uncommon eloquence sprang from the commonest of roots. Such was the promise of jazz, whose tangled and improvised mongrel beauty became the very image of modern America. The bifurcation of American culture, intensified by the heavy hand of the academy and the numbing effects of mass culture, has made it no easier for peculiarly American ideas of this sort, possessing both intellectual sophistication and wide democratic scope, to flourish and find a receptive audience. But an American artist or thinker can have no worthier goal than to reach that audience.

More From This Issue