Human history becomes more and more a race between education and catastrophe.
—H.G. Wells, 1920On my campus is a handsome domed building with these words inscribed across its facade: erected for the students that religion and learning may go hand in hand and character grow with knowledge. Built about a hundred years ago, the structure is in reasonably good repair, but the purpose for which it was built (it still houses the office of the university chaplain) is in ruins. I’ve never met a student who knows what’s written above the door.
Religion and learning hand in hand? This is true at only a few sectarian institutions, and if you were to remind just about any university president that his or her own institution arose from this or that denomination—Congregationalism (Harvard and Yale); Presbyterianism (Princeton); Episcopalianism (Columbia)—you’d likely get the response of the proverbial Victorian lady who, upon hearing of Darwin’s claim that men descend from apes, replied that she hoped it wasn’t so, but if it were, that it not become widely known. As for the claim that “character” and knowledge should grow together, if you ask just about any student how college makes that happen, the most you’re likely to get is a raised eyebrow or a puzzled stare.
The truth is that today the domed building is a monument to an endangered idea: the college idea. In fact, that idea was already in trouble when the building went up. Even then the words were more commemorative than descriptive, concerned with reclaiming a threatened purpose (“erected for the students”) and connection (“character grow with knowledge”). The undergraduate college within Columbia University was still at least nominally focused on the task of developing Christian gentlemen, but was slipping into the encroaching shadow of a vast research enterprise that has since surrounded and, some would say, swallowed it.
Most of the new university buildings constructed during the first quarter of the twentieth century were meant for advanced study of a particular discipline such as mathematics or chemistry, or a profession such as law. Over the intervening century, the college survived and developed a post-Christian form of general education—the well-known Columbia Core Curriculum in which undergraduates still read great books from Homer to Dostoevsky, including parts of the Jewish and Christian scriptures. But that curriculum is subject to continual attack on grounds both ideological (too male and “Eurocentric”) and financial (discussion-sized classes are very expensive). On a campus mainly devoted to research and graduate training, those of us who focus on college teaching sometimes feel like Jonah in the belly of the whale.
This is not a local or peculiar story. Although the relation between college and university may be more disproportionate at Columbia than at comparable institutions (among the leading universities, Princeton has kept the balance best), it is a representative story. It begins with the fact that colleges and universities, even when they share a campus and a name, embody fundamentally different purposes. A college—whose students, ideally, live on or near campus—is about transmitting to young adults knowledge of the past so they may draw upon it as a living resource in the future. A university is a disparate array of research activities that is about creating new knowledge with the aim of superseding the past.
Until the middle of the nineteenth century, education was mostly about the first and very little about the second. Colleges were small and sectarian, while universities barely existed. After the Civil War, everything changed. Many colleges, where students had typically taken a common curriculum leading to a capstone course on moral philosophy taught by the college president, went out of business. Those that survived had to come to terms with the fact that industrial society wanted specialists with technical knowledge more than generalists with Biblical or classical learning. To meet the need, Congress made grants of federal land in order to fund the establishment of public universities for the purpose of advancing agricultural techniques in the undeveloped West. In the East, private philanthropists funded new institutions modeled on the renowned German universities of the day, where academic freedom prevailed and research laboratories as well as graduate seminars attained their modern form. The first such distinguished American university, Johns Hopkins, initially took in no undergraduate students at all. And soon, old (by American standards) institutions such as Harvard and Columbia, though they retained their undergraduate colleges, began to turn themselves into universities, too.
The first president of Hopkins, Daniel Coit Gilman, defined the new educational objective as showing students “how to extend, even by minute accretions, the realm of knowledge.” The most influential advocate of the new pedagogy was Charles William Eliot, a chemist who became president of Harvard. In the 1870s he introduced an elective system by which college students could choose courses according to their talents and interests rather than, as in the past, follow a curriculum common to all. Eventually even conservative faculty came to appreciate the new system, since it freed them to teach subjects in which they were expert, and thus brought their teaching into conformity with their research. The distinction between undergraduate and graduate students, whose number was increasing, was breaking down.
Eliot thought that “concentration” (the word Harvard still uses today instead of the more common term “major”) should begin in college, whose job is “to store up the accumulated knowledge of the race” so that “each successive generation of youth shall start with all the advantages which their predecessors have won.” On this view, all of human history becomes a sort of relay race in which no runner is required to travel trodden ground. The efficacy of the principle is confirmed every day by the large number of undergraduates who now master at least the first stages of calculus—even though it took several millennia before mankind produced the two seventeenth-century geniuses (Isaac Newton and Gottfried Wilhelm Leibniz) who developed it independently. We take the fact for granted, but it is a stunning vindication of the scientific premise that knowledge is incremental and accretive: that once a new truth is discovered, it does not have to be rediscovered but can be passed on to those capable of grasping and, most importantly, extending it.
Given the evidence, it is hardly surprising that science is now the driving force in our universities. It has proven itself. It possesses a means—the experimental method—by which its claims can be tested. It has an obvious impact (not always benign, to be sure) on the lives of virtually everyone living and yet to be born. Transistors, lasers, computers, diagnostic machines, medical therapies, alternative sources of energy (we hope)—the list goes on and on of what constitutes, from one point of view, the “return” on public and private investment in education. If we include the “hard” social sciences under the rubric of science, then it can also be credited with devising rational principles for legal and financial systems, managing the infrastructure of transport and commerce, and promoting public health.
So what does all this mean for those of us in academia who, recognizing the achievements of the modern university and harboring no nostalgia for the days of official religion, still hold to the college principle “that character should grow with knowledge”? Science, unfortunately, is no help here. We cannot appropriate its principle of progress and say that Daniel Defoe’s A Journal of the Plague Year, published in 1722, or Albert Camus’s La Peste, published in 1947, tells us more about the disease than did Thucydides’ commentary on the plague at Athens, written in the fifth century BC. Or that James Joyce’s Ulysses, written at the outset of the twentieth century, gives a more complete account of experience than did the Odyssey, probably composed more than two and half millennia earlier.
Science, moreover, tells us nothing about how to shape a life or how to face death, about the meaning of love, or the degrees of responsibility. It not only fails to answer such questions; it cannot ask them. Some people believe that someday it will do both—that in some future age of “consilience,” neuroscience will define and ensure happiness and prove or disprove the insights of religion into the nature of sin and salvation; biochemistry will distinguish truth from falsity among what today are mere opinions about sex and gender; all human choices will become susceptible to experimental testing and rational sorting. Maybe it will happen, but none of us will be around when it does, and I’m not sure I want to be.
Meanwhile, we are left with what are customarily called the humanities—literature, history, philosophy, and the arts—as the best means for coming to grips with such questions, which have always had a special urgency for young people of college age. In my ramblings through various college archives, I came across a diary kept by a student at a small Methodist school, Emory and Henry College, in southwest Virginia. On a winter evening in 1850 after attending a lecture—really a sermon—by the college president that left him troubled and apprehensive, he made the following entry in his journal. “Oh that the Lord would show me,” he wrote to himself, “how to think and how to choose.” That poignant sentence—poised somewhere between a wish and a plea—has an archaic sound, but I can’t imagine a better formulation, then or now, of what college should be: an aid to reflection, a place and process whereby young people take stock of their talents and passions and begin to sort out their lives in a way true to themselves and responsible to others.
For many if not most students, God is no longer the object of the plea, or if he is, they do not attend a college where everyone worships the same god in the same way. Religion today is so much a matter of private conscience and the number of punishable infractions on a college campus so few (even rules against the academic sin of plagiarism are only loosely enforced) that most contemporary students would be astonished if a teacher or dean presumed to intervene in their private lives with a doctrinal or moral correction. The era of spiritual authority belonging to the college is long and well gone. Yet that is all the more reason why colleges, committed as they are to toleration and diversity, ought to do everything they can to excite their students’ interest in works of literature and art that register their longings and yet exceed what they are able to articulate by and for themselves. Such encounters are among the indispensable experiences of the fulfilled life, and colleges have an undiminished—indeed, an augmented—obligation to coax and prod students in their direction.
Japanese girl practicing calligraphy, c. 1890.
Yet they do so less and less. Practitioners of the humanities, as everyone knows, have lost much of their standing in the academy. Why has this happened? Partly, no doubt, because of changes in the larger culture that colleges can do little to arrest: particularly the decline of reading in an age saturated by digitized noise, where few people have the time or concentration to linger with a long or difficult book. And in a culture where affluence is both the norm and an elusive goal, students feel relentless pressure to please the professional school admissions committee or the corporate hiring committee with evidence of proficiency in some practical subject like economics, while subjects that were once at the center of a liberal education—literature, art, philosophy, music—are consigned to the peripheral category of “enrichment.”
But humanists cannot blame their fallen status entirely on that exculpatory abstraction, culture. They have damaged themselves by the desperate combination of emulation and repudiation with which they have reacted to the rise of science. At first they aped the scientists by trying to turn language and history into phenomena susceptible to objective description. When the new universities appeared in the late nineteenth century, humanists tried to claim a place for themselves by describing language as if it could be studied in much the way their scientific colleagues studied the properties of gas or light in the lab next door. The name of the new discipline, which arose in Germany, was philology. This was the era, too, of “scientific” history—also of German provenance—the idea that empirical investigation could establish laws that govern human behavior over time with no less consistency and predictability than the laws of physics.
Those days of chasing the phantom of objectivity are now far behind us. Humanists today are mostly to be found at the opposite extreme—denying the very idea of truth by asserting, with varying degrees of “postmodern” irony, that all putative truths are contingent and all values relative. But the result is the same: the humanities have marginalized themselves in the universities, which creates an inevitable trickle-down effect in the colleges, since today virtually all college faculty members must have earned a PhD degree in a research university, which provides little training or encouragement to become truly effective—and affective—teachers.
All the skills and strategies that faculty need for rising in the academic hierarchy—publishing frequently, performing at conferences, and (most important for raising one’s salary) currying offers from rival institutions—have nothing to do, and are often at odds, with a commitment to undergraduate teaching. The waste and shame is increasingly evident in independent liberal arts colleges as well as in research universities. Some fifty years ago Clark Kerr, the man who supervised the vast mid-century expansion of the University of California—an expansion so large that he felt a new word, “multiversity,” was needed to describe it—pointed out the “cruel paradox that a superior faculty results in an inferior concern for undergraduate teaching.” Forty years earlier, Max Weber, in his famous lecture on “Science as a Vocation,” noted that “one can be a preeminent scholar and at the same time an abominably poor teacher.” And if we go back further still, to the 1840s, we find Ralph Waldo Emerson protesting against the self-replicating clubbiness of the Harvard faculty by remarking, “A college professor should be elected by setting all the candidates loose on a miscellaneous gang of young men taken at large from the street. He who could get the ear of these youths after a certain number of hours... should be the professor.”
What all these critics have in common is the conviction that the true teacher is a person inflamed, even possessed, in a way that can never be certified by any advanced degree. The philosopher William James—whose student George Santayana saw him as hesitant and halting in the classroom until “the spirit would sometimes come upon him, and leaning his head on his hand, he would let fall golden words, picturesque, fresh from the heart, full of the knowledge of good and evil”—spoke of “the Ph.D. Octopus,” as if the whole university apparatus were a kind of predatory monster. Of course we want our teachers to be informed and disciplined (James was both) as well as fervent, and there is no contradiction, in theory, between good scholarship and good pedagogy. In fact, it is when the two converge that the dividing line between university and college is erased—as when college students are afforded the chance to work in a laboratory with a “cutting-edge” scientist or to assist a scholar with “ground-breaking” work (terms of praise that would have been incomprehensible before the rise of the university). Nor should we imagine that the only effective form of teaching is the Socratic form of small-group discussion. “Truly speaking,” as Emerson put it, “it is not instruction but provocation that I can receive from another soul.” This is the hallmark of the great lecturer, who may not pause for Q & A, or even know the students’ names, but whose example of total immersion in the subject creates the moving spectacle of what Emerson called “
Man Thinking.” It is for that reason that several generations of Columbia students—undergraduates and graduates alike—flocked to hear the great art historian Meyer Schapiro, whose glowing eyes and transported smile as he spoke of Paul Cézanne or Wassily Kandinsky led more than one student to say, “Whatever he’s smoking, I’ll have some.”
Whether the great books and works of art make their debut in the lives of students in a seminar room or a lecture hall, they have an incomparable power to make the past seem not only less remote but not even past. They are antidotes to loneliness. I remember how, when I first read the Iliad with a group of Columbia freshmen (we call them “first-years” now), there came a moment after all our discussion of Homeric similes and the formulaic structure of oral poetry and the mythic origin of national identity when suddenly we felt as if we were reading about ourselves—or at least, if we were male—our childhood selves. It happened when we arrived at the image with which Homer describes the Trojan soldiers overrunning their Greek enemies: “They streamed over / in massed formation, with Apollo in front of them holding / the tremendous aegis, and wrecked the bastions of the Achaians / easily, as when a little boy piles sand by the sea-shore / when in his innocent play he makes sand towers to amuse him / and then, still playing, with hands and feet ruins them and wrecks them.” Apparently, little boys on the shores of the Aegean three thousand years ago did the same thing as little boys do today at Jones Beach or the Hamptons (social class is not, in this case, a salient variable). As to whether little girls are less keen to build, then smash—we can leave that question to the neuroscientists.
Whatever the explanation for such transhistorical truths, the humanities speak to us in a subversive whisper that says the idea of progress is a sham. The humanities tell us that the questions we face under the shadow of death are not new, and that no new technology will help us answer them. They are hard and serious questions. Does Achilles’ concept of honor in the Iliad retain any force for us today? Can one bear to live according to Thoreau’s ethic of minimal exploitation of nature? Is there a basis in experience for the Augustinian idea of original sin? Such questions do not admit of verifiable or replicable answers because the experiment to which we must subject them is the experiment of our own lives.
So what are the prospects that this sort of education will remain central to the mission of our colleges in the twenty-first century? There are strong forces working against it. In response to the challenge of global competition, major universities are expanding (Yale is developing a new research campus in West Haven; Harvard is building across the Charles River; Columbia in Harlem; Penn in an undeveloped area adjacent to its current campus), partly in order to accommodate the foreign students who will be the next generation of the international elite from whom American universities hope to draw contributions to the alumni fund.
Anyone who has passed through the regular gradations of a classical education, and is not made a fool by it, may consider himself as having had a very narrow escape.
—William Hazlitt, 1821Sheer institutional growth has never been good for humanistic education. It makes small classes hard to sustain, not only because of the expense (low faculty-student ratios require high instructional budgets) but also because humanities faculties are not keeping pace with the sciences—a trend likely to quicken in part because students from abroad usually come for technical training rather than for the collegiate experience. Moreover, there is growing pressure, both because of faculty ideology and because growing numbers of students are foreign-born or of non-Western descent, to introduce texts from other cultures. This is all to the good when it is supplementary rather than substitutional. But given the zero-sum principle that applies to any curriculum (put something new in, take something old out), chances are that reading the constitutive books of the Western tradition will continue to decline.
Why, in the end, should we care? Anthony T. Kronman gives a good answer in his recent book, Education’s End: Why Our Colleges and Universities Have Given Up on the Meaning of Life. A former dean of Yale Law School, who now teaches in a Great Books program in Yale College, Kronman makes a passionate case for how important it is that students should grasp the fundamental ideas that constitute Western culture in its ideal form:
The ideals of individual freedom and toleration; of democratic government; of respect for the rights of minorities and for human rights generally; a reliance on markets as a mechanism for the organization of economic life and a recognition of the need for markets to be regulated by a supervenient political authority; a reliance, in the political realm, on the methods of bureaucratic administration, with its formal division of functions and legal separation of office from officeholder; an acceptance of the truths of modern science and the ubiquitous employment of its technological products: all these provide, in many parts of the world, the existing foundations of political, social, and economic life, and where they do not, they are viewed as aspirational goals toward which everyone has the strongest moral and material reasons to strive.
Surely, every college graduate ought to understand something about the genealogy of these ideas and practices, something also about the historical processes from which they have emerged, about the tragic cost when societies fail to defend them, and yes, about alternative ideas both within the Western tradition and outside it.
The proposition accords with the Jeffersonian argument that democracy, if it is to survive, requires an educated citizenry aware of the necessary balance between rights and responsibilities. Given the perennial presence of demagogues and would-be autocrats, the argument is as strong now as it was when Jefferson made it more than two centuries ago. In fact, in a world where science has conferred on us previously unimaginable powers—the power to poison or preserve the earth, to engineer genetic changes in our own species for better or worse—it is more urgent than ever that the rising generations learn how “to think and how to choose.” College in the endangered sense of the word is still the most likely place for that to begin to happen.