1.
Along with freedom, the other sacred word in today’s college is “diversity.” Nearly sixty years ago, the Harvard “Red Book”—the famous faculty report on general education published in 1945 when the end of World War II was in sight but not yet at hand—identified the coming challenge of postwar America in a chapter ominously entitled “Problems of Diversity.” By using that word, the authors were not exactly prophesying the impending influx of women, as well as racial and ethnic minorities, into historically white and male institutions; they had in mind no clear picture of demographic change, but they did anticipate what they called “differentiation” in the “inner sphere of ability and outlook” of future students. With a sigh of Brahmin realism, they conceded that the old economy in which “thousands of lighter jobs…used to call for a brisk young pair of hands” was disappearing, and that unprecedented numbers of young people would finish high school and want a chance to go to college.
This prediction was borne out in the postwar years, which saw enormous growth in the size and quality of public universities and the rise of a community college system that afforded educational opportunity to millions of first-generation college students. Anticipating the surge of democratization, the “Red Book” authors had asked prospectively, “How can general education be so adapted to different ages and, above all, differing abilities and outlooks, that it can appeal deeply to each, yet remain in goal and essential teaching the same for all?” With a flourish—but not without reason—they concluded that “the answer to this question…is the key to anything like complete democracy.”1
This question has never been answered. In fact, by tacit agreement, it has been dropped. The new diversity has exerted a necessary and salutary pressure to open the curriculum to non-Western and other nontraditional subjects. But there has been almost no parallel effort to establish courses that bring students of “differing… outlooks” together into productive discussion. As Louis Menand argued in these pages a few years ago, most of the high-sounding postwar talk about general education was “lip-service,” and the growth has been mainly in technical and practical education.2 The few institutions that still have compulsory “Great Books” programs—such as the University of Chicago, Columbia, and St. John’s College—adopted them well before World War II. I happen to teach at one of those institutions, which naturally expresses public pride in its rigor and wisdom, but as my former colleague the literary scholar Arnold Rampersad (now at Stanford) remarked a few years ago at the seventy-fifth anniversary of the Columbia Core Curriculum, the Core is like the interstate highway system: we are glad we have it, but we could never build it today.
The new diversity has tended to exert pressure on the curriculum to be more various at precisely the time when some measure of commonality is needed. Yet it is risky to raise any question—even a friendly one—about the educational consequences of diversity. Among the few who have done so is Peter Gomes, Harvard’s long-serving chaplain, who writes in his contribution to Distinctively American, Steven Koblik and Stephen R. Graubard’s collection of essays on the future of liberal arts colleges, that by the 1970s, diversity had become
a goal so frequently and fervently espoused as to take on the nature of a sacred cow, immune to criticism or examination. Those who did risk a challenge to the concept were consigned to the ranks of the sentimental or self-interested old guard, who refused to recognize the new demographics of America and longed for the old boy network of the past. The great conundrum of diversity, however, was not in its variety but in its purpose.
For what end was this new and diverse student population created? What purpose, other than statistical, was to be achieved by the new diversity?
These salient questions have been answered in essentially two ways during the decades-long debate over affirmative action. The first answer is the remedial argument that has created so much legal and poli-tical turmoil, since, as Anthony Kronman, the former dean of Yale Law School, puts it, any argument that emphasizes past group discrimination invites
a contest of right against right—a conflict between the defensible claim of minority applicants to a form of special treatment and the equally defensible claim of non-minority applicants to be judged by their individual qualifications alone.
The second argument, which has proven more persuasive to the courts, is that diversity (to be achieved not by quotas, but by considering race as one factor in admissions decisions) contributes to the purposes of liberal education, which Kronman summarizes as “expansion of the student’s powers of sympathetic imagination” through appreciation of “views, moods, dispositions and experiences other than his or her own.”3
Advertisement
Since colleges and universities are committed to this second argument, it would seem that they should do whatever they can to foster a genuine exchange among students of diverse backgrounds in their residential buildings, eating facilities, and classrooms. Some colleges do in fact match students from contrasting circumstances as roommates when they first arrive, but since relatively few colleges provide guaranteed or mandatory housing beyond the freshman year, and upperclass students expect to choose where and with whom they live, this approach has little lasting effect.4 On many campuses one sees dispiriting, if understandable, racial clusters—“Asian tables,” “Hispanic tables,” “Black tables”—in the dining halls. The one place where students might be compelled to listen to one another—“to educate ourselves by knowing opposite lives,” as Stover put it a long time ago—is the classroom. And yet small-group education is expensive and therefore increasingly rare, and universally required courses, where students of different backgrounds cannot avoid each other, are almost unknown. The human proclivity to stick to one’s own, especially in our age of diversity, is an argument for a shared general education, not against it.
With a few exceptions, academic leaders have been notably silent on these matters, beyond issuing the usual bromides about multicultural tolerance.5 They seem to be afraid of offending one or another of the interest groups to which they are beholden: trustees, faculty, donors, parents, politicians, and the students themselves. Derek Bok, former president of Harvard, defends presidential reticence by rightly pointing out that some of his predecessors “argued openly against trade unions, opposed marriage between different ethnic groups, and favored property qualifications for voting.” But being out of step with the times is no argument for self-censorship, as Bok himself exemplifies in his indignant book, Universities in the Marketplace, on the perils of commercialism.
2.
No one who spends much time on a college campus can fail to sense that it is a place rich—even ripe—with paradox. Although the agedness of our elite institutions accounts for much of their prestige, the institutional past tends to be reviled as a dark age of prejudice. The old paternalistic college has been thrown out with a hearty good riddance, yet, as Gomes points out, today’s students “want to retain all their hard-won autonomy, while at the same time insisting that institutions assume a moral responsibility for protecting them from the consequences of that autonomy.” (Presumably, Gomes has in mind the fact that although students long ago rejected the college in loco parentis, whenever trouble breaks out over some incendiary “hate speech,” college authorities tend to get blamed for not parentally stepping in.) To stroll around any venerable campus is to see the past flit by in mottoes that express repudiated ideals. Across the façade of the building that houses the campus ministry at my own university, for example, is written: “Erected for the Students that Religion and Learning May Go Hand in Hand and Character Grow with Knowledge.” But it is hard to know what tests of character today’s college employs, since it is almost as difficult to get expelled as it is (except in science) to get a “C.”
There is a nervous sense that something basic is missing—a nervousness that may account for the rise of compensatory institutions within the institutions, such as the Center for Human Values at Princeton, formerly directed by the political philosopher Amy Gutman (now president of the University of Pennsylvania), or the Institute for Ethics at Duke. But what can it mean that thinking about ethics has become mostly an extracurricular activity?
It is sheer insouciance to pretend that nothing is wrong. Cheating, especially in the form of plagiarism, is rampant in today’s colleges. A recent article in The New York Times Book Review (“students are fuzzy,” the author explains, “on what’s cheating and what’s not”) reports that one of the many Web sites that offer term papers for sale has the winningly candid name CheatHouse.com. In his autobiography The Bridge, Ernest Poole (Princeton, class of 1902) writes that during his college days, only once did he see a cheating incident. Sitting behind the cheater during an exam was the class president, who, seeing his classmate “furtively…looking at notes,” whispered in his ear, “Tear up your paper and flunk this.” Suitably shamed, the culprit obliged. This kind of peer-group policing apparently no longer works at Princeton, where there is also a serious problem with vandalism of the golf carts used by disabled students to get around campus. Last May, the installation of surveillance cameras in the Princeton University store triggered a surge in shoplifting arrests. “Nobody wants to think,” said municipal prosecutor Marc Citron, “that a Princeton University student, a future secretary of state…would dare to commit shoplifting.” Perish the thought.6
Advertisement
Students have always learned by example, and while there is no hard evidence of a higher incidence of vandals or cheats among professors than in any other professional group, the fact is that the professorial example is not always an uplifting one. In his dean’s report to the Harvard faculty in 1991, Henry Rosovsky wrote, with uncommon candor, that the faculty had “become a society largely without rules, or to put it slightly differently, the tenured members of the faculty—frequently as individuals—make their own rules” with regard to teaching loads, outside business ventures, consulting time versus teaching
time, etc.7 A me-first ethos, Rosovsky believed, was destroying what was left of an older civic attitude according to which “a professor’s primary obligation is to the institution—essentially students and colleagues—and that all else is secondary.” Professors were coming to regard the university as serving them rather than the other way around. A few years later, Donald Kennedy, the former president of Stanford, agreed that the idea of institutional citizenship was under siege, and wrote a whole book entitled Academic Duty, in an effort to articulate the responsibilities that ought to go hand-in-hand with academic freedom. It has had no discernible effect.
As Derek Bok makes clear in Universities in the Marketplace, opportunities for institutional and personal profit have jumped because of growth in “technology-transfer” partnerships with corporate investors and government agencies, especially since the passage in 1980 of the Bayh-Dole Act, which permits both universities and individual researchers to share in profits from inventions or therapies developed with public funds. And there is, of course, the continual quest to enrich the university by bringing in money through its athletic programs, which elicit alumni contributions—though these efforts are usually futile, Bok says, since revenues are typically plowed back into the athletic programs themselves. He cites one case in which the University of Oregon spent $250,000 for a billboard in order to promote its quarterback for the Heisman Trophy.
The incursion of market values into the putatively pure academic world has been the subject of a host of recent books, all of which point in one way or another to the marginalization of undergraduate teaching.8 None of this, of course, is new—except in scale. In Owen Johnson’s 1912 novel, Stover at Yale, one of Stover’s buddies protests that “our universities are simply the expression of the forces that are operating outside. We are business colleges purely and simply, because we as a nation have only one ideal—the business ideal.”
3.
Stover at Yale is a book more cited than read. In fact, it is not a brief for the WASP world of snobbery and clubby pranks so much as it is an account of a privileged insider who comes to “a critical analysis of his own good fortune.” The Yale we meet in Stover is a place still tinged with the old Calvinist belief (Yale was founded by Puritans who thought that Harvard had gone theologically soft) that God dispenses grace by his inscrutable whim, and that those who find themselves smiled upon by God must live, when confronted by human suffering, with the humbling knowledge that “there but for the grace of God go I.”
Yale exerts a force on Stover like that of a guilt-dream; from its “misty walls and the elm-tops confounded in the night, a monstrous hand seemed to stretch down, impending over him” until the windows were “transformed into myriad eyes, set on him in inquisition.” The eyes of Yale follow Stover relentlessly, demanding that he subordinate himself to an “idea of sacrifice and self-abnegation.” No doubt many, if not most, of Stover’s classmates flunked out of this idealized “school for character,” and saw the world instead as Yale’s popular turn-of-the-century professor, the Social Darwinist William Graham Sumner, saw it—as a dog-eat-dog contest in which the fit prevail and the weak can go to hell. Sumner’s best-known book bore the implicitly interrogative title What Social Classes Owe to Each Other, to which the implied answer was: not a damn thing. According to this view, if you get to Yale by means of Daddy’s money, you nevertheless deserve the benefits that accrue.9
Yet even if Sumner’s was the real Yale, and Stover’s Yale a myth, it was a good myth, and it helped to produce good men. In Geoffrey Kabaservice’s book The Guardians: Kingman Brewster, His Circle, and the Rise of the Liberal Establishment, we get an account of how some well-born sons of the old tribal Yale worked to open it up to the larger world as an act of public service. The transformation of Yale into an institution with a heightened sense of obligation beyond its traditional constituency took place under Brewster (Yale ’41, and president from 1963 to 1977). The same progressive spirit was carried beyond New Haven by a number of Brewster’s friends and fellow patricians, including the industrialist and philanthropist J. Irwin Miller (Yale ’31), a pioneer in integrating the corporate world, and Paul Moore (Yale, ’41), Episcopal Bishop of New York, who turned the Cathedral of St. John the Divine from a high WASP enclave into an ecumenical institution serving the Harlem community. These men—along with near contemporaries such as Cyrus Vance (’39), Sargent Shriver (’38), and John Lindsay (’44)—recognized their own privilege as blind luck at best or injustice at worst. The interesting paradox of Kabaservice’s book, which is both stirring and a little hagiographical, is that narrow old Yale was destroyed by large-minded old Yalies.
Is there any trace left of noblesse oblige indignation in the post–Old Boy university? If not, is there a way to re-kindle it? From his Harvard pulpit, Peter Gomes answers no and no. In his essay “Affirmation and Adaptation: Values and the Elite Residential College,” he seems to say that universities have refused once and for all any responsibility for the moral education of undergraduates, but that perhaps it is not too late (it may even be timely) for small colleges to do something about it. “Taking up once again the cause of moral education,” Gomes writes, in the voice of a business consultant hired to help a small-cap company find its niche, “…might be a key strategic move in establishing for the residential elites a unique and marketable identity in contemporary American higher education.”
Gomes is doubtless right that the time is gone for restoring to Cambridge or New Haven anything like Stover’s world—which, on balance, is well lost. It was generally a world where gentlemen-in-training were expected to honor what the sociologist John Cuddihy called the “Protestant aesthetic”—the “inconsequential attitude” that one finds in Stover at Yale, “outwardly indifferent, as all Anglo-Saxons should” be. At Princeton, according to Fitzgerald, the ideal type was “the non-committal man”—a behavioral norm that only thinly cloaked an underlying bigotry and, in particular, anti-Semitism.
In his splendid book Liberal Education and the Public Interest, James Freedman, president of Dartmouth from 1987 to 1998, quotes Judge Learned Hand’s response to Harvard’s president Abbott Lawrence Lowell when Lowell proposed a Jewish quota in the 1920s in order to keep Harvard from being vulgarized by the influx of intellectually aggressive Jewish students. To Lowell’s suggestion that “a personal estimate of character” become part of the admissions process, Hand replied,
If anyone could devise an honest test for character, perhaps it would serve well. I doubt its feasibility except to detect formal and obvious delinquencies. Short of it, it seems to me that students can only be chosen by tests of scholarship, unsatisfactory as those no doubt are….
Hand concluded that there are only two choices for institutions like Harvard: “a college may gather together men of a common tradition, or it may put its faith in learning.”
Since there is no returning to the age of “common tradition”—whatever that might mean today—a college must, more than ever, put its “faith in learning.” Yet even the most boosterish accounts of contemporary academic life suggest a certain slackness of commitment to the kind of learning that Freedman calls “redemptive” and to which Judge Hand was willing to entrust his faith.
4.
There are, it should be said, some good signs. At Harvard, the first serious review of undergraduate education in years has gotten underway, and it would be a shame if the current conflict between president and faculty should be allowed to deflect their attention from that most important undertaking. At Columbia, the liveliest faculty meeting I have attended was devoted recently to discussion of an experimental new Core course designed by a group of faculty scientists, including a Nobel Prize winner, in order to develop in undergraduate students a degree of scientific literacy, and to convey to them how much the human future depends on how the power of modern science is deployed.
As similar curricular reviews go forward elsewhere, it may be helpful to acknowledge at least one sense in which students in our age of putative freedom have less freedom than ever before. Today’s students are pressured and programmed, trained to live from task to task, relentlessly rehearsed and tested until winners are culled from the rest. When Stover says, “I’m going to do the best thing a fellow can do at our age, I’m going to loaf,” he speaks from an immemorial past. Loafing has always been the luxury of the idle rich. But there is a more dignified sense in which it was also the colloquial equivalent of what Cardinal Newman meant when he described the true university as a place where “liberal knowledge… refuses to be informed (as it is called) by any end,” so that it may “present itself to our contemplation.” College is still a main protector of the waning possibility of contemplation in American life, and an American college is only true to itself when it opens its doors to everyone with the capacity to pursue and embrace the precious chance to think and reflect before life engulfs them.
Such a college, Jefferson wrote, must ensure
that those persons, whom nature hath endowed with genius and virtue, should be rendered by liberal education worthy to receive, and able to guard the sacred deposit of the rights and liberties of their fellow citizens, and that they should be called to that charge without regard to wealth, birth or other accidental condition or circumstance.
This idea of public trust, which motivated Jefferson to found the University of Virginia, applies no less to private than to public institutions; and the record of both in serving students “without regard to wealth, birth, or other accidental condition” is one of the impressive achievements of American culture—though that record has lately been imperiled.
Unfortunately, recent institutional commitment to liberal education has been less than impressive. As the sociologist E. Digby Baltzell wrote forty years ago in a Jeffersonian spirit, a true aristocracy is “based on the conviction that it is culture and not genes…which make an aristocrat.” Baltzell’s double premise was that a consensus exists about what constitutes culture, and that democratic society needs cultured “aristocrats”—in the Jeffersonian sense of the word—to save it from demagoguery and degradation. “America’s continuing authority in the world,” Baltzell believed, “depends on our ability to solve the problem of authority here at home.” In other words, we need cultivated citizens with a sense of civic responsibility—and we need them more than ever.
The challenge that lies behind these several new books about higher education has only gotten more exigent since Baltzell wrote: in our “postmodern” moment, we no longer have any consensus about what culture is or should be, yet the need for cultured authority has become more urgent. Perhaps the most remarkable sentence in all of these books—as remarkable for the fact that it appears in a footnote as for the fact that it is patently true—occurs in Bok’s Universities in the Marketplace. “Arts and Sciences faculties,” Bok tells us,
currently display scant interest in preparing undergraduates to be democratic citizens, a task once regarded as the principal purpose of a liberal education and one urgently needed at this moment in the United States.
It is not hard to imagine what other purpose the faculties have in mind. Preparation for well-paying jobs seems high among them even if that goal is not explicitly stated. So does introduction to various academic disciplines as they are conceived by the guilds in charge of them. Any larger sense of purpose seems absent and there are few signs that anyone is concerned about it.
The problem, of course, is easier to identify than to solve—though greater candor from academic leaders about the state of undergraduate education would be a start. Arts and Sciences faculties might take a cue from their counterparts in medicine and law, who have tried in recent years to address their own hyperprofessionalism. Wary of turning out specialists with too little sense of the human needs of their patients, medical schools have placed a new emphasis in the curriculum on primary-care medicine, while leading law schools have instituted a “pro bono” requirement that students perform public service such as working in a legal aid clinic or helping to defend indigent clients.
Prescribing morality is not the point.10 But deans could provide incentives for departments to hire faculty with a proven commitment to undergraduate teaching—teachers in philosophy departments who find excitement in engaging young people in debate over questions of justice, or in history departments who believe that thinking about the past is at least a partial antidote to smugness about the present, or in English departments who believe that literature can deepen the sympathetic imagination.
Department chairs could ensure that the graduate teaching assistants on whom the university depends have their classroom work guided and encouraged by teaching mentors, just as their scholarly work is evaluated by dissertation readers. And all Ph.D. candidates, who are, after all, future professors, could be asked to learn something about the history of higher education, so that they will not confuse the way things are today with the way they have always been, or must be.
These are modest proposals. Real change will come when academic leaders, even if it means requiring courses with no obvious professional utility or consumer popularity, insist that the BA degree reflect some awareness of how the ideas and institutions of modern liberal society came into being and the ways that they are under challenge. “One of a college president’s greatest opportunities,” writes James Freedman, “is to elevate the sights of undergraduates.” If the opportunity is to be seized, why not now?
This Issue
March 24, 2005
-
1
Paul Buck et al., General Education in a Free Society (Harvard University Press, 1945), pp. 81, 93.
↩ -
2
“College: The End of the Golden Age,” The New York Review, October 18, 2001.
↩ -
3
“Is Diversity a Value in American Higher Education?” Florida Law Review, December 2000, pp. 40, 46.
↩ -
4
There are some acerbic comments on this subject in a forthcoming book by Ross Gregory Douthat, Privilege: Harvard and the Education of the Ruling Class (Hyperion, 2005). The plush house and college residential halls constructed by Harvard and Yale in the 1920s and 1930s—complete with paneled dining halls, libraries, and common rooms—are lavish by most dormitory standards, but in fact they were conceived as a way to democratize student life. Before their construction, wealthy students lived in “gold coast” apartments while poor students boarded in or near slums.
↩ -
5
Questions are also being raised about exactly what kind of diversity has been achieved in today’s top col-leges. William Bowen’s call for preferential admissions for needy students is one challenge to current practice. Another comes from Henry Louis Gates Jr., who recently remarked, “I just want people to be honest enough to talk about it,” in calling atten-tion to the fact that many students of color are not of African-American descent, but are West Indian or African immigrants or their children. As his Harvard colleague Mary C. Waters, chair of the sociology department, put it, “If it’s about getting black faces at Harvard, then you’re doing fine. If it’s about making up for 200 to 500 years of slavery in this country and its aftermath, then you’re not doing well.” Sara Rimer and Karen W. Arenson, “Top Colleges Take More Blacks, but Which Ones?” The New York Times, June 24, 2004.
↩ -
6
Suzy Hansen, “Dear Plagiarists: You Get What You Pay For,” The New York Times Book Review, August 22, 2004, p. 11. The incidence of cheating is, of course, hard to measure, but one authority on the subject, Donald McCabe of Rutgers University, finds that the number of students reporting “cut and paste” plagiarism using Internet sources quadrupled between 1999 and 2001. McCabe also describes a sharp rise over the last four decades in the number of students reporting “unpermitted collaboration” (academic integrity .org/cai_research.asp). Drawing on McCabe’s research, David Callahan, The Cheating Culture: Why More Americans Are Doing Wrong to Get Ahead (Harcourt, 2004), p. 217, estimates that serious cheating in college increased by 30 to 35 percent during the 1990s. On the golf cart vandalism at Princeton, a letter from concerned students describing the problem as “endemic” was posted last year at www .princeton.edu/usg/vandal.html. For the shoplifting story, see The Times of Trenton, May 23, 2004. There is no reason to assume that shoplifting is a Princeton specialty. The MIT Coop has a big problem too.
↩ -
7
Henry Rosovsky, “Annual Report of the Dean of the Faculty of Arts and Sciences, 1990–1991,” Policy Perspectives, Vol. 4, No. 3 (September 1992), pp. 1b–2b.
↩ -
8
Along with the book by Bok, these works include David L. Kirp, Shakespeare, Einstein, and the Bottom Line: The Marketing of Higher Education (Harvard University Press, 2003), Eric Gould, The University in a Corporate Culture (Yale University Press, 2003); Christopher Newfield, Ivy and Industry: Business and the Making of the American University, 1880–1980 (Duke University Press, 2003); Richard Ohmann, Politics of Knowledge: The Commercialization of the University, the Professions, and Print Culture (Wesleyan University Press, 2003); Jennifer Washburn, University Inc.: The Corporate Corruption of Higher Education (Basic Books, 2005); and James Engell and Anthony Dangerfield, Saving Higher Education in the Age of Money, forthcoming from University of Virginia Press.
↩ -
9
Recently, President George W. Bush, a member of the Yale class of 1968 (to which he was admitted with the strikingly low verbal SAT score of 566), joined the discussion by saying that the only criterion in college admissions should be individual “merit.” Bush’s father was graduated from Yale in 1948, and his grandfather with the class of 1917, but the President assured a reporter at a conference of minority journalists that he is as much opposed to giving advantage to alumni children as to members of minority groups (reported by CNN, August 6, 2004).
↩ -
10
See Dave Eggers, “Serve or Fail,” Op-Ed piece, The New York Times, June 13, 2004, for a case that all colleges and universities should institute a graduation requirement of public service. Such a general requirement runs counter to the spirit of voluntarism, but at some institutions there are elective courses that have a public service component directly related to their subject matter—a practice that merits wider adoption.
↩