Tag Archives: vocation

Seeking Asylum in the World: 1980-1993

Part Seven of the essay Professor Without Portfolio.

I was fully inured to my Kenyon professoriate (but increasingly devitalized by it) when I piled Ellie and our two teenage daughters (Sam was in his second year at Dartmouth) into the Volvo on an early summer morning in 1980 and headed for Chicago’s Gold Coast. A year’s service as chair of Kenyon’s Department of Political Science had convinced me to take at least that much time away from the College and our hilltop home in Gambier, Ohio. The opportunity to do so was supplied by a fellowship to lead a research seminar for undergraduates at the Newberry Library in Chicago’s Near North Side on “The Idea of Privacy in the Western Tradition”.


I had then been 13 years at Kenyon, had reached the rank of Professor, and—until the experience of the chairship—supposed that I would retire, die, be buried and feted in Gambier. The chairship introduced me to the utterly strange world of academic administration. Suddenly the colleagues with whom I had designed and taught courses, criticized student performance, and negotiated tenure and promotions became my “patients”, each one suffering a distinctive and surprisingly advanced “illness”—egocentrism, rejection of institutional duty, demand of personal accommodations in compensation and perquisites which were not only inequitable but violations of collegiate policy. Their requests of me, their new chair, were very discreet and for “favors” that they knew were inequitable and unjust. Their discretion in making such requests was extended to the public denial of critical judgement of their colleagues in tenure and promotion cases, thus leaving the confusing impression that negative recommendations were rendered by the unanimous endorsement of the department. On three occasions, I stood alone before quizzical if not hostile students and colleagues to defend negative personal decisions rendered by universally positive colleagues.


In my haste to get a recess from the experience and the place, I encountered the severest imaginable criticism from my older  daughter; she would graduate from The Francis Parker School in Lincoln Park, Chicago, instead of with her friends at “home”! She and I have survived the breach. She has not forgiven me the substitution of Francis Parker for the finishing year in Ohio—but we have each borne and forgiven more serious affronts from the other both before and since and we love each other all the more profoundly.


Although the seminar on Privacy turned out to be popular—it attracted 25 students from almost as many colleges across the Midwest—Chicago turned out to be more fascinating for me and for many of the students. At its end, I resigned my professorship at Kenyon, joined the First National Bank of Chicago as a trainee, and thus launched at mid-life a 12-year career in Commercial Banking and Corporate Finance.


I’m still not entirely sure why I did that. Chicago’s bluff urbanity in the place of Gambier’s fecund rumour-mill and precious rusticity had something to do with it. A year’s fresh experience anywhere would have dulled the prospect for me of returning to the college and my departmental responsibilities, but the considerable charms of “that toddlin’ town” entirely obliterated it.


A yet larger factor was the malaise that swept into the college on the heels of the American withdrawal from Vietnam. The anti-war protest had enlivened the civic and intellectual life of the campus just as I began my teaching there. As the nearest thing to a local expert on the war and the Sino-Soviet relationship, the protest provided me a platform on which to establish my professorial standing in Kenyon’s “highly selective” enclave of privately educated sons of east coast professionals.  It diminished my “class” fear of both students and faculty, and gave me enough self-confidence eventually to notice that many of both were far less sure of themselves than they were anxious to seem. This was a critical discovery; it taught me something about myself that I might not have seen until much later in life.


My position on Vietnam—that American participation in the conflict amounted to “containment” and was in our national interest—was profoundly unpopular on campus and in academia. But the deliberative, even disputatious, culture at Kenyon in those days actually generated authority for those who espoused unpopular positions, so long as we did so with manifest care for the truth and civility.


Particularly at Kenyon, the discussion of the war occurred principally in the classroom rather than out on the quad, and the “teach-ins” were distinctly fair and respectful. It was the students who led the resistance to blockades, shout-downs, and building and office occupations, most of which were suggested by certain members of the faculty and staff who, ironically, regretted the braking effect of deliberation on precipitous action. In fact, an address by the studentpresident to an assembly called to close the college after the infamous Kent State incident resulted in its staying open, almost alone among Ohio’s institutions of higher learning. He turned the faculty away from closing down by arguing that we had sought refuge in Gambier from the world that included the Kent State violence in order to find deliberative counter measures to it. To give up the enterprise when it was most needed would be irresponsible foolishness. (I used the same argument from Augsburg’s chapel pulpit 30 years later, from which I was to preach just as the events of Nine Eleven were breaking—supplemented by a newer personal awareness that actual evil coexisted in the world and there was no reason to be surprised by evidence of it.)


By suddenly substituting in the aftermath of the war private for public affairs, and personal complaint for ringing rhetoric, the malaise took much of the civic life out of the place. Grade inflation set in, perhaps to help “our” graduates gain admission to the “best” law, medical and business schools. Graduation-credit equivalencies were suddenly easy to come by and were supplemented by an expansive list of “advanced placement” and off-campus study programs, perhaps to raise enrollment by making college more easily accessible. And these liberalities were accompanied by a major infusion of “soft” drugs, mostly in honor of Pleasure, newly enshrined as a legitimate objective of education, or to medicate a lengthening list of study-stopping or distracting psychic ailments. Those of us who had become legendary as “demanding” teachers were forced to lighten our assignments to retain adequate enrollments in our “elective” courses. Suddenly, there were few fora (and even those were ill-attended) on the vexing moral and ethical questions that had beckoned many of us to the academy in the first place. And there were no longer any acceptable ways to acknowledge true excellence; everything contributed to the transfiguration of “C’s” into “A’s” and true “A’s” into new “A’s”.


A simultaneous oversupply of employable Ph.D.’s deepened the malaise by suddenly lengthening the road to tenure and increasing the risk of being denied it by some bureaucratic bobble or vengeful colleague. This weakened collegiality and shifted institutional priorities to tenure and promotion processes and away from personal and professional achievements. Careers suddenly stopped progressing at home, and opportunities for fresh starts at other colleges or in related industries dried up. And those of us who had squeezed through the gates before the crush felt accidentally rather than deservedly privileged by good timing.


I would have fled half that hopelessness in any other industry. But those of us who professed the study of the human condition in and between such agglomerations as Polities, Tribes, Religions, Nation-States, Cultures felt “stuck”.  We resisted—unsuccessfully—the recasting of our Fields of Study as “Social Sciences”. Our fascination with Ideas and our skepticism of Data meant that we were lightly published, if at all, in the new spate of “refereed journals”. Our scholarship was argumentative and interpretive rather than research reports and methodological essays. We were not as sure as our counterparts that mankind was either destined to or was actually progressing morally and ethically.


Deep down, even the most ebullient of us suspected that because we had acquired our knowledge bookishly rather than experientially, and had certified its relevance for right living by reflection and discourse rather than by “practice”, we could not “do” and were left with only the ability to teach. (I traded that conviction after I left the classroom for one closer to the truth, namely, that such practical arts as banking and corporate finance serve private interest immediately, and the best academics begin, instead, with public things. Some of this insight was borne upon me by the conversation I had for years after I resigned my tenure and rank—with corporate figures that dreampt of professing their trade instead of practicing it, and academicians that dreampt of practicing a trade instead of professing it. One set of my interlocutors wanted to be better off economically, and the other morally. Very few, indeed, suspected that in our culture either dream excludes the other.



The truth is that the last half of my 15-year tenure at Kenyon eroded the promise with which it had begun. I saw (invented?) that promise at my first Kenyon commencement. A legendary professor of history was feted that day upon his retirement. Decked out in his academic robes and slightly stooped, he stood before the College to receive its collective gratitude and an honorary Kenyon degree conferred by the Board of Trustees.  The whole place came to its feet in a spontaneous expression of institutional, even communal, self-congratulation. I saw myself moving down the enchanted paths I imagined he had trod, establishing a storied reputation as citizen-hero and transformational teacher, receiving the visits of grateful alumni at every Homecoming in the living room of my farm-ette—and eventually around my headstone in the local cemetery.


Even while enduring the persistent exploratory probes of Kenyon’s privileged undergraduate ruffians, the critical elements of this image took salutary shape. I surprised those early challengers by confessing and firmly defending my convictions concerning the war in Viet Nam, and softened them, ironically, with a blunt candor to resist the “new” morality that substituted drugs for reflection as the route to understanding. Most of all, I represented to them a genuine fascination with ideas and celebrated the insights that our classroom interaction quite frequently generated. Even the most cynical of them began to see that drugs armed the mind with a miasma that obscured its image of both God and man.


In the end, I won their respect by the only form of teaching my fear of them would permit: I listened to them carefully and long enough to eventually confront them on each of our disagreements. Many of the most challenging and thoughtful of them got to coming over to the house to continue a discussion that had begun in the classroom—and then invited me into the fraternities to get me on their ground (and to return the hospitality).


(The test that I nearly failed was of my chaperoning skills for a moveable dance early in my second year. The daughter of a colleague of mine joined a coterie of volunteer dance companions imported for such occasions from a women’s college in northern Ohio and came into my particular zone of responsibility draped drunkenly over the strong, willing and wobbly body of a manly senior. All evening, I had been asked to share a “smoke” with students by providing a light for cigarettes that, in many cases, resembled those my father had made so brilliantly with one hand while guiding a team of horses in some task with the other. The students’ creations that evening were less refined, and the “tobacco” smelt strange—and when the colleague’s daughter came through in such imminent danger of giving up her favors on the dance floor itself, I realized the conspiracy of which I (and perhaps she) were subject. I found the president of the fraternity among the audience loudly anticipating the fast-approaching catastrophe and compelled him to agree that our watches, as well as the wall clock, had somehow gotten slow by about 15 minutes. With a loud and authoritative shout, I ended both the event and the spectacle. I learned the smell of “weed” that night and never again went near the dormitories on dance weekends.)


Having escaped that very close call by a widely admired deception, my reputation for intelligence and prudence leapt and I emerged as something of a classroom and advising legend. But within less than a decade, the realities of my professorial life became caricatures of my original expectation. What I originally assumed as mutual respect among professors turned out to be back-biting envy all too often or sharp-elbowed rivalry. The friendship for which I longed was possible, but only among the unranked and unelected. Many faculty seemed to suffer envy in the face of scholastic achievements by their colleagues. (A notice that I posted early in the 1970’s to the members of the department that a young colleague’s first book had been published by a major house elicited an anonymous “So what?” from one of them, and a “Who cares?” from another.) Sharp, personal repartee and quarrelsome public questioning greeted even departmentally sponsored guest lecturers, to say nothing of the presentations of Kenyon faculty. Whole departments showed up at public lectures either to cheer or castigate the lecturer. The intellectual life of the college became a tournament rather than an inquiry, a rivalry rather than a deliberation.


So—the year-off from my chairship of the department, and temporary relief from the lonely burden of responsibility for its liberating mission in an academy less and less interested in (or capable of) Liberal Education almost inevitably escalated into thoughts of resignation. Hence, at midlife I left the academy to more fully pursue my academic calling. I left the distracting trivialities of the ivory tower for what I hoped would be the gravities of the world. I abandoned colleagues in the hope of finding friends.

Developing a Self to Contend with the World

Part Six of the essay Professor Without Portfolio.

During the frequent station stops along the way from Ohio State through Hawaii to Kenyon, I learned that finding a job is a lot easier—and far less satisfying—than finding one’s place in the world. In fact, I now think that one’s place in the world does not even exist until one’s Self has developed enough perspicacity to recognize it. Known as “discernment” in the parlance of Vocation, such perspicacity seems to sharpen in deep reflection upon the events of one’s life—as though reflection was the stone, discernment the knife, and life the foot upon the pedal or the hand upon the wheel. And the life force rises and falls respectively during episodes of good or ill “fit”, good or regrettable work. (After the farm, my best fit (and worst work?) was practicing Journalism at Ohio State—and it entailed a distinctly maturing dose of suffering. Until the Augsburg presidency, my best “good work” (and least comfortable fit?) was professing Political Philosophy at Kenyon—and it revealed the dependency of good teaching on episodic learning (which disorients you just as you were becoming comfortable with the last course correction).  Although good work and good fit came closer together for me at Augsburg than at any other time in my life so far, my best “good work” has never—and will never–escape the strictures of duty.)


Although the kind of dialectic under discussion here may seem a tautology—or at least a paradox—it is neither. Self takes shape through interaction with the world, and its role in subsequent such interactions is made larger thereby. Eventually, the Self becomes strong enough—independent enough—to actually contend with the world, even-Steven (but never better than that while we are in the world).


My particular Self certainly lacked the capacity to contend with the world—especially the world of the conspiracy cabals of the late ‘50’s at Ohio State and, in the ‘60’s and ever since, of the exclusive partisan clubs among faculty at every college and university in the land—until I was well into my professoriate at Kenyon. The thing that effected its eventual invigoration was the recognition that the dialectic of world and self couldn’t advance vocationally without a third element—a teleological element made accessible by education and close acquaintanceship with admirable persons who could draw the Self upward (rather than relying on lower forces to propel it in that direction from below). The Ancients called that element “the good” and thought it rooted in Nature; many of the “moderns” who acknowledge it  think it a product of wishful thinking and believe it rooted, if anywhere, in History. Some very few call it “destiny” and think it a gift of the Divine. Whatever it is and whence ever it comes, it leavens the dialectic and enables it to rise—morally and consciously.


Only a couple of years earlier, with the confusion and self-consciousness of adolescence hard upon me, I encountered a sudden, stultifying fear for which this dialectic offered a balm. Triggered perhaps by the failure of my first venture in romance (she was Catholic, and both of us “knew” that any such inter-sectarian relationship had no future whatsoever), the fear was of the prospect of being profoundly and permanently alone. I reacted ambiguously—by casting about for ways to ingratiate myself with any nearby and popular group, on the one hand, and by seeking a radical independence of all groups, on the other (i.e., solitude in the place of alone-ness). In the long run, the stronger of these was the latter—because it entailed freedom, both “from” the prevailing biases of the groups, and “to” a life chosen by my lights and “belonging” to me (as one’s place in the world should seem). Of the two, I preferred independence. My experience suggested it as “realistic”; even when he was around, my father’s disappointment in me constituted an abandonment. His death was the ultimate abandonment—and it sent me rather more in search of an independent (rather than servile) inclusion, and a proud (rather than lonely) solitude.


The first “independent inclusion” I sought was of my peers. By that time, I had become acceptable to my mother and the portion of polite life in that part of the country that was governed by females, and secondarily by my father and his farmer friends. I was heir apparent to the 160-acre Appalachian farm he had acquired at the very end of the war. As the inheritor of his bass voice—and blessed by my mother’s subjection of me to piano and choral lessons—I was regularly asked to read Biblical texts in church or sing solo there or at the Grange, Farm Bureau, and other farmer meetings in the region.


But among my contemporaries, my standing was shaky.  I was too skinny—even with sharp elbows I couldn’t block out rival rebounders; too Eastern—older of two offspring of a college-educated Philadelphia mother who became a demanding teacher in the Southeastern Ohio school district she helped found; too musical and literary—no locker-room tales of romantic dalliances, too few of risky adventures with comrades, and nowhere near enough badinage skills to qualify as the kind of raconteur that was admired in those parts in those times.


Obtaining full acceptance by the trend setters among my contemporaries without becoming the group wise-guy or comedian was best done by handsome fellows with first-team varsity ability in sport.  I was a good pitcher of softball, but that was after-school church stuff—not enough physical combat to rival the standing of football. (I came close to measuring up when I inadvertently fractured a batter’s ankle with a fastball.)


So—I chose “thinking” as my strategy. I had learned something of the art from my mother. She made good use of it in constructing the speeches that her education, accommodating disposition, civility and clever turns-of-phrase qualified her to make to school, church and farm audiences all over the region. Her father had been a Methodist minister known best for his graceful homilies. My father’s curse-laced speech and sharp wit was widely admired among his compatriots. So, thinking and fluent rhetoric had a good reputation in the family. Hence, I have always equated them, and have become a devotee of forensic rhetoric—of clear, fascinating and instructive expression in the classroom, from the podium, and on paper. (And a preference, in private, for “rich” speech, though free of obscenities.)


At the time, thinking seemed a distinctive strategy. It was non-confrontational. It didn’t require an assertion of “rights” or “due to’s”. It needed but a certain persistence, a constantly furrowed brow, and a self-effacing interest in the remarks of interlocutors who thought themselves particularly smart or learned. It played out in rhetorical gambits with which I could acceptably interrupt and enter the conversation of the moment: “But what about….?” Or “Don’t you think….?” Or “Didn’t I just read ….”?


Although I didn’t realize it until greater maturity came to me, thinking had the added advantage of going straight to ideas. It didn’t pause over feelings. Whether the idea of the moment was “good” or “bad” didn’t matter half as much as whether it was “sound”. Thinking allowed for conversation with charlatans as well as angels, with people I secretly wanted to punch—or embrace. It allowed for a relational life outwardly devoid of love or alienation.


Although the strategy didn’t work so well for my immediate purposes—it didn’t lift me directly into the local “in” crowd—it did foster a peculiar form of reflectiveness useful for breaking free of the tyranny of popular opinion. (The other day I found notes of a week’s worth of “all nighters” in my 19th year to elect “qualified” judges—one or two only of good and knowing friends who approved of  me and my character—and to deny to casual critics such as store clerks to whom I had given too little or too much cash for my purchase or drivers irritated by the abruptness or hesitation of my left turns the power to flay my tender soul with a sharp word or obscene gesture.)


Beyond that election of those possessing the exclusive right to judge me, perhaps the earliest and most elementary realization of my search for independence and a place of my own was that writing is less a vocation than a modus-operandi. That turns out to be true, also, of academic fields-of-study. Political Philosophy, for example, is a profession—defined by a code of conduct among practitioners rather than by their dedication to justice, civility or neighborliness. Only when one consciously choses the aim of one’s political philosophy as well as a realistic strategy for effecting it, does one have a love-thy-neighbor vocation—either political, non-political or anti-political.