Category Archives: Development of Western Civilization

The Weight of this Sad Time

It’s not often that I’m at a loss for words, but I am this morning. This post from several months ago sort of captures my mood at the moment; as Wittgenstein wrote, “of that which we cannot speak, we must remain silent.”

Speak what we feel, not what we ought to say. Shakespeare, King Lear

Last week I spent a couple of mornings and part of an afternoon participating in a faculty end-of-the-year workshop held annually for the honors faculty. It is always held the week after Commencement; with sabbatical just around the corner, I considered not attending this year. Cost of DiscipleshipBut the two morning seminars were on King Lear and Dietrich Bonhoeffer’s The Cost of Discipleship, two worthy texts that should be on everyone’s top whatever list. That, along with a reasonable stipend, was enough for me to sign up.

The King Lear seminar, led by a Shakespeare scholar from the English department, was a welcome return to a text that I find both strikingly dark and strangely compelling every time I read it. I love Shakespeare and find his plays more insightful about human nature and the human condition than any other author (certainly more insightful than any philosopher I have read), but had not read this particular tragedy for a couple of years. tumblr_ma8azfhZEg1rgpruxo3_r3_1280[1]As it always does, the play blew me away, disturbed me, and left me wondering whether my colleagues might find some glimmers of hope and redemption that have always escaped me.

King Lear pushes to the limit a hypothesis that has a long and complicated pedigree: We live in a universe that is malign, at the very least indifferent, and human life within this universe is brutal, wretched, and meaningless. Furthermore, Shakespeare sets the play in an early England that as yet has not been “Christianized”—typical and familiar moralizing and redemptive language is as out of place here as it would have been in Ancient Greece or Rome. As various nasty and morally awful characters—including Lear’s two older daughters—apparently prosper from their rejection of their father, those characters with even a shred of dignity, honor, or love—including Lear’s youngest daughter—are rejected and ultimately destroyed. By the end of the play, the stage is littered with the bodies of both the good and the bad, while a handful of dazed survivors are left to pick up the pieces. Naked in a driving storm in the middle of a Scottish heath, Lear rages that human beings are nothing but “poor, bare forked animals,” living on a “great stage of fools.”imagesCAOCS0RP Lear demands an answer to the question “Is man no more than this?” The blinded Gloucester despairingly directs his accusations heavenward:

As flies to wanton boys are we to th’gods;

They kill us for their sport.

Lear 2008My colleagues and I ended two morning hours of seminar and another afternoon hour by viewing the final act of the play on screen with the 2008 version starring Ian MacKellan as Lear. It is a stark production with Beckett-like sparse staging toward the end. As character after character dies—Lear’s three daughters, the evil Edmund, and ultimately Lear himself—and the stage is littered with corpses, the play ends with Edgar’s final lines:

The weight of this sad time we must obey,

Speak what we feel, not what we ought to say.

Fade to black. The seminar leader asked us for our feelings, our impressions of what we had just viewed, and for the first time in thirty years in academia I heard something I’ve never before heard when in the presence of twenty scholars: total silence. In obedience to Edward’s directive, no one felt obligated to say anything that “should” be said; at least for a minute or two, we were not professors ready to discuss the next topic to death, Auschwitzbut human beings stunned into silence by Shakespeare’s brilliant and disabling portrayal of a meaningless and hopeless world.

I was reminded of one of the final classes in my “Grace, Truth, and Freedom in the Nazi Era” colloquium this past semester. My colleague Ray was up front andeyeglasses ended the two-hour session with footage from the liberation of Auschwitz. The students were exposed to a variety of tough material, both in writing and on the screen, throughout the semester, but this particular footage was especially difficult to watch. There was no narration, no voice over, just hundreds of emaciated dead bodies stacked like so much wood, rooms filled to the ceiling with eyeglasses, hair, or shoes. Bulldozers pushing piles of bodies into a pit for burial just as they would push garbage into a pit at a landfill. suvivorsAnd perhaps most horrific of all were the close-ups on the faces of the just-liberated prisoners who were still barely alive. The haunting and empty gazes still float through my memory and probably will never leave. At the end of the several minute montage there was dead silence in the room. Ray wisely made no comment and simply turned off the computer and AV system, then began gathering his books and notes. This was the cue for the rest of us to do the same, and we left the room in silence.

This would have probably been the appropriate conclusion to the King Lear seminar the other day as well. But after what seemed like a very long silence, someone made a comment, then someone else followed up, and pretty soon we were doing what academics do in every context and setting—talking. Several people referenced the silence that preceded the talking and began to analyze what it was about both the play and the film adaptation that caused us not to say anything. speak what we feelBut with Edgar’s final lines in mind, our first reaction was most in keeping with “Speak what we feel”—except that our feelings were, at least for a few moments, deeper than words could express. Once we started putting what we felt into words, it was very easy to shift into “what we ought to say,” and the powerful moment was lost.Greenberg

Sometimes the best thing to say is nothing. And, as Irving Greenberg writes in Cloud of Smoke, Pillar of Fire, if we feel that something must be said, we need to be very careful about what it is.

The Holocaust challenges the claims of all the standards that compete for modern man’s loyalties.  Nor does it give simple, clear answers or definitive solutions.  To claim that it does is not to take the burning children seriously…Giotto lamentationLet us offer, then, as a working principle the following: No statement, theological, or otherwise, should be made that would not be credible in the presence of the burning children.

Living Without God

God would have us know that we must live as people who manage our lives without God. Dietrich Bonhoeffer

Perhaps it is a feature of teaching at a Catholic college, but I am frequently surprised by how many of my students are convinced that the only basis for being moral is belief in a God who will hold each of us responsible after we die for what we have done during this life. I am familiar with this attitude—fire insurance policyI was raised with the Protestant version and believed that the primary reason to be a Christian is to gain an eternal fire-insurance policy. But people old enough to be a freshman or sophomore in college have undoubtedly encountered people who do not profess any sort of religious conviction and yet apparently have managed to develop working moral frameworks. When I ask my students whether it would be possible for an atheist to be moral, just about all of them admit that such a thing is possible—they just don’t know how. So I find myself faced with a continuing task each semester—exploring with my students the strange phenomenon of living a life of moral commitment and excellence without God. Or at least without the God they have in mind.

BonhoefferNext semester I will be team-teaching a colloquium entitled “Love Never Fails: Grace, Truth, and Freedom in the Nazi Era” with a friend and colleague from the history department–it will be our third time teaching the course. In previous semesters, my students’ expectations and pre-conceptions concerning the connections between moral commitment and religious faith have been challenged on a regular basis. These challenges were most pressing during the weeks that we studied Dietrich Bonhoeffer, the Protestant pastor and theologian who ultimately found himself in prison awaiting execution because of his involvement in an attempt to assassinate Adolf Hitler. In letters to his friend Eberhard Bethge, Bonhoeffer described the many ways in which his understanding of Christian commitment and action was changing. Lurking behind his ideas was one big question—where is God in all of this? In a letter a few weeks before his death, he wrote

So our coming of age leads us to a true recognition of our situation before God. God would have us know that we must live as people who manage our lives without God. The God who is with us is the God who forsakes us. losing faithThe God who lets us live in the world without the working hypothesis of God is the God before whom we stand continually.

My students found this passage challenging, to say the least. In online discussions, several expressed their sadness that this pastor, who had been such a beacon of Christian hope and light during very dark times, lost his faith in his final days of life. I responded, tentatively, that Bonhoeffer had not lost his faith—but this was a very different sort of faith than my students were accustomed to.the bell

Bonhoeffer’s striking statement reminds me of the predicament that Michael Meade, a character in Iris Murdoch’s novel The Bell, finds himself in. Michael has an intense desire for God and the transcendent, seeking at various times to become a priest and, when that fails, to create the lay religious community that is at the heart of the novel. Throughout his life, Michael has considered himself “called” to service to God and has sought for patterns and signs that confirm his “calling.” Unfortunately, as with most of us, these signs and patterns turn out to be idolatrous projections of his own self-centered hopes and dreams. When the lay religious community fails and several of the members come to tragic ruin, including a man’s suicide for which Michael considers himself at least partially responsible, Michael is understandably on the brink of despair and suicide himself. As he seeks in the midst of ruin, for the first time in his life, to look at himself and at God cleanly and without preconceptions, he comes to hard conclusions.

The pattern which he had seen in his life had existed only in his own romantic imagination. At the human level there was no pattern. “For as the heavens are higher than the earth, so are my ways higher than your ways, and my thoughts than your thoughts.” belief in godAnd as he felt, bitterly, the grimness of these words, he put it to himself: there is a God, but I do not believe in Him.

Michael has come for the first time in his life to see the need for “dying to self,” for removing himself from the center of the universe and insisting that the world must “make sense.” God’s existence has not been threatened by the deconstruction of Michael’s hopes and dreams, but the “belief system,” the vocabulary, through which he has defined and described God has been destroyed. Michael’s God, in other words, has died.

At the end of the novel, Michael reflects and takes stock. Rather than fill the resulting vacuum with yet another projection of himself onto the transcendent, Michael chooses to let the vast gap between himself and the Other remain, at least for the present, in all its power and rawness. God has not died, but Michael’s conception of God has. And at least for now, this is a good thing. The rituals that were once consoling and uplifting remain as a reminder of his true situation.

No sharp sense of his own needs drove him to make supplication. He looked about him with the calmness of the ruined man. But what did, from his former life, remain to him was the Mass. . . . The Mass remained, not consoling, not uplifting, but in some way factual. It contained for him no assurance that all would be made well that was not well. It simply existed as a kind of pure reality separate from the weaving of his own thoughts. . . . Writualhoever celebrated it, the Mass existed and Michael existed beside it. He made no movement now, reached out no hand. He would have to be found and fetched or else he was beyond help.

Sad? Yes. Regrettable? Undoubtedly. But Michael has chosen to see if, for at least a period of time, he can refrain from creating the transcendent in his own image. Perhaps when he begins again, he’ll be more aware of the contingency of all transcendent language.

When Bonhoeffer writes that The God who is with us is the God who forsakes us, he is recognizing, as Michael Meade recognized, that all of our imaginings about what God must be and will do are human constructs guaranteed to disappoint and fail. Living in the world “without the working hypothesis of God,” embracing God’s existence without confining God to the limits of human belief, may seem to leave commitment to moral principles and behavior without a foundation. le chambonBut this need not be the case. Magda Trocme, one of the leaders of the rescue efforts in the little village of Le Chambon where thousands of refugees, Jewish and otherwise, were successfully hidden from the Gestapo and Vichy police during the dark years of World War Two, is a case in point.

Magda’s husband, Andre, was the dynamic Protestant pastor in Le Chambon whose powerful and eloquent sermons inspired his congregation to live out their faith in real time in the face of prison- and life-threatening dangers. Magda had no patience for theologicalmagda niceties and regularly scoffed at the notion that her astounding generosity and fearless hospitality made her a “saint” or even morally special. She just did what needed to be done and facilitated the efforts of others to do the same, addressing every human need within her power to address no matter who the human in need happened to be. I have studied the Le Chambon phenomenon a great deal and have used the story of this remarkable village in class many times. But it was not until last summer while reading a new study of the village that I encountered Magda saying anything about God. In her unpublished memoirs, now in the archives at Swarthmore College, Magda provides her definition of God:

If there weren’t somewhere a source of hope, justice, truth, and love, we would not have rooted in us the hope of justice, truth, and love that we find in every religion and every degree of civilization. It’s that source that I call God.

And that, for Magda, was sufficient for her to be one of the most remarkable moral exemplars I have ever encountered. And, I would argue, it is a sufficient foundation for moral goodness. Who knew it could be that simple?

Sorry for the Inconvenience

Dear Dr. Morgan: I’m writing to let you know that I won’t be in class today at 11:30. Our lacrosse match on campus that was scheduled for yesterday was rescheduled for today at 3:30. Our pre-game prep starts at 12:00, so I won’t be able to make class. I know that I have already missed a couple of classes this semester [four, as a matter of fact], but I’m hoping this won’t be a big problem. snoopyMy academic advisor’s email address is xxxxxxxxx@providence.edu if you have any questions. Sorry for the inconvenience.

Dear Dr. Morgan: I’m very sorry, but I won’t be able to make Friday morning seminar. I’m in a wedding on Sunday back home; I had a Friday afternoon flight home, but my mother changed it to Thursday afternoon because she was able to find a better fare on that day. I’ll contact you next week to see if there’s anything I need to make up. Sorry for the inconvenience.

My response to each of the above student emails that I received last week was something along the lines of “Dear Student: It is your responsibility to do whatever is necessary to account for missed classes (check the syllabus for the course policy on attendance)—you are also responsible for whatever we work on in the class that you miss. Your missing class is not an inconvenience to me at all—the inconvenience is entirely yours. Dr. Morgan.”

In student/teacher communication, “Sorry for the inconvenience” has become the “go to” email comment with which to close a communication containing information that you don’t want to take responsibility for. inconvenienceThe sender is saying “I hope that maybe a half-hearted apology for making your life difficult will cause you to be merciful, even though I know that you don’t have to and that I should have handled the situation differently.” On the level of effectiveness, the “sorry for the inconvenience” strategy ranks just slightly above the ostrich strategy which requires pretending that the situation never even happened. Used more broadly, “sorry for the inconvenience” could mean “I know what I just did or failed to do messed your day (week, month, year, life) up. That doesn’t mean that I’m going to do anything about it or try to set things right—but I wanted you to know that I am aware of the inconvenience I just caused you.” Sort of like “I just wrecked your car—sorry for the inconvenience,” or “I am sleeping with your significant other—sorry for the inconvenience.”

convenience storeHuman beings do not like being inconvenienced. Although we might not admit it, we love “convenience stores” and have made them a ubiquitous part of the American landscape, simply because they are “convenient.” Early in the 2000s, shortly after the Supreme Court decided to appoint George W. Bush as the 43rd President of the United States, Al Gore wrote a book as well as both starring in and producing a documentary about the dangers of global warming with the wonderful title “An Inconvenient Truth.” I have often wondered why millions of people worldwide, but particularly in this country, are so vehement in either their denial that global warming is real or in their insistence that if it is real, human beings are not responsible, given the mountains of evidence and data that prove its reality and our complicity. an inconvenient truthThe title of Gore’s documentary and book directly answers such questions—people often go to extremes in their efforts to avoid anything that, if accepted as true, would force them to adjust their attitudes and actions in uncomfortable ways. I’m reminded of what Vera Brittain once said that teachers should never forget—learning is an uncomfortable process and “above all, human beings desire to be comfortable.” In addition, above all they desire not to be inconvenienced.

Which is what makes a familiar gospel reading from Mark so problematic. In response to Peter’s insistence that he is not going to go to Jerusalem to die, Jesus first puts Peter in his place in Jesus’ inimitable style, then issues this attractive invitation to his would-be disciples:

If any want to become my followers, let them deny themselves and take up their cross and follow me. For those who want to save their life will lose it, and those who lose their life for my sake, and for the sake of the gospel, will save it.

To which Jesus might have added, Sorry for the inconvenience. Because what Jesus is describing is more than an inconvenient truth. He’s warning his would-be followers then and now that, as bonhoefferDietrich Bonhoeffer wrote, “When Christ calls a man, he bids him come and die.” That’s an inconvenient faith.

There is another story in Mark’s gospel that caught my attention in one of my first posts on this blog almost two and half years ago. A young man (called a “certain ruler” in the Luke version of the story) approaches Jesus and asks “What shall I do that I may inherit eternal life?” Jesus answers that the young man knows very well what to do—he should keep the commandments, listing a few for the guy just in case he had forgotten them. But the young man replies “Teacher, all these I have done from my youth.” He’s not looking for a “good boy” pat on the head from Jesus; he’s already past the point of thinking that simply following the rules is good enough, or he wouldn’t have asked in the first place. The young man is looking for more.

We all know Jesus’ response—he tells him the inconvenient truth. “Go your way, sell whatever you have and give to the poor, and you will have treasure in heaven; and come, take up the cross, and follow me.” rich young rulerWe also all know the end of the story—“He was sad at this word, and went away grieved, for he had great possessions.” Jesus had inconvenienced the rich young man beyond his toleration level. But what precedes Jesus’ sharing this inconvenient truth is very  interesting. Mark says that “Jesus, looking at him, loved him.” This is a man who wants more, Jesus knows it, and Jesus loves him for it. But this is an inconvenient faith—the thing that you cannot do, that’s the thing that is required. And it will be something different for each of us. This story isn’t about the incompatibility of wealth and following Jesus at all. It’s a story about being called to come and die. The God of love is not a cure for anything. The God of love is the greatest of dispensers of inconvenience. “I did not come to bring peace but a sword,” and this is a sword that cuts deepest in those who are the most obsessed with knowing God.

These gospels are “hard sayings” because they run roughshod over our desire that our dealings with what is greater than us be similar to a convenience store transaction. “What do I need to do in order for X to happen, in order for Y not to happen, in order for Z to get a break?” are the sorts of questions we so often want answered, but they are always wrong sort of question when directed toward the transcendent. While on sabbatical several years ago I heard the poet browneMichael Dennis Browne speak of an insight that unexpectedly came to him as he mourned the tragic death of his younger sister, a woman for whom family and friends had gone hoarse with their prayers and petitions for healing. And she died anyways. What the hell is going on? Browne said “It came to me that this is not a God who intervenes, but one who indwells.” That changes everything. The inconvenience of trying to believe in a God who never calls, writes, or tweets is transformed into the challenge of being God in the world.

Silence and Submission

trump-and-bushDuring the past two weeks, reports concerning the attitudes and actions of one of the major party candidates for President of the United States towards women over the past few decades has dominated the news cycle. The attitudes and actions of the husband of the other major party candidate for President towards women have been part of the news cycle for lewinskydecades as well. It’s difficult to imagine that there is a person in this country who either finds such attitudes acceptable or wants to hear yet another person’s opinions about them—so I won’t dig further into the details. Instead, I’m interested in why so many people, from every political and religious persuasion imaginable, has been surprised by the offensive, demeaning, and degrading attitudes and actions that have been illuminated over the past two weeks. Misogyny and prejudice toward women has been part of our social structure for centuries—one if the most powerful sources of these attitudes and actions is the dominant religion in our culture: Christianity.ancient-other

In the team-taught, interdisciplinary course that I teach in, we recently completed a unit called “The Other,” focusing on how the ancient Greeks and Romans understood and treated those who were different. During one seminar we considered ancient views of gender, with two of Aristophanes’ comedies and an assortment of excerpts from other authors as our texts. Some were remarkably equitable, including Plato’s insistence that both males and females are equally capable of being rulers of his idealistic and imaginary perfect community, and hence should be educated in the same ways. Other ancient voices were not as complimentary toward women. From Aristotle, for instance, we learned that women are “deformed males,” arguing that “as regards the sexes, the male is by nature superior and the female inferior, the male ruler and the female subject.” And in the Apostle Paul’s first letter to the Corinthians we read thatpaul

I wish you to know that the head of every man is Christ, and the head of every woman is her husband . . . a man is the image and glory of God, but a woman is the glory of her husband. For man was not created from woman, but woman from man. And man was not created for woman’s sake, but woman for the sake of man . . . In all the churches of the faithful, let women be silent in the congregation, for it is not appropriate for them to speak. If they want to learn something, they should ask their own husbands at home . . .

But wait . . . that’s not all. A couple of Sundays ago, one of the readings was this from the Paul’s first letter to Timothy:

Let a woman learn in silence with full submission. I permit no woman to teach or to have authority over a man; she is to keep silent. For Adam was formed first, then Eve; and Adam was not deceived, but the woman was deceived and became a transgressor. Yet she will be saved through childbearing, provided they continue in faith and love and holiness, with modesty.silence-and-submittion

After the lector finished I leaned over the back of the pew in front of me and whispered to the couple sitting there “Wow, I guess Paul was having a bad day when he wrote that!” “No shit!” the guy whispered back. I’ve often wondered what the experiential and/or psychological sources of Paul’s obvious problems with women might have been—I’m still wondering. But whatever the sources were, such attitudes, fully resonant with the majority of philosophies of his day with which he was fully familiar, had a powerful influence going forward—an influence that afflicts Western culture to this day.

I found that many of the dozen-and-a-half eighteen-year-old freshmen in each of mif-onlyy seminars on ancient perspectives on gender assumed that the attitudes toward women they were exposed to in the readings they prepared for seminar are no longer with us. We moderns are, fortunately, respectful of all and treat everyone equally, no matter what gender or sexual orientation. If only. I wish. It didn’t take very long or much encouragement, however, for a few female voices to start providing plenty of evidence that we not only have not moved that far from ancient attitudes on gender, but in many cases are arguably very much the same.

female-priestDuring that seminar I asked the students to start thinking about the ways in which we use gender to organize social structures by asking them to identify a job description for which one’s gender is truly relevant. They had a difficult time coming up with one, despite our culture’s history of making gender relevant to decision making in everything from wages to educational opportunities, until someone said “I know one—priest!” I pointed out, first, that one of my best friends is both a woman and an Episcopal priest, so clearly it is only priests of a certain sort (Catholic) who can only be male. The rules and traditions of the Catholic church notwithstanding, however, none of my students were able to identify any specific thing a Catholic priest does that could not be done equally well by a qualified male or female.gmm

Given that it is difficult to find anything in the actual reported teachings of Jesus to support either treating men and women differently or assuming that men are superior to women, it is truly remarkable to observe just how thoroughly such attitudes and actions became entrenched in the religion that grew out of Jesus’ teachings. There is plenty of evidence that many members of Jesus’ inner circle were women and that women were important leaders in the early Christian communities. But the documents containing such evidence did not make the cut when the New Testament was officially assembled, and such evidence was suppressed and ignored as a male-dominated ecclesiastical hierarchy emerged. After two millennia there are signs that biases against women are changing in some Christian circles, but there remains much to do and a great deal progress needs to be made.

When misogyny and Neanderthal attitudes toward women rear their ugly heads, as they have with a vengeance during the past couple of weeks in the context of the Presidential campaign, we should not be surprised. This is the natural outcome of centuries of history in Western culture, a history in which Christianity has been a central driving force. Christians are in nearth-and-heaveno position to take the high road and respond to such ugliness with moralistic tut-tutting and judgments. The truth of the matter is that Christian churches of all sorts have contributed to the embedded misogyny and sexism that still infects our world in many ways. If Christians truly intend for God’s will to “be done on earth as it is in heaven,” as we recite in the Lord’s Prayer every week, it is incumbent on us to put our house in order before casting stones elsewhere. There is a great deal of work to be done.

October Musings

Autumn in NEAutumn is my favorite season of the year, and October is my favorite month. This is not surprising for a native New Englander, since turning leaves together with crisp, sunny and cool days are an attractive combination. Even on this particular middle-of-October day as I write, when it is unseasonably warm and humid with a threat of heavy rain later, a few typically beautiful fall days in the past week and the promise of more to come keeps me weather-happy. I know that autumn bums many people who live where the seasons change out because it means that winter is coming. But I like winter as well, or at least the idea of it. The older I get the less I enjoy the actual fact of shoveling snow on occasion and having to warm the car up every morning, but I’ll take it over the Florida summer humidity and heat that my son and daughter-in-law profess to love for some unknown reason.halloween birthday

October not only means my favorite kind of weather, but also puts me in a reminiscent mood. October was an important month during my growing up years because both my mother and my brother were born in October (my mother on Halloween, which meant that we usually ignored her birthday in exchange for more interesting activities). It is my brother’s birthday in a few days; I am several months older now than the age at which my mother died. She died of cancer in October, just three weeks short of her sixtieth birthday, followed a couple of weeks later unexpectedly by the death of my father-in-law of only a few months. That was twenty-eight years ago; amazingly, sometimes it seems more like twenty-eight weeks.

October is a centrally important month every year for both students and faculty on college campuses—the first big papers and often the first significant exam of the semester (or perhaps the midterm exam)midterm are usually October events. For students this means even more stress than usual; for faculty it means that the first few weeks of the semester that have pleasantly been free of tons of grading are now at an end. Faculty love to bitch and moan about grading—I used to be great at such complaining until Jeanne asked me once many years ago at the end of my latest grading whine-fest “Isn’t that part of your job?” Well yes, I guess it is. It’s the one part of my job that I hope I don’t have to do in my next life (because I still intend to be a college professor—there’s nothing better). Now I tend to think of October grading as a great opportunity to learn new things from my students.

For instance, my colleague on an interdisciplinary faculty team informed me by email a few days ago that she just read the following in one of his freshman papers: “As Mr. Morgan talked about in lecture, during this time and culture, obeying god was the priority of every man, even if that means sacrificing your own son, which happened a lot in olden times.” Google UMy colleague wrote “I guess I must have missed that lecture.” I responded that “Mr. Morgan is my evil twin who gives lectures on off days for students who don’t come to the regularly scheduled lectures. I take no responsibility for anything Mr. Morgan says.” In one of my own papers (the same assignment that produced my colleague’s paper) one of my freshman began as follows: “According to Google, happiness is defined as . . .” I’m glad that I’m old enough that I won’t have to fully adjust to the brave new educational world that is just around the bend.Kathleen

October also often brings important speakers to campus. Doris Kearns Goodwin, a rock star in Jeanne’s and my estimation, speaks in ten days. I remember a couple of years ago when my friend and best-selling author Kathleen Norris was resident scholar on my campus and gave a late afternoon talk. At the beginning of Q and A , Kathleen mentioned how much she used to enjoy Q and A sessions with second-graders to whom she was bringing poetry in North and South Dakota classrooms many years ago. “How old are you?” “How much do you weigh?” “Do you have a cat?” “How much money do you make?” “Do you have a bicycle?” The next time I am in attendance at a scholarly paper event, those are the questions I’m going to ask. Because those are the things I really want to know.

Even though the liturgical year is still slogging through endless weeks of “Ordinary Time,” October always brings welcome entertainment. Last Sunday we celebrated Saint Francis Sunday with “Blessing of the Animals.” I went to the early show with Frieda, who along with five other dogs held center stage and generally behaved themselves.

Three years ago

Five years ago

This year

Two years ago

For several years running I was lector for Saint Francis Sunday and read the story of Balaam and his donkey from Numbers. My friend Marsue, who was rector of our little Episcopal church for those years, made sure I was scheduled as lector for this event every year because I always brought Frieda to the lectern so she could stare people down while I was reading.

During October the weekly readings are still stuck in Ordinary Time, where we have been since Pentecost. This year the readings from the Jewish scriptures have wandered through various prophets yelling at whoever would listen about various shortcomings.  Last year we were walked through the familiar and fascinating stories of the patriarchs in Genesis and the dramatic escape of the children of Israel from Egyptian bondage in Exodus. In Exodus 32 Moses is up on Mount Sinai hanging out while God writes the Ten Commandments and everyone else figures he’s never coming back. So they make the Golden Calf, start a minor orgy, and you know how that worked out. golden calfMoses is pissed; God is even more pissed. “Jesus Christ!” God yells (he forgot what part of the Bible he was in for a moment). “Moses, can you believe this shit?? I’ve had enough of these clowns! Stand back, Moses, while I wipe them all out. Then I’ll begin again with a new bunch of people starting with you, sort of like I did with Abraham in the previous book.” Moses points out that this would make God look bad, given that he put so much effort and creative thought—from plagues to parting a sea—into getting these people out of slavery, only to kill them in the desert. God’s response to Moses’ point is my favorite verse in the Jewish Scriptures, perhaps in the entire Bible: And the Lord changed His mind. The implications are unlimited.

October also provides me with a yearly opportunity to introduce a bunch of innocent freshmen to my choice for the greatest philosopher in the Western tradition: Aristotle. McEwanHis vision of the moral life, of the life of human flourishing, is second to none. I came across a beautiful description of such a life not long ago in Ian McEwan’s The Children Act:

Welfare, happiness, well-being must embrace the philosophical concept of the good life. She listed some relevant ingredients, goals toward which a child might grow. Economic and moral freedom, virtue, compassion and altruism, satisfying work through engagement with demanding tasks, a flourishing network of personal relationships, earning the esteem of others, pursuing larger meanings to one’s existence, and having at the center of one’s life one or a small number of significant relations defined above all by love.

Autumn is a time when I feel, at least a little bit, that such a life might be possible. Thanks, October.love october

Two Plus Two Makes Four

In J. M. Coetzee’s strange and fascinating novel The Childhood of Jesus, the precocious child David has a difficult time understanding numbers. Oh, he knows their names but is not inclined to put them in the order that the rules of mathematics specify. Nor is he inclined to accept the rules guiding any accepted human behavior—he wants to live in a world in which things are valuable and right to the extent that he likes them, and he is not willing to arrange numbers in the proper sequence that everyone agrees upon. After one too many patient attempts to steer David toward conformity, his guardian Símon sputters coetzee“The answer to all your Why? questions, past, present and future is: Because that is the way the world is. The world was not made for our convenience, my young friend. It is up to us to fit in.”

This business of knowing when to fit in and when to creatively resist expectations is a lifelong challenge that all of us grapple with on a daily basis. At the heart of that challenge lie questions so fundamental that they literally shape our reality. Is the search for truth more like a treasure hunt or a creative, artistic process? Is meaning something to be found or to be made? Tentative answers to these questions frame one’s encounter with both oneself and the outside world. As Plato famously suggested, it is difficult to imagine meaning as the target of an open search, since I won’t know if I’ve discovered the goal of the search unless I already have a sense of what I’m looking for. But if meaning is something that each of us creates throughout the process of our lives, what hope is there for shared meaning, for truths that are not just mine but everyone’s in common?

Although both by nature and philosophical preference I am more of a “creative process” than “treasure hunt” sort of person when it comes to engagement with meaning and truth, I spent a recent semester exploring a seminal text in philosophy written by one of the most eloquent advocates of the “treasure hunt” model in the Western tradition. Plato’s Republic is, among many other things, an extended development of the idea that Truth is objective, that meaning is something to be found, not created, and that enlightenment is a life-long process of being freed from the clutches of our ego-driven subjective “truths” in order to slowly discover what “Truth” really is. plato geometryPlato’s paradigm for Truth is mathematics, a discipline that with its objective principles and rules exposes the truth-seeker to a world in which what is true is not up to me but is available to those who are willing to commit themselves to “the sight of the Truth.” Plato makes an extended argument that moral values and virtues properly understood exhibit the precision, certainty and objectivity of mathematics. Indeed, mathematics is Plato’s exemplar of the nature of truth; he insisted that only those who love geometry could enter his Academy, because it is through study of mathematics that one becomes accustomed to the nature of all truth.

If my students in this class—actually, over the past twenty-five years—are an accurate sampling, Plato’s commitment to the objectivity of truth is strongly opposed to our contemporary intuitions. As I often do, I introduced the problem early in the semester with a simple question about a couple of basic truth claims. I wrote two sentences on the board,Mona_Lisa

A. Two plus two equals four.

B. The Mona Lisa is a beautiful painting.

then asked for observations about what makes these truth claims different. Within short order the students point out that A is objectively true (as are all mathematical truths), while B is subjectively true (as are all aesthetic claims). If someone denies the truth of A, we assume that either that person doesn’t know the basic rules of arithmetic, is deliberately being a contrarian, or simply is nuts. If someone denies the truth of B, however, no problem—there’s a reason why we say “beauty is in the eye of the beholder,” after all.

Then I move to the point of the exercise by writing a third truth claim on the board.values

C. X is right (good) and Y is wrong (bad).

X and Y can be anything that people are inclined to make value judgments about. I simply ask “Is C more like A or like B?’ When we venture into the realm of moral truth claims, in other words, have we entered a realm more like mathematics or art? Objective or subjective? Finding or creating? In twenty-five years of teaching, students have overwhelmingly given the same answer—moral truth claims and judgments are more like B than A. Morality is subjective rather than objective, in other words. In my Plato’s Republic class last semester, only two students out of twenty-five present claimed that moral claims are objectively true—and they were both Catholic seminarians.

moral-disagreementWhen I asked the other twenty-three students—many of whom were the products of Catholic primary and secondary education—why they bundled moral and value truth claims together with aesthetic claims as subjective, most zeroed in on the problem of moral disagreement. Essentially their argument was that since people disagree significantly across the board about every moral issue imaginable, and given the apparent absence of any authoritative perspective from which it could be judged who is right and who is wrong, moral disagreement looks a lot more like the Mona Lisa squabble than whether two plus two equals four or five. The real problem is that, unlike mathematics, there is no working and accepted objective standard to which one can appeal when trying to figure out who is right and who is wrong in a moral disagreement. Rather than do the difficult and challenging work of seeking objective standards, it is much easier to assume there are no such standards in morality (except perhaps extreme tolerance) and place moral truth claims in the subjective category. We get to create them ourselves without being answerable to an objective standard—because there isn’t any such standard. Let the discussion begin.

the plagueIn The Plague, a central and early text in another one of my recent classes, Albert Camus raises the possibility that despite the apparent subjectivity of moral claims, there comes a time when one must hang on to moral commitments with the tenacity of two plus two equals four.

Again and again there comes a time in history when the man who dares to say that two and two make four is punished with death. And the question is not one of knowing what punishment or reward attends the making of this calculation. The question is that of knowing whether two and two do make four.

Here the narrator of The Plague is commenting on the “sanitation squads” in the novel who, rather than hiding from an apparently random and incurable plague that is sweeping across their city, taking the lives of hundreds of their fellow citizens per day, choose to embrace the basic moral task of facing the danger head on, putting their own lives at risk in the service of making the suffering of others slightly less intense and their environment slightly less dangerous. When asked why they have taken on such a thankless task, the members of the sanitation squad always answer with mathematical simplicity. Some things just need to be done. And sometimes what needs to be done is as obvious as the truth of two plus two equals four. the white rose“But what you are doing may very well lead to your death,” someone might object. “So be it.”

Camus’ point is strengthened significantly when considering that The Plague is not just a powerful work of fiction but is also a multi-layered allegory. Published in 1947, the bulk of the novel was written during the Nazi occupation of France during World War II, with the various characters in the novel representing the different reactions of French citizens to totalitarianism, the loss of their freedoms, and the extermination of undesirables. kolbeThose who, as did the sanitation squads, chose to address the Nazi plague in the face of overwhelming odds of failure are those who recognized that even in a moral world turned upside down, sometimes the truth and what is right are as obvious as a simple sum in arithmetic. We studied a number of such people during my “Grace, Truth, and Freedom in the Nazi Era” colloquium; many of them—Dietrich Bonhoeffer, the members of the White Rose, Maximillian Kolbe, and others—lost their lives for daring to insist that two plus two makes four, just as Camus described. But that doesn’t change the fact that even in the world of morals and values, some things are as clear as mathematical truths. Sometimes it really is that simple.

West of What?

ruaneThe heart of Providence College’s core curriculum is the Development of Western Civilization (DWC) program, a sixteen-credit, four-semester, interdisciplinary and team-taught series of courses required of all freshmen and sophomores regardless of major. From its origin in the 1970s, DWC (or “Civ,” as many call it) has been both a regular source of pride and occasionally of controversy, both of which have been the case recently. I have taught in the program for sixteen of the twenty-one years I have taught at the college and directed the program for the four academic years before my 2015-16 sabbatical. At the end of August on the first day of the current semester (my first day back in the classroom in fifteen months), my colleagues and I chose to spend our first ninety minutes with our new freshman students by raising the question that many of them were probably thinking but were too afraid to ask: Why study Western civilization at all?

The very existence of a program such as ours is controversial in many quarters where many wonder, in an increasingly diverse and globally connected world, whether the study of dead-white-guys“dead white guys” is anything more than a lame and outdated exercise in cheerleading for white folks. But others seek to defend the primacy of Western civilization over all others. One of my colleagues began the class with a clip from the Republican National Convention a couple of months earlier. A talking head panel was filling up air space waiting for the first speaker of the evening. After one person mentioned that the hall was filled with lots of angry white people, Iowa Congressman Steve King took offense.

King: This “old, white people’ business does get a little tired. I’d ask you to go back through history and figure out, where are these contributions that have been made by these other categories of people that you’re talking about — where did any other subgroup of people contribute more to civilization?”

Panel moderator: Than white people?

King: Than Western civilization itself. It’s rooted in Western Europe, Eastern Europe and the Unites Stated of America and every place where the footprint of Christianity settled the world. That’s all of Western civilization.

King’s comments astounded a couple of people on the panel and outraged a third—as the table exploded into shouting, the moderator wisely broke for a commercial.

The congressman’s comments reminded me of something brought to my attention a few weeks earlier. A person who gave a talk at my college over fifteen years ago remembered it in a recent column on a conservative website, recalling that some of the people present were not happy with the content of his talk.

 I am sure I noted the fact that some cultures are better than others and I might well have quoted (with enthusiastic approval if I did) William Henry’s observation, in his book In Defense of Elitism, that “It is scarcely the same thing to put a man on the moon as to put a bone in your nose.” I am very fond of Henry’s native prosuperior-civiliztionsboscis image, not least because, like Saul Bellow’s question: “Who is the Tolstoy of the Zulus?”, it is so reliably productive of rage among pampered left-wing academics.

These comments could have been written or said by some of my colleagues on campus who often defend the existence of our DWC program by asking “Why wouldn’t we want to focus close-to-exclusive attention on Western Civilization? It’s the best civilization ever!”

After Congressman King’s clip from the RNC, my teaching colleagues began a conversation with our students by asking basic questions such as “What is Western Civilization?” The students seemed unsure how to answer other than to say “we are!” After a few minutes I made my first contribution. “This is going to sound like a stupid question,” I said, “but I’m sitting over here listening to all of you talk about ‘Western Civilization’ and I’m wondering: West of what? Are we talking geography here? What is Western Civilization west of?” None of us had thought to load a map onto PowerPoint, so I just asked the students to picture Europe, Asia, and Africa in their imaginations. “If we draw a dotted line from north to south to divide the East from the West, where would we draw it?”drawing-the-line

The first suggestion was that it would go through the Middle East, in a way that would perhaps include Israel in the West but perhaps the rest of the Middle East in the East. “What about Turkey?” one of us asked. They weren’t sure. Of course if the dotted line continues straight south from Israel, another question arises. “What about Africa? Is it in the West?” I asked.funny-africa “No,” just about all of the students answered, meaning that our dotted line needs to take a sharp turn left from Israel, bisecting the Mediterranean Sea so that Europe to the north stays in the West (as everyone knows it should) and Africa to the south stays non-West (as everyone knows it should). Remembering Congressman King’s inclusion of the United States in Western civilization (even though it does not make an appearance in DWC until the beginning of sophomore year), the dotted line should cross the Atlantic so that the U.S. and Canada lie on the Europe (northern) side of the line. Mexico and Central America? Iffy. South America? Not sure. It was pretty clear that the line was intended to include those most like us in the West and exclude everyone else.funny-asia

This exercise established pretty quickly that the answer to West of What? cannot be geographical. One student then helpfully suggested that “I think ‘Western’ refers to various ideas rather than geographical areas.” Asked for examples, students suggested things like democracy, human rights, capitalism, and monotheism. One of my colleagues, a theologian, jumped on that one. “Is Christianity a Western religion?’ he asked—“Of course!” was the answer (Congressman King would have been proud). “What about Judaism?” The students weren’t as sure there, even when my colleague pointed out that both Christianity and Judaism started in the same place and that Jesus was a Jew. “What about Islam?” Most of the students thought not (Donald Trump would have been proud), prompting my colleague to point out that for a number of centuries during the Middle Ages the westernmost area of Europe—Spain—was Muslim. Trying to define “Western” by religion doesn’t work much better than geographically.monotheism

Finally, one brave student said “I think that ‘Western’ for most people means pretty much the same thing as ‘White’.” And I suspect there’s a great deal of truth in that suggestion. When we study Western Civilization, we will undoubtedly end up studying predominantly dead white guys unless a concerted effort is made to engage with non-Western influences. I mentioned to the students, for instance, thg-and-iat most people don’t know that one of the greatest—perhaps the greatest—philosopher in the Western tradition, Plato, was heavily influenced by thought from Persia and India. The dotted line is porous, in other words. Furthermore, one of my colleagues who is of Italian ancestry pointed out that one hundred years ago in this country, immigrants from southern European countries such as Greece and Italy were not considered by many in this country as white.

After ninety or so minutes of this, I’m sure our students left largely in confusion. And that’s a good thing. Despite the name of the course, “Western Civilization,” upon close scrutiny, turns out to be as much a reflection of our expectations and prejudices as anything else. This is good for seventeen- and eighteen-year-old young adults to understand at the outset of their forays into their college education. We live in a polarized political and social climate in which labels and categories are used as if they are as objective and set in stone as the Ten Commandments. Only when we realize that all such dividing lines are at their roots random, porous, and fluid might we seize the opportunity to do better.

Embracing the Barbarian Invasion

Every year the world is invaded by millions of tiny barbarians. We call them “children.”  Hannah Arendt

One of the wonderfully gratuitous features of my early years as a college professor was the opportunity to teach regularly with a couple of master teachers. During the first decade of my teaching career at Providence College, I taught on an interdisciplinary Honors Development of Western Civilization team every year with two such colleagues. images[6]Rodney was a teaching icon from the English department who now, a few years after his untimely passing, has a tree on campus, a seminar room in the brand new humanities building, and an annual lecture named after him. One of the most dynamic and engaging pedagogues I have ever encountered, I remember telling Jeanne shortly after meeting Rodney in the middle nineties in my first year at Providence College that “when I grow up, I want to be Rodney.”

rays[1]The other member of our teaching triumvirate, Ray, is an extraordinary professor out of the History department. He is also one of the flat-out finest human beings I have ever had the privilege of knowing. This coming spring Ray and I will be teaching a colloquium together for the third time the past four years, and class fondly referred to by students as “Nazi Civ.” I am a far better teacher and human being for having spent so many years in the classroom in the company of these outstanding colleagues.

Because we spent so much time together in and out of the classroom, the three of us got to know each others business over the semesters a bit more than is typical between professional colleagues. We often spoke of our children; Rodney’s and Ray’s were young adults at the time, while mine were in high school and junior high. One morning before class as we were getting coffee in the break room, Rodney was bemoaning the fact that he had returned home from work the previous day at 5:00 in the afternoon at the very same time that his son, yowl-380x190[1]a twenty-something who was still living at home, emerged bleary-eyed from his basement bedroom for the first time that day. As we compared notes about the shortcomings and failures of our respective offspring, Ray, who I had always pegged as the perfect father and husband, grew reflective. “I’ve heard so many parents talk about the wonders of parenthood, how raising children is such a privilege, how their children’s growing up years were the best years of their lives,” he said. “I guess I must have missed that.” Preach it, Ray. For all of our politically correct claims about the wonders of child rearing, all parents know that Hannah Arendt’s “tiny barbarians” comment is absolutely true. Civilizing barbarians is hard work.

Conan-the-Barbarian[1]The word “barbarian” is from the Greek word βαρβαρος (barbaros), the term Greeks used to refer to anyone who was not Greek. To the refined but xenophobic Greek ear, the sounds coming out of a non-Greek speaker’s mouth sounded like “bar, bar, bar”—hence, “barbarian.” We would call such persons “blahblahblahrians.” The wider connotation of “barbarian” is simply someone or something that does not fit into the expected categories, abide by the accepted rules, or behave according to agreed-upon standards. That description certainly fits children and a lot more—I frequently call our 196834_112520205494582_3062546_n[1]dachshunds barbarians when they pee or take a dump in the middle of the floor, just as I would probably call a human being a barbarian (and worse) if they did the same thing.

And yet there is something exhilarating about having barbarians in our midst. A world without barbarians, without unfamiliar hordes pressing against the outer walls of our holy-of-holies comfort zones, is a world that eventually would stagnate into a smug status quo. I realized this past semester, as I do in varying degrees every semester, that one of the regular features of what I do as a teacher is to let the barbarians loose on the civilized yet unexamined thought processes of my students. conan-barbarian-04_510[1]Philosophy is an inherently barbarian discipline because it’s entire raison d’etre is the challenge to consider that one’s most cherished beliefs might indeed need improvement, that the doors and windows to the inner sanctum might regularly be opened to allow the smelly and scary barbarians in.

Several years ago, when I was still an untenured assistant professor and should have been keeping my mouth shut, I recall being involved in a conversation about this feature of philosophy during a philosophy department meeting. We were in the process of crafting a new “mission statement” for the department, an exercise guaranteed to generate disagreement. Title[1]One of the older members who had been chair of the department for a couple of decades before my arrival, a Dominican priest, proposed that our mission statement read that “The mission of the philosophy department is to teach the Truth.” Period—and make sure that it’s a capital “T” on “Truth.” I, along with several others, suggested that this would presume that we possess the Truth with a capital T, a presumption that is directly contrary to the very spirit of the philosophical enterprise. In a condescending tone (or at least so it sounded to me), another priestly colleague said “Vance, some of us around here think we have the truth,” to which I replied “And here I thought we were a philosophy department.”

So how does one keep the pursuit of truth alive without it being sidetracked into defense of the Truth? Over the past several years in my teaching and writing this question has been directed more and more toward the arena within which Truth rears its ugly head most often—religious belief.collegeville-lecture-31[1] During my sabbatical semester at an ecumenical institute five years ago I described my original book project as follows: “Is it possible to live a life of human excellence, of moral focus and spiritual energy, in a world in which the transcendent is silent, in which God is arguably absent?” As I led an afternoon seminar based on my early work on this project with a dozen fellow “resident scholars,” one of them—a Lutheran pastor—asked “But Vance, don’t you have to believe something with certainty if you’re going to call yourself a Christian?” To which I replied, “I don’t know—do I?” I had been wondering that for many years, but this was the first time I had said it aloud. And it was liberating. What would a faith that in which no “truth” is a “Truth,” a faith in which no cows are sacred, look like?

As I’ve dug into these questions with new energy and focus over the past few years, several matters have begun clear, beginning with the fact that the transcendent is not silent after all and God is definitely not absent. They just show up in entirely different places than where we have traditionally looked for them. And I am finding that, for me at least, a vibrant faith requires little in the way of defending the Truth, but rather a willingness to welcome the divine even when wrapped in unexpected packages. JCarse3YT1.2c_000[1]As James Carse writes,

This is Christianity’s strongest feature: it tirelessly provokes its members to object to prevailing doctrines without having to abandon the faith . . . Neither Christianity nor any of the great religions has ever been able to successfully erect barriers against the dreaded barbarian incursions of fresh ideas. 

Such barbarian incursions are not to be feared or defended against. They are to be invited and welcomed. Just as the millions of tiny barbarians who invade the world every year are actually the way in which the human species is renewed and regenerated, so the regular introduction of barbarian ideas into our civilized and supposedly completed belief systems will keep those beliefs from turning into idols. What would a faith in which no “truth” is a “Truth,” a faith in which no cows are sacred look like? It would look a lot like Faith–the real thing.

Nice Work If You Can Get It

Not long ago, I read a Huffington Post article summarizing the results of what a bunch of anthropologists found out concerning the daily work habits of university faculty.

What Do Professors Do All Day?

After spending two weeks with a non-random sample of sixteen faculty of different ranks at boise stateBoise State University, the researchers found out that on the average the faculty worked 51 hours during the work week and 10 hours on the weekend at a host of different tasks. Amazing. It took a study to find out that teachers have full-time jobs after all and actually do work-related things on the weekend (although the numbers for both week and weekend seem a bit low). I’m wondering how to square these remarkable results with an article I read a couple of years ago claiming that “University Professor” topped the list of “Least Stressful jobs of 2013.” Of course I had to respond . . .

Those who read this blog regularly or even occasionally know that I believe I have the greatest job in the world. For those who are finding this out for the first time, let me repeat—I have the greatest job in the world. As a matter of fact, it is so great that I don’t consider it to be a job at all. For me, teaching is a vocation rather than a job, something that I truly believe I am called and was born to do. Convocation_2007_16I raise eyebrows occasionally on the campus of the Catholic college at which I teach when I say that I consider teaching to be a vocation for me just as much as being a priest is the vocation of the guys who walk around campus in white robes. But even though I love what I do to an almost unhealthy extent, I taken aback when I learned from a colleague via Facebook that “University Professor” is listed by CareerCast.com at number one in its top ten list of “Least Stressful Jobs of 2013.”

The Ten Least Stressful Jobs of 2013

Really? Or as one of my colleagues commented on Facebook “Bullshit!!! [pardon my advanced degree French].” I guess I must have failed to notice how non-stressful my job is during my 25 year university professor career.

Every person who teaches in higher education has a virtual file full of stories about how difficult it is to get a non-academic friend or family member to understand exactly what is involved with being a college professor. pic_short_teaching_courses_londonMost difficult is getting someone to understand that this is not a part-time job. For instance, Cousin Bob finds out that the typical teaching load for a faculty member at a teaching college or university is three or four three-credit courses per semester (or perhaps five at a two-year college), meaning that the faculty member is in class at most fifteen hours per week. Must be nice to make a full-time salary at a part-time job! Cousin Bob remarks. Early in my career I often patiently pointed out to the Cousin Bobs in my family that a good rule of thumb is that a teacher spends three to four hours outside of class (preparation, reading, grading, meeting with students, etc.) for every hour spent in class. “Really?” Cousin Bob replies. But he clearly is not convinced, since as we all know, easy working hours is the main reason that a person becomes a teacher.

Then, of course, Cousin Bob wonders about all of the weeks of the calendar year that I am not in the classroom. Christmas break, Spring break, the summer—teachers apparently get at least twenty weeks per year off.images Must be nice to make a full-time salary at a part-time job! With what begins to feel like infinite patience, I explain to Cousin Bob that teaching is only one part of a university professor’s job. In order to advance through the ranks of promotion and, more importantly, in order to get tenure, one must research and publish on a regular basis. For most college faculty, the time and focus required for this aspect of the profession is not available during the semester, so the “breaks” are all about research, writing, and praying for publication. But I’m not in the classroom, right? Must be nice to make a full-time salary at a part-time job! You get the idea. A colleague once told me about his frustrated response to a relative after one too many such conversations. Upon hearing Must be nice to make a full-time salary at a part-time job! one too many times, my colleague replied “It actually is really nice. If you were smart enough you could have a job like that too.”

CareerCast’s explanation of why University Professor is the least stressful job of 2013 has more than a whiff of Cousin Bob behind it, just expressed in a slightly less aggressive fashion. For instance, the article explains that

University professors are at the pinnacle of the education field. Their students are largely those who choose the classes they attend, and thus want to be in class. Unlike elementary and secondary educators, the performance of college professors isn’t evaluated based on standardized tests. 15rfd-image-custom3University professors also have the opportunity to earn tenure, which guarantees lifetime employment.

A full response would require something more like a book chapter than a blog post. Suffice it to say that the author of the article (1) has apparently never heard of core requirements if he thinks that college professors face “students . . . who choose the classes they attend, and thus want to be in class,” (2) is not aware that despite the (usual) lack of standardized tests, college teachers are evaluated by their peers, answerable for the results of student evaluations, and are under regular scrutiny in every aspect of what they do, and (3) needs to learn something about the tenure process (assuming that the faculty member is fortunate enough to be teaching at an institution that uses the tenure process).

Tenure. Such job security is certainly unusual in today’s job market and tenure is an attractive “perk” of the academic life. Once one earns it, that is. one-does-not-simply-become-an-adjunct-professorTenure-track positions are hard to come by in academia, more and more so as many institutions opt for hiring year-to-year adjunct professors or special lecturers then proceed to treat them as well-dressed slave labor (don’t get me started on that one). Should a teacher be fortunate to land a tenure-track position in today’s heavily buyer-skewed academic marketplace, the stress she or he will experience in the next several years leading to the tenure decision will be sufficient to last a lifetime. As is undoubtedly the case in many workplace environments, the tenure decision is often as much or more about internal campus politics as it is about the qualifications of the candidate and those things that she or he can control. “The opportunity to earn tenure” is indeed that—an opportunity that, unfortunately, for many talented and qualified teachers will never be available.

Then there’s the money. The article author points out that

csreport_header02_r1_c1_s1Harvard University pays full-time professors $198,400, with a 7:1 professor-to-student ratio, while University of Chicago professors receive $197,800 per year with a 6:1 ratio. Among public universities, the University of California at Los Angeles (UCLA) is highest paying, with an average wage of $162,600 for its full-time staff.

Really? All of them? At all levels? In all disciplines? Such “statistics” are useless without context, about as useless as telling a lawyer in a public defender’s office working 80-100 hours per week and struggling to make ends meet that the senior lawyers in the top firms on Wall Street often make seven-figures annually. Here’s an anecdote from the real world. At least a dozen years into my teaching career, still paying off the loans from ten years of college in order to earn the PhD required to teach at the college level in my discipline,business-ethics it occurred to me one day that the second semester seniors in my Business Ethics class, most of whom would be entering the work force shortly after graduation as entry-level persons with undergraduate business degrees, would be starting at a monthly salary noticeably higher than my own. As I once told a younger colleague when I was chair of my department, in response to his complaints about how little he was being paid, “if you became a teacher for the money, you’re a moron.”

1385581_616281185091038_1215320450_nI have reached the level of experience and rank (a tenured, full professor) at which one could expect that maybe stress levels might reduce and ultimately disappear. But persons such as I are those who are tapped, appropriately, to significantly commit themselves to the third leg of the academic stool (along with teaching and research): service. After four years as chair of a department of 25 faculty and having recently completed a four-year stint as the director of the core academic program at my college, responsible for 80 faculty and upwards of 1700 students at any given time, I realize that one sort of stress just gets replaced by another.

And actually that’s fine, since it is all part of the vocation I was born to inhabit. There are many attractive features to the life of a university professor. I can think of no other profession in which one’s creativity is required more often or in which one has more autonomy and flexibility. But it is anything but stress-free. A teacher never leaves the office. Your work goes with you everywhere. I realized at one point early one December that, other than Thanksgiving Day,  I literally had not had a day off since the middle of August. This is why I have recommended the teaching profession to no more than a half-dozen of my best students in more than twenty years of teaching. If you are looking for a profession that will fit nicely with your family obligations and other interests, don’t become a teacher.nice-work-if-you-can-get-it-1180 If you want to make a living wage at a stimulating 40-45 hour per week job, don’t become a teacher. If you want to “work to live” rather than “live to work,” you probably should not become a teacher. If you think of teaching as one among many equally interesting career possibilities, don’t become a teacher. But if you are incurably obsessed with the life of learning, if the dynamic of new ideas exhilarates you, if you suspect that you might have the heart of a teacher and nothing else will fulfill you, then this highly stressful but highly rewarding vocation might just be for you. It’s nice work if you can get it.

First Day of Classes

Today is the first day of the semester at my college. I have been anticipating it even more than I usually do because I have been on sabbatical and have not been in the classroom since May 2015—my longest stretch away from school since the mid-1980s. charlie-brownEveryone has stories, many of them of the horror variety, about the first day of school—mine were full of cognitive and emotional dissonance from my earliest years. On the one hand, I couldn’t wait for school to start—I’ve loved just about everything about the life of learning ever since I can remember. On the other hand, there was all that non-learning stuff that I was not good at. Meeting new people. Sports. Leaving my mother and my dogs at home every morning. Riding the bus for a half hour each way. Come to think of it, I probably would have loved home schooling. I doubt my parents would have.

For a college professor, the first day of the semester requires answering an eternal question—what to do with students whom you have never met, who are coming off either summer or Christmas vacation, who probably just met their roommate and hate her or him, who might have spent the previous night getting in a final party before attempting to get serious about their education for at least a week or so, and who are assuming that all we are going to do during our first class meeting is go over the syllabus and leave early? quizI have some professor colleagues who, assuming that all the above is true, descend to their new students’ expectations and dismiss class after a brief overview of the syllabus and the upcoming semester. I am not one of those professors. At the beginning of the first day of school, I give a reading quiz.

This, of course, means that my students have a reading assignment that they are required to complete for our first meeting of the semester. This is easy enough to arrange—I have the class roster available several weeks before the semester begins, so a simple email, followed by at least a dozen reminders in the following days before the semester begins, provides my students-to-be with the course syllabus, other pertinent data, and the first day’s reading assignment (with the promise of a quiz). I often notice a drop of five or six students in my class roster in the day or two after sending this email, which is a good thing (see below).game of thrones I usually receive a bit of email pushback from my students-to-be (“My other classes aren’t requiring a reading or having a quiz on the first day”), so why do I do this? For a number of reasons.

  • One advantage, as noted earlier, is roster control. Unfortunately, because my classes tend to fill up quickly and are often of the core curriculum variety that students have to take, every student who leaves my roster for a more palatable first day experience is immediately replaced by someone else (who doesn’t necessarily know about the first day reading and quiz—hence the multiple email reminders).
  • The amount of classroom time I have in a given semester with my students is limited. This semester we meet for seventy-five minutes twice per week for a total of twenty-seven meetings. Throwing away the first of those meetings sends the message that either our time together is unlimited or that it isn’t important enough to squeeze value out of every scheduled moment. Neither of these is true.
  • Many of my courses (half of them this semester) are filled exclusively with freshmen. Nothing says “this isn’t high school or Kansas anymore” to a first-week college student than for them to hit the ground running—hard.

I’m happy to report that in the interdisciplinary course I teach in every fall, my teammates have been happy to join me in assigning significant work for the first class. Two years ago, my colleague and I assigned the entire Iliad for freshman summer reading, building on it through the first two weeks of the semester. The first meeting of this year’s interdisciplinary course freshmen met today—penguinsmy colleagues and I assigned fifty pages or so and gave a quiz. Amazingly, everyone survived. It’s amazing what you can get freshmen to do without complaining.

Other than making the first day of school a difficult one for my students, what really am I up to as I return to the classroom this week? I reflected on this just before as my sabbatical began sixteen months ago; as I get back into the teaching saddle I return to what I was thinking then. As a professor, I am a facilitator of lifetime learning, a person who points students in fruitful directions, helping them identify and become skillful in the use of tools that will help them construct their own moral frameworks intelligently. The liberally educated lifetime learner is equipped both to seek and create meaning throughout her life. I take pride in playing a part in this process. I have thought a lot over the past twenty-five years about the day-to-day dynamic between professor and student; I continually return to the difference between an idol and an icon.

Idols and Icons

virgil and danteThe point of a professor is to be Virgil to the student’s Dante, guiding the educational journey, relying on knowledge and experience to point out the pitfalls and attractions of a journey that each person must encounter individually. The professor helps the student learn to identify what is important and what is not in the perpetual sifting process of education. The professor is not the main attraction at any point in this process. The professor is an icon—something to look through or past, in other words—rather than an idol—something to look at. second commandmentThere is a reason that the Second Commandment is a prohibition against idolatry. Human beings are inveterate idolaters, more than happy to pattern themselves after someone or something else rather than to take on responsibility for themselves. For those who are interested in creatively addressing the undoubtedly real shift in higher education toward preparation for a good job and financial success that has been going on for a while now, I highly recommend iconography.