Category Archives: teaching

It’s Not a Holy Relic!

Amadeusmov[1]In Milos Forman’s 1984 Academy Award winning film Amadeus, Holy Roman Emperor Joseph II, played by Jeffrey Jones of Ferris Buehler’s Day Off fame, is most of the time an enlightened ruler who makes his decisions after considering the advice of his cabinet entourage who accompany him wherever he goes. Yet he is an Emperor, after all, so there is often uncertainty about how to interact with this very powerful “first among equals.” Those who enter the Emperor’s presence often drop to their knees and kiss his hand, to which (after an appropriate few seconds of kissing) the Emperor often responds by withdrawing his hand and saying “Please, please! It’s not a holy relic!” supported by the sycophantic chuckles of his surrounding posse.

The Emperor is right—his hand isn’t a holy relic—but it also isn’t just a hand. When does a normal, everyday object become something more? When does the mundane become something special? Examples and possible answers abound. I have spent my professional life as a non-Catholic teaching at Catholic educational institutions of higher learning, so have had frequent exposure to various aspects of the holy relic racket. I call it that because the whole idea of holy relics messes with my Protestant sensibilities, even though in the church of my youth we treated the Bible, which appears to be a mere book, with a reverence not to be outdone by the most dedicated Catholic holy relic aficionado. gillespie_kathy_-_st._anthony_s_swing_with_xw_roof_by_lake_1_[1]I remember, for instance, one summer  when my cousin got turned in to the Bible camp authorities for moving a Bible from the seat of a glider swing and placing it on the grass nearby so he and I could operate the glider. I still remember the tone of voice with which the owner of the Bible yelled “YOU PUT THE WORD OF GOD ON THE GROUND!!!” before making a beeline for the director’s office.

Other faith traditions cast a much wider net when considering what might be a holy relic. I was reminded of this just a couple of days ago as I was reading the final entries in an intellectual notebook submitted by one of the students in my Honors colloquium entitled Tucson_000000798345[1]“Beauty and Violence” two or three semesters ago (I will be repeating it this spring). One of the continuing themes of this colloquium was how to have a dynamic and mature faith in the face of all sorts of features of the world we live in that threaten to make such a faith impossible. It was one of the most enjoyable and satisfying classes I have ever taught for many reasons, largely because I had the opportunity to facilitate the often uncomfortable but always fruitful process of challenging one’s beliefs with a dozen honors juniors and seniors. One of these students put it best during her oral exam at the end of the semester when she said “This class really messed me up!—in a good way.”My course syllabi have always included that “my job is not to tell you what to think—it’s to get you to think.” In addition to that I will now include “my job is to mess you up—in a good way.”

The author of the intellectual notebook in question revealed herself early on in the semester, both in writing and in class, as a “devout Catholic.” Yet I could detect from the start that she had both the courage and the willingness to press her faith boundaries, which she did regularly in all sorts of ways. Santa_Croce_in_Gerusalemme[1]So I was a bit disappointed when in one of her last entries she described in some detail a visit to a holy relic site while studying abroad in Rome last spring.

I had the chance to visit Santa Croce in Gerusalemme where my class and I saw several Holy relics. Saint Helena, Constantine’s mother, was sent to Jerusalem to bring back the holy relics of the passion of the Christ. She found parts of the cross that Jesus was crucified on but she wasn’t exactly sure which cross was His. Saint Helena brought the crosses to an old, sick woman and placed each cross on top of her to see if she could identify the cross of Jesus. The woman was suddenly cured by the third cross. This cross now lies in Santa Croce as the cross of Jesus Christ along with several other holy relics such as PHOTO-Rome-Crx-4[1]the finger of St. Thomas which was placed in the wounds of the risen Christ, two thorns from Jesus’ crown, a nail, and a nameplate which was nailed to the cross stating “Jesus of Nazareth.”

Please, I thought. Are you fucking kidding me? How can anyone take any of this seriously? I was reminded of Martin Luther, an extremely vocal critic of the relic racket, who reportedly said that there were enough pieces of the true cross of Christ in the Europe of his day to have exhausted a German forest.

I was somewhat pleased to read further and discover that my student apparently had not needed to take my colloquium to at least think a little bit critically.

How much of these stories do I believe 100% to be true?  . . . Who wrote this story down and why should they be a credible source?  . . . Maybe someone planted all of these relics. Maybe they knew that as human beings we need concrete proof to believe. Maybe it was God planting these relics for us to find as the ultimate concrete proof that Jesus is the messiah—I don’t know. I don’t know.

Well I know, I thought. This stuff is all bullshit. I grew out of the idea that the Bible is a holy relic and the inerrant Word of God. You’ll grow out of this.

My student concluded her notebook reflection with this:

What I do know is that there was a feeling that came across me that is very hard to describe. There was a silence amongst all of us in the small room of Santa Croce as if the Holy Spirit was present right in front of our eyes. My heart dropped. I knew I was breathing but did not feel like I was in control of my breaths. I was frozen and soon felt a rush come over me like I wanted to cry. I did not ask myself “Is this real?” I knew it was real. This must have been my faith taking control of my body. It was exciting. I cannot say whether the historical facts of what I learned that day are accurate or not. It doesn’t matter, because I took away more than just a history lesson. I believe this is what the Holy Spirit wanted when guiding the writings of the gospel—a personal and unique experience.

In my comments I wrote “This is a very powerful paragraph, describing what my family would call a ‘Big Bird moment.’ This is something to remember and embrace. Don’t ever forget it.”

In the Gospel of John, Jesus compares the activity of the Spirit to the wind, which “blows wherever it pleases. You hear its sound, but you cannot tell where it comes from or where it is going.” IMG_4527[1]There is a wonderful, holy randomness to all of this, unpredictable so that it cannot be packaged or formalized, and so powerful that it cannot be mistaken or forgotten. As Gerard Manley Hopkins wrote, “the earth is charged with the grandeur of God.” Sacredness infuses everything, and anything can become a direct channel of the divine wind. Even random pieces of wood and bone.

Two Plus Two Makes Four

In J. M. Coetzee’s strange and fascinating novel The Childhood of Jesus, the precocious child David has a difficult time understanding numbers. Oh, he knows their names but is not inclined to put them in the order that the rules of mathematics specify. Nor is he inclined to accept the rules guiding any accepted human behavior—he wants to live in a world in which things are valuable and right to the extent that he likes them, and he is not willing to arrange numbers in the proper sequence that everyone agrees upon. After one too many patient attempts to steer David toward conformity, his guardian Símon sputters coetzee“The answer to all your Why? questions, past, present and future is: Because that is the way the world is. The world was not made for our convenience, my young friend. It is up to us to fit in.”

This business of knowing when to fit in and when to creatively resist expectations is a lifelong challenge that all of us grapple with on a daily basis. At the heart of that challenge lie questions so fundamental that they literally shape our reality. Is the search for truth more like a treasure hunt or a creative, artistic process? Is meaning something to be found or to be made? Tentative answers to these questions frame one’s encounter with both oneself and the outside world. As Plato famously suggested, it is difficult to imagine meaning as the target of an open search, since I won’t know if I’ve discovered the goal of the search unless I already have a sense of what I’m looking for. But if meaning is something that each of us creates throughout the process of our lives, what hope is there for shared meaning, for truths that are not just mine but everyone’s in common?

Although both by nature and philosophical preference I am more of a “creative process” than “treasure hunt” sort of person when it comes to engagement with meaning and truth, I spent a recent semester exploring a seminal text in philosophy written by one of the most eloquent advocates of the “treasure hunt” model in the Western tradition. Plato’s Republic is, among many other things, an extended development of the idea that Truth is objective, that meaning is something to be found, not created, and that enlightenment is a life-long process of being freed from the clutches of our ego-driven subjective “truths” in order to slowly discover what “Truth” really is. plato geometryPlato’s paradigm for Truth is mathematics, a discipline that with its objective principles and rules exposes the truth-seeker to a world in which what is true is not up to me but is available to those who are willing to commit themselves to “the sight of the Truth.” Plato makes an extended argument that moral values and virtues properly understood exhibit the precision, certainty and objectivity of mathematics. Indeed, mathematics is Plato’s exemplar of the nature of truth; he insisted that only those who love geometry could enter his Academy, because it is through study of mathematics that one becomes accustomed to the nature of all truth.

If my students in this class—actually, over the past twenty-five years—are an accurate sampling, Plato’s commitment to the objectivity of truth is strongly opposed to our contemporary intuitions. As I often do, I introduced the problem early in the semester with a simple question about a couple of basic truth claims. I wrote two sentences on the board,Mona_Lisa

A. Two plus two equals four.

B. The Mona Lisa is a beautiful painting.

then asked for observations about what makes these truth claims different. Within short order the students point out that A is objectively true (as are all mathematical truths), while B is subjectively true (as are all aesthetic claims). If someone denies the truth of A, we assume that either that person doesn’t know the basic rules of arithmetic, is deliberately being a contrarian, or simply is nuts. If someone denies the truth of B, however, no problem—there’s a reason why we say “beauty is in the eye of the beholder,” after all.

Then I move to the point of the exercise by writing a third truth claim on the board.values

C. X is right (good) and Y is wrong (bad).

X and Y can be anything that people are inclined to make value judgments about. I simply ask “Is C more like A or like B?’ When we venture into the realm of moral truth claims, in other words, have we entered a realm more like mathematics or art? Objective or subjective? Finding or creating? In twenty-five years of teaching, students have overwhelmingly given the same answer—moral truth claims and judgments are more like B than A. Morality is subjective rather than objective, in other words. In my Plato’s Republic class last semester, only two students out of twenty-five present claimed that moral claims are objectively true—and they were both Catholic seminarians.

moral-disagreementWhen I asked the other twenty-three students—many of whom were the products of Catholic primary and secondary education—why they bundled moral and value truth claims together with aesthetic claims as subjective, most zeroed in on the problem of moral disagreement. Essentially their argument was that since people disagree significantly across the board about every moral issue imaginable, and given the apparent absence of any authoritative perspective from which it could be judged who is right and who is wrong, moral disagreement looks a lot more like the Mona Lisa squabble than whether two plus two equals four or five. The real problem is that, unlike mathematics, there is no working and accepted objective standard to which one can appeal when trying to figure out who is right and who is wrong in a moral disagreement. Rather than do the difficult and challenging work of seeking objective standards, it is much easier to assume there are no such standards in morality (except perhaps extreme tolerance) and place moral truth claims in the subjective category. We get to create them ourselves without being answerable to an objective standard—because there isn’t any such standard. Let the discussion begin.

the plagueIn The Plague, a central and early text in another one of my recent classes, Albert Camus raises the possibility that despite the apparent subjectivity of moral claims, there comes a time when one must hang on to moral commitments with the tenacity of two plus two equals four.

Again and again there comes a time in history when the man who dares to say that two and two make four is punished with death. And the question is not one of knowing what punishment or reward attends the making of this calculation. The question is that of knowing whether two and two do make four.

Here the narrator of The Plague is commenting on the “sanitation squads” in the novel who, rather than hiding from an apparently random and incurable plague that is sweeping across their city, taking the lives of hundreds of their fellow citizens per day, choose to embrace the basic moral task of facing the danger head on, putting their own lives at risk in the service of making the suffering of others slightly less intense and their environment slightly less dangerous. When asked why they have taken on such a thankless task, the members of the sanitation squad always answer with mathematical simplicity. Some things just need to be done. And sometimes what needs to be done is as obvious as the truth of two plus two equals four. the white rose“But what you are doing may very well lead to your death,” someone might object. “So be it.”

Camus’ point is strengthened significantly when considering that The Plague is not just a powerful work of fiction but is also a multi-layered allegory. Published in 1947, the bulk of the novel was written during the Nazi occupation of France during World War II, with the various characters in the novel representing the different reactions of French citizens to totalitarianism, the loss of their freedoms, and the extermination of undesirables. kolbeThose who, as did the sanitation squads, chose to address the Nazi plague in the face of overwhelming odds of failure are those who recognized that even in a moral world turned upside down, sometimes the truth and what is right are as obvious as a simple sum in arithmetic. We studied a number of such people during my “Grace, Truth, and Freedom in the Nazi Era” colloquium; many of them—Dietrich Bonhoeffer, the members of the White Rose, Maximillian Kolbe, and others—lost their lives for daring to insist that two plus two makes four, just as Camus described. But that doesn’t change the fact that even in the world of morals and values, some things are as clear as mathematical truths. Sometimes it really is that simple.

West of What?

ruaneThe heart of Providence College’s core curriculum is the Development of Western Civilization (DWC) program, a sixteen-credit, four-semester, interdisciplinary and team-taught series of courses required of all freshmen and sophomores regardless of major. From its origin in the 1970s, DWC (or “Civ,” as many call it) has been both a regular source of pride and occasionally of controversy, both of which have been the case recently. I have taught in the program for sixteen of the twenty-one years I have taught at the college and directed the program for the four academic years before my 2015-16 sabbatical. At the end of August on the first day of the current semester (my first day back in the classroom in fifteen months), my colleagues and I chose to spend our first ninety minutes with our new freshman students by raising the question that many of them were probably thinking but were too afraid to ask: Why study Western civilization at all?

The very existence of a program such as ours is controversial in many quarters where many wonder, in an increasingly diverse and globally connected world, whether the study of dead-white-guys“dead white guys” is anything more than a lame and outdated exercise in cheerleading for white folks. But others seek to defend the primacy of Western civilization over all others. One of my colleagues began the class with a clip from the Republican National Convention a couple of months earlier. A talking head panel was filling up air space waiting for the first speaker of the evening. After one person mentioned that the hall was filled with lots of angry white people, Iowa Congressman Steve King took offense.

King: This “old, white people’ business does get a little tired. I’d ask you to go back through history and figure out, where are these contributions that have been made by these other categories of people that you’re talking about — where did any other subgroup of people contribute more to civilization?”

Panel moderator: Than white people?

King: Than Western civilization itself. It’s rooted in Western Europe, Eastern Europe and the Unites Stated of America and every place where the footprint of Christianity settled the world. That’s all of Western civilization.

King’s comments astounded a couple of people on the panel and outraged a third—as the table exploded into shouting, the moderator wisely broke for a commercial.

The congressman’s comments reminded me of something brought to my attention a few weeks earlier. A person who gave a talk at my college over fifteen years ago remembered it in a recent column on a conservative website, recalling that some of the people present were not happy with the content of his talk.

 I am sure I noted the fact that some cultures are better than others and I might well have quoted (with enthusiastic approval if I did) William Henry’s observation, in his book In Defense of Elitism, that “It is scarcely the same thing to put a man on the moon as to put a bone in your nose.” I am very fond of Henry’s native prosuperior-civiliztionsboscis image, not least because, like Saul Bellow’s question: “Who is the Tolstoy of the Zulus?”, it is so reliably productive of rage among pampered left-wing academics.

These comments could have been written or said by some of my colleagues on campus who often defend the existence of our DWC program by asking “Why wouldn’t we want to focus close-to-exclusive attention on Western Civilization? It’s the best civilization ever!”

After Congressman King’s clip from the RNC, my teaching colleagues began a conversation with our students by asking basic questions such as “What is Western Civilization?” The students seemed unsure how to answer other than to say “we are!” After a few minutes I made my first contribution. “This is going to sound like a stupid question,” I said, “but I’m sitting over here listening to all of you talk about ‘Western Civilization’ and I’m wondering: West of what? Are we talking geography here? What is Western Civilization west of?” None of us had thought to load a map onto PowerPoint, so I just asked the students to picture Europe, Asia, and Africa in their imaginations. “If we draw a dotted line from north to south to divide the East from the West, where would we draw it?”drawing-the-line

The first suggestion was that it would go through the Middle East, in a way that would perhaps include Israel in the West but perhaps the rest of the Middle East in the East. “What about Turkey?” one of us asked. They weren’t sure. Of course if the dotted line continues straight south from Israel, another question arises. “What about Africa? Is it in the West?” I asked.funny-africa “No,” just about all of the students answered, meaning that our dotted line needs to take a sharp turn left from Israel, bisecting the Mediterranean Sea so that Europe to the north stays in the West (as everyone knows it should) and Africa to the south stays non-West (as everyone knows it should). Remembering Congressman King’s inclusion of the United States in Western civilization (even though it does not make an appearance in DWC until the beginning of sophomore year), the dotted line should cross the Atlantic so that the U.S. and Canada lie on the Europe (northern) side of the line. Mexico and Central America? Iffy. South America? Not sure. It was pretty clear that the line was intended to include those most like us in the West and exclude everyone else.funny-asia

This exercise established pretty quickly that the answer to West of What? cannot be geographical. One student then helpfully suggested that “I think ‘Western’ refers to various ideas rather than geographical areas.” Asked for examples, students suggested things like democracy, human rights, capitalism, and monotheism. One of my colleagues, a theologian, jumped on that one. “Is Christianity a Western religion?’ he asked—“Of course!” was the answer (Congressman King would have been proud). “What about Judaism?” The students weren’t as sure there, even when my colleague pointed out that both Christianity and Judaism started in the same place and that Jesus was a Jew. “What about Islam?” Most of the students thought not (Donald Trump would have been proud), prompting my colleague to point out that for a number of centuries during the Middle Ages the westernmost area of Europe—Spain—was Muslim. Trying to define “Western” by religion doesn’t work much better than geographically.monotheism

Finally, one brave student said “I think that ‘Western’ for most people means pretty much the same thing as ‘White’.” And I suspect there’s a great deal of truth in that suggestion. When we study Western Civilization, we will undoubtedly end up studying predominantly dead white guys unless a concerted effort is made to engage with non-Western influences. I mentioned to the students, for instance, thg-and-iat most people don’t know that one of the greatest—perhaps the greatest—philosopher in the Western tradition, Plato, was heavily influenced by thought from Persia and India. The dotted line is porous, in other words. Furthermore, one of my colleagues who is of Italian ancestry pointed out that one hundred years ago in this country, immigrants from southern European countries such as Greece and Italy were not considered by many in this country as white.

After ninety or so minutes of this, I’m sure our students left largely in confusion. And that’s a good thing. Despite the name of the course, “Western Civilization,” upon close scrutiny, turns out to be as much a reflection of our expectations and prejudices as anything else. This is good for seventeen- and eighteen-year-old young adults to understand at the outset of their forays into their college education. We live in a polarized political and social climate in which labels and categories are used as if they are as objective and set in stone as the Ten Commandments. Only when we realize that all such dividing lines are at their roots random, porous, and fluid might we seize the opportunity to do better.

The Right Niyyah

As I wait impatiently for my sabbatical that is under contract with a publisher to return from the editor, I’ve been thinking about some of my blog essays that “made the cut” in some sense to appear in revised form in my book-to-be. One of these essays is about the challenge of cultivating the right attitude with which to enter the world on a daily basis. I learned a lot about this from Rami Nashishibi when he was interviewed a year or so ago on Krista Tippett’s “On Being.”

Regular readers of this blog know that I am a fan of Krista Tippett’s radio program “On Being,” a show that I frequently catch several minutes of on Sunday mornings as I drive the fifteen minutes from our house to the early show at church. A few weeks ago, her guest was Rami Nashashibi, Founder and Executive Director of the Inner-city Muslim Action Network, or IMAN, in Chicago. He’s also a Visiting Assistant Professor in Sociology of Religion and Muslim Studies at Chicago Theological Seminary.nashishibi

On Being: A New Coming Together

Tippett describes Nashishibi at the beginning of the interview as using

Graffiti, calligraphy, and hip-hop in his work as a healing force on the South Side of Chicago. A Palestinian-American, he started his activism with at-risk urban Muslim families, especially youth, while he was still a college student. Now he’s the leader of a globally-emulated project converging religious virtues, the arts, and social action. And he is a fascinating face of a Muslim-American dream flourishing against the odds in post-9/11 America.

Not surprisingly, the conversation was wide-ranging, fascinating, and introduced me to a number of matters and issues that are well outside of my usual frame of reference. What particularly grabbed me, however, was a brief exchange toward the end of the interview, just as I was pulling into my usual parking spot at Trinity Episcopal.

Krista Tippett: I told you I was on your Twitter feed, and I love this. I think it was really recent. You wrote: “My 4-year-old discovers the spiritual power of her name as she looks over and seriously asks, ‘Daddy, do you have the right niyyah?’” What does niyyah mean?

Rami Nashashibi: So niyyah — in kind of Arabic-Muslim parlance — is spiritual intention. niyyahAnd oftentimes — it’s both the Swahili and Arabic word. And oftentimes, Muslims are always asked before they pray, before they do any act of service, before they engage in anything that has any kind of sense of worship associated with it, is it being done for the right niyyah? Is it being done for the right purpose? Are you attempting to get fame or credit? I think, yes, there was a song that had used her name in that way and the light went off in the middle of it and turned over to me on the couch and asked me that question. Honestly, I looked at her and I didn’t have an answer for her for I think a good 20 seconds. She nodded her head and she said, “No, probably not.”

And I said, “What?” We then had a conversation. I said, “Well, at least keep me in your prayers that I have.”

This four-year-old’s simple question—Do you have the right niyyah?—has stuck with me ever since. So has her response to her father’s lack of response—“No, probably not.” It’s hard enough to figure out what the right thing to do is on a daily basis; adding in that it should be done with the right intention, for the right reasons, seems like piling on.intentions and actions As a philosophy professor who has taught introductory ethics courses more times than I care to count over the past twenty-five years, I have thought about this a lot. When I ask my students “What is more important—what you do, or why you do it? Actions or intentions?” they usually split roughly down the middle.

And so do the great moral philosophers. There is the tradition of those who say that only results matter (since they can be observed and measured publicly) and intentions are irrelevant. Then there is the other tradition (spearheaded by Immanuel Kant) who say that results are irrelevant—the true measure of the moral life is internal. Were your intentions pure? Was your heart in the right place? If so, then you are morally in the clear, even if the results of your intended action go “tits up” (to quote my brother-in-law).

VgMKgyZMy students are pretty smart, and it doesn’t take very long before they realize that the “results or intentions” question is a false dichotomy. Because in truth, normal human beings care about both. If morality is just about doing the right thing, then the person who identifies the things that should be done and does them—even if for all of the wrong reasons, such as self-righteous smugness or the praise of others—is morally in the clear. But Nashashibi’s four-year-old daughter is right—we want not only the right thing to be done, but for it to be done with the right niyyah, the right intention or reason. And that sucks, because it takes things straight into the human heart. For those who profess the Christian faith, it also takes things straight into the world of grace.

The first thing I ever learned from Scripture about the human heart as a young boy was from JeremiahJeremiah: “The heart is deceitful above all things and desperately wicked—who can know it?” Far less attention was paid to the Psalm that is recited in liturgical churches during the Ash Wednesday liturgy: “Create in me a clean heart, and renew a right spirit within me. Cast me not away from your presence, O Lord, and take not your Holy Spirit from me. Restore unto me the joy of your salvation, and renew a right spirit within me.” Straight from the Jewish scriptures is both the problem of and the solution for right intentions. As a flawed human being, I am incapable of doing things for the right reason, but help is available. Through divine grace the heart is changed and turned toward the good. Rami Nashishibi’s daughter is right when she doubts that her dad has the right niyyah, so long as that depends on his own energies and strength. But when the divine gets involved, everything changes.

The mystery of grace is exactly that—a mystery. Divine grace enters the world through flawed human beings, strangely enough, and there isn’t enough time to try to figure it out. Grace is something to be channeled, to be lived, not systematized and turned into dogma or doctrine. My bright abyssThe poet Christian Wiman writes beautifully about this. Through many years of cancer treatments, he learned to hear God, then to channel God, in the most unlikely places, the very places where divine grace apparently lives. Wiman writes that

God speaks to us by speaking through us, and any meaning we arrive at in this life is composed of the irreducible details of the life that is around us at any moment. . . . All too often the task to which we are called is simply to show a kindness to the irritating person in the cubicle next to us, say, or to touch the face of a spouse from whom we ourselves have been long absent, letting grace wake love from our intense, self-enclosed sleep.

The right niyyah is not the result of struggle, training, or calculation. And as the author of Deuteronomy tells us,deuteronomy

Nor is it beyond the sea, that you should say, “Who will cross the sea for us to get it for us and make us hear it, that we may observe it?” But the word is very near you, in your mouth and in your heart, that you may observe it.

All I have to do to have the right niyyah is to open my heart, open my mouth, and let it out.

To experience grace is one thing; to integrate it into your life is quite another. What I crave now is that integration, some speech that is true to the transcendent nature of grace yet adequate to the hard reality in which daily faith operates.

Tolerance on Steroids

What happens when a perfectly good virtue gets turned into not only the most important virtue, but in many cases the only virtue? I have come face to face with this question in the signpostsearly weeks of this semester with fifty juniors and seniors in two ethics classes. I chose this past summer to organize my General Ethics course, usually a tour of several of the notable moral theories in the Western philosophical tradition (Aristotle, Kant, Mill, Sartre, etc.) that are then applied to the details of human experience, by starting instead with those messy details themselves. We find ourselves in a world of competing religious, moral, and political claims shouting at each other across various divides, claims that are both incompatible with each other and resistant to compromise. How in the midst of diverse perspectives that too often lead to violence are we to find a place of stability from which to plot the way forward?

I have discovered both from early class discussion and student writing reflections what I suspected—most of my young adult students have been taught for as long as they can remember that the “go-to” virtue that must be cultivated in order to wend one’s way through the minefield of incompatible beliefs and commitmaristotleents is tolerance. It’s interesting that the granddaddy of virtue ethics, Aristotle, did not include tolerance in any of his lists of virtues—apparently such a virtue was not particularly useful in fourth century BC Athens. Tolerance is also rejected by many contemporary people as a sign of weakness, of lacking commitment to one’s beliefs, and of a willingness to compromise too quickly. But for many in our culture, particularly those who might consider themselves as “liberal” in some sense, tolerance is the proposed remedy for many of the things that ail us.

Don’t get me wrong—I have no problem with tolerance as a virtue. As a matter of fact, it probably plays as regular a role in my life on a daily basis as any virtue you could name. My concern about tolerance arises from intimate facebookfamiliarity with how it often works in my own life. When I remove myself from an email list on campus because I’m sick to death of being inundated with what I consider to be the often petty concerns of my colleagues, it feels like tolerance. “Let them continue emailing about anything they want,” I think. “I just don’t want to be part of it.” When a Facebook conversation wanders into areas that I find either offensive or seriously different-strokesmisguided, my tendency is to withdraw from the conversation rather than insert my concerns. Tolerant, right? Not really.

I find in my own life, and I suspect I’m not unusual or unique in this, that “tolerance” is an umbrella term for “whatever.” “Different strokes for different folks.” “I disagree with you but I’ll defend to the death your right to say it” (although I almost certainly wouldn’t). In other words, one of the best safeguards against being judgmental and ethnocentric, a check against our natural human tendency to negatively judge those who believe, think, and act differently than we do simply because they are believing, thinking, and acting differently than we do, turns into a placeholder for laziness and a reticence to engage even with what one most strongly disagrees with. When writing on topics related to diversity and difference, my students regularly include phrases such as “we all just need to accept people as they are” and “the world would be a better place if everyone would simply be more tolerant of differences.” Tolerance is not only the first virtue that gets mentioned in class discussion and assignments, but is often the only virtue in play. But is tolerance suitable as the primary virtue in a moral framework or ethic? And what if there are some things that must not be tolerated?

herodotusA brief but familiar story from the ancient Greek historian Herodotus provides a useful jumping off point for asking uncomfortable questions about tolerance. In his Histories, Herodotus tells the story of King Darius of Persia, a (somewhat) enlightened ruler who was fascinated by the various customs of the different groups of people from the far-flung reaches of his empire who were part of his extended court. Darius noted, for instance, that two different groups of people—the Greeks and the Callatians (a tribe of people from what is now the Indian peninsula)—had strikingly different methods of dealing with the body of a person who died in their community. The Greek practice when someone died was to burn the dead body, while the Callatian practice was to eat the dead body.

Intrigued, Darius first asked representatives of the Greek community what he would have to pay or give them, what honors he would have to bestow on them, so that the next time someone died in their community they would eat the dead body instead of burning it, as was their custom. Appalled, the Greek representatives told Darius that no amount of riches or honors could possibly convince them to do such a horrible and immoral thing. Darius also asked a similar question of the Callatians—could I convince you to burn the next dead body you have to deal with in your community rather than eating it, as is your custom? Hell no! the Callatians said, insisting that nothing could convince them to do such a disgusting and immoral thing. Herodotus’s conclusion? callatians“Custom is king.” What a person or group of people considers to be “right” or “moral” is what they are accustomed to, the practices of their family, their community, or their culture that they have been taught since their youth. Humans nature causes us not only to embrace what we are most familiar with as morally right, but also to assume that it is right for everyone.

If “custom is king” and moral values are culturally defined, then the most important attitude to cultivate, the habit most likely to put up a firewall against unwarranted projection of one’s parochial practices and values on others, is undoubtedly tolerance. As Herodotus’ story is intended to illustrate, the best answer to the question “Who is right about the best way to dispose of a dead body—the Greeks or the Callatians?” is “Both, within the parameters of their culture.” Furthermore, there is no way to step outside one’s own culturally defined moral stance and be “objective.” There is no such objective standpoint. The only proper response to differences between groups, or perhaps even between individuals, is tolerance—the habit of accepting differences without judgment.tolerance

The problem, as a student quickly pointed out in each section of my ethics course, is that tolerance as an exclusive or primary virtue is not sufficient to account for many of our strongest moral intuitions. What if, for instance, the difference is about something more serious than the difference between eating or burning a dead body? What if the difference is between a culture that practices female circumcision and our culture that does not? Is tolerance appropriate in this instance? Are we to say “wow, I’m glad I don’t live in that culture, but for them that practice is morally right”? If our intuitions say that some practices cannot be tolerated, no matter what cultures adopt them, is this because our intuitions have been shaped by our own culture or because our intuitions are resonating with a moral absolute that transcends cultural differences?moral-values

Of such questions a great General Ethics class is made. But it appears that if we raise tolerance to primary virtue status, we at the same time take any commitment to moral principles that transcend cultural differences off the table. And that may not be a price worth paying. As I told my students the other day, a moral theory that does not account for our strongest moral intuitions is like trying to cover a queen-size mattress with a twin-size fitted sheet. It covers some of what needs to be covered, but not all of it. I, for one, am not ready to tolerate a theory like that.

Disturbing the Peace

Last Friday I attended a talk on campus by civil rights lawyer and and law professor Greg Lukianoff on issues of free speech, trigger warnings, and a related host of matters on college and university matters that are regularly in the news. He is the co-author of an article in The Atlantic a bit over a year ago that raised a lot of eyebrows and generated a lot of conversation. I wrote about it in the early weeks of my sabbatical last August: 

I do not know how to teach philosophy without becoming a disturber of the peace Baruch Spinoza Spinoza

One of the lead articles in the most recent edition of The Atlantic magazine is “The Coddling of the American Mind.”

Lukianoff and Haidt: The Coddling of the American Mind

Co-authored by Greg Lukianoff and Jonathan Haidt, the teaser blurb for the article in the Table of Contents says “How a new strain of political correctness on campus is damaging higher education—and may be threatening students’ mental health.” It is an interesting read. Given Donald Trump’s current more-than-fifteen-minutes of fame, concerns about political correctness are in the news, safe spacebut in this article Lukianoff and Haidt are drawing our attention to what might be called “political correctness with a twist”:

The current movement is largely about emotional well-being. . . . It presumes an extraordinary fragility of the collegiate psyche, and therefore elevates the goal of protecting students from psychological harm. The ultimate aim, it seems, is to turn campuses into “safe spaces” where young adults are shielded from words and ideas that make some uncomfortable. And more than the last, this movement seeks to punish anyone who interferes with that aim, even accidentally.

The authors’ argument is largely anecdotal, relying either on their own experiences or on recent anecdotal stories and essays from various campuses across the country. seismic shiftThere is a great deal of speculation about the causes of this perceived seismic psychological shift among students over the past couple of decades, although virtually no data is provided to substantiate many of the authors’ claims.

In the first column of the article readers are introduced to two important terms that “have risen quickly from obscurity into common campus parlance: Microaggression and Trigger warnings. Microaggressions “are small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless.” Examples provided include asking an Asian American or Latino American “Where were you born?,” because this implies that she or he is not a real American. Mrs. DallowayTrigger warnings are “alerts that professors are expected to issue if something in a course might cause a strong emotional response”; examples of texts deemed as needing trigger warnings on various campuses include Virginia Woolf’s Mrs. Dalloway (suicidal inclinations) and Ovid’s Metamorphoses (sexual assault). The many examples of these and related problems in the article are chosen and presented with the clear intention of “triggering” the reader into concluding “well that’s just stupid—political correctness, like a hydra, rears a new ugly head.” One of the authors’ primary concerns, repeated frequently throughout the article is that such attention to words and actions that might possibly somewhere, somehow offend someone will leave students unprepared to live and work in a world that doesn’t give a crap about what makes them feel uncomfortable.

What are we doing to our students if we encourage them to develop extra-thin skin in the years just before they leave the cocoon of adult protection and enter the workforce? Would they not be better prepared to flourish if we taught them to question their own emotional reactions, and to give people the benefit of a doubt?

Even though I have twenty-five years of college teaching under my belt,pc my experience on college campuses is deep but narrow, given that I have taught at my current college home for twenty-one years and have shaped my teaching and professional life within the confines of its “105 acre, park-like campus.” Serious conversations about the negative power of language on students in various groups defined racially, economically, by gender or by sexual preference have been ongoing on my campus for some time now. In my own philosophy department regular, continuing, and often heated debates occur about what constitutes appropriate and inappropriate language in the classroom, in job candidate interviews, and in basic conversation with each other. What strikes some as obviously benign, scholarly, and insightful strikes others as ill-advised, insensitive, and downright offensive. That said, the tsunami described by Lukianoff and Haidt as drowning campuses nationwide has escaped my notice where I teach—at least in my classrooms. Perhaps this is because I have included this general “trigger warning” in every syllabus for every one of my courses for at least the past fifteen years:

Ine this course we will be considering some of the most important questions a human being can ask. Perhaps the most important feature of our considerations is learning to ask these questions clearly and precisely. Only then can possible answers be considered fairly. Although I have definite positions on the questions we will be addressing, my role as professor is not to tell you what to think. My role is rather to get you to think. Expect your assumptions to be challenged and comfortable ways of thinking to be disturbed. As the great 17th century philosopher Spinoza once said, I do not know how to teach philosophy without becoming a disturber of the peace.

During an oral final exam a couple of semesters ago a student told me that “This class really messed me up—but in a good way!” Mission accomplished.mission accomplished

The new fall semester is just three weeks old–here’s a bit of advice related to safe spaces and learning for the incoming freshmen:

  1. Free speech dictates that everyone has the right to their opinion, but not all opinions are equal. right to an opinionOne of the purposes of a liberal education is to help you become skillful at using the tools of lifetime learning; some of these tools, used properly, will help you learn how to distinguish a good argument from bullshit—even when it is your own argument. I often say that a liberally educated person earns the right to have an opinion. The process of earning that right begins with realizing that your opinion is not special just because it is yours, and without challenge and analysis it means nothing with regard to whether it is true (or even a defensible position).
  2. In the life of learning, comfort is vastly overrated. comfort zoneExpect to encounter people, ideas, situations and expectations that are both unfamiliar and well outside your comfort zone. You should be looking for these rather than trying to avoid them. If you manage to make it through your undergraduate college career without changing any opinion, belief, perspective or attitude, then your tuition dollars have been wasted.
  3. The world of adulthood into which you are making your first, tentative forays can be a tough, nasty place. The world out there is full of people, ideas, things, and events that couldn’t care less if they lie within your current comfort zone.it is what it is As my wife would say, the world is what it is. Your years in college are not so much about your landing a well-paying job after you graduate as they are about the construction of a powerful and flexible moral and psychological framework of belief and commitment, from within which you will engage with what’s “out there” on a daily basis. It is not the world’s responsibility to provide you with comfort and security. It is your task to create and maintain a moral and psychological home for yourself in that world using all of the resources available to you, resources to sustain you on a life-long journey. By the way, you’ll be making significant renovations and additions to this home your whole life. Your professors are here to assist you in the construction of that home—good luck!

A liberal education, especially, inspires students to value struggle. By grappling with authors and ideas that demand the greatest level of intellectual intensity—and this is especially true in subjects that are difficult and uncongenial—students learn that they stretch themselves more through struggle, whether or not they win the match. Christopher Nelson

Embracing the Barbarian Invasion

Every year the world is invaded by millions of tiny barbarians. We call them “children.”  Hannah Arendt

One of the wonderfully gratuitous features of my early years as a college professor was the opportunity to teach regularly with a couple of master teachers. During the first decade of my teaching career at Providence College, I taught on an interdisciplinary Honors Development of Western Civilization team every year with two such colleagues. images[6]Rodney was a teaching icon from the English department who now, a few years after his untimely passing, has a tree on campus, a seminar room in the brand new humanities building, and an annual lecture named after him. One of the most dynamic and engaging pedagogues I have ever encountered, I remember telling Jeanne shortly after meeting Rodney in the middle nineties in my first year at Providence College that “when I grow up, I want to be Rodney.”

rays[1]The other member of our teaching triumvirate, Ray, is an extraordinary professor out of the History department. He is also one of the flat-out finest human beings I have ever had the privilege of knowing. This coming spring Ray and I will be teaching a colloquium together for the third time the past four years, and class fondly referred to by students as “Nazi Civ.” I am a far better teacher and human being for having spent so many years in the classroom in the company of these outstanding colleagues.

Because we spent so much time together in and out of the classroom, the three of us got to know each others business over the semesters a bit more than is typical between professional colleagues. We often spoke of our children; Rodney’s and Ray’s were young adults at the time, while mine were in high school and junior high. One morning before class as we were getting coffee in the break room, Rodney was bemoaning the fact that he had returned home from work the previous day at 5:00 in the afternoon at the very same time that his son, yowl-380x190[1]a twenty-something who was still living at home, emerged bleary-eyed from his basement bedroom for the first time that day. As we compared notes about the shortcomings and failures of our respective offspring, Ray, who I had always pegged as the perfect father and husband, grew reflective. “I’ve heard so many parents talk about the wonders of parenthood, how raising children is such a privilege, how their children’s growing up years were the best years of their lives,” he said. “I guess I must have missed that.” Preach it, Ray. For all of our politically correct claims about the wonders of child rearing, all parents know that Hannah Arendt’s “tiny barbarians” comment is absolutely true. Civilizing barbarians is hard work.

Conan-the-Barbarian[1]The word “barbarian” is from the Greek word βαρβαρος (barbaros), the term Greeks used to refer to anyone who was not Greek. To the refined but xenophobic Greek ear, the sounds coming out of a non-Greek speaker’s mouth sounded like “bar, bar, bar”—hence, “barbarian.” We would call such persons “blahblahblahrians.” The wider connotation of “barbarian” is simply someone or something that does not fit into the expected categories, abide by the accepted rules, or behave according to agreed-upon standards. That description certainly fits children and a lot more—I frequently call our 196834_112520205494582_3062546_n[1]dachshunds barbarians when they pee or take a dump in the middle of the floor, just as I would probably call a human being a barbarian (and worse) if they did the same thing.

And yet there is something exhilarating about having barbarians in our midst. A world without barbarians, without unfamiliar hordes pressing against the outer walls of our holy-of-holies comfort zones, is a world that eventually would stagnate into a smug status quo. I realized this past semester, as I do in varying degrees every semester, that one of the regular features of what I do as a teacher is to let the barbarians loose on the civilized yet unexamined thought processes of my students. conan-barbarian-04_510[1]Philosophy is an inherently barbarian discipline because it’s entire raison d’etre is the challenge to consider that one’s most cherished beliefs might indeed need improvement, that the doors and windows to the inner sanctum might regularly be opened to allow the smelly and scary barbarians in.

Several years ago, when I was still an untenured assistant professor and should have been keeping my mouth shut, I recall being involved in a conversation about this feature of philosophy during a philosophy department meeting. We were in the process of crafting a new “mission statement” for the department, an exercise guaranteed to generate disagreement. Title[1]One of the older members who had been chair of the department for a couple of decades before my arrival, a Dominican priest, proposed that our mission statement read that “The mission of the philosophy department is to teach the Truth.” Period—and make sure that it’s a capital “T” on “Truth.” I, along with several others, suggested that this would presume that we possess the Truth with a capital T, a presumption that is directly contrary to the very spirit of the philosophical enterprise. In a condescending tone (or at least so it sounded to me), another priestly colleague said “Vance, some of us around here think we have the truth,” to which I replied “And here I thought we were a philosophy department.”

So how does one keep the pursuit of truth alive without it being sidetracked into defense of the Truth? Over the past several years in my teaching and writing this question has been directed more and more toward the arena within which Truth rears its ugly head most often—religious belief.collegeville-lecture-31[1] During my sabbatical semester at an ecumenical institute five years ago I described my original book project as follows: “Is it possible to live a life of human excellence, of moral focus and spiritual energy, in a world in which the transcendent is silent, in which God is arguably absent?” As I led an afternoon seminar based on my early work on this project with a dozen fellow “resident scholars,” one of them—a Lutheran pastor—asked “But Vance, don’t you have to believe something with certainty if you’re going to call yourself a Christian?” To which I replied, “I don’t know—do I?” I had been wondering that for many years, but this was the first time I had said it aloud. And it was liberating. What would a faith that in which no “truth” is a “Truth,” a faith in which no cows are sacred, look like?

As I’ve dug into these questions with new energy and focus over the past few years, several matters have begun clear, beginning with the fact that the transcendent is not silent after all and God is definitely not absent. They just show up in entirely different places than where we have traditionally looked for them. And I am finding that, for me at least, a vibrant faith requires little in the way of defending the Truth, but rather a willingness to welcome the divine even when wrapped in unexpected packages. JCarse3YT1.2c_000[1]As James Carse writes,

This is Christianity’s strongest feature: it tirelessly provokes its members to object to prevailing doctrines without having to abandon the faith . . . Neither Christianity nor any of the great religions has ever been able to successfully erect barriers against the dreaded barbarian incursions of fresh ideas. 

Such barbarian incursions are not to be feared or defended against. They are to be invited and welcomed. Just as the millions of tiny barbarians who invade the world every year are actually the way in which the human species is renewed and regenerated, so the regular introduction of barbarian ideas into our civilized and supposedly completed belief systems will keep those beliefs from turning into idols. What would a faith in which no “truth” is a “Truth,” a faith in which no cows are sacred look like? It would look a lot like Faith–the real thing.

Good Morning, Psalms

Last Thursday, in just our second class of the semester, I had the opportunity to introduce my ethics students to the master of all things ethical. The key to Aristotle’s understanding of the life of human flourishing is that such a life depends on the formation of the best habits—Aristotle ethicsthe virtues—to guide one’s life. Aristotle conceived of the life of freedom and moral excellence as a life constructed out of the virtues, good habits that, when cultivated, incline a person to do the right thing as a matter of developed character rather than conscious choice. Habits are established by repetition and, once formed, are often very difficult to change. Accordingly, one should take great care that one’s moral habits are the right ones (virtues) and not the wrong ones (vices), since the wrong habits, once entrenched, will be next to impossible to replace with better ones.plato footnote

I have taught Aristotle’s ethics for many years and believe that although Alfred North Whitehead was probably correct when he said that all of philosophy is a series of footnotes to Plato, the best thinking about ethics begins with Aristotle. And his insights concerning the importance of habits are relevant beyond the ethical realm. I find myself in the best physical shape of my life now in my early sixties because several decades ago my grudging daily trips to the gym somehow turned into a habit that I no longer had to talk myself into. Reading psalms with 100_0670Benedictine monks in Minnesota three times a day during my 2009 sabbatical established a habit of reading the three or four psalms appointed for each day in the Book of Common Prayer that continued for several years after my sabbatical ended. Between my alarm at 5:15 AM and getting to the gym by its 6:00 opening time I read the day’s psalter aloud (or murmured it, lest I awaken the dogs and Jeanne). I am convinced that this simple habit both helped transfer important changes in my life from sabbatical to real life, and also contributed to the preservation of my sanity as I juggled full-time teaching with the additional full-time duties of running a large academic program for four years.

But then I lost the habit, under the strangest of circumstances. My next sabbatical arrived, and with the prospect of unlimited time to rest, re-center, read, and write in front of me, somehow the daily regimen of early morning psalm reading fell by the wayside. I no longer needed to arise at 5:15, I rode my new bicycle obsessively instead of daily workouts at the gym, I applied myself energetically to my sabbatical writing project, and somehow my simple ten to fifteen minutes alone with the psalms every morning didn’t make the cut. habitsI made no conscious decision to end the habit—I just did. If Aristotle is correct in saying that well-established bad habits are very difficult to break, it turned out—in my case at least—that good habits can be broken very easily. I didn’t even realize consciously that my psalm reading habit had gone by the wayside for several weeks; once I noticed its absence, I made a few half-hearted attempts to start again over the following months. But they didn’t take.

I returned to the classroom for the first time in fifteen months a week ago, and decided that along with a return to a 5:15 wake-up call, I would attempt to re-establish my psalm reading habit. With only a week under my belt, the returns are promising; coming back to the psalms has been like becoming reacquainted with very wise friends who have been away for a while. My renewed acquaintances include:

Monday, August 29: Psalm 139

The opening psalm on the list for my first day back was one that, depending on my mood and what’s going on in my life, has been either very disturbing or deeply comforting.

O LORD, you have searched me and known me.

You know when I sit down and when I rise up;

you discern my thoughts from far away . . .

Where can I go from your spirit? Or where can I flee from your presence?

If I ascend to heaven, you are there; if I make my bed in Sheol, you are there.

If I take the wings of the morning and settle at the farthest limits of the sea,

even there your hand shall lead me, and your right hand shall hold me fast . . .

big[1]For it was you who created my being, knit me together in my mother’s womb.

Sometimes this Psalm reads like a description of a divine stalker, but more often the mere improbability that the creator of the universe cares about lil’ ole me is overwhelming. If I were inclined to be an atheist, or at least an agnostic, it would probably be because of this very point—the idea that God cares about human beings in any specific sense at all. Most of what we observe and experience screams against it. Our obvious insignificance screams against it.

Psalm 139 offers hope in the face of insignificance. Perhaps there is one place where I do not need to be an impostor or be overwhelmed by my insignificance, a place where I am known better than I know myself and am valued more highly than I could ever manufacture. The other day at convocation, NY Times Pulitzer Prize winning journalist kristofNicolas Kristof told the hundreds of students and faculty in attendance that at those times when one feels insignificant, like a single drop of water in a very large bucket, a drop that can’t possibly make a difference, we should remember that buckets are filled by one drop of water at a time.

Tuesday, August 30: Psalm 146

The final entries in the collection of 150 poems are praises of various sorts—noon prayers at St. John’s Abbey in Minnesota, the place where I first learned to inhabit these ancient poems, include one of the final five psalms in rotation. I always looked forward to Psalm 146, which for me summarizes what God—and therefore those who profess to follow God—cares about the most.

It is the Lord who keeps faith forever, who is just to those who are oppressed.

It is God who gives bread to the hungry, the Lord, who sets prisoners free,

the Lord, who gives sight to the blind, who raises up those who are bowed down,

the Lord, who protects the stranger and upholds the widow and orphan.john the baptist

When John the Baptist sends some of his followers from his prison cell to ask Jesus whether Jesus is the Messiah, “the one who is to come, or should we look for another?” Jesus responds in the language of Psalm 146. Tell John that the blind see, the lame walk, the hungry are being fed, strangers are being welcomed, and those imprisoned are being set free. That’s how you can tell when the divine is in the house, when human beings are in tune with what is greater than themselves. Imagine how different our nation, our world, would be if the above lines were the defining touchstone for success.

Thursday, September 1: Psalm 1

The compilers of the Psalms chose to kick things off with a description of happy people, those who “delight in the law of the Lord.”

They are like a tree that is planted bedside the flowing waters,

That yields its fruit in due season and whose leaves shall never fade;

and all that they do shall prosper.

006I have always been fascinated with trees, but have come to love them in a deeper way over the past several years. Their stability, rootedness, and beauty have become iconic for me. I write about trees frequently in this blog: within the past few months I have written about Tolkien’s Ents, arboreal survival strategies, oaks of righteousness, and how the removal of a 150+ year old tree on campus this summer was traumatic for all involved. In an interview with Krista Tippett, theologian Ellen Davis said that “anything in our world now that slows us down is to be valued and maybe as a gift and even a calling from God.” The fact that the first analogy in the Psalms for the person who “meditates on God’s law day and night” is a tree silently proceeding through its seasons of fruitfulness and prosperity confirms Davis’ insight. I may not meditate on God’s law day and night, but fifteen minutes a day is doable.

Convocation_2007_16

Nice Work If You Can Get It

Not long ago, I read a Huffington Post article summarizing the results of what a bunch of anthropologists found out concerning the daily work habits of university faculty.

What Do Professors Do All Day?

After spending two weeks with a non-random sample of sixteen faculty of different ranks at boise stateBoise State University, the researchers found out that on the average the faculty worked 51 hours during the work week and 10 hours on the weekend at a host of different tasks. Amazing. It took a study to find out that teachers have full-time jobs after all and actually do work-related things on the weekend (although the numbers for both week and weekend seem a bit low). I’m wondering how to square these remarkable results with an article I read a couple of years ago claiming that “University Professor” topped the list of “Least Stressful jobs of 2013.” Of course I had to respond . . .

Those who read this blog regularly or even occasionally know that I believe I have the greatest job in the world. For those who are finding this out for the first time, let me repeat—I have the greatest job in the world. As a matter of fact, it is so great that I don’t consider it to be a job at all. For me, teaching is a vocation rather than a job, something that I truly believe I am called and was born to do. Convocation_2007_16I raise eyebrows occasionally on the campus of the Catholic college at which I teach when I say that I consider teaching to be a vocation for me just as much as being a priest is the vocation of the guys who walk around campus in white robes. But even though I love what I do to an almost unhealthy extent, I taken aback when I learned from a colleague via Facebook that “University Professor” is listed by CareerCast.com at number one in its top ten list of “Least Stressful Jobs of 2013.”

The Ten Least Stressful Jobs of 2013

Really? Or as one of my colleagues commented on Facebook “Bullshit!!! [pardon my advanced degree French].” I guess I must have failed to notice how non-stressful my job is during my 25 year university professor career.

Every person who teaches in higher education has a virtual file full of stories about how difficult it is to get a non-academic friend or family member to understand exactly what is involved with being a college professor. pic_short_teaching_courses_londonMost difficult is getting someone to understand that this is not a part-time job. For instance, Cousin Bob finds out that the typical teaching load for a faculty member at a teaching college or university is three or four three-credit courses per semester (or perhaps five at a two-year college), meaning that the faculty member is in class at most fifteen hours per week. Must be nice to make a full-time salary at a part-time job! Cousin Bob remarks. Early in my career I often patiently pointed out to the Cousin Bobs in my family that a good rule of thumb is that a teacher spends three to four hours outside of class (preparation, reading, grading, meeting with students, etc.) for every hour spent in class. “Really?” Cousin Bob replies. But he clearly is not convinced, since as we all know, easy working hours is the main reason that a person becomes a teacher.

Then, of course, Cousin Bob wonders about all of the weeks of the calendar year that I am not in the classroom. Christmas break, Spring break, the summer—teachers apparently get at least twenty weeks per year off.images Must be nice to make a full-time salary at a part-time job! With what begins to feel like infinite patience, I explain to Cousin Bob that teaching is only one part of a university professor’s job. In order to advance through the ranks of promotion and, more importantly, in order to get tenure, one must research and publish on a regular basis. For most college faculty, the time and focus required for this aspect of the profession is not available during the semester, so the “breaks” are all about research, writing, and praying for publication. But I’m not in the classroom, right? Must be nice to make a full-time salary at a part-time job! You get the idea. A colleague once told me about his frustrated response to a relative after one too many such conversations. Upon hearing Must be nice to make a full-time salary at a part-time job! one too many times, my colleague replied “It actually is really nice. If you were smart enough you could have a job like that too.”

CareerCast’s explanation of why University Professor is the least stressful job of 2013 has more than a whiff of Cousin Bob behind it, just expressed in a slightly less aggressive fashion. For instance, the article explains that

University professors are at the pinnacle of the education field. Their students are largely those who choose the classes they attend, and thus want to be in class. Unlike elementary and secondary educators, the performance of college professors isn’t evaluated based on standardized tests. 15rfd-image-custom3University professors also have the opportunity to earn tenure, which guarantees lifetime employment.

A full response would require something more like a book chapter than a blog post. Suffice it to say that the author of the article (1) has apparently never heard of core requirements if he thinks that college professors face “students . . . who choose the classes they attend, and thus want to be in class,” (2) is not aware that despite the (usual) lack of standardized tests, college teachers are evaluated by their peers, answerable for the results of student evaluations, and are under regular scrutiny in every aspect of what they do, and (3) needs to learn something about the tenure process (assuming that the faculty member is fortunate enough to be teaching at an institution that uses the tenure process).

Tenure. Such job security is certainly unusual in today’s job market and tenure is an attractive “perk” of the academic life. Once one earns it, that is. one-does-not-simply-become-an-adjunct-professorTenure-track positions are hard to come by in academia, more and more so as many institutions opt for hiring year-to-year adjunct professors or special lecturers then proceed to treat them as well-dressed slave labor (don’t get me started on that one). Should a teacher be fortunate to land a tenure-track position in today’s heavily buyer-skewed academic marketplace, the stress she or he will experience in the next several years leading to the tenure decision will be sufficient to last a lifetime. As is undoubtedly the case in many workplace environments, the tenure decision is often as much or more about internal campus politics as it is about the qualifications of the candidate and those things that she or he can control. “The opportunity to earn tenure” is indeed that—an opportunity that, unfortunately, for many talented and qualified teachers will never be available.

Then there’s the money. The article author points out that

csreport_header02_r1_c1_s1Harvard University pays full-time professors $198,400, with a 7:1 professor-to-student ratio, while University of Chicago professors receive $197,800 per year with a 6:1 ratio. Among public universities, the University of California at Los Angeles (UCLA) is highest paying, with an average wage of $162,600 for its full-time staff.

Really? All of them? At all levels? In all disciplines? Such “statistics” are useless without context, about as useless as telling a lawyer in a public defender’s office working 80-100 hours per week and struggling to make ends meet that the senior lawyers in the top firms on Wall Street often make seven-figures annually. Here’s an anecdote from the real world. At least a dozen years into my teaching career, still paying off the loans from ten years of college in order to earn the PhD required to teach at the college level in my discipline,business-ethics it occurred to me one day that the second semester seniors in my Business Ethics class, most of whom would be entering the work force shortly after graduation as entry-level persons with undergraduate business degrees, would be starting at a monthly salary noticeably higher than my own. As I once told a younger colleague when I was chair of my department, in response to his complaints about how little he was being paid, “if you became a teacher for the money, you’re a moron.”

1385581_616281185091038_1215320450_nI have reached the level of experience and rank (a tenured, full professor) at which one could expect that maybe stress levels might reduce and ultimately disappear. But persons such as I are those who are tapped, appropriately, to significantly commit themselves to the third leg of the academic stool (along with teaching and research): service. After four years as chair of a department of 25 faculty and having recently completed a four-year stint as the director of the core academic program at my college, responsible for 80 faculty and upwards of 1700 students at any given time, I realize that one sort of stress just gets replaced by another.

And actually that’s fine, since it is all part of the vocation I was born to inhabit. There are many attractive features to the life of a university professor. I can think of no other profession in which one’s creativity is required more often or in which one has more autonomy and flexibility. But it is anything but stress-free. A teacher never leaves the office. Your work goes with you everywhere. I realized at one point early one December that, other than Thanksgiving Day,  I literally had not had a day off since the middle of August. This is why I have recommended the teaching profession to no more than a half-dozen of my best students in more than twenty years of teaching. If you are looking for a profession that will fit nicely with your family obligations and other interests, don’t become a teacher.nice-work-if-you-can-get-it-1180 If you want to make a living wage at a stimulating 40-45 hour per week job, don’t become a teacher. If you want to “work to live” rather than “live to work,” you probably should not become a teacher. If you think of teaching as one among many equally interesting career possibilities, don’t become a teacher. But if you are incurably obsessed with the life of learning, if the dynamic of new ideas exhilarates you, if you suspect that you might have the heart of a teacher and nothing else will fulfill you, then this highly stressful but highly rewarding vocation might just be for you. It’s nice work if you can get it.

You Are Free, Therefore Choose

Each of us can point to a time (or several times) in our past when we made a decision that, in retrospect, significantly shaped our lives going forward. Such decisions for me include getting married at one month past twenty, choosing to leave law school for a masters program in philosophy, and deciding to commit for life in my early thirties to a person whom I had known for six weeks. I could have chosen differently in each of these cases, and my life would be much different now than it is. DarwinBneuroscienceut could I really have chosen otherwise? The answer of many “experts” from psychology, science, philosophy, and more is “no.”

I wrote about how the “experts” have gradually but inexorably come to this conclusion a week ago, describing how evidence from Darwin to neuroscience supports the conclusion that everything about me, including all of my choices, is fully determined by both biological and environmental causes beyond my control.

They Will Never Take Our Freedom

I undoubtedly, the experts admit, will continue to believe that some of my choices are free in the sense that I could have chosen otherwise, but that belief is based on an illusion. illusionMy choices may feel free, but they really aren’t. If true, the news that free will—the foundation of what most of us believe concerning morality, reward, punishment, praise, blame, and responsibility—is an illusion cannot be taken lightly. Nor, I would argue, need I either as a philosopher or a human being believe that the “experts” are right about this. Free will is only an illusion if one accepts the starting assumptions that energize the argument against human beings having real free will, assumptions that include the belief that everything that exists is made of physical matter, that physical matter is governed by inexorable physical laws, and that we generally know what those laws are. These assumptions are so entrenched among the “experts” that challenging them is as uphill a battle as trying to argue that the earth is flat. But I’ll give it a shot.

I often tell my students that each person, among her or his beliefs, has a small handful of what Aristotle called “first principles.” Aristotle knew as much about cause and effect as anyone;ō indeed, he arguably invented our familiar system of logic that is built on the belief that we live in a world governed by cause and effect relationships. These relationships shape how our beliefs hang together as well. Consider the following conversation:

Me: I believe A is true.

You: Why?

Me: Because A depends on B, and I believe B is true.

You: Why?

Me: Because B depends on C, and I believe C is true.

You: Why?

There’s a pattern here. We all seek to support our beliefs by referring to connected and deeper beliefs on which they depend. There’s also a problem here, though. The chain of cause and effect has to end somewhere if we are to avoid the dreaded “infinite regress.” So eventually we get this:

Me: I believe that X is true because X depends on Z, and I believe Z is true.

first principlesYou: Why? (you’re getting really annoying, by the way)

Me: Because I do.

In Aristotle’s terminology, I have just identified “Z” as one of my first principles. In order to avoid an infinite regress, eventually we arrive at a belief for which we seek no further justification than that we believe it. Such first principles vary from person to person–some common ones include “Human life is intrinsically valuable,” “Human beings are fundamentally equal,” and “Do unto others as you would have them do unto you.” For many, including myself, “Human beings have the capacity to choose freely, choices that are not entirely determined by matters outside their control” is another first principle which, if true, stands in direct opposition to what the “experts” claim the truth to be. And like it or not, no one wants to hear that a first principle is an illusion.

When I choose freely, I deliberate between available options, weigh the evidence supporting and against each, and choose the option that best satisfies my operational criteria. I cause the choice, in other words, influenced but not determined by any number of factors. This simple idea—that a human being can choose without the choice being fully determined—violates assumptions so prevalent among the “experts” that it is tantamount to heresy. uncaused caused 2And to be sure, this simple idea is indeed a radical one, for it claims that the freely choosing human being is an exception to the inexorable laws of matter, capable of starting her or his own chain of causation that would not have happened without the choice starting the chain. There are few beings in the annals of philosophy with this power. Aristotle called his creating force—what we might call “God”—the “unmoved mover” or “the first cause uncaused,” the place where the chain of causation begins (or ends if one is going backwards). In short, human beings act “in loco parentis,” with the causal power of the divine itself, when we make free choices. If one wants to go religious, it is this very creative power of free choice that the sacred texts are referring to when they claim that human beings are “created in the image of God.”

The position that truly free choices step meaningfully outside the laws of nature has been called “metaphysically peculiar” by some philosophers, simply “bullshit” by others. Free will deniers assume that any human capacity that purportedly steps outside the laws of physical matter must be an illusion, since we all know that everything is made of matter and that matter is governed by deterministic laws. more thingsTo which I respond, as I often do, in the words of Shakespeare’s Hamlet: There are more things in heaven and earth . . . than are dreamt of in your philosophy. Presuming that everything in heaven and earth can be reduced to the confines of our current understanding of reality is hubris of breathtaking proportions. When a fundamental and definitive human ability is defined out of existence because of narrow assumptions, I choose to question the assumptions rather than the reality of the fundamental human ability. When answers to a question do not square with our strongest intuitions and beliefs about ourselves, change the question.

None of the above is very philosophical of me, at least not in the contemporary sense. I freely (J) admit that human free choice might be an illusion, but I see no reason to believe so based on an argument with questionable assumptions. I choose rather to embrace the mystery and miracle of being human and believe, until better contrary evidence is provided, in keeping with the intuitions of billions of human beings that at least some human choices make a difference—such a great difference that they make the world a different place than it would have otherwise been. SartreAnd human beings are not just vehicles of that change—their choices cause that change. Maybe we just don’t know enough about reality to rule out abilities that don’t square with our current understanding of things. Maybe human beings are truly the crowning glory of creation, endowed with a spark of the divine that reveals itself in our most basic capacities. Maybe all of the above. Take your pick. As Jean-Paul Sartre used to say, “You are free, therefore choose.”