Category Archives: philosophy

West of What?

ruaneThe heart of Providence College’s core curriculum is the Development of Western Civilization (DWC) program, a sixteen-credit, four-semester, interdisciplinary and team-taught series of courses required of all freshmen and sophomores regardless of major. From its origin in the 1970s, DWC (or “Civ,” as many call it) has been both a regular source of pride and occasionally of controversy, both of which have been the case recently. I have taught in the program for sixteen of the twenty-one years I have taught at the college and directed the program for the four academic years before my 2015-16 sabbatical. At the end of August on the first day of the current semester (my first day back in the classroom in fifteen months), my colleagues and I chose to spend our first ninety minutes with our new freshman students by raising the question that many of them were probably thinking but were too afraid to ask: Why study Western civilization at all?

The very existence of a program such as ours is controversial in many quarters where many wonder, in an increasingly diverse and globally connected world, whether the study of dead-white-guys“dead white guys” is anything more than a lame and outdated exercise in cheerleading for white folks. But others seek to defend the primacy of Western civilization over all others. One of my colleagues began the class with a clip from the Republican National Convention a couple of months earlier. A talking head panel was filling up air space waiting for the first speaker of the evening. After one person mentioned that the hall was filled with lots of angry white people, Iowa Congressman Steve King took offense.

King: This “old, white people’ business does get a little tired. I’d ask you to go back through history and figure out, where are these contributions that have been made by these other categories of people that you’re talking about — where did any other subgroup of people contribute more to civilization?”

Panel moderator: Than white people?

King: Than Western civilization itself. It’s rooted in Western Europe, Eastern Europe and the Unites Stated of America and every place where the footprint of Christianity settled the world. That’s all of Western civilization.

King’s comments astounded a couple of people on the panel and outraged a third—as the table exploded into shouting, the moderator wisely broke for a commercial.

The congressman’s comments reminded me of something brought to my attention a few weeks earlier. A person who gave a talk at my college over fifteen years ago remembered it in a recent column on a conservative website, recalling that some of the people present were not happy with the content of his talk.

 I am sure I noted the fact that some cultures are better than others and I might well have quoted (with enthusiastic approval if I did) William Henry’s observation, in his book In Defense of Elitism, that “It is scarcely the same thing to put a man on the moon as to put a bone in your nose.” I am very fond of Henry’s native prosuperior-civiliztionsboscis image, not least because, like Saul Bellow’s question: “Who is the Tolstoy of the Zulus?”, it is so reliably productive of rage among pampered left-wing academics.

These comments could have been written or said by some of my colleagues on campus who often defend the existence of our DWC program by asking “Why wouldn’t we want to focus close-to-exclusive attention on Western Civilization? It’s the best civilization ever!”

After Congressman King’s clip from the RNC, my teaching colleagues began a conversation with our students by asking basic questions such as “What is Western Civilization?” The students seemed unsure how to answer other than to say “we are!” After a few minutes I made my first contribution. “This is going to sound like a stupid question,” I said, “but I’m sitting over here listening to all of you talk about ‘Western Civilization’ and I’m wondering: West of what? Are we talking geography here? What is Western Civilization west of?” None of us had thought to load a map onto PowerPoint, so I just asked the students to picture Europe, Asia, and Africa in their imaginations. “If we draw a dotted line from north to south to divide the East from the West, where would we draw it?”drawing-the-line

The first suggestion was that it would go through the Middle East, in a way that would perhaps include Israel in the West but perhaps the rest of the Middle East in the East. “What about Turkey?” one of us asked. They weren’t sure. Of course if the dotted line continues straight south from Israel, another question arises. “What about Africa? Is it in the West?” I asked.funny-africa “No,” just about all of the students answered, meaning that our dotted line needs to take a sharp turn left from Israel, bisecting the Mediterranean Sea so that Europe to the north stays in the West (as everyone knows it should) and Africa to the south stays non-West (as everyone knows it should). Remembering Congressman King’s inclusion of the United States in Western civilization (even though it does not make an appearance in DWC until the beginning of sophomore year), the dotted line should cross the Atlantic so that the U.S. and Canada lie on the Europe (northern) side of the line. Mexico and Central America? Iffy. South America? Not sure. It was pretty clear that the line was intended to include those most like us in the West and exclude everyone else.funny-asia

This exercise established pretty quickly that the answer to West of What? cannot be geographical. One student then helpfully suggested that “I think ‘Western’ refers to various ideas rather than geographical areas.” Asked for examples, students suggested things like democracy, human rights, capitalism, and monotheism. One of my colleagues, a theologian, jumped on that one. “Is Christianity a Western religion?’ he asked—“Of course!” was the answer (Congressman King would have been proud). “What about Judaism?” The students weren’t as sure there, even when my colleague pointed out that both Christianity and Judaism started in the same place and that Jesus was a Jew. “What about Islam?” Most of the students thought not (Donald Trump would have been proud), prompting my colleague to point out that for a number of centuries during the Middle Ages the westernmost area of Europe—Spain—was Muslim. Trying to define “Western” by religion doesn’t work much better than geographically.monotheism

Finally, one brave student said “I think that ‘Western’ for most people means pretty much the same thing as ‘White’.” And I suspect there’s a great deal of truth in that suggestion. When we study Western Civilization, we will undoubtedly end up studying predominantly dead white guys unless a concerted effort is made to engage with non-Western influences. I mentioned to the students, for instance, thg-and-iat most people don’t know that one of the greatest—perhaps the greatest—philosopher in the Western tradition, Plato, was heavily influenced by thought from Persia and India. The dotted line is porous, in other words. Furthermore, one of my colleagues who is of Italian ancestry pointed out that one hundred years ago in this country, immigrants from southern European countries such as Greece and Italy were not considered by many in this country as white.

After ninety or so minutes of this, I’m sure our students left largely in confusion. And that’s a good thing. Despite the name of the course, “Western Civilization,” upon close scrutiny, turns out to be as much a reflection of our expectations and prejudices as anything else. This is good for seventeen- and eighteen-year-old young adults to understand at the outset of their forays into their college education. We live in a polarized political and social climate in which labels and categories are used as if they are as objective and set in stone as the Ten Commandments. Only when we realize that all such dividing lines are at their roots random, porous, and fluid might we seize the opportunity to do better.

The Right Niyyah

As I wait impatiently for my sabbatical that is under contract with a publisher to return from the editor, I’ve been thinking about some of my blog essays that “made the cut” in some sense to appear in revised form in my book-to-be. One of these essays is about the challenge of cultivating the right attitude with which to enter the world on a daily basis. I learned a lot about this from Rami Nashishibi when he was interviewed a year or so ago on Krista Tippett’s “On Being.”

Regular readers of this blog know that I am a fan of Krista Tippett’s radio program “On Being,” a show that I frequently catch several minutes of on Sunday mornings as I drive the fifteen minutes from our house to the early show at church. A few weeks ago, her guest was Rami Nashashibi, Founder and Executive Director of the Inner-city Muslim Action Network, or IMAN, in Chicago. He’s also a Visiting Assistant Professor in Sociology of Religion and Muslim Studies at Chicago Theological Seminary.nashishibi

On Being: A New Coming Together

Tippett describes Nashishibi at the beginning of the interview as using

Graffiti, calligraphy, and hip-hop in his work as a healing force on the South Side of Chicago. A Palestinian-American, he started his activism with at-risk urban Muslim families, especially youth, while he was still a college student. Now he’s the leader of a globally-emulated project converging religious virtues, the arts, and social action. And he is a fascinating face of a Muslim-American dream flourishing against the odds in post-9/11 America.

Not surprisingly, the conversation was wide-ranging, fascinating, and introduced me to a number of matters and issues that are well outside of my usual frame of reference. What particularly grabbed me, however, was a brief exchange toward the end of the interview, just as I was pulling into my usual parking spot at Trinity Episcopal.

Krista Tippett: I told you I was on your Twitter feed, and I love this. I think it was really recent. You wrote: “My 4-year-old discovers the spiritual power of her name as she looks over and seriously asks, ‘Daddy, do you have the right niyyah?’” What does niyyah mean?

Rami Nashashibi: So niyyah — in kind of Arabic-Muslim parlance — is spiritual intention. niyyahAnd oftentimes — it’s both the Swahili and Arabic word. And oftentimes, Muslims are always asked before they pray, before they do any act of service, before they engage in anything that has any kind of sense of worship associated with it, is it being done for the right niyyah? Is it being done for the right purpose? Are you attempting to get fame or credit? I think, yes, there was a song that had used her name in that way and the light went off in the middle of it and turned over to me on the couch and asked me that question. Honestly, I looked at her and I didn’t have an answer for her for I think a good 20 seconds. She nodded her head and she said, “No, probably not.”

And I said, “What?” We then had a conversation. I said, “Well, at least keep me in your prayers that I have.”

This four-year-old’s simple question—Do you have the right niyyah?—has stuck with me ever since. So has her response to her father’s lack of response—“No, probably not.” It’s hard enough to figure out what the right thing to do is on a daily basis; adding in that it should be done with the right intention, for the right reasons, seems like piling on.intentions and actions As a philosophy professor who has taught introductory ethics courses more times than I care to count over the past twenty-five years, I have thought about this a lot. When I ask my students “What is more important—what you do, or why you do it? Actions or intentions?” they usually split roughly down the middle.

And so do the great moral philosophers. There is the tradition of those who say that only results matter (since they can be observed and measured publicly) and intentions are irrelevant. Then there is the other tradition (spearheaded by Immanuel Kant) who say that results are irrelevant—the true measure of the moral life is internal. Were your intentions pure? Was your heart in the right place? If so, then you are morally in the clear, even if the results of your intended action go “tits up” (to quote my brother-in-law).

VgMKgyZMy students are pretty smart, and it doesn’t take very long before they realize that the “results or intentions” question is a false dichotomy. Because in truth, normal human beings care about both. If morality is just about doing the right thing, then the person who identifies the things that should be done and does them—even if for all of the wrong reasons, such as self-righteous smugness or the praise of others—is morally in the clear. But Nashashibi’s four-year-old daughter is right—we want not only the right thing to be done, but for it to be done with the right niyyah, the right intention or reason. And that sucks, because it takes things straight into the human heart. For those who profess the Christian faith, it also takes things straight into the world of grace.

The first thing I ever learned from Scripture about the human heart as a young boy was from JeremiahJeremiah: “The heart is deceitful above all things and desperately wicked—who can know it?” Far less attention was paid to the Psalm that is recited in liturgical churches during the Ash Wednesday liturgy: “Create in me a clean heart, and renew a right spirit within me. Cast me not away from your presence, O Lord, and take not your Holy Spirit from me. Restore unto me the joy of your salvation, and renew a right spirit within me.” Straight from the Jewish scriptures is both the problem of and the solution for right intentions. As a flawed human being, I am incapable of doing things for the right reason, but help is available. Through divine grace the heart is changed and turned toward the good. Rami Nashishibi’s daughter is right when she doubts that her dad has the right niyyah, so long as that depends on his own energies and strength. But when the divine gets involved, everything changes.

The mystery of grace is exactly that—a mystery. Divine grace enters the world through flawed human beings, strangely enough, and there isn’t enough time to try to figure it out. Grace is something to be channeled, to be lived, not systematized and turned into dogma or doctrine. My bright abyssThe poet Christian Wiman writes beautifully about this. Through many years of cancer treatments, he learned to hear God, then to channel God, in the most unlikely places, the very places where divine grace apparently lives. Wiman writes that

God speaks to us by speaking through us, and any meaning we arrive at in this life is composed of the irreducible details of the life that is around us at any moment. . . . All too often the task to which we are called is simply to show a kindness to the irritating person in the cubicle next to us, say, or to touch the face of a spouse from whom we ourselves have been long absent, letting grace wake love from our intense, self-enclosed sleep.

The right niyyah is not the result of struggle, training, or calculation. And as the author of Deuteronomy tells us,deuteronomy

Nor is it beyond the sea, that you should say, “Who will cross the sea for us to get it for us and make us hear it, that we may observe it?” But the word is very near you, in your mouth and in your heart, that you may observe it.

All I have to do to have the right niyyah is to open my heart, open my mouth, and let it out.

To experience grace is one thing; to integrate it into your life is quite another. What I crave now is that integration, some speech that is true to the transcendent nature of grace yet adequate to the hard reality in which daily faith operates.

Tolerance on Steroids

What happens when a perfectly good virtue gets turned into not only the most important virtue, but in many cases the only virtue? I have come face to face with this question in the signpostsearly weeks of this semester with fifty juniors and seniors in two ethics classes. I chose this past summer to organize my General Ethics course, usually a tour of several of the notable moral theories in the Western philosophical tradition (Aristotle, Kant, Mill, Sartre, etc.) that are then applied to the details of human experience, by starting instead with those messy details themselves. We find ourselves in a world of competing religious, moral, and political claims shouting at each other across various divides, claims that are both incompatible with each other and resistant to compromise. How in the midst of diverse perspectives that too often lead to violence are we to find a place of stability from which to plot the way forward?

I have discovered both from early class discussion and student writing reflections what I suspected—most of my young adult students have been taught for as long as they can remember that the “go-to” virtue that must be cultivated in order to wend one’s way through the minefield of incompatible beliefs and commitmaristotleents is tolerance. It’s interesting that the granddaddy of virtue ethics, Aristotle, did not include tolerance in any of his lists of virtues—apparently such a virtue was not particularly useful in fourth century BC Athens. Tolerance is also rejected by many contemporary people as a sign of weakness, of lacking commitment to one’s beliefs, and of a willingness to compromise too quickly. But for many in our culture, particularly those who might consider themselves as “liberal” in some sense, tolerance is the proposed remedy for many of the things that ail us.

Don’t get me wrong—I have no problem with tolerance as a virtue. As a matter of fact, it probably plays as regular a role in my life on a daily basis as any virtue you could name. My concern about tolerance arises from intimate facebookfamiliarity with how it often works in my own life. When I remove myself from an email list on campus because I’m sick to death of being inundated with what I consider to be the often petty concerns of my colleagues, it feels like tolerance. “Let them continue emailing about anything they want,” I think. “I just don’t want to be part of it.” When a Facebook conversation wanders into areas that I find either offensive or seriously different-strokesmisguided, my tendency is to withdraw from the conversation rather than insert my concerns. Tolerant, right? Not really.

I find in my own life, and I suspect I’m not unusual or unique in this, that “tolerance” is an umbrella term for “whatever.” “Different strokes for different folks.” “I disagree with you but I’ll defend to the death your right to say it” (although I almost certainly wouldn’t). In other words, one of the best safeguards against being judgmental and ethnocentric, a check against our natural human tendency to negatively judge those who believe, think, and act differently than we do simply because they are believing, thinking, and acting differently than we do, turns into a placeholder for laziness and a reticence to engage even with what one most strongly disagrees with. When writing on topics related to diversity and difference, my students regularly include phrases such as “we all just need to accept people as they are” and “the world would be a better place if everyone would simply be more tolerant of differences.” Tolerance is not only the first virtue that gets mentioned in class discussion and assignments, but is often the only virtue in play. But is tolerance suitable as the primary virtue in a moral framework or ethic? And what if there are some things that must not be tolerated?

herodotusA brief but familiar story from the ancient Greek historian Herodotus provides a useful jumping off point for asking uncomfortable questions about tolerance. In his Histories, Herodotus tells the story of King Darius of Persia, a (somewhat) enlightened ruler who was fascinated by the various customs of the different groups of people from the far-flung reaches of his empire who were part of his extended court. Darius noted, for instance, that two different groups of people—the Greeks and the Callatians (a tribe of people from what is now the Indian peninsula)—had strikingly different methods of dealing with the body of a person who died in their community. The Greek practice when someone died was to burn the dead body, while the Callatian practice was to eat the dead body.

Intrigued, Darius first asked representatives of the Greek community what he would have to pay or give them, what honors he would have to bestow on them, so that the next time someone died in their community they would eat the dead body instead of burning it, as was their custom. Appalled, the Greek representatives told Darius that no amount of riches or honors could possibly convince them to do such a horrible and immoral thing. Darius also asked a similar question of the Callatians—could I convince you to burn the next dead body you have to deal with in your community rather than eating it, as is your custom? Hell no! the Callatians said, insisting that nothing could convince them to do such a disgusting and immoral thing. Herodotus’s conclusion? callatians“Custom is king.” What a person or group of people considers to be “right” or “moral” is what they are accustomed to, the practices of their family, their community, or their culture that they have been taught since their youth. Humans nature causes us not only to embrace what we are most familiar with as morally right, but also to assume that it is right for everyone.

If “custom is king” and moral values are culturally defined, then the most important attitude to cultivate, the habit most likely to put up a firewall against unwarranted projection of one’s parochial practices and values on others, is undoubtedly tolerance. As Herodotus’ story is intended to illustrate, the best answer to the question “Who is right about the best way to dispose of a dead body—the Greeks or the Callatians?” is “Both, within the parameters of their culture.” Furthermore, there is no way to step outside one’s own culturally defined moral stance and be “objective.” There is no such objective standpoint. The only proper response to differences between groups, or perhaps even between individuals, is tolerance—the habit of accepting differences without judgment.tolerance

The problem, as a student quickly pointed out in each section of my ethics course, is that tolerance as an exclusive or primary virtue is not sufficient to account for many of our strongest moral intuitions. What if, for instance, the difference is about something more serious than the difference between eating or burning a dead body? What if the difference is between a culture that practices female circumcision and our culture that does not? Is tolerance appropriate in this instance? Are we to say “wow, I’m glad I don’t live in that culture, but for them that practice is morally right”? If our intuitions say that some practices cannot be tolerated, no matter what cultures adopt them, is this because our intuitions have been shaped by our own culture or because our intuitions are resonating with a moral absolute that transcends cultural differences?moral-values

Of such questions a great General Ethics class is made. But it appears that if we raise tolerance to primary virtue status, we at the same time take any commitment to moral principles that transcend cultural differences off the table. And that may not be a price worth paying. As I told my students the other day, a moral theory that does not account for our strongest moral intuitions is like trying to cover a queen-size mattress with a twin-size fitted sheet. It covers some of what needs to be covered, but not all of it. I, for one, am not ready to tolerate a theory like that.

Disturbing the Peace

Last Friday I attended a talk on campus by civil rights lawyer and and law professor Greg Lukianoff on issues of free speech, trigger warnings, and a related host of matters on college and university matters that are regularly in the news. He is the co-author of an article in The Atlantic a bit over a year ago that raised a lot of eyebrows and generated a lot of conversation. I wrote about it in the early weeks of my sabbatical last August: 

I do not know how to teach philosophy without becoming a disturber of the peace Baruch Spinoza Spinoza

One of the lead articles in the most recent edition of The Atlantic magazine is “The Coddling of the American Mind.”

Lukianoff and Haidt: The Coddling of the American Mind

Co-authored by Greg Lukianoff and Jonathan Haidt, the teaser blurb for the article in the Table of Contents says “How a new strain of political correctness on campus is damaging higher education—and may be threatening students’ mental health.” It is an interesting read. Given Donald Trump’s current more-than-fifteen-minutes of fame, concerns about political correctness are in the news, safe spacebut in this article Lukianoff and Haidt are drawing our attention to what might be called “political correctness with a twist”:

The current movement is largely about emotional well-being. . . . It presumes an extraordinary fragility of the collegiate psyche, and therefore elevates the goal of protecting students from psychological harm. The ultimate aim, it seems, is to turn campuses into “safe spaces” where young adults are shielded from words and ideas that make some uncomfortable. And more than the last, this movement seeks to punish anyone who interferes with that aim, even accidentally.

The authors’ argument is largely anecdotal, relying either on their own experiences or on recent anecdotal stories and essays from various campuses across the country. seismic shiftThere is a great deal of speculation about the causes of this perceived seismic psychological shift among students over the past couple of decades, although virtually no data is provided to substantiate many of the authors’ claims.

In the first column of the article readers are introduced to two important terms that “have risen quickly from obscurity into common campus parlance: Microaggression and Trigger warnings. Microaggressions “are small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless.” Examples provided include asking an Asian American or Latino American “Where were you born?,” because this implies that she or he is not a real American. Mrs. DallowayTrigger warnings are “alerts that professors are expected to issue if something in a course might cause a strong emotional response”; examples of texts deemed as needing trigger warnings on various campuses include Virginia Woolf’s Mrs. Dalloway (suicidal inclinations) and Ovid’s Metamorphoses (sexual assault). The many examples of these and related problems in the article are chosen and presented with the clear intention of “triggering” the reader into concluding “well that’s just stupid—political correctness, like a hydra, rears a new ugly head.” One of the authors’ primary concerns, repeated frequently throughout the article is that such attention to words and actions that might possibly somewhere, somehow offend someone will leave students unprepared to live and work in a world that doesn’t give a crap about what makes them feel uncomfortable.

What are we doing to our students if we encourage them to develop extra-thin skin in the years just before they leave the cocoon of adult protection and enter the workforce? Would they not be better prepared to flourish if we taught them to question their own emotional reactions, and to give people the benefit of a doubt?

Even though I have twenty-five years of college teaching under my belt,pc my experience on college campuses is deep but narrow, given that I have taught at my current college home for twenty-one years and have shaped my teaching and professional life within the confines of its “105 acre, park-like campus.” Serious conversations about the negative power of language on students in various groups defined racially, economically, by gender or by sexual preference have been ongoing on my campus for some time now. In my own philosophy department regular, continuing, and often heated debates occur about what constitutes appropriate and inappropriate language in the classroom, in job candidate interviews, and in basic conversation with each other. What strikes some as obviously benign, scholarly, and insightful strikes others as ill-advised, insensitive, and downright offensive. That said, the tsunami described by Lukianoff and Haidt as drowning campuses nationwide has escaped my notice where I teach—at least in my classrooms. Perhaps this is because I have included this general “trigger warning” in every syllabus for every one of my courses for at least the past fifteen years:

Ine this course we will be considering some of the most important questions a human being can ask. Perhaps the most important feature of our considerations is learning to ask these questions clearly and precisely. Only then can possible answers be considered fairly. Although I have definite positions on the questions we will be addressing, my role as professor is not to tell you what to think. My role is rather to get you to think. Expect your assumptions to be challenged and comfortable ways of thinking to be disturbed. As the great 17th century philosopher Spinoza once said, I do not know how to teach philosophy without becoming a disturber of the peace.

During an oral final exam a couple of semesters ago a student told me that “This class really messed me up—but in a good way!” Mission accomplished.mission accomplished

The new fall semester is just three weeks old–here’s a bit of advice related to safe spaces and learning for the incoming freshmen:

  1. Free speech dictates that everyone has the right to their opinion, but not all opinions are equal. right to an opinionOne of the purposes of a liberal education is to help you become skillful at using the tools of lifetime learning; some of these tools, used properly, will help you learn how to distinguish a good argument from bullshit—even when it is your own argument. I often say that a liberally educated person earns the right to have an opinion. The process of earning that right begins with realizing that your opinion is not special just because it is yours, and without challenge and analysis it means nothing with regard to whether it is true (or even a defensible position).
  2. In the life of learning, comfort is vastly overrated. comfort zoneExpect to encounter people, ideas, situations and expectations that are both unfamiliar and well outside your comfort zone. You should be looking for these rather than trying to avoid them. If you manage to make it through your undergraduate college career without changing any opinion, belief, perspective or attitude, then your tuition dollars have been wasted.
  3. The world of adulthood into which you are making your first, tentative forays can be a tough, nasty place. The world out there is full of people, ideas, things, and events that couldn’t care less if they lie within your current comfort zone.it is what it is As my wife would say, the world is what it is. Your years in college are not so much about your landing a well-paying job after you graduate as they are about the construction of a powerful and flexible moral and psychological framework of belief and commitment, from within which you will engage with what’s “out there” on a daily basis. It is not the world’s responsibility to provide you with comfort and security. It is your task to create and maintain a moral and psychological home for yourself in that world using all of the resources available to you, resources to sustain you on a life-long journey. By the way, you’ll be making significant renovations and additions to this home your whole life. Your professors are here to assist you in the construction of that home—good luck!

A liberal education, especially, inspires students to value struggle. By grappling with authors and ideas that demand the greatest level of intellectual intensity—and this is especially true in subjects that are difficult and uncongenial—students learn that they stretch themselves more through struggle, whether or not they win the match. Christopher Nelson

Good Morning, Psalms

Last Thursday, in just our second class of the semester, I had the opportunity to introduce my ethics students to the master of all things ethical. The key to Aristotle’s understanding of the life of human flourishing is that such a life depends on the formation of the best habits—Aristotle ethicsthe virtues—to guide one’s life. Aristotle conceived of the life of freedom and moral excellence as a life constructed out of the virtues, good habits that, when cultivated, incline a person to do the right thing as a matter of developed character rather than conscious choice. Habits are established by repetition and, once formed, are often very difficult to change. Accordingly, one should take great care that one’s moral habits are the right ones (virtues) and not the wrong ones (vices), since the wrong habits, once entrenched, will be next to impossible to replace with better ones.plato footnote

I have taught Aristotle’s ethics for many years and believe that although Alfred North Whitehead was probably correct when he said that all of philosophy is a series of footnotes to Plato, the best thinking about ethics begins with Aristotle. And his insights concerning the importance of habits are relevant beyond the ethical realm. I find myself in the best physical shape of my life now in my early sixties because several decades ago my grudging daily trips to the gym somehow turned into a habit that I no longer had to talk myself into. Reading psalms with 100_0670Benedictine monks in Minnesota three times a day during my 2009 sabbatical established a habit of reading the three or four psalms appointed for each day in the Book of Common Prayer that continued for several years after my sabbatical ended. Between my alarm at 5:15 AM and getting to the gym by its 6:00 opening time I read the day’s psalter aloud (or murmured it, lest I awaken the dogs and Jeanne). I am convinced that this simple habit both helped transfer important changes in my life from sabbatical to real life, and also contributed to the preservation of my sanity as I juggled full-time teaching with the additional full-time duties of running a large academic program for four years.

But then I lost the habit, under the strangest of circumstances. My next sabbatical arrived, and with the prospect of unlimited time to rest, re-center, read, and write in front of me, somehow the daily regimen of early morning psalm reading fell by the wayside. I no longer needed to arise at 5:15, I rode my new bicycle obsessively instead of daily workouts at the gym, I applied myself energetically to my sabbatical writing project, and somehow my simple ten to fifteen minutes alone with the psalms every morning didn’t make the cut. habitsI made no conscious decision to end the habit—I just did. If Aristotle is correct in saying that well-established bad habits are very difficult to break, it turned out—in my case at least—that good habits can be broken very easily. I didn’t even realize consciously that my psalm reading habit had gone by the wayside for several weeks; once I noticed its absence, I made a few half-hearted attempts to start again over the following months. But they didn’t take.

I returned to the classroom for the first time in fifteen months a week ago, and decided that along with a return to a 5:15 wake-up call, I would attempt to re-establish my psalm reading habit. With only a week under my belt, the returns are promising; coming back to the psalms has been like becoming reacquainted with very wise friends who have been away for a while. My renewed acquaintances include:

Monday, August 29: Psalm 139

The opening psalm on the list for my first day back was one that, depending on my mood and what’s going on in my life, has been either very disturbing or deeply comforting.

O LORD, you have searched me and known me.

You know when I sit down and when I rise up;

you discern my thoughts from far away . . .

Where can I go from your spirit? Or where can I flee from your presence?

If I ascend to heaven, you are there; if I make my bed in Sheol, you are there.

If I take the wings of the morning and settle at the farthest limits of the sea,

even there your hand shall lead me, and your right hand shall hold me fast . . .

big[1]For it was you who created my being, knit me together in my mother’s womb.

Sometimes this Psalm reads like a description of a divine stalker, but more often the mere improbability that the creator of the universe cares about lil’ ole me is overwhelming. If I were inclined to be an atheist, or at least an agnostic, it would probably be because of this very point—the idea that God cares about human beings in any specific sense at all. Most of what we observe and experience screams against it. Our obvious insignificance screams against it.

Psalm 139 offers hope in the face of insignificance. Perhaps there is one place where I do not need to be an impostor or be overwhelmed by my insignificance, a place where I am known better than I know myself and am valued more highly than I could ever manufacture. The other day at convocation, NY Times Pulitzer Prize winning journalist kristofNicolas Kristof told the hundreds of students and faculty in attendance that at those times when one feels insignificant, like a single drop of water in a very large bucket, a drop that can’t possibly make a difference, we should remember that buckets are filled by one drop of water at a time.

Tuesday, August 30: Psalm 146

The final entries in the collection of 150 poems are praises of various sorts—noon prayers at St. John’s Abbey in Minnesota, the place where I first learned to inhabit these ancient poems, include one of the final five psalms in rotation. I always looked forward to Psalm 146, which for me summarizes what God—and therefore those who profess to follow God—cares about the most.

It is the Lord who keeps faith forever, who is just to those who are oppressed.

It is God who gives bread to the hungry, the Lord, who sets prisoners free,

the Lord, who gives sight to the blind, who raises up those who are bowed down,

the Lord, who protects the stranger and upholds the widow and orphan.john the baptist

When John the Baptist sends some of his followers from his prison cell to ask Jesus whether Jesus is the Messiah, “the one who is to come, or should we look for another?” Jesus responds in the language of Psalm 146. Tell John that the blind see, the lame walk, the hungry are being fed, strangers are being welcomed, and those imprisoned are being set free. That’s how you can tell when the divine is in the house, when human beings are in tune with what is greater than themselves. Imagine how different our nation, our world, would be if the above lines were the defining touchstone for success.

Thursday, September 1: Psalm 1

The compilers of the Psalms chose to kick things off with a description of happy people, those who “delight in the law of the Lord.”

They are like a tree that is planted bedside the flowing waters,

That yields its fruit in due season and whose leaves shall never fade;

and all that they do shall prosper.

006I have always been fascinated with trees, but have come to love them in a deeper way over the past several years. Their stability, rootedness, and beauty have become iconic for me. I write about trees frequently in this blog: within the past few months I have written about Tolkien’s Ents, arboreal survival strategies, oaks of righteousness, and how the removal of a 150+ year old tree on campus this summer was traumatic for all involved. In an interview with Krista Tippett, theologian Ellen Davis said that “anything in our world now that slows us down is to be valued and maybe as a gift and even a calling from God.” The fact that the first analogy in the Psalms for the person who “meditates on God’s law day and night” is a tree silently proceeding through its seasons of fruitfulness and prosperity confirms Davis’ insight. I may not meditate on God’s law day and night, but fifteen minutes a day is doable.

You Are Free, Therefore Choose

Each of us can point to a time (or several times) in our past when we made a decision that, in retrospect, significantly shaped our lives going forward. Such decisions for me include getting married at one month past twenty, choosing to leave law school for a masters program in philosophy, and deciding to commit for life in my early thirties to a person whom I had known for six weeks. I could have chosen differently in each of these cases, and my life would be much different now than it is. DarwinBneuroscienceut could I really have chosen otherwise? The answer of many “experts” from psychology, science, philosophy, and more is “no.”

I wrote about how the “experts” have gradually but inexorably come to this conclusion a week ago, describing how evidence from Darwin to neuroscience supports the conclusion that everything about me, including all of my choices, is fully determined by both biological and environmental causes beyond my control.

They Will Never Take Our Freedom

I undoubtedly, the experts admit, will continue to believe that some of my choices are free in the sense that I could have chosen otherwise, but that belief is based on an illusion. illusionMy choices may feel free, but they really aren’t. If true, the news that free will—the foundation of what most of us believe concerning morality, reward, punishment, praise, blame, and responsibility—is an illusion cannot be taken lightly. Nor, I would argue, need I either as a philosopher or a human being believe that the “experts” are right about this. Free will is only an illusion if one accepts the starting assumptions that energize the argument against human beings having real free will, assumptions that include the belief that everything that exists is made of physical matter, that physical matter is governed by inexorable physical laws, and that we generally know what those laws are. These assumptions are so entrenched among the “experts” that challenging them is as uphill a battle as trying to argue that the earth is flat. But I’ll give it a shot.

I often tell my students that each person, among her or his beliefs, has a small handful of what Aristotle called “first principles.” Aristotle knew as much about cause and effect as anyone;ō indeed, he arguably invented our familiar system of logic that is built on the belief that we live in a world governed by cause and effect relationships. These relationships shape how our beliefs hang together as well. Consider the following conversation:

Me: I believe A is true.

You: Why?

Me: Because A depends on B, and I believe B is true.

You: Why?

Me: Because B depends on C, and I believe C is true.

You: Why?

There’s a pattern here. We all seek to support our beliefs by referring to connected and deeper beliefs on which they depend. There’s also a problem here, though. The chain of cause and effect has to end somewhere if we are to avoid the dreaded “infinite regress.” So eventually we get this:

Me: I believe that X is true because X depends on Z, and I believe Z is true.

first principlesYou: Why? (you’re getting really annoying, by the way)

Me: Because I do.

In Aristotle’s terminology, I have just identified “Z” as one of my first principles. In order to avoid an infinite regress, eventually we arrive at a belief for which we seek no further justification than that we believe it. Such first principles vary from person to person–some common ones include “Human life is intrinsically valuable,” “Human beings are fundamentally equal,” and “Do unto others as you would have them do unto you.” For many, including myself, “Human beings have the capacity to choose freely, choices that are not entirely determined by matters outside their control” is another first principle which, if true, stands in direct opposition to what the “experts” claim the truth to be. And like it or not, no one wants to hear that a first principle is an illusion.

When I choose freely, I deliberate between available options, weigh the evidence supporting and against each, and choose the option that best satisfies my operational criteria. I cause the choice, in other words, influenced but not determined by any number of factors. This simple idea—that a human being can choose without the choice being fully determined—violates assumptions so prevalent among the “experts” that it is tantamount to heresy. uncaused caused 2And to be sure, this simple idea is indeed a radical one, for it claims that the freely choosing human being is an exception to the inexorable laws of matter, capable of starting her or his own chain of causation that would not have happened without the choice starting the chain. There are few beings in the annals of philosophy with this power. Aristotle called his creating force—what we might call “God”—the “unmoved mover” or “the first cause uncaused,” the place where the chain of causation begins (or ends if one is going backwards). In short, human beings act “in loco parentis,” with the causal power of the divine itself, when we make free choices. If one wants to go religious, it is this very creative power of free choice that the sacred texts are referring to when they claim that human beings are “created in the image of God.”

The position that truly free choices step meaningfully outside the laws of nature has been called “metaphysically peculiar” by some philosophers, simply “bullshit” by others. Free will deniers assume that any human capacity that purportedly steps outside the laws of physical matter must be an illusion, since we all know that everything is made of matter and that matter is governed by deterministic laws. more thingsTo which I respond, as I often do, in the words of Shakespeare’s Hamlet: There are more things in heaven and earth . . . than are dreamt of in your philosophy. Presuming that everything in heaven and earth can be reduced to the confines of our current understanding of reality is hubris of breathtaking proportions. When a fundamental and definitive human ability is defined out of existence because of narrow assumptions, I choose to question the assumptions rather than the reality of the fundamental human ability. When answers to a question do not square with our strongest intuitions and beliefs about ourselves, change the question.

None of the above is very philosophical of me, at least not in the contemporary sense. I freely (J) admit that human free choice might be an illusion, but I see no reason to believe so based on an argument with questionable assumptions. I choose rather to embrace the mystery and miracle of being human and believe, until better contrary evidence is provided, in keeping with the intuitions of billions of human beings that at least some human choices make a difference—such a great difference that they make the world a different place than it would have otherwise been. SartreAnd human beings are not just vehicles of that change—their choices cause that change. Maybe we just don’t know enough about reality to rule out abilities that don’t square with our current understanding of things. Maybe human beings are truly the crowning glory of creation, endowed with a spark of the divine that reveals itself in our most basic capacities. Maybe all of the above. Take your pick. As Jean-Paul Sartre used to say, “You are free, therefore choose.”

The Joyful Owl

SagataganJust about seven years ago, on a beautiful summer morning very similar to the ones we are experiencing in Providence these days, I was just finishing a post-morning prayer walk around beautiful Lake Sagatagan behind St. John’s Abbey on the campus of St. John’s University in Collegeville, Minnesota. I had been in Collegeville for the first four-and-a-half months of 2009 on sabbatical and was now back for a week of writing and relaxation while Jeanne participated in a workshop at the Episcopal House of Prayer nearby. ThMary at stella marise point of destination when walking the perimeter of the lake is Stella Maris Chapel on the opposite shore from the Abbey, a lovely little chapel which contains an exquisitely unique statue of a pregnant Mary. St. John’s is situated on a national wildlife preserve; I learned during my months in residence never to walk the trails without a camera. On this particular morning, I noticed a dark shape in one of the massive trees just off the trail to the right. I stared at it for what seemed like several minutes. After concluding that it must be a large abandoned nest or simply the remains of a long-ago fallen branch, the top third of the shape turned slowly 180 degrees and looked directly at me. It was an owl.100_0767

I have noted occasionally in this blog that I am obsessed with penguins, to the extent that I once dedicated a post exclusively to penguins.

Well-Dressed Birds

But I also love owls. They’re not quite as cool as penguins, but come in a very close second. If penguins did not exist (a world I do not care to consider possible), my office would be full of owl paraphernalia instead of penguin stuff. And I could make a better case for an owl obsession than I can for penguins. Owls are iconic symbols of wisdom, something everyone wants (I think).the owl of minerva Accordingly, philosophers should like owls. As a matter of fact, The Owl of Minerva is generally considered to be the best philosophy journal in the English-speaking world dedicated to the philosophy of the great nineteenth-century German philosopher Georg Wilhelm Friedrich Hegel, just in case you’re interested. The title of the journal is a reference to the owl being the favored bird of Athena, the Greek goddess of wisdom (Minerva is her Roman name)—who just happens to be my favorite resident of Mount Olympus. Stuttgart_Athene_ZeusYou have to take notice of someone who sprang fully grown and clothed in battle armor directly from her father’s skull and started giving him advice. So I immediately chalked up my owl sighting as yet another gratuitous favor sent to me from the divine as confirmation that this place in the middle-of-nowhere Minnesota is indeed a spiritual home. It only could have been topped by some penguins waddling down the path in my direction.

I don’t recall that owls were a favorite of mine as a child. My attraction to owls was most likely triggered during my first couple of years of teaching after graduate school. I was on the faculty at a small Catholic college in Memphis where they basically needed someone to teach business ethics to their business and engineering students. spotted owlSo I did—five sections per semester for three years. I always included a unit on environmental issues, and during the early 90s this invariably meant spotted owls. Native to the Pacific Northwest, the endangered spotted owls were very picky about where they nested and lived—which just happened to be in the middle of prime timber forest. Every time well-meaning people would relocate the owls, they immediately moved back to their original section of the forest that various constituencies wanted to cut down and turn into useful items that people will pay money for. So the debate raged—tree_huggertree huggers arguing that this forest must remain untouched so the spotted owls could live where they chose, and good capitalists screaming foul over the idea that a stupid, useless bird that no one ever saw because they only came out at night when everyone was asleep could actually hold up progress and money-making. My students had many fine, spirited debates—so many that at the end of one semester they presented me with a stuffed spotted owl toy that perches twenty years later proudly on top of one of my office bookcases.

Imagine my delight when taking the “What Animal Were You in a Past Life” quiz that popped up one day on Facebook to find out that

What Animal Were You in a Past Life?

You were the Owl. Graceful, quiet, and majestic, you glide silently through the night. You are self-sufficient, independent, and make the most of everything around you. You are not very picky about what you like, and when you love something, it will be forever. You would make a wonderful parent, but in no way would you spoil your children; they would be taught how to look after themselves. You are a symbol of guidance.

flying owlIt’s very interesting how these descriptions put a positive spin on features that aren’t that attractive. For instance, “You are not very picky about what you like” is a reference to the fact that owls are birds of prey and will basically eat anything they find. The positive qualities listed are ones that I certainly aspire to and I can almost remember “glid[ing] silently through the night” in my past life as an owl.

Owls are not funny. Here’ a typical example of owl humor:

An owl and a field mouse walk into a bar. The owl turns to the field mouse, but doesn’t say anything because owls can’t talk. Then the owl eats the field mouse, because owls are predatory birds.

Owls are serious predators of the night, wise and stealthy as they swoop about taking care of their nocturnal business. Nothing humorous there.

So I was confused a few days later when I took the “What is Your Spiritual Power?” quiz (I really do need to get a life) and was told that

What is Your Spirit Power?

You got Joy. You are the most joyful spirit around. The happiness within you never stops flowing. You’ve never kept it all for yourself either, you’ve always made others happy when they needed it most.

Joy? Really? This will come a surprise to Jeanne. She’s the one who has Pharell Williams’ song “Happy” as the ringtone on her phone.

I would be more likely to have the tune to “Leave me the fuck alone” on my phone. What would a happy, joyful owl be like? In the world of Photoshopping, all sorts of possibilities are available.untitled But in the real world, owl joy is hard to detect. Take my word for it—it’s in there. As soon as I find it, I’ll let it out.Caleb owl

I AM smiling.

I AM smiling.

more than the syllabus

First Day of Classes

Today is the first day of the semester at my college. I have been anticipating it even more than I usually do because I have been on sabbatical and have not been in the classroom since May 2015—my longest stretch away from school since the mid-1980s. charlie-brownEveryone has stories, many of them of the horror variety, about the first day of school—mine were full of cognitive and emotional dissonance from my earliest years. On the one hand, I couldn’t wait for school to start—I’ve loved just about everything about the life of learning ever since I can remember. On the other hand, there was all that non-learning stuff that I was not good at. Meeting new people. Sports. Leaving my mother and my dogs at home every morning. Riding the bus for a half hour each way. Come to think of it, I probably would have loved home schooling. I doubt my parents would have.

For a college professor, the first day of the semester requires answering an eternal question—what to do with students whom you have never met, who are coming off either summer or Christmas vacation, who probably just met their roommate and hate her or him, who might have spent the previous night getting in a final party before attempting to get serious about their education for at least a week or so, and who are assuming that all we are going to do during our first class meeting is go over the syllabus and leave early? quizI have some professor colleagues who, assuming that all the above is true, descend to their new students’ expectations and dismiss class after a brief overview of the syllabus and the upcoming semester. I am not one of those professors. At the beginning of the first day of school, I give a reading quiz.

This, of course, means that my students have a reading assignment that they are required to complete for our first meeting of the semester. This is easy enough to arrange—I have the class roster available several weeks before the semester begins, so a simple email, followed by at least a dozen reminders in the following days before the semester begins, provides my students-to-be with the course syllabus, other pertinent data, and the first day’s reading assignment (with the promise of a quiz). I often notice a drop of five or six students in my class roster in the day or two after sending this email, which is a good thing (see below).game of thrones I usually receive a bit of email pushback from my students-to-be (“My other classes aren’t requiring a reading or having a quiz on the first day”), so why do I do this? For a number of reasons.

  • One advantage, as noted earlier, is roster control. Unfortunately, because my classes tend to fill up quickly and are often of the core curriculum variety that students have to take, every student who leaves my roster for a more palatable first day experience is immediately replaced by someone else (who doesn’t necessarily know about the first day reading and quiz—hence the multiple email reminders).
  • The amount of classroom time I have in a given semester with my students is limited. This semester we meet for seventy-five minutes twice per week for a total of twenty-seven meetings. Throwing away the first of those meetings sends the message that either our time together is unlimited or that it isn’t important enough to squeeze value out of every scheduled moment. Neither of these is true.
  • Many of my courses (half of them this semester) are filled exclusively with freshmen. Nothing says “this isn’t high school or Kansas anymore” to a first-week college student than for them to hit the ground running—hard.

I’m happy to report that in the interdisciplinary course I teach in every fall, my teammates have been happy to join me in assigning significant work for the first class. Two years ago, my colleague and I assigned the entire Iliad for freshman summer reading, building on it through the first two weeks of the semester. The first meeting of this year’s interdisciplinary course freshmen met today—penguinsmy colleagues and I assigned fifty pages or so and gave a quiz. Amazingly, everyone survived. It’s amazing what you can get freshmen to do without complaining.

Other than making the first day of school a difficult one for my students, what really am I up to as I return to the classroom this week? I reflected on this just before as my sabbatical began sixteen months ago; as I get back into the teaching saddle I return to what I was thinking then. As a professor, I am a facilitator of lifetime learning, a person who points students in fruitful directions, helping them identify and become skillful in the use of tools that will help them construct their own moral frameworks intelligently. The liberally educated lifetime learner is equipped both to seek and create meaning throughout her life. I take pride in playing a part in this process. I have thought a lot over the past twenty-five years about the day-to-day dynamic between professor and student; I continually return to the difference between an idol and an icon.

Idols and Icons

virgil and danteThe point of a professor is to be Virgil to the student’s Dante, guiding the educational journey, relying on knowledge and experience to point out the pitfalls and attractions of a journey that each person must encounter individually. The professor helps the student learn to identify what is important and what is not in the perpetual sifting process of education. The professor is not the main attraction at any point in this process. The professor is an icon—something to look through or past, in other words—rather than an idol—something to look at. second commandmentThere is a reason that the Second Commandment is a prohibition against idolatry. Human beings are inveterate idolaters, more than happy to pattern themselves after someone or something else rather than to take on responsibility for themselves. For those who are interested in creatively addressing the undoubtedly real shift in higher education toward preparation for a good job and financial success that has been going on for a while now, I highly recommend iconography.

illusion

They Will Never Take Our Freedom

Although I read incessantly, I don’t read a lot of magazines. The only magazine I currently subscribe to is The Atlantic—I appreciate the excellent writing and quirky features, but don’t exactly wait by the mailbox for each monthly edition to show up. Instead, they tend to pile up on the little table next to my side of the bed, waiting to be perused when I am between authors in my novel reading. I’m currently in one of those spaces, having just finished my fourth consecutive Arturo Pérez-Reverte mystery a few days ago and not ready to start a new, large reading project just a week before the semester starts. 394-They'll Never Take Our FreedomAccordingly, I started plowing through the three summer editions of The Atlantic that have accumulated on my nightstand since June. Inside the June edition, whose cover includes two-thirds of Donald Trump’s head peeking in from the right side announcing a lead article entitled “The Mind of Donald Trump” (an oxymoron if I ever saw one), I found this: “There’s No Such Thing as Free Will—Here’s why we all may be better off believing in it anyway.”

Stephen Cave: There’s No Such Thing As Free Will

CaveThe article is by Stephen Cave, a philosopher who runs a “Center for the Future of Intelligence” at the University of Cambridge. His article is well-written and engaging—so much so that I suspect he may have had help with it. Trust me, I know whereof I speak. I have spent over twenty-five years learning to write in ways that make core philosophical issues accessible and interesting to non-philosophers—it ain’t easy. First, it’s important to clarify what philosophers usually are referring to when they use terms like “free will” or “freedom.”  Just before the final battle in his 1995 epic “Braveheart,” Mel Gibson’s William Wallace screams to the Scottish army that They may take our lives, but they’ll never take our freedom!!

That sort of freedom, the kind enshrined in this country’s founding documents as “rights” that each citizen possesses and that must not be violated or taken away, is not what philosophers mean by freedom.

Instead, “free will” refers to the human ability to choose, for a person to deliberate between options and eventually choose, then act on one of the options, all the time knowing that she or he did not have to choose that option—decisionin other words, she or he could have chosen otherwise. This vaunted human ability to freely choose is, for many (including me), the fundamental and defining feature of what it means to be human. Stephen Cave points out that our legal systems, as well as our general beliefs concerning praise, blame, reward, punishment, and all things moral depend on our basic belief in human free will. And it is under attack—scientists, psychologists, philosophers, and just about everyone “in the know” have been trying to take it away for decades.

The “free will issue” is a go-to problem in all philosophy courses, the philosophical version of the divine foreknowledge/free will problem in theology. Just it is impossible to make room for free choice in a world governed by an omniscient deity, so in a world where everything that occurs is governed in a cause-and-effect manner by the physical laws of matter, there is no room for true human free will. Cave points out that at least since Darwin argued in The of Species that everything about human beings—including our vaunted reasoning abilities where the ability to choose is located—is a result of natural evolutionary processes rather than a mystical, magical, or divine “spark” that lies outside the physical laws of matter, illusionscience has reinforced the conclusion that whatever human consciousness and deliberate choice are, they are to be placed squarely in the material world. Making it impossible, of course, to squeeze out the special place we desire for choice. Our choices may “feel” free, “as if” they are up to us, but Cave pulls no punches in describing the truth about us:

The contemporary scientific image of human behavior is one of neurons firing, causing other neurons to fire, causing our thoughts and deeds, in an unbroken chain that stretches back to our birth and beyond. In principle, we are therefore completely predictable. If we could understand any individual’s brain architecture and chemistry well enough, we could, in theory, predict that individual’s response to any given stimulus with 100 percent accuracy.

Experiments by psychologists and neuroscientists have shown that the brain’s neurons fire in new patterns causing a specific action before a person consciously “chooses” to act—indicating that my conscious “choice” is an illusion that actually doesn’t cause anything. nature nurtureDebates rage concerning how much a human’s actions are caused by “nature”—one’s hardwiring—and how much is caused by “nurture”—one’s environment—but there is general agreement that none of them are caused by conscious choice. We are determined through and through.

The ensuing discussion is often amusingly similar to conversations that couples considering a divorce might have: Should we tell the children, and if so, when? In the service of all truth all the time, some argue that non-philosophers and non-scientists should be made aware that free choice is an illusion and they should stop believing in it. Others insist that such a revelation would be damaging to the basic human’s commitment to morality, law, reward, punishment, and all of the other cool things that rely on our apparently mistaken belief that our choices make a difference and that we are responsible for them. My own classroom experiences indicate that it doesn’t matter. I regularly use a very simple thought experiment with my students at the beginning of the “free will” unit on the syllabus:

Suppose that in the near future a super-duper computer can read your brain and physiology sufficiently to predict the rest of your life, from large events to the minutest second-to-second thoughts and feelings, from now until you die. For a nominal fee you can purchase a printout of every event, thought, and feeling that you will experience for the rest of your life. Some printouts will be yards in length, while others will be very short. Do you want to see yours?

In a typical class of twenty-five students, no more than one or two students will say that she or he wants to see it. Why? Because even with direct proof available that the rest of my history is determined down to the minutest level—including my “free” choices—illusionI prefer to believe that my choices make a difference in my life and in the world around me. I prefer to embrace the illusion. It appears, in other words, that human beings are determined to believe that they are not fully determined.

On this particular issue I find myself swimming against the tide. I not only believe that human beings have the ability, at least on occasion, to make choices that are not entirely determined by their biology, history, and environment—I also believe that this ability is not an illusion. It’s real. The free will/determinism issue as contemporary philosophy defines it has its current shape because virtually everyone accepts a starting assumption—everything that exists is material stuff subject to inflexible physical laws. Given that assumption, the claim that human beings have the capacity to jump outside the limitations of matter and make choices that avoid the determinism of cause and effect makes no sense. But as I often tell my students, if the answers one is getting are unacceptable, change the question. If the ability to freely choose is fundamental to what a human being is, and if our current assumptions about how reality is constructed make no room for that ability, then perhaps instead of accepting that choice is an illusion we should challenge the assumptions that forced us to this acceptance. Be watching for “What Freedom Amounts To” next week, where I’ll describe a very different way to think about human choice!Horatio

They’re Baaack . . . .

We are in the business of believing in, and promoting, things that don’t yet exist. Julie Schumacher, Dear Committee Members

RI summerAfter the most beautiful June in my Rhode Island memory, July was warm and August has been abnormally hot. I hate heat—I will take zero over ninety-five degrees any day of the week. But August is one of my favorite months because I am a college professor. August is very quiet on campus—no classes, few hosted events, few visitors other than prospective students and their parents taking tours. I can work out at the gym without competing for equipment and enjoy observing the various construction sites on campus without dealing with tons of people.  It’s all a wonderful period of solitude; but just as Louis XV reportedly commented in anticipation of the Revolution that would cost his grandson Louis XVI his head, August says to the academic Apres moiaprès moi le deluge.” Before long, the floodgates will open. They’re baaack . . .!

Actually, this is great news. I can’t wait until next Monday when classes start–after a year-long sabbatical, I’m more than ready to be back in the classroom. I’m not one of those professors who regularly moans and complains about their students; they are the reason I am in the profession to begin with, they keep me young (at heart if not in outward appearance), and let’s be practical: for an academic, a world sans students would be a world sans paycheck. I got a fictional look at the dark side of academic attitudes about students, fellow faculty, administrators, and reality in general when reading DCMJulie Schumacher’s Dear Committee Members not long ago. Billed by Amazon as “A novel that puts the pissed back in epistolary,” it follows Jason Fitger, Professor of Creative Writing at Payne University, from the beginning to the end of an academic year through the exclusive lens of letters and emails of reference and support written for current and former students, colleagues, and acquaintances from graduate school days. Billed by reviewers as “funny as hell,” “hilarious,” “fun-as-heck,” and “funny and lacerating,” I must confess that although I smiled occasionally, I found the novel more sad than anything else. Sad because I know that the never-ending bureaucratic and pedagogical challenges of the academic life can turn someone into a jaded, sarcastic, and cynical curmudgeon like Jake Fitger (he’s four years younger than I am), and even sadder because it doesn’t have to be that way.

owaFitger is the graduate product of what he calls the “Seminar,” a graduate writing program that sounds a great deal like the Iowa Writers’ Workshop (which produced a few of my friends and colleagues—and none of them are like Fitger). After a reasonably successful first novel, followed by a couple more that fell still-born from the press, he finds himself in the final decade or so of a mediocre career at a mediocre public university in an English department whose infrastructure, both psychological and physical, is falling apart. As lines are closed down and the plumbing in the men’s bathroom fails, the Economics Department on the floor directly above is being treated to a complete upgrade of facilities. Fitger is the embodiment of what is actually happening across the country in more universities and colleges than I care to consider—the neglect and downsizing of the humanities while departments and programs perceived as job-producers and money-makers receive the lion’s share of funding and attention.

I am extraordinarily fortunate and privileged to have spent the last twenty-two years (and hopefully the next fifteen or twenty) at an institution that consciously attempts to swim against that tide. pcAccordingly, Fitger is for me the fictional embodiment of what could have happened had I not been as fortunate. I have few cynical and dedicated negative bones in my body, but some might have been created had I lived the professional life of Jason Fitger. Of course, many of the most problematic aspects of Fitger’s life are self-created. He is hated across campus for various justifiable reasons, his marriage to a fellow professor on campus fell apart when she became aware of his continuing infidelities with an administrative assistant, and his affair with the administrative assistant ends when he accidentally hits the “Send All” button on an email intended for one individual in which he expresses his continuing sexual attraction for his ex-wife that remains strong even after five years of divorce.

I don’t know anyone on campus like Jason Fitger (although he might be lurking somewhere). But hidden like buried treasure underneath page upon page of sarcasm and nastiness are occasional and brief homages to the academic life that I was surprised and pleased to find. faculty on quadIn an email to a former colleague from the Seminar, an epistle drenched in anger, regret and bitterness, Fitger steps back for a moment.

The stately academic career featuring black-robed professors striding confidently across the campus square is already fading; and, though I’ve often railed against its eccentricities, I want to proclaim here that I believe our mission and our way of life to have been admirable and lovely, steeped with purpose and worth defending.

Amen to that. I only get to wear black robes a couple of times a year—although it would be sort of cool to wear them all of the time—and I agree that the eccentricities of the academic process and of academics themselves can be a pain in the ass. But what a wonderful profession. It’s the best thing going, not because of money, fame, or notoriety (which come to only a miniscule percentage), but because of the privilege of making a living in the midst of the most exciting environment imaginable—the life of continual learning. VM Ruane 1As I noted in my remarks at the dedication of our new humanities building almost three years ago, the Apostle Paul’s words ring true at this time of year for every academic: Old things are passed away; behold, all things are become new. And Jason Fitger knows it, in spite of himself:

There is nothing more promising or hopeful than the start of the academic cycle: another chance for self-improvement, for putting into practice what one has learned—or failed to learn—during the previous year.

They’re baaack . . . and I can’t wait. Bring it on.