Category Archives: human nature

The Right Niyyah

As I wait impatiently for my sabbatical that is under contract with a publisher to return from the editor, I’ve been thinking about some of my blog essays that “made the cut” in some sense to appear in revised form in my book-to-be. One of these essays is about the challenge of cultivating the right attitude with which to enter the world on a daily basis. I learned a lot about this from Rami Nashishibi when he was interviewed a year or so ago on Krista Tippett’s “On Being.”

Regular readers of this blog know that I am a fan of Krista Tippett’s radio program “On Being,” a show that I frequently catch several minutes of on Sunday mornings as I drive the fifteen minutes from our house to the early show at church. A few weeks ago, her guest was Rami Nashashibi, Founder and Executive Director of the Inner-city Muslim Action Network, or IMAN, in Chicago. He’s also a Visiting Assistant Professor in Sociology of Religion and Muslim Studies at Chicago Theological Seminary.nashishibi

On Being: A New Coming Together

Tippett describes Nashishibi at the beginning of the interview as using

Graffiti, calligraphy, and hip-hop in his work as a healing force on the South Side of Chicago. A Palestinian-American, he started his activism with at-risk urban Muslim families, especially youth, while he was still a college student. Now he’s the leader of a globally-emulated project converging religious virtues, the arts, and social action. And he is a fascinating face of a Muslim-American dream flourishing against the odds in post-9/11 America.

Not surprisingly, the conversation was wide-ranging, fascinating, and introduced me to a number of matters and issues that are well outside of my usual frame of reference. What particularly grabbed me, however, was a brief exchange toward the end of the interview, just as I was pulling into my usual parking spot at Trinity Episcopal.

Krista Tippett: I told you I was on your Twitter feed, and I love this. I think it was really recent. You wrote: “My 4-year-old discovers the spiritual power of her name as she looks over and seriously asks, ‘Daddy, do you have the right niyyah?’” What does niyyah mean?

Rami Nashashibi: So niyyah — in kind of Arabic-Muslim parlance — is spiritual intention. niyyahAnd oftentimes — it’s both the Swahili and Arabic word. And oftentimes, Muslims are always asked before they pray, before they do any act of service, before they engage in anything that has any kind of sense of worship associated with it, is it being done for the right niyyah? Is it being done for the right purpose? Are you attempting to get fame or credit? I think, yes, there was a song that had used her name in that way and the light went off in the middle of it and turned over to me on the couch and asked me that question. Honestly, I looked at her and I didn’t have an answer for her for I think a good 20 seconds. She nodded her head and she said, “No, probably not.”

And I said, “What?” We then had a conversation. I said, “Well, at least keep me in your prayers that I have.”

This four-year-old’s simple question—Do you have the right niyyah?—has stuck with me ever since. So has her response to her father’s lack of response—“No, probably not.” It’s hard enough to figure out what the right thing to do is on a daily basis; adding in that it should be done with the right intention, for the right reasons, seems like piling on.intentions and actions As a philosophy professor who has taught introductory ethics courses more times than I care to count over the past twenty-five years, I have thought about this a lot. When I ask my students “What is more important—what you do, or why you do it? Actions or intentions?” they usually split roughly down the middle.

And so do the great moral philosophers. There is the tradition of those who say that only results matter (since they can be observed and measured publicly) and intentions are irrelevant. Then there is the other tradition (spearheaded by Immanuel Kant) who say that results are irrelevant—the true measure of the moral life is internal. Were your intentions pure? Was your heart in the right place? If so, then you are morally in the clear, even if the results of your intended action go “tits up” (to quote my brother-in-law).

VgMKgyZMy students are pretty smart, and it doesn’t take very long before they realize that the “results or intentions” question is a false dichotomy. Because in truth, normal human beings care about both. If morality is just about doing the right thing, then the person who identifies the things that should be done and does them—even if for all of the wrong reasons, such as self-righteous smugness or the praise of others—is morally in the clear. But Nashashibi’s four-year-old daughter is right—we want not only the right thing to be done, but for it to be done with the right niyyah, the right intention or reason. And that sucks, because it takes things straight into the human heart. For those who profess the Christian faith, it also takes things straight into the world of grace.

The first thing I ever learned from Scripture about the human heart as a young boy was from JeremiahJeremiah: “The heart is deceitful above all things and desperately wicked—who can know it?” Far less attention was paid to the Psalm that is recited in liturgical churches during the Ash Wednesday liturgy: “Create in me a clean heart, and renew a right spirit within me. Cast me not away from your presence, O Lord, and take not your Holy Spirit from me. Restore unto me the joy of your salvation, and renew a right spirit within me.” Straight from the Jewish scriptures is both the problem of and the solution for right intentions. As a flawed human being, I am incapable of doing things for the right reason, but help is available. Through divine grace the heart is changed and turned toward the good. Rami Nashishibi’s daughter is right when she doubts that her dad has the right niyyah, so long as that depends on his own energies and strength. But when the divine gets involved, everything changes.

The mystery of grace is exactly that—a mystery. Divine grace enters the world through flawed human beings, strangely enough, and there isn’t enough time to try to figure it out. Grace is something to be channeled, to be lived, not systematized and turned into dogma or doctrine. My bright abyssThe poet Christian Wiman writes beautifully about this. Through many years of cancer treatments, he learned to hear God, then to channel God, in the most unlikely places, the very places where divine grace apparently lives. Wiman writes that

God speaks to us by speaking through us, and any meaning we arrive at in this life is composed of the irreducible details of the life that is around us at any moment. . . . All too often the task to which we are called is simply to show a kindness to the irritating person in the cubicle next to us, say, or to touch the face of a spouse from whom we ourselves have been long absent, letting grace wake love from our intense, self-enclosed sleep.

The right niyyah is not the result of struggle, training, or calculation. And as the author of Deuteronomy tells us,deuteronomy

Nor is it beyond the sea, that you should say, “Who will cross the sea for us to get it for us and make us hear it, that we may observe it?” But the word is very near you, in your mouth and in your heart, that you may observe it.

All I have to do to have the right niyyah is to open my heart, open my mouth, and let it out.

To experience grace is one thing; to integrate it into your life is quite another. What I crave now is that integration, some speech that is true to the transcendent nature of grace yet adequate to the hard reality in which daily faith operates.

Tolerance on Steroids

What happens when a perfectly good virtue gets turned into not only the most important virtue, but in many cases the only virtue? I have come face to face with this question in the signpostsearly weeks of this semester with fifty juniors and seniors in two ethics classes. I chose this past summer to organize my General Ethics course, usually a tour of several of the notable moral theories in the Western philosophical tradition (Aristotle, Kant, Mill, Sartre, etc.) that are then applied to the details of human experience, by starting instead with those messy details themselves. We find ourselves in a world of competing religious, moral, and political claims shouting at each other across various divides, claims that are both incompatible with each other and resistant to compromise. How in the midst of diverse perspectives that too often lead to violence are we to find a place of stability from which to plot the way forward?

I have discovered both from early class discussion and student writing reflections what I suspected—most of my young adult students have been taught for as long as they can remember that the “go-to” virtue that must be cultivated in order to wend one’s way through the minefield of incompatible beliefs and commitmaristotleents is tolerance. It’s interesting that the granddaddy of virtue ethics, Aristotle, did not include tolerance in any of his lists of virtues—apparently such a virtue was not particularly useful in fourth century BC Athens. Tolerance is also rejected by many contemporary people as a sign of weakness, of lacking commitment to one’s beliefs, and of a willingness to compromise too quickly. But for many in our culture, particularly those who might consider themselves as “liberal” in some sense, tolerance is the proposed remedy for many of the things that ail us.

Don’t get me wrong—I have no problem with tolerance as a virtue. As a matter of fact, it probably plays as regular a role in my life on a daily basis as any virtue you could name. My concern about tolerance arises from intimate facebookfamiliarity with how it often works in my own life. When I remove myself from an email list on campus because I’m sick to death of being inundated with what I consider to be the often petty concerns of my colleagues, it feels like tolerance. “Let them continue emailing about anything they want,” I think. “I just don’t want to be part of it.” When a Facebook conversation wanders into areas that I find either offensive or seriously different-strokesmisguided, my tendency is to withdraw from the conversation rather than insert my concerns. Tolerant, right? Not really.

I find in my own life, and I suspect I’m not unusual or unique in this, that “tolerance” is an umbrella term for “whatever.” “Different strokes for different folks.” “I disagree with you but I’ll defend to the death your right to say it” (although I almost certainly wouldn’t). In other words, one of the best safeguards against being judgmental and ethnocentric, a check against our natural human tendency to negatively judge those who believe, think, and act differently than we do simply because they are believing, thinking, and acting differently than we do, turns into a placeholder for laziness and a reticence to engage even with what one most strongly disagrees with. When writing on topics related to diversity and difference, my students regularly include phrases such as “we all just need to accept people as they are” and “the world would be a better place if everyone would simply be more tolerant of differences.” Tolerance is not only the first virtue that gets mentioned in class discussion and assignments, but is often the only virtue in play. But is tolerance suitable as the primary virtue in a moral framework or ethic? And what if there are some things that must not be tolerated?

herodotusA brief but familiar story from the ancient Greek historian Herodotus provides a useful jumping off point for asking uncomfortable questions about tolerance. In his Histories, Herodotus tells the story of King Darius of Persia, a (somewhat) enlightened ruler who was fascinated by the various customs of the different groups of people from the far-flung reaches of his empire who were part of his extended court. Darius noted, for instance, that two different groups of people—the Greeks and the Callatians (a tribe of people from what is now the Indian peninsula)—had strikingly different methods of dealing with the body of a person who died in their community. The Greek practice when someone died was to burn the dead body, while the Callatian practice was to eat the dead body.

Intrigued, Darius first asked representatives of the Greek community what he would have to pay or give them, what honors he would have to bestow on them, so that the next time someone died in their community they would eat the dead body instead of burning it, as was their custom. Appalled, the Greek representatives told Darius that no amount of riches or honors could possibly convince them to do such a horrible and immoral thing. Darius also asked a similar question of the Callatians—could I convince you to burn the next dead body you have to deal with in your community rather than eating it, as is your custom? Hell no! the Callatians said, insisting that nothing could convince them to do such a disgusting and immoral thing. Herodotus’s conclusion? callatians“Custom is king.” What a person or group of people considers to be “right” or “moral” is what they are accustomed to, the practices of their family, their community, or their culture that they have been taught since their youth. Humans nature causes us not only to embrace what we are most familiar with as morally right, but also to assume that it is right for everyone.

If “custom is king” and moral values are culturally defined, then the most important attitude to cultivate, the habit most likely to put up a firewall against unwarranted projection of one’s parochial practices and values on others, is undoubtedly tolerance. As Herodotus’ story is intended to illustrate, the best answer to the question “Who is right about the best way to dispose of a dead body—the Greeks or the Callatians?” is “Both, within the parameters of their culture.” Furthermore, there is no way to step outside one’s own culturally defined moral stance and be “objective.” There is no such objective standpoint. The only proper response to differences between groups, or perhaps even between individuals, is tolerance—the habit of accepting differences without judgment.tolerance

The problem, as a student quickly pointed out in each section of my ethics course, is that tolerance as an exclusive or primary virtue is not sufficient to account for many of our strongest moral intuitions. What if, for instance, the difference is about something more serious than the difference between eating or burning a dead body? What if the difference is between a culture that practices female circumcision and our culture that does not? Is tolerance appropriate in this instance? Are we to say “wow, I’m glad I don’t live in that culture, but for them that practice is morally right”? If our intuitions say that some practices cannot be tolerated, no matter what cultures adopt them, is this because our intuitions have been shaped by our own culture or because our intuitions are resonating with a moral absolute that transcends cultural differences?moral-values

Of such questions a great General Ethics class is made. But it appears that if we raise tolerance to primary virtue status, we at the same time take any commitment to moral principles that transcend cultural differences off the table. And that may not be a price worth paying. As I told my students the other day, a moral theory that does not account for our strongest moral intuitions is like trying to cover a queen-size mattress with a twin-size fitted sheet. It covers some of what needs to be covered, but not all of it. I, for one, am not ready to tolerate a theory like that.

Disturbing the Peace

Last Friday I attended a talk on campus by civil rights lawyer and and law professor Greg Lukianoff on issues of free speech, trigger warnings, and a related host of matters on college and university matters that are regularly in the news. He is the co-author of an article in The Atlantic a bit over a year ago that raised a lot of eyebrows and generated a lot of conversation. I wrote about it in the early weeks of my sabbatical last August: 

I do not know how to teach philosophy without becoming a disturber of the peace Baruch Spinoza Spinoza

One of the lead articles in the most recent edition of The Atlantic magazine is “The Coddling of the American Mind.”

Lukianoff and Haidt: The Coddling of the American Mind

Co-authored by Greg Lukianoff and Jonathan Haidt, the teaser blurb for the article in the Table of Contents says “How a new strain of political correctness on campus is damaging higher education—and may be threatening students’ mental health.” It is an interesting read. Given Donald Trump’s current more-than-fifteen-minutes of fame, concerns about political correctness are in the news, safe spacebut in this article Lukianoff and Haidt are drawing our attention to what might be called “political correctness with a twist”:

The current movement is largely about emotional well-being. . . . It presumes an extraordinary fragility of the collegiate psyche, and therefore elevates the goal of protecting students from psychological harm. The ultimate aim, it seems, is to turn campuses into “safe spaces” where young adults are shielded from words and ideas that make some uncomfortable. And more than the last, this movement seeks to punish anyone who interferes with that aim, even accidentally.

The authors’ argument is largely anecdotal, relying either on their own experiences or on recent anecdotal stories and essays from various campuses across the country. seismic shiftThere is a great deal of speculation about the causes of this perceived seismic psychological shift among students over the past couple of decades, although virtually no data is provided to substantiate many of the authors’ claims.

In the first column of the article readers are introduced to two important terms that “have risen quickly from obscurity into common campus parlance: Microaggression and Trigger warnings. Microaggressions “are small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless.” Examples provided include asking an Asian American or Latino American “Where were you born?,” because this implies that she or he is not a real American. Mrs. DallowayTrigger warnings are “alerts that professors are expected to issue if something in a course might cause a strong emotional response”; examples of texts deemed as needing trigger warnings on various campuses include Virginia Woolf’s Mrs. Dalloway (suicidal inclinations) and Ovid’s Metamorphoses (sexual assault). The many examples of these and related problems in the article are chosen and presented with the clear intention of “triggering” the reader into concluding “well that’s just stupid—political correctness, like a hydra, rears a new ugly head.” One of the authors’ primary concerns, repeated frequently throughout the article is that such attention to words and actions that might possibly somewhere, somehow offend someone will leave students unprepared to live and work in a world that doesn’t give a crap about what makes them feel uncomfortable.

What are we doing to our students if we encourage them to develop extra-thin skin in the years just before they leave the cocoon of adult protection and enter the workforce? Would they not be better prepared to flourish if we taught them to question their own emotional reactions, and to give people the benefit of a doubt?

Even though I have twenty-five years of college teaching under my belt,pc my experience on college campuses is deep but narrow, given that I have taught at my current college home for twenty-one years and have shaped my teaching and professional life within the confines of its “105 acre, park-like campus.” Serious conversations about the negative power of language on students in various groups defined racially, economically, by gender or by sexual preference have been ongoing on my campus for some time now. In my own philosophy department regular, continuing, and often heated debates occur about what constitutes appropriate and inappropriate language in the classroom, in job candidate interviews, and in basic conversation with each other. What strikes some as obviously benign, scholarly, and insightful strikes others as ill-advised, insensitive, and downright offensive. That said, the tsunami described by Lukianoff and Haidt as drowning campuses nationwide has escaped my notice where I teach—at least in my classrooms. Perhaps this is because I have included this general “trigger warning” in every syllabus for every one of my courses for at least the past fifteen years:

Ine this course we will be considering some of the most important questions a human being can ask. Perhaps the most important feature of our considerations is learning to ask these questions clearly and precisely. Only then can possible answers be considered fairly. Although I have definite positions on the questions we will be addressing, my role as professor is not to tell you what to think. My role is rather to get you to think. Expect your assumptions to be challenged and comfortable ways of thinking to be disturbed. As the great 17th century philosopher Spinoza once said, I do not know how to teach philosophy without becoming a disturber of the peace.

During an oral final exam a couple of semesters ago a student told me that “This class really messed me up—but in a good way!” Mission accomplished.mission accomplished

The new fall semester is just three weeks old–here’s a bit of advice related to safe spaces and learning for the incoming freshmen:

  1. Free speech dictates that everyone has the right to their opinion, but not all opinions are equal. right to an opinionOne of the purposes of a liberal education is to help you become skillful at using the tools of lifetime learning; some of these tools, used properly, will help you learn how to distinguish a good argument from bullshit—even when it is your own argument. I often say that a liberally educated person earns the right to have an opinion. The process of earning that right begins with realizing that your opinion is not special just because it is yours, and without challenge and analysis it means nothing with regard to whether it is true (or even a defensible position).
  2. In the life of learning, comfort is vastly overrated. comfort zoneExpect to encounter people, ideas, situations and expectations that are both unfamiliar and well outside your comfort zone. You should be looking for these rather than trying to avoid them. If you manage to make it through your undergraduate college career without changing any opinion, belief, perspective or attitude, then your tuition dollars have been wasted.
  3. The world of adulthood into which you are making your first, tentative forays can be a tough, nasty place. The world out there is full of people, ideas, things, and events that couldn’t care less if they lie within your current comfort zone.it is what it is As my wife would say, the world is what it is. Your years in college are not so much about your landing a well-paying job after you graduate as they are about the construction of a powerful and flexible moral and psychological framework of belief and commitment, from within which you will engage with what’s “out there” on a daily basis. It is not the world’s responsibility to provide you with comfort and security. It is your task to create and maintain a moral and psychological home for yourself in that world using all of the resources available to you, resources to sustain you on a life-long journey. By the way, you’ll be making significant renovations and additions to this home your whole life. Your professors are here to assist you in the construction of that home—good luck!

A liberal education, especially, inspires students to value struggle. By grappling with authors and ideas that demand the greatest level of intellectual intensity—and this is especially true in subjects that are difficult and uncongenial—students learn that they stretch themselves more through struggle, whether or not they win the match. Christopher Nelson

the onion

Repairing an Angel

I love The Onion. A couple of weeks ago they reported on a sad event at the Vatican:

Angel flies into window at the Vatican

The story reminded me of another damaged angel who I wrote about not long ago . . .

As I sat at home last Tuesday, doing the things I would normally have been doing in my office on a Tuesday (thanks Winter Storm Juno for coming on a day I don’t have classes), I managed to avoid checking Facebook until early afternoon. When I did, I saw that my daughter-in-law Alisha had posted a link to a white aura“What Color is Your Aura?” personality test. I hadn’t taken one in a while (they used to be a mindless and fun obsession) so I bit.

What Color Is Your Aura?

I had done this one before a while ago (I think I got yellow) and was pleasantly surprised by the following: A white aura means you are intensely spiritual, possibly surrounded by angels. You are good, honest, quiet and a bit shy, but full of light. Congratulations! You are an amazing person. The usual on-line personality attempt to “pump you up”—but I like it. Of most interest was that I am “possibly surrounded by angels.” I’ve always found the very idea of angels, especially guardian angels, strangely attractive yet entirely outside the reach of reason and logic. Strangely this reminded me of a place that I not only don’t like much but is about as different from Juno-invaded Providence as possible: memphis in mayMemphis, Tennessee.

One of the few things I remember fondly about the city of Memphis, where we lived for three years in the middle nineties, is “Memphis in May.” This is an annual event in Memphis during which the city celebrates the culture, food and history of a country selected in advance. It was (and I presume still is) a big deal, providing us with a welcome window into the world beyond the Mid-South parochialism and Southern “hospitality” that we found so challenging. We arrived in Memphis in August 1991, just in time for the beginning of the 91-92 academic year at Christian Brothers University, the place the inscrutable gods of academics chose for me to begin my career as a philosophy professor. We were not amused. But a couple of months into 1992, we started hearing about “Memphis in May”—and the country of choice met with our strong approval.

Italy. I knew nothing about Italians or things Italian until Jeanne and I met; once we were together permanently by the end of 1987 (we had met a month earlier), it was a quick education. bensonhurstA girl from Bensonhurst, Brooklyn—Italian father, Irish mother. Youngest of five, with three older, large Italian brothers and one older sister. Jeanne often describes herself by saying “I look Irish but I act Italian;” the latter part of that description is true of all of her siblings as well. The nature of an Italian father together with the nurture of being raised in a Sicilian neighborhood pretty much clinched the deal. By the time we made it to Memphis, our stepfamily was still relatively new; none of us liked Memphis at all (with the inexplicable exception of my older son), and we gladly anticipated seeing what Southerners might do to celebrate Italy.

The celebration must not have been that great, because I remember absolutely none of it—except the poster.011 The central figure is a Raphael-esque angel in gold and earth tones, contemplatively smiling and holding a garland as she walks down stairs containing the notes of “Spring,” the opening movement from Vivaldi’s “The Four Seasons.” There is also a lute on the second stair and a random, oddly shaped chair at the top of the steps with a palm, fruit tree, and cedar trees in the background. It thought it was pretty, particularly because I thought the angel with its curly, reddish hair looked something like Jeanne. I spent more disposable money than we really had available to get it framed for Jeanne’s birthday—it has hung somewhere in our home for the last twenty-four years.

Our Italy-poster angel is not the only wall-hanging angel in our house. A few years ago (even elephant-memory Jeanne can’t remember when), we purchased a ceramic angel who has hung on our dining room wall ever since. Let’s call her Hannah. 005Hannah hung happily for a long time attached by one of those wonderful Velcro contraptions that both hold things securely and can be removed from the wall without leaving a mark when necessary. One evening as I watched television in the close-by living room, I heard a crash. Usually such a noise is the effect of something one of the dogs has done, but not this time. Hannah had decided that she had hung in her particular spot long enough and fell five or six feet to the floor (she hadn’t flown for a while so was out of practice), shattering into five or six pieces. Fortunately she did not shatter into dust—fitting the pieces together like a jigsaw puzzle I thought “this is fixable.” “I’ll fix Hannah with Super Glue,” I told Jeanne when she returned home. This was a bold prediction.

I super gluehave a checkered history with Super Glue. Given Jeanne’s obsession with all things bovine, a decade or so ago I frequently purchased ceramic miniatures of the various “Cow Parade” cows that popped up in city after city. Soon we had more than a dozen of them; we even had a three-tiered display stand in the corner of the living room upon which these ceramic cows lived and grazed. That is until the day that Stormy, my son’s cat who was living with us while Caleb and Alisha were residing in the basement for a few months after they moved to Providence from Colorado, did a typical feline thing and knocked the display stand over just for the hell of it. cow paradeTiny horns and legs snapped off each Cow Parade treasure (they weren’t cheap). I gathered the parts and said “I’ll fix them with Super Glue.” As it turns out, Super Glue is great when you can clamp the things being glued together for thirty seconds (impossible when one of the items is a couple of molecules in length.) It is also great when the gluee’s fingers are not larger than the tube of glue and the things being glued. After many mishaps in which the only things being glued effectively were the tips of my fingers, I despaired as a repair failure. Jeanne took pity on me and put all the broken bovines into a box and put them into the attic where they still reside. Two of the less damaged ones are still in the living room, one missing a horn and one missing a hoof.

So my plan to repair the fallen angel with Super Glue was contrary to my past. But Hannah is larger than a Cow Parade figure, and her five or six pieces fit together nicely. Amazingly enough, the glue held, Hannah was deposited back on the wall (with more Velcro devices), and there she hung for a year. Until we decided to repaint the dining room over Christmas Break a month ago. I detached Hannah carefully in one piece from the wall and laid her, along with a number of other items (including the Italy angel poster) in the book room while we painted the dining room. It turned out beautifully; the day came to put everything back on the wall. hannahThat morning as I arose from reading in a book room chair next to where Hannah was lying, my clumsy foot touched her just directly enough to snap her trumpet and both of her hands off, each severed hand holding half of her broken trumpet. “No biggie,” I thought—“I’ll fix Hannah with Super Glue,” as I had the last time. But the detached pieces were eerily reminiscent in size of the tiny bovine items I had failed to repair in the past, and all of a sudden I was reliving the frustration of trying to repair midget cows. After several failed efforts, I said (loudly) “I’M ABOUT READY TO SHOVE THIS TRUMPET UP YOUR ANGELIC ASS!” and started thinking about what an angel with no hands and no trumpet might look like on the wall. Maybe nobody would notice.

Then I remembered that between my cow failures and now I have learned something about peace, avoiding frustration, and things angelic (sort of). Repeating the phrase that regularly calms and centers me when needed—“Truly I have set my soul in silence and in peace”—I returned to the handless and trumpetless Hannah. Suddenly it didn’t seem so impossible to hold two tiny ceramic pieces together solidly without wiggling for a full minute. 004Suddenly it occurred to me to slide a book of just the right thickness under her newly attached trumpet and hands so they could meld with full Super Glue strength to the rest of Hannah without being threatened by gravity. I calmly left the room and did not check on her until the next day. Sure enough, Hannah was once again whole, a cooperative effort between Super Glue and peacefully centered me. Hannah now presides over the archway between the dining room and the kitchen. I don’t know if real angels ever need repair. But if they do, I recommend Super Glue and lots of Psalm 131.006

Invading the Impossible

A couple of Sundays ago the gospel reading from Luke prompted our rector and my friend Mitch to suggest that Jesus is not someone you would ever want to invite to dinner. Why? Because Jesus’ behavior and the stories he told indicate that he had little interest in or patience with the way things are “supposed to be done.” For instance, he suggests that when you throw a dinner party, everyone is welcomeyou should not invite your best friends and closest family, the people who you know and love the most and whose presence is guaranteed to make the evening a success (they also are the people who are likely to extend a return invitation to you in the future). Rather, “invite the poor, the crippled, the lame, and the blind . . . because they cannot repay you.” In Providence, and I suspect in many locations, it has seemed over this past summer that every busy intersection has a person or two standing with a container and a homemade sign that says something like “Homeless—anything helps. God bless you.” There has been a lot of chatter in various places about where all these people came from, are they really homeless or is this actually an organized scam, and so on. Jesus not only would not ask those questions, homelessbut he would also bring all of these folks along to your house for a meal if you invite him to dinner. So think carefully before you invite him—there’s no telling what he might do or say.

A few days later at the opening of the semester mass that also officially kicked off my college’s 100th anniversary year, the gospel for the day was from earlier in Luke. This time Peter’s mother-in-law is sick with a high fever, Jesus heals her, “and immediately she arose and served them.” The word gets around town, of course, that the healing man is here and as evening falls everyone with anything wrong with them either makes their way or is brought to Jesus. Throughout the night he heals them all. As one might expect, he’s exhausted by the time morning arrives and, as introverts will do, “he departed and went into a deserted place.” But showing a typical lack of respect for an introvert’s need for solitude and battery recharging, “the crowd sought him and came to him, and tried to keep him from leaving them.” Just a normal twenty-four hours in the life of the Son of God.

So what are we to make of such stories if one professes to be a follower of Jesus and to at least be on the fringes of Christianity? My natural and immediate reaction from my earliest years has always been twofold. First, this guy was strange. Second, his being both human and divine equipped him to do stuff that normal human beings can’t do. Neither of those reactions is profound or unusual; it’s difficult to know what one is supposed to make of the gospel stories, particularly if they are intended to provide us with guidance for how to live a human life. global awakeningsBut not long ago I came across an “out of left field” observation about Jesus in action that jerked me up short.

Jeanne spent three weeks in June at an extended conference and workshop in Pennsylvania at a place called “Global Awakenings,” returning with much to be thankful for and much to share. All of the speakers and teachers she spent the weeks with can be listened to on-line, so over the summer I spent a good deal of time listening to and becoming acquainted with what these folks are up to. I’ve enjoyed and learned a great deal from my listening, but I resonated particularly with one fellow named Bill JohnsonBill Johnson. A few days after we listened together to one of his talks, Jeanne said “I have something from one of Bill’s books that I want to read to you.” Here’s what she read:

Jesus could not heal the sick. Neither could he deliver the tormented from demons or raise the dead. To believe otherwise is to ignore what he said about himself, and more importantly, to miss the purpose of his self-imposed restriction to live as a man. Jesus said of himself, “the Son can do nothing.” He had no supernatural capabilities whatsoever. He chose to live with the same limitations that man would face once he was redeemed. He made that point over and over again. Jesus became the model for all who would embrace the invitation to invade the impossible in his name. He performed miracles, signs, and wonders as a man in right relationship to God . . . Johnsons booknot as God. If he performed miracles because he was God, then they would be unattainable for us. But if he did them as a man, I am responsible to pursue his lifestyle. Recapturing this simple truth changes everything.

“Wow!” I said—“Holy shit!” I thought—“That’s really out there.” One of several endorsements at the beginning of the book describes the author, Bill Johnson, as “one of the nicest persons I know, and one of the most dangerous.” That’s not an overstatement. Because if what he writes about Jesus is true, then there is no place for those who profess to follow Jesus to hide.

One of the great theological and doctrinal debates in the early Christian church had to do, not surprisingly, with how we are supposed to understand Jesus. Human? God? Both? The winner in the debate, as embedded in the Nicene Creed that Christians in many churches recite every week, was “Both.” Which is, of course, very confusing. Various groups have tended to emphasize one aspect over the other ever since. nicene creedMy own tendency has always been to embrace the human side of Jesus rather than divinity, a tendency that over the past several years has evolved into a strong resonance with incarnation, the divine choice to be in the world in human form. I’m convinced that this was not a one-time deal. God continues to be in the world in human form, in you and in me. The passage from Bill Johnson’s book resonates fully with a strong embrace of incarnation. So far so good.

But as many, I tend to waffle when it comes to the miracles of Jesus. Amazing things happen in his wake everywhere he goes; all he has to do is show up. It’s easy simply to say “Well of course—he was the Son of God.” Bill Johnson’s argument is controversial, first and foremost, because it takes this “out” off the table. His argument also makes a lot of sense—it’s just that most followers of Jesus, including me, aren’t ready to hear it. AthanasiusAthanasius provocatively once said that “God became man so that man might become God,” exactly what Bill Johnson is arguing. Jesus is an example and model of what a human being attuned to the divine is like, of what is possible for those of us who take our faith seriously. The idea of incarnation, of God working in the world in and through human beings, is a beautiful one—but it is also intensely challenging. Jesus told his followers that they would do greater things than he did, and that includes us. Are we sure that we are ready to “invade the impossible”?

Tired of Hating People–Thoughts on the anniversary of 9/11

Everyone beyond a certain age can remember clearly what they were doing fifteen years ago tomorrow when they heard the news. I was in my college’s main cafeteria getting coffee and noticed something weird happening on the Today Show broadcast on a television hanging from the ceiling in the corner. first towerAt that point all they knew was that one of the Twin Towers was on fire, apparently because an airplane had crashed into it. I had scheduled office hours that morning, so I listened to live radio reports on NPR of the second tower being hit and the collapse of both towers. There was a surreal air to the broadcast—I wanted to believe that it wasn’t true, some sort of elaborate hoax along the lines of Orson Welles’ “War of the Worlds” broadcast many decades earlier. But it was true.

Classes were encouraged to meet and decide individually how best to deal with the day’s events. Several students in my first class of the day at 12:30 had family and friends who lived and/or worked in Manhattan—it was clear that the best thing for these students to do was to continue their frantic attempts to contact their loved ones. About half the class stayed and shared their thoughts—what they said and the nature of our conversation is difficult to recall. I know that many students (as well as many of my colleagues) were understandably angry and wanted retribution; tower collapseas we gathered our things to leave about half way through the class period I said “the one thing I’m feeling is that my best response to what has happened is to become a better person. A better teacher, husband, father, friend. That’s all I’ve got right now.”

There will be any number of retrospective reports throughout the day and evening today. Neither Jeanne nor I lost any immediate family or close friends in that day’s terrible events, although in a few cases it was only “luck” that spared someone we know well. A decade and a half removed, when I think about 9/11 and its aftermath as I have been over the past few days, I think of patriotism, wars that seem never to end, and the realization that with the swift passage of time soon I will be teaching students who, first, will not remember 9/11 and then, two or three years later, will not have been born when 9/11 occurred. But most of all, the lasting effect in this country of the terrorist attacks on that day has been a persistent atmosphere of fear and suspicion—as well as of the hatred that fear and suspicion  produce.

Just about a year ago the theme of the weekly “TED Radio Hour” on NPR was “Transformation—stories and ideas about becoming a completely different person.” The first story up that day was titled “How Did the Son of a Terrorist Choose Peace?”untitled

How did the Son of a Terrorist Choose Peace?

The story teller, Zak Ebrahim, is a peace activist and the author of The Terrorist’s Son: A Story of Choice. Ebrahim’s father, El-Sayyid Nosair, for a number of years plotted with other radicals to attack a number of New York City landmarks, including tunnels, synagogues and the United Nations headquarters. May of these planned attacks were thwarted by an FBI informant, but one of the attacks—the 1993 bombing of the World Trade Center–was not. Nosair and his fellow terrorists were convicted of placing a van filled with 1,500 pounds of explosives into the sublevel parking lot of the North Tower; the subsequent explosion killed six people and injured over a thousand others. Ebrahim was seven years old at the time of his father’s conviction and incarceration—Nosair was sentenced to life imprisonment plus fifteen years.nosair and son

Ebrahim’s father had become radicalized in the early years of his son’s life; in his TED talk Ebrahim describes how shortly before his father was arrested he took Ebrahim, along with several of the men who turned out to be co-conspirators, to a shooting range for Ebrahim’s first lessons in using a rifle. Even after Nosair’s arrest, the impact of his worldview on his young son continued to be strong.

Growing up in a bigoted household, I wasn’t prepared for the real world. I had been raised to judge people based on arbitrary measurements, like a person’s race or religion. He would just talk about Jews being evil. And I would hear similar things from the men that were with him. You know, gay people being evil and them wanting to turn you gay so that you would go to hell too. And just gay people being all-around terrible people and a bad influence. And he used to say things like, a bad Muslim is better than a good non-Muslim. That’s pretty much what indoctrination is. You have authority figures around you telling you that the world is one way and you don’t get to see another perspective.

This radical indoctrination began to crumble when Ebrahim, as a teenager, began through school to be exposed to some of the people he had been taught to hate. PhiladelphiaOne of his fellow group members at the National Youth Conference in Philadelphia leading up to the 2000 Presidential election was Jewish. Ebrahim did not learn that his new friend was Jewish until several days after their friendship had started developing; he says that “I felt a sense of pride in having been able to overcome a barrier that, for most of my life, I had been led to believe was insurmountable.” That summer he found a job at a Busch Gardens amusement park and for the first time had the opportunity to meet some gay people performing in one of the park’s shows. “I soon found that many were the kindest, least judgmental people I had ever met.”

One day I had a conversation with my mother about how my worldview was starting to change. And she said something to me that I will hold dear to my heart for as long as I live. She looked at me with the weary eyes of someone who’d experienced enough dogmatism to last a lifetime, and said tired of hating“I’m tired of hating people.” In that instant, I realized how much negative energy it takes to hold that hatred inside of you.

On one level it’s easy to hate because a world made of “Us” vs. “Them” is simple to define and make judgments from within. On a deeper level, though, Ebrahim is right—the negative energy of fear and hate is psychologically exhausting, an exhaustion that is symptomatic of our culture. It’s almost as if it isn’t natural for humans to hate.

A few moments of attention to the level of discourse in the current Presidential campaign are sufficient to hear the tones of fear and anger that pervade our national conversation about almost everything. It is a season of intolerant and fear-mongering language. That such attitudes exist is nothing new; what is new is that we have reached the point where hatred and intolerance have found a new foothold in the public square and conversation. And even for those who seek a moderate position that avoids anger and fear, the current atmosphere is infectious. big enough lieA character in Eric Bennett’s new novel A Big Enough Lie explains the dynamic well:

There are people in the world whose opinions differ from yours so much that the difference implies violence, urges it, supplies a will for it. And if you stand on the side of moderation, this implication, this will to violence, upsets you even more than the mere difference of opinion itself. Because you are complicit in it—you become complicit in extremism by loathing extremism. You are reduced by your enemy to what you despise in your enemy. The world excuses only saints and lunatics from its economy of hatred, is what you realize. Pick a side.

On this fifteenth anniversary of one of the darkest days in American history, my hope is that we as a nation, as a culture will decide, as Zak Ibrahim’s mother did, that we are tired of hating people. us-vs-themTired of dividing our tiny little universes up into “Us” and “Them” as we vilify those who do not look like, act like, or believe the same as those in our self-defined groups of specialness do, often in the name of rigidly dogmatic beliefs that cannot accommodate the complex and shades-of-grey world in which we live. As Zak Ebrahim discovered, the best cure for fear and hatred is simple experience. But such experience can only happen if each of us has the courage to step outside our ossified comfort zones and dare to meet the most frightening thing in the universe—someone who is not the same as me.

Convocation_2007_16

Nice Work If You Can Get It

Not long ago, I read a Huffington Post article summarizing the results of what a bunch of anthropologists found out concerning the daily work habits of university faculty.

What Do Professors Do All Day?

After spending two weeks with a non-random sample of sixteen faculty of different ranks at boise stateBoise State University, the researchers found out that on the average the faculty worked 51 hours during the work week and 10 hours on the weekend at a host of different tasks. Amazing. It took a study to find out that teachers have full-time jobs after all and actually do work-related things on the weekend (although the numbers for both week and weekend seem a bit low). I’m wondering how to square these remarkable results with an article I read a couple of years ago claiming that “University Professor” topped the list of “Least Stressful jobs of 2013.” Of course I had to respond . . .

Those who read this blog regularly or even occasionally know that I believe I have the greatest job in the world. For those who are finding this out for the first time, let me repeat—I have the greatest job in the world. As a matter of fact, it is so great that I don’t consider it to be a job at all. For me, teaching is a vocation rather than a job, something that I truly believe I am called and was born to do. Convocation_2007_16I raise eyebrows occasionally on the campus of the Catholic college at which I teach when I say that I consider teaching to be a vocation for me just as much as being a priest is the vocation of the guys who walk around campus in white robes. But even though I love what I do to an almost unhealthy extent, I taken aback when I learned from a colleague via Facebook that “University Professor” is listed by CareerCast.com at number one in its top ten list of “Least Stressful Jobs of 2013.”

The Ten Least Stressful Jobs of 2013

Really? Or as one of my colleagues commented on Facebook “Bullshit!!! [pardon my advanced degree French].” I guess I must have failed to notice how non-stressful my job is during my 25 year university professor career.

Every person who teaches in higher education has a virtual file full of stories about how difficult it is to get a non-academic friend or family member to understand exactly what is involved with being a college professor. pic_short_teaching_courses_londonMost difficult is getting someone to understand that this is not a part-time job. For instance, Cousin Bob finds out that the typical teaching load for a faculty member at a teaching college or university is three or four three-credit courses per semester (or perhaps five at a two-year college), meaning that the faculty member is in class at most fifteen hours per week. Must be nice to make a full-time salary at a part-time job! Cousin Bob remarks. Early in my career I often patiently pointed out to the Cousin Bobs in my family that a good rule of thumb is that a teacher spends three to four hours outside of class (preparation, reading, grading, meeting with students, etc.) for every hour spent in class. “Really?” Cousin Bob replies. But he clearly is not convinced, since as we all know, easy working hours is the main reason that a person becomes a teacher.

Then, of course, Cousin Bob wonders about all of the weeks of the calendar year that I am not in the classroom. Christmas break, Spring break, the summer—teachers apparently get at least twenty weeks per year off.images Must be nice to make a full-time salary at a part-time job! With what begins to feel like infinite patience, I explain to Cousin Bob that teaching is only one part of a university professor’s job. In order to advance through the ranks of promotion and, more importantly, in order to get tenure, one must research and publish on a regular basis. For most college faculty, the time and focus required for this aspect of the profession is not available during the semester, so the “breaks” are all about research, writing, and praying for publication. But I’m not in the classroom, right? Must be nice to make a full-time salary at a part-time job! You get the idea. A colleague once told me about his frustrated response to a relative after one too many such conversations. Upon hearing Must be nice to make a full-time salary at a part-time job! one too many times, my colleague replied “It actually is really nice. If you were smart enough you could have a job like that too.”

CareerCast’s explanation of why University Professor is the least stressful job of 2013 has more than a whiff of Cousin Bob behind it, just expressed in a slightly less aggressive fashion. For instance, the article explains that

University professors are at the pinnacle of the education field. Their students are largely those who choose the classes they attend, and thus want to be in class. Unlike elementary and secondary educators, the performance of college professors isn’t evaluated based on standardized tests. 15rfd-image-custom3University professors also have the opportunity to earn tenure, which guarantees lifetime employment.

A full response would require something more like a book chapter than a blog post. Suffice it to say that the author of the article (1) has apparently never heard of core requirements if he thinks that college professors face “students . . . who choose the classes they attend, and thus want to be in class,” (2) is not aware that despite the (usual) lack of standardized tests, college teachers are evaluated by their peers, answerable for the results of student evaluations, and are under regular scrutiny in every aspect of what they do, and (3) needs to learn something about the tenure process (assuming that the faculty member is fortunate enough to be teaching at an institution that uses the tenure process).

Tenure. Such job security is certainly unusual in today’s job market and tenure is an attractive “perk” of the academic life. Once one earns it, that is. one-does-not-simply-become-an-adjunct-professorTenure-track positions are hard to come by in academia, more and more so as many institutions opt for hiring year-to-year adjunct professors or special lecturers then proceed to treat them as well-dressed slave labor (don’t get me started on that one). Should a teacher be fortunate to land a tenure-track position in today’s heavily buyer-skewed academic marketplace, the stress she or he will experience in the next several years leading to the tenure decision will be sufficient to last a lifetime. As is undoubtedly the case in many workplace environments, the tenure decision is often as much or more about internal campus politics as it is about the qualifications of the candidate and those things that she or he can control. “The opportunity to earn tenure” is indeed that—an opportunity that, unfortunately, for many talented and qualified teachers will never be available.

Then there’s the money. The article author points out that

csreport_header02_r1_c1_s1Harvard University pays full-time professors $198,400, with a 7:1 professor-to-student ratio, while University of Chicago professors receive $197,800 per year with a 6:1 ratio. Among public universities, the University of California at Los Angeles (UCLA) is highest paying, with an average wage of $162,600 for its full-time staff.

Really? All of them? At all levels? In all disciplines? Such “statistics” are useless without context, about as useless as telling a lawyer in a public defender’s office working 80-100 hours per week and struggling to make ends meet that the senior lawyers in the top firms on Wall Street often make seven-figures annually. Here’s an anecdote from the real world. At least a dozen years into my teaching career, still paying off the loans from ten years of college in order to earn the PhD required to teach at the college level in my discipline,business-ethics it occurred to me one day that the second semester seniors in my Business Ethics class, most of whom would be entering the work force shortly after graduation as entry-level persons with undergraduate business degrees, would be starting at a monthly salary noticeably higher than my own. As I once told a younger colleague when I was chair of my department, in response to his complaints about how little he was being paid, “if you became a teacher for the money, you’re a moron.”

1385581_616281185091038_1215320450_nI have reached the level of experience and rank (a tenured, full professor) at which one could expect that maybe stress levels might reduce and ultimately disappear. But persons such as I are those who are tapped, appropriately, to significantly commit themselves to the third leg of the academic stool (along with teaching and research): service. After four years as chair of a department of 25 faculty and having recently completed a four-year stint as the director of the core academic program at my college, responsible for 80 faculty and upwards of 1700 students at any given time, I realize that one sort of stress just gets replaced by another.

And actually that’s fine, since it is all part of the vocation I was born to inhabit. There are many attractive features to the life of a university professor. I can think of no other profession in which one’s creativity is required more often or in which one has more autonomy and flexibility. But it is anything but stress-free. A teacher never leaves the office. Your work goes with you everywhere. I realized at one point early one December that, other than Thanksgiving Day,  I literally had not had a day off since the middle of August. This is why I have recommended the teaching profession to no more than a half-dozen of my best students in more than twenty years of teaching. If you are looking for a profession that will fit nicely with your family obligations and other interests, don’t become a teacher.nice-work-if-you-can-get-it-1180 If you want to make a living wage at a stimulating 40-45 hour per week job, don’t become a teacher. If you want to “work to live” rather than “live to work,” you probably should not become a teacher. If you think of teaching as one among many equally interesting career possibilities, don’t become a teacher. But if you are incurably obsessed with the life of learning, if the dynamic of new ideas exhilarates you, if you suspect that you might have the heart of a teacher and nothing else will fulfill you, then this highly stressful but highly rewarding vocation might just be for you. It’s nice work if you can get it.

You Are Free, Therefore Choose

Each of us can point to a time (or several times) in our past when we made a decision that, in retrospect, significantly shaped our lives going forward. Such decisions for me include getting married at one month past twenty, choosing to leave law school for a masters program in philosophy, and deciding to commit for life in my early thirties to a person whom I had known for six weeks. I could have chosen differently in each of these cases, and my life would be much different now than it is. DarwinBneuroscienceut could I really have chosen otherwise? The answer of many “experts” from psychology, science, philosophy, and more is “no.”

I wrote about how the “experts” have gradually but inexorably come to this conclusion a week ago, describing how evidence from Darwin to neuroscience supports the conclusion that everything about me, including all of my choices, is fully determined by both biological and environmental causes beyond my control.

They Will Never Take Our Freedom

I undoubtedly, the experts admit, will continue to believe that some of my choices are free in the sense that I could have chosen otherwise, but that belief is based on an illusion. illusionMy choices may feel free, but they really aren’t. If true, the news that free will—the foundation of what most of us believe concerning morality, reward, punishment, praise, blame, and responsibility—is an illusion cannot be taken lightly. Nor, I would argue, need I either as a philosopher or a human being believe that the “experts” are right about this. Free will is only an illusion if one accepts the starting assumptions that energize the argument against human beings having real free will, assumptions that include the belief that everything that exists is made of physical matter, that physical matter is governed by inexorable physical laws, and that we generally know what those laws are. These assumptions are so entrenched among the “experts” that challenging them is as uphill a battle as trying to argue that the earth is flat. But I’ll give it a shot.

I often tell my students that each person, among her or his beliefs, has a small handful of what Aristotle called “first principles.” Aristotle knew as much about cause and effect as anyone;ō indeed, he arguably invented our familiar system of logic that is built on the belief that we live in a world governed by cause and effect relationships. These relationships shape how our beliefs hang together as well. Consider the following conversation:

Me: I believe A is true.

You: Why?

Me: Because A depends on B, and I believe B is true.

You: Why?

Me: Because B depends on C, and I believe C is true.

You: Why?

There’s a pattern here. We all seek to support our beliefs by referring to connected and deeper beliefs on which they depend. There’s also a problem here, though. The chain of cause and effect has to end somewhere if we are to avoid the dreaded “infinite regress.” So eventually we get this:

Me: I believe that X is true because X depends on Z, and I believe Z is true.

first principlesYou: Why? (you’re getting really annoying, by the way)

Me: Because I do.

In Aristotle’s terminology, I have just identified “Z” as one of my first principles. In order to avoid an infinite regress, eventually we arrive at a belief for which we seek no further justification than that we believe it. Such first principles vary from person to person–some common ones include “Human life is intrinsically valuable,” “Human beings are fundamentally equal,” and “Do unto others as you would have them do unto you.” For many, including myself, “Human beings have the capacity to choose freely, choices that are not entirely determined by matters outside their control” is another first principle which, if true, stands in direct opposition to what the “experts” claim the truth to be. And like it or not, no one wants to hear that a first principle is an illusion.

When I choose freely, I deliberate between available options, weigh the evidence supporting and against each, and choose the option that best satisfies my operational criteria. I cause the choice, in other words, influenced but not determined by any number of factors. This simple idea—that a human being can choose without the choice being fully determined—violates assumptions so prevalent among the “experts” that it is tantamount to heresy. uncaused caused 2And to be sure, this simple idea is indeed a radical one, for it claims that the freely choosing human being is an exception to the inexorable laws of matter, capable of starting her or his own chain of causation that would not have happened without the choice starting the chain. There are few beings in the annals of philosophy with this power. Aristotle called his creating force—what we might call “God”—the “unmoved mover” or “the first cause uncaused,” the place where the chain of causation begins (or ends if one is going backwards). In short, human beings act “in loco parentis,” with the causal power of the divine itself, when we make free choices. If one wants to go religious, it is this very creative power of free choice that the sacred texts are referring to when they claim that human beings are “created in the image of God.”

The position that truly free choices step meaningfully outside the laws of nature has been called “metaphysically peculiar” by some philosophers, simply “bullshit” by others. Free will deniers assume that any human capacity that purportedly steps outside the laws of physical matter must be an illusion, since we all know that everything is made of matter and that matter is governed by deterministic laws. more thingsTo which I respond, as I often do, in the words of Shakespeare’s Hamlet: There are more things in heaven and earth . . . than are dreamt of in your philosophy. Presuming that everything in heaven and earth can be reduced to the confines of our current understanding of reality is hubris of breathtaking proportions. When a fundamental and definitive human ability is defined out of existence because of narrow assumptions, I choose to question the assumptions rather than the reality of the fundamental human ability. When answers to a question do not square with our strongest intuitions and beliefs about ourselves, change the question.

None of the above is very philosophical of me, at least not in the contemporary sense. I freely (J) admit that human free choice might be an illusion, but I see no reason to believe so based on an argument with questionable assumptions. I choose rather to embrace the mystery and miracle of being human and believe, until better contrary evidence is provided, in keeping with the intuitions of billions of human beings that at least some human choices make a difference—such a great difference that they make the world a different place than it would have otherwise been. SartreAnd human beings are not just vehicles of that change—their choices cause that change. Maybe we just don’t know enough about reality to rule out abilities that don’t square with our current understanding of things. Maybe human beings are truly the crowning glory of creation, endowed with a spark of the divine that reveals itself in our most basic capacities. Maybe all of the above. Take your pick. As Jean-Paul Sartre used to say, “You are free, therefore choose.”

more than the syllabus

First Day of Classes

Today is the first day of the semester at my college. I have been anticipating it even more than I usually do because I have been on sabbatical and have not been in the classroom since May 2015—my longest stretch away from school since the mid-1980s. charlie-brownEveryone has stories, many of them of the horror variety, about the first day of school—mine were full of cognitive and emotional dissonance from my earliest years. On the one hand, I couldn’t wait for school to start—I’ve loved just about everything about the life of learning ever since I can remember. On the other hand, there was all that non-learning stuff that I was not good at. Meeting new people. Sports. Leaving my mother and my dogs at home every morning. Riding the bus for a half hour each way. Come to think of it, I probably would have loved home schooling. I doubt my parents would have.

For a college professor, the first day of the semester requires answering an eternal question—what to do with students whom you have never met, who are coming off either summer or Christmas vacation, who probably just met their roommate and hate her or him, who might have spent the previous night getting in a final party before attempting to get serious about their education for at least a week or so, and who are assuming that all we are going to do during our first class meeting is go over the syllabus and leave early? quizI have some professor colleagues who, assuming that all the above is true, descend to their new students’ expectations and dismiss class after a brief overview of the syllabus and the upcoming semester. I am not one of those professors. At the beginning of the first day of school, I give a reading quiz.

This, of course, means that my students have a reading assignment that they are required to complete for our first meeting of the semester. This is easy enough to arrange—I have the class roster available several weeks before the semester begins, so a simple email, followed by at least a dozen reminders in the following days before the semester begins, provides my students-to-be with the course syllabus, other pertinent data, and the first day’s reading assignment (with the promise of a quiz). I often notice a drop of five or six students in my class roster in the day or two after sending this email, which is a good thing (see below).game of thrones I usually receive a bit of email pushback from my students-to-be (“My other classes aren’t requiring a reading or having a quiz on the first day”), so why do I do this? For a number of reasons.

  • One advantage, as noted earlier, is roster control. Unfortunately, because my classes tend to fill up quickly and are often of the core curriculum variety that students have to take, every student who leaves my roster for a more palatable first day experience is immediately replaced by someone else (who doesn’t necessarily know about the first day reading and quiz—hence the multiple email reminders).
  • The amount of classroom time I have in a given semester with my students is limited. This semester we meet for seventy-five minutes twice per week for a total of twenty-seven meetings. Throwing away the first of those meetings sends the message that either our time together is unlimited or that it isn’t important enough to squeeze value out of every scheduled moment. Neither of these is true.
  • Many of my courses (half of them this semester) are filled exclusively with freshmen. Nothing says “this isn’t high school or Kansas anymore” to a first-week college student than for them to hit the ground running—hard.

I’m happy to report that in the interdisciplinary course I teach in every fall, my teammates have been happy to join me in assigning significant work for the first class. Two years ago, my colleague and I assigned the entire Iliad for freshman summer reading, building on it through the first two weeks of the semester. The first meeting of this year’s interdisciplinary course freshmen met today—penguinsmy colleagues and I assigned fifty pages or so and gave a quiz. Amazingly, everyone survived. It’s amazing what you can get freshmen to do without complaining.

Other than making the first day of school a difficult one for my students, what really am I up to as I return to the classroom this week? I reflected on this just before as my sabbatical began sixteen months ago; as I get back into the teaching saddle I return to what I was thinking then. As a professor, I am a facilitator of lifetime learning, a person who points students in fruitful directions, helping them identify and become skillful in the use of tools that will help them construct their own moral frameworks intelligently. The liberally educated lifetime learner is equipped both to seek and create meaning throughout her life. I take pride in playing a part in this process. I have thought a lot over the past twenty-five years about the day-to-day dynamic between professor and student; I continually return to the difference between an idol and an icon.

Idols and Icons

virgil and danteThe point of a professor is to be Virgil to the student’s Dante, guiding the educational journey, relying on knowledge and experience to point out the pitfalls and attractions of a journey that each person must encounter individually. The professor helps the student learn to identify what is important and what is not in the perpetual sifting process of education. The professor is not the main attraction at any point in this process. The professor is an icon—something to look through or past, in other words—rather than an idol—something to look at. second commandmentThere is a reason that the Second Commandment is a prohibition against idolatry. Human beings are inveterate idolaters, more than happy to pattern themselves after someone or something else rather than to take on responsibility for themselves. For those who are interested in creatively addressing the undoubtedly real shift in higher education toward preparation for a good job and financial success that has been going on for a while now, I highly recommend iconography.

illusion

They Will Never Take Our Freedom

Although I read incessantly, I don’t read a lot of magazines. The only magazine I currently subscribe to is The Atlantic—I appreciate the excellent writing and quirky features, but don’t exactly wait by the mailbox for each monthly edition to show up. Instead, they tend to pile up on the little table next to my side of the bed, waiting to be perused when I am between authors in my novel reading. I’m currently in one of those spaces, having just finished my fourth consecutive Arturo Pérez-Reverte mystery a few days ago and not ready to start a new, large reading project just a week before the semester starts. 394-They'll Never Take Our FreedomAccordingly, I started plowing through the three summer editions of The Atlantic that have accumulated on my nightstand since June. Inside the June edition, whose cover includes two-thirds of Donald Trump’s head peeking in from the right side announcing a lead article entitled “The Mind of Donald Trump” (an oxymoron if I ever saw one), I found this: “There’s No Such Thing as Free Will—Here’s why we all may be better off believing in it anyway.”

Stephen Cave: There’s No Such Thing As Free Will

CaveThe article is by Stephen Cave, a philosopher who runs a “Center for the Future of Intelligence” at the University of Cambridge. His article is well-written and engaging—so much so that I suspect he may have had help with it. Trust me, I know whereof I speak. I have spent over twenty-five years learning to write in ways that make core philosophical issues accessible and interesting to non-philosophers—it ain’t easy. First, it’s important to clarify what philosophers usually are referring to when they use terms like “free will” or “freedom.”  Just before the final battle in his 1995 epic “Braveheart,” Mel Gibson’s William Wallace screams to the Scottish army that They may take our lives, but they’ll never take our freedom!!

That sort of freedom, the kind enshrined in this country’s founding documents as “rights” that each citizen possesses and that must not be violated or taken away, is not what philosophers mean by freedom.

Instead, “free will” refers to the human ability to choose, for a person to deliberate between options and eventually choose, then act on one of the options, all the time knowing that she or he did not have to choose that option—decisionin other words, she or he could have chosen otherwise. This vaunted human ability to freely choose is, for many (including me), the fundamental and defining feature of what it means to be human. Stephen Cave points out that our legal systems, as well as our general beliefs concerning praise, blame, reward, punishment, and all things moral depend on our basic belief in human free will. And it is under attack—scientists, psychologists, philosophers, and just about everyone “in the know” have been trying to take it away for decades.

The “free will issue” is a go-to problem in all philosophy courses, the philosophical version of the divine foreknowledge/free will problem in theology. Just it is impossible to make room for free choice in a world governed by an omniscient deity, so in a world where everything that occurs is governed in a cause-and-effect manner by the physical laws of matter, there is no room for true human free will. Cave points out that at least since Darwin argued in The of Species that everything about human beings—including our vaunted reasoning abilities where the ability to choose is located—is a result of natural evolutionary processes rather than a mystical, magical, or divine “spark” that lies outside the physical laws of matter, illusionscience has reinforced the conclusion that whatever human consciousness and deliberate choice are, they are to be placed squarely in the material world. Making it impossible, of course, to squeeze out the special place we desire for choice. Our choices may “feel” free, “as if” they are up to us, but Cave pulls no punches in describing the truth about us:

The contemporary scientific image of human behavior is one of neurons firing, causing other neurons to fire, causing our thoughts and deeds, in an unbroken chain that stretches back to our birth and beyond. In principle, we are therefore completely predictable. If we could understand any individual’s brain architecture and chemistry well enough, we could, in theory, predict that individual’s response to any given stimulus with 100 percent accuracy.

Experiments by psychologists and neuroscientists have shown that the brain’s neurons fire in new patterns causing a specific action before a person consciously “chooses” to act—indicating that my conscious “choice” is an illusion that actually doesn’t cause anything. nature nurtureDebates rage concerning how much a human’s actions are caused by “nature”—one’s hardwiring—and how much is caused by “nurture”—one’s environment—but there is general agreement that none of them are caused by conscious choice. We are determined through and through.

The ensuing discussion is often amusingly similar to conversations that couples considering a divorce might have: Should we tell the children, and if so, when? In the service of all truth all the time, some argue that non-philosophers and non-scientists should be made aware that free choice is an illusion and they should stop believing in it. Others insist that such a revelation would be damaging to the basic human’s commitment to morality, law, reward, punishment, and all of the other cool things that rely on our apparently mistaken belief that our choices make a difference and that we are responsible for them. My own classroom experiences indicate that it doesn’t matter. I regularly use a very simple thought experiment with my students at the beginning of the “free will” unit on the syllabus:

Suppose that in the near future a super-duper computer can read your brain and physiology sufficiently to predict the rest of your life, from large events to the minutest second-to-second thoughts and feelings, from now until you die. For a nominal fee you can purchase a printout of every event, thought, and feeling that you will experience for the rest of your life. Some printouts will be yards in length, while others will be very short. Do you want to see yours?

In a typical class of twenty-five students, no more than one or two students will say that she or he wants to see it. Why? Because even with direct proof available that the rest of my history is determined down to the minutest level—including my “free” choices—illusionI prefer to believe that my choices make a difference in my life and in the world around me. I prefer to embrace the illusion. It appears, in other words, that human beings are determined to believe that they are not fully determined.

On this particular issue I find myself swimming against the tide. I not only believe that human beings have the ability, at least on occasion, to make choices that are not entirely determined by their biology, history, and environment—I also believe that this ability is not an illusion. It’s real. The free will/determinism issue as contemporary philosophy defines it has its current shape because virtually everyone accepts a starting assumption—everything that exists is material stuff subject to inflexible physical laws. Given that assumption, the claim that human beings have the capacity to jump outside the limitations of matter and make choices that avoid the determinism of cause and effect makes no sense. But as I often tell my students, if the answers one is getting are unacceptable, change the question. If the ability to freely choose is fundamental to what a human being is, and if our current assumptions about how reality is constructed make no room for that ability, then perhaps instead of accepting that choice is an illusion we should challenge the assumptions that forced us to this acceptance. Be watching for “What Freedom Amounts To” next week, where I’ll describe a very different way to think about human choice!Horatio