Category Archives: belief


The Designer God Project

Jean-Antoine Houdon ~ Voltaire[1]Voltaire once said that if God did not exist, we would have to invent him. In truth, we invent God all the time, often with seeming disregard as to whether the God we have invented actually exists or not. Anne Lamott suggests that we can be pretty sure that we have created God in our own image if it turns out that God likes all the people and things that we like and dislikes all the people and things that we dislike. So how am I, or how is any God-believer, supposed to tell whether the God I believe in exists in reality, or exists simply as a figment of my self-obsessed imagination? I’m having the opportunity to explore these issues with my students early this semester, and the process has been both dynamic and illuminating.

The texts for an early seminar not long ago in the interdisciplinary program I both teach in and direct was the first twenty-five chapters of Genesis and the first twenty-four of Exodus.gen-ex[1] It is often a challenge to get freshmen to discuss anything in seminar in the early weeks of their first semester; getting a bunch of eighteen-year-olds, most of whom are products of twelve years of parochial education, to talk about the Bible is even more difficult. But I’ve been doing this for a while and have a lot of tricks. After assuring them that no one has ever been struck dead in any of my classes for speaking honestly about their reactions to what they’ve read in a “sacred text,” a few brave souls began to admit that the God of these Old Testament stories is quite different from the God they had been taught to believe in. This God frequently seems insecure, petty, unfair, and arbitrary—what’s up with that??

After a few minutes, it occurred to me that a thought experiment was in order. I said “Okay, if you don’t like the God of Genesis and Exodus, let’s work for a while on what we do want God to be and to act like. Let’s create a ‘Designer God’—you get to create God from scratch. Write in your notebooks for ten minutes on the following topic: Any God worth believing in will have the following characteristics. Come up with three characteristics and explain why any God worth believing in would have to have them. Go.”

After the writing portion of the thought experiment, the students compared notes and found that the God they had just designed individually was pretty similar from person to person. As they offered their favored divine characteristics, I wrote the list on the board:

Any God worth believing in will have the following characteristics:





Loving (at least to those who deserve to be loved)




Not a micromanager

As we discussed selected characteristics on the list, a number of issues were revealed.

Fairness and justice: The biggest problem the students had with the Old Testament God is that this is a God who plays favorites. 172663381_640[1]Any God worth believing in should treat everyone the same. “Why?” I asked. Do all of you treat everyone the same? Do you like the seven billion plus people in the world the same? Do you even like the few dozen people who you know really well the same? They had to admit that they didn’t. “Then why do you expect God to do something that you make no attempt to do?” I wondered. The students struggled for an answer other than that God is God and we’re not—the divine should be held to a higher standard than we are, although where that standard would come from other than God they weren’t sure.

images[6]Love and forgiveness: At first, the idea was that any God worth believing in should be loving. Period. “Even mass murderers, drug dealers and child abusers?” I asked. Well, several thought, we need to qualify this love thing a bit. God should love those who deserve it, or those who believe in God, but not everyone indiscriminately. Love that is equally spread everywhere without qualification is cheapened somehow. God’s love is transactional, in other words. I do this, God responds with love.

Power: Omnipotence turned out to be a big one—no God worth believing in is wimpy or weak. “But God in Genesis and Exodus is powerful and has no problem exhibiting that power on a regular basis. And you didn’t like thatomnipotent[1],” I reminded them. As it turned out, Designer God should be powerful but should not be all about using that power all the time. “When is it appropriate for God to use that divine power?” “Whenever I or my group is in trouble or needs something” was the most common response. So you want God to be like a 9-1-1 operator or a lifeline on “Who Wants to be a Millionaire?” That didn’t sound right, but maybe so. That led to another Designer God must-have trait.

Dependability: God needs to “be there” was the way that many students put it. “Dependable” and “trustworthy” were synonyms in this discussion. “Being there” means on call, though—the students clearly were not interested in a proactive God that demanded much of them. When things are going badly, listen up and answer my prayers. When things are going well, leave me the hell alone. No-Micromanage-150x150[1]The students were largely in agreement when I reframed this trait as a requirement that God not be a micro-manager. An overall plan for my life is fine, but I want to have a great deal of choice in terms of how I choose to find out about and pursue that plan (even freedom not to follow that plan if I so choose).

As we entered the last half hour of seminar, I asked everyone to take a mental step back and look at the list of Designer God characteristics that we had been discussing. 1834269-a-macro-of-santa-claus-face[1]Truth be told, they looked like the characteristics of a combination of a non-interfering Santa Claus and my students’ parents on a good day. Or the personality traits of the pleasant, vanilla God they had been taught to believe in. The question to ask, I suggested “What evidence is there that the God you have just designed actually exists?” Is there any evidence that these are the character traits of the divine, or are these simply a projection of what we want to believe in? A careful and clear consideration of the world we actually live in reveals that for every piece of evidence supporting the existence of the Designer God, an equally obvious piece of evidence suggests either the Designer God’s non-existence, or—perhaps more challenging—that whatever God is, God is something quite mysterious, exhibiting characteristics not on our list, and well outside our comfort zones. puppet[1]The Designer God Project was a two-hour exercise in creating God in our own image. And maybe that’s where most of us would like to stay. We’re like the Israelites in Exodus who get the shit scared out of them when God actually talks to them directly. They are very uncomfortable with the noise, the lightening, the fire, and the obvious power. Their response? “Moses, you go talk to God and tell us what God wants. We can deal with you, but don’t want to deal with that.”

The writer of Hebrews suggests that it is a terrifying thing to fall into the hands of a living God. Something created in my own image is far more comfortable, predictable, and manageable. Forrest-Gump1[1]The uncomfortable thing about adventuring with a real God rather than hanging out with a projection of myself is that it opens the door to continual growth and surprise and blows the doors off my comfort zone. Walking with God is like Forrest Gump’s box of chocolates. You never know what you’re going to get.

Insufficient Evidence

russellBertrand Russell was a British philosopher, logician, mathematician, historian, writer, social critic, political activist and Nobel laureate. He was one of the most important and recognizable public figures in the English-speaking world during the first half of the twentieth century. He was also an avowed atheist. The story is told that at the end of one of his public lectures in which his atheism was on full display, a furious woman stood up during the question and answer period and asked, “And Lord Russell, what will you say when you stand in front of the throne of God on judgment day?” Russell replied, “I will say: ‘I’m terribly sorry, but you didn’t give us enough evidence.’”

I was reminded of this story when the author of an article I assigned in my ethics classes the other day included it at the beginning of his discussion of how people use evidence to support the different sorts of things we claim to be true. For instance, the author claimed, verifiable and objective evidence serves as the foundation for truth claims in the sciences, but in religious belief—no so much. Indeed, the author continued, religious belief is easy and available to everyone because evidence is not required—just faith (whatever that means). atheismThe author identifies himself as an atheist who is fascinated by the phenomenon of religious belief—his conclusions make me wonder if he has ever actually met a person of faith.

As I am working with fifty students in two sections of General Ethics through our current unit entitled “Does God have anything to do with ethics?’ I have found regularly that my junior and senior students—the majority of whom are products of Catholic primary and secondary education—are often no more informed about the relationship of evidence to faith than the atheist author of the article for this given day. A few weeks ago I provided them with my “go to” definition of faith: Faith is the substance of things hoped for, the hebrewsevidence of things not seen. Some were surprised to learn that this definition, which does not refer to either God or religion, is from the Book of Hebrews in the New Testament. Whatever else faith is, the author of Hebrews is claiming that evidence has something to do with it.

I decided, as I often do, to get at this tricky issue obliquely and through the back door. “How many of you have ever been to Japan?” I asked. No hands went up in either class section, including mine. “How many of you believe in the existence of Japan?” I asked next—all hands went up. “How does that work?” How do we come to believe in the existence and/or truth of something when we lack direct supporting evidence? Because clearly the preponderance of things that each of us believes to be true—thousands upon thousands of items of all sorts—are beliefs that lack direct empirical evidence to support them.“How many of you believe that Abraham Lincoln was assassinated by John Wilkes Booth in Ford’s Theater in 1865?” I aslincolnked next. All hands went up. “How many of you were there when it happened?” No hands went up—I assured them that even I was not old enough to have been an eye-witness. So once again we have an example of a claim that everyone believes to be true even though we lack “concrete evidence” (as my students like to call it) to support our belief. Or do we?

Upon being asked to list what sorts of evidence we do have to support our beliefs concerning the existence of Japan and what happened in Ford’s Theater in April of 1865, my students came up with several suggestions:

  • Testimony—The word of others, eyewitnesses when possible, counts as particularly strong indirect evidence. Even though I have not been to Japan, I know people who have visited there and have even met people from Japan. It is, of course, possible that all of these people are lying to me, but the more that the testimonies I gather are consistent with each other, the more likely it is that they are pointing toward something true. This doesn’t work, of course, when considering events where no eyewitnesses are available, such as what happened at Ford’s Theater, but fortunately the spoken word is not the only way in which we are able to gather relevant testimony.
  • Texts—Those who have not been to Japan have seen pictures of it and have read descriptions of it. These are indirect and second-hand, but become part of accumulating indirect evidence. Textual evidence for historical events for whom there are no longer any eyewitnesses is the bread-and-butter of the historian’s trade. goodwinThe great Doris Kearns Goodwin gave a talk on campus a couple of evenings ago—when she spoke of Abraham Lincoln, it never occurred to me to wonder if what she was saying was true. As she described the meticulous ways in which she gathered evidence for her book Team of Rivals from letters, diaries, newspapers, and other first-hand accounts from over a century-and-a-half ago, I was reminded of how the only evidence we have of the truth of anything occurring more than a hundred years ago requires both investigative strategies and an inherent trust in the results of such investigations.
  • Traditions—Often all we have to rely on to bolster various beliefs is what has been passed down from generation to generation as stories and traditions intended to capture the essence of an individual, a family, or a culture. We use such stories and traditions as evidence to support our best guesses concerning what might have happened; they form an important part of the foundation of belief that gets passed from generation to generation.

It was clear as the discussion proceeded that in some of the most important parts of our daily lives, both as we engage with the present and as we consider the past, our “certainties” are built on a foundation of uncertainty, a foundation that we eventually depend upon as if it were certain as our collection of indirect evidence reaches an imaginary tipping point.

We are now half-way through the semester and my students are familiar with my teaching strategies, so no one was surprised when I stepped back from the board where I had listed their examples of indirect evidence and asked, “How might these types of evidence work in another area of belief where we have a difficult time accessing direct evidence—belief in God?” As it turns out, the same sorts of indirect evidence that we use on a daily basis to bolster our belief in all kinds of things are available when we enter the arena of faith. sacred-textsSacred texts are used as sources of evidence of all sorts; regardless of whether a person considers such texts to be divinely inspired or not, they contain evidence of how people have engaged with issues of God and faith over the centuries. Such texts purportedly contain testimonial evidence to the existence and nature of God as the divine reportedly interacts with human beings, and interpretations of these testimonies become centerpieces of traditions that develop into religions.

We also have the same sort of indirect testimony concerning faith-related issues that my students used when identifying the sources of their belief in the existence of Japan even though none had ever been there. For instance, I know many people who report personal experiences that they believe can only be explained by an invasion of or intervention into their life by something greater than themselves. Often such reports can also be accounted for by explanations other than a direct encounter with God, but in such cases one must always consider both the reliability of the person giving the testimony and whether other similar reports have been given by other reliable sources. testimonyAnd as always, whether or not a person believes such reports is going to be a direct function of her or his predisposition to believe anything. How much evidence is enough? What sorts of testimony will I never believe, no matter who it comes from? To claim that evidence is ever free of all sorts of psychological and personality-driven biases is to ignore how we actually gather and use evidence in real time.

My answer to Bertrand Russell’s complaint that “you didn’t us give enough evidence” is that perhaps Lord Russell needed to consider what sort of evidence would have counted as “enough,” as well as what even counted as evidence for him at all. If his claim is that there is not enough “concrete evidence” to establish the existence of God, I might agree. But I would remind Russell that the vast majority of the many things that we believe to be true are not supported by such evidence either. When considering issues as important as faith, there is no need to change the rules of the evidence game—faith is, after all, “the evidence of things not seen.”

A Grownup Faith

If we’re grownups about faith, then why can’t we all get together and lament the fact that there is no God? Christian Wiman

Recently my ethics students and I have been discussing the dangers of moral certainty. For many of them, this has been a counterintuitive conversation, given that moral principles are commonly thought to be only as good as they can be proved to be universally applicable and unassailable. Why wouldn’t we want certainty in our moral beliefs? one might ask. Because, as several of the authors assigned for class discussion have noted, many of the worst atcritchleyrocities that human beings have done to each other over the course of human history have been done in the name of various claims to certainty. The Holocaust. The Crusades. Terrorism of all sorts.  In an article assigned for a recent class, Simon Critchley writes that “Human knowledge is personal and responsible, an unending adventure at the edge of uncertainty. Insisting on certainty leads ineluctably to arrogance and dogma based on ignorance.”

Nowhere is certainty more problematic than in the life of faith. As poet Christian Wiman said in a recent interview,wiman

Doubt is so woven in with what I think of as faith that it can’t be separated. I am convinced that the same God that might call me to sing of God at one time might call me at another to sing of godlessness. Sometimes when I think of all of this energy that’s going on, all of these different people trying to find some way of naming and sharing their belief, I think it may be the case that God calls some people to unbelief in order that faith can take new forms.

If my own experiences and struggles with faith are at all typical, Wiman is on to something. There are times when I find it very difficult to tell the difference between faith in God and faith in a figment of my imagination. This is why, as I wrote last Friday, a person of faith can learn a lot from atheism.

Evangelical Atheism

This is not an unusual idea. For centuries, voices from within the camp of Christianity have called for something sounding very close to atheism. eckhartMeister Eckhart wrote that “We pray to God in order to be free of God,” from his prison cell Dietrich Bonhoeffer predicted that the future of faith would be found in a “Religionless Christianity,” and Simone Weil wrote that “the absence of God is the most marvelous testimony of perfect love.”

In each of these instances, the person of faith is asked to move beyond the traditional notion of God as something outside ourselves, a picture of the divine that for many has lost its meaning. I often find myself thinking, as I listen to various descriptions of God being thrown around in different venues, that “if that was what God amounted to, I would be an atheist.” This is where the passage from Christian Wiman quoted earlier comes in. The only way for faith to evolve and take new forms is for old models and paradigms to change. As Wiman writes in My Bright Abyss, “This is why every single expression of faith is provisional—because life carries us always forward to a place where the faith we’d fought so hard to articulate to ourselves must now be reformulated, and because faith in God is, finally, faith in change.”paradigm-shift

This can be very disconcerting, because old paradigms change only with great difficulty. When life gets even more challenging than usual, the person of faith is often tempted to fall back on “tried and true” methods of getting the divine’s attention. More prayer, more church attendance—but there comes a time when such methods are regularly met with deafening silence. This silence can lead either to a deepening crisis of faith or an entirely new faith altogether, a new faith that is infused with healthy doubt, and an openness to possibilities from sources that one never even considered as places where truth might reside. Wiman again:

To say that one must live in uncertainty doesn’t begin to get at the tenuous, precarious nature of faith. The minute you begin to speak with certitude about God, he is gone. We praise people for having strong faith, but strength is only one part of that physical metaphor: one also needs flexibility.

A grownup faith is one that is both strong enough to look for God in places that have traditionally been off-limits and honest enough to realize that certainty is the greatest threat to faith of all.

One of the traditionally strongest arguments from atheists against belief in God is particularly effective against a supposed God who lives outside the reach of human investigation, effectively immune from supporting evidence and critical argumentation. When non-theists mock disagreements among religious folks as simply being various competitions about whose imaginary friend is better, it is this sort of God whose existence is being questioned. immanenceAn evolving faith, however, tends to move from the “out there” model to the “right here” model when looking for the divine. If God’s immanence is at least as important as God’s transcendence, then we should expect to find glimmers and traces of the divine in the most mundane features of reality, although it takes a great deal of patience and imagination to perceive these traces. Persons of all faiths, in moments of doubt and uncertainty, can honestly share their faith experiences without the burden and bondage of doctrine and dogma, since in the trenches of faith, pristinely certain articles of faith tend to be irrelevant and meaningless. And atheists can join in the conversation, because trying to live a life of meaning and purpose without a safety net is a challenge for all of us, regardless of whether God is or is not a piece of the puzzle.evaporating-dew

Faith steals upon you like dew: some days you wake and it is there. And like dew, it gets burned off in the rising sun of anxieties, ambitions, distractions. Christian Wiman

Horton in a Nutshell

geahDr. Seuss was a regular in our house when my sons were young—my thirty-something sons still occasionally mention how much they both loved Green Eggs and Ham in particular. Theodor Geisel’s creatively madcap work has occasionally made it into this blog over the past four years, from the star-bellied Sneetches in an early essay on heresy

Dr. Seuss and Heresy

to the environmentally-minded Lorax during an on-campus controversy over the demise of a 150-year-old oak this past summer.

I Speak for the Trees

The most recent Dr. Seuss classic to cross my radar screen involves a gentle elephant who believes that “a person’s a person no matter how small,” a couple of kangaroos with bad attitudes, and other jungle animals dedicated to making the elephant’s life difficult. Early last summer someone at a conference Jeanne was attending told her that she should read Horton Hears a Who; by late summer a large orange-covered copy had arrived at our house. I paid no attention to it until a couple of weeks ago.nutshell

As I drove across town headed for minor oral surgery (a phrase that has turned out to be oxymoronic), I listened to an interview on Boston Public Radio with Ian McEwan, one of my favorite contemporary novelists.

Ian McEwan on Boston Public Radio

McEwan was in town on a book tour promoting his new novel, Nutshell. On the basis of the interview, I ordered the book on Amazon as soon as I got home. It is a reworking of Hamlet with a few twists, including that it is narrated by an unborn child hanging upside-down in its mother’s uterus. A la Hamlet, the unborn child knows that his mother is having an affair with its uncle and that they are plotting to kill its father. The fetus is urbane, sophisticated, listens to music and podcasts vicariously through its mother, has developed a connoisseur’s picky tastes in wine, and wonders whether there is life after the uterus. This is going to be fun.

Toward the end of the segment, the two interviewers asked McEwan to read a passage from Nutshell.

Certain artists in print or paint flourish, like babies to be, in confined spaces. Their narrow subjects may confound or disappoint some: courtship among the 18th century gentry, life beneath a sail, talking rabbits, sculpted hares, fat people in oils, dog portraits, horse portraits, portraits of aristocrats, reclining nudes, nativities by the millions, and crucifixions, assumptions, bowls of fruit, flowers in vases, and Dutch bread and cheese, with or without a knife on the side. Some give themselves in prose merely to the self. In science, too, one dedicates his life to an Albanian snail, another to a virus. Darwin gave eight years to barnacles, and in wise later life, to earthworms. The Higgs Bosun, a tiny thing, perhaps not even a thing, is the lifetime’s pursuit of thousands. To be bound in a nutshell, see the world in two inches of ivory, in a grain of sand. Why not, when all of literature, all of art, of human endeavor, is just a speck in the universe of possible things? And even this universe may be a speck in a multitude of actual and possible universes. So, why not be an owl poet?horton

In the wonderfully random way that things often connect together, this passage made me think of the book that had been laying on our coffee table for the last couple of weeks: Horton Hears a Who.

Horton the Elephant, while splashing in a pool, hears a small speck of dust talking to him. He comes to realize that the voice is coming from a small person who lives on the dust speck; indeed, the speck is actually a tiny planet, home to a community called Whoville, the home of microscopic creatures called Whos. the-whosThe Whos know that they are vulnerable and exposed to possible harm in a dangerous world; the mayor of Whoville asks Horton for protection, which Horton happily agrees to provide. He places the Who-planet on a clover that he proceeds to carry in his trunk as carefully as a waiter carrying a tray of crystal champagne glasses. Horton has come to the same realization as the pre-born narrator of Nutshell: Each existing thing is the center of its own universe of interests, desires, and concerns—but each existing thing is “bound in a nutshell,” “just a speck in the universe of possible things.”

The apparent insignificance of human existence prompted seventeenth-century mathematician and philosopher Blaise Pascal to write that The eternal silence of these infinite spaces frightens me. Yet in Psalm 139 we are told that

You have formed my inward parts;

You have formed me in my mother’s womb . . .

My frame was not hidden from you,

When I was made in secret

And skillfully wrought in the lowest parts of the earth.

As both Horton and McEwan’s upside-down narrator realize, everything is at the same time both insignificant and unique. The challenge is to keep both of these in mind simultaneously.success

As Horton’s story proceeds, his fellow jungle animals refuse to believe that the Whos exist, believing rather that Horton is nuts. In scenes reminiscent of grade school playgrounds, various animals ridicule Horton, eventually managing to steal his Who-bearing clover and hide it from him in a large field of clovers. After a long search, Horton rescues the frightened and shaken Whos; at his prompting, they finally prove their existence to the still skeptical jungle animals by making as much collective noise as possible until everyone can hear them. Now convinced of the Whos’ existence, all the animals vow to help Horton protect the tiny community.

Each of us is both insignificant and infinitely precious, no matter what current circumstances might indicate. From within the confines of his mother’s womb, McEwan’s narrator gains insights into a world he’s not sure he will ever reach. The Whos, upon discovering just how vulnerable and fragile their world is, are discovered by someone greater than themselves, someone willing to put himself on the line again and again to preserve their special existence. It’s a wonderful retelling of a story that generations have embraced. “A person’s a person no matter how small,” after all.a-persons-a-person

Disturbing the Peace

Last Friday I attended a talk on campus by civil rights lawyer and and law professor Greg Lukianoff on issues of free speech, trigger warnings, and a related host of matters on college and university matters that are regularly in the news. He is the co-author of an article in The Atlantic a bit over a year ago that raised a lot of eyebrows and generated a lot of conversation. I wrote about it in the early weeks of my sabbatical last August: 

I do not know how to teach philosophy without becoming a disturber of the peace Baruch Spinoza Spinoza

One of the lead articles in the most recent edition of The Atlantic magazine is “The Coddling of the American Mind.”

Lukianoff and Haidt: The Coddling of the American Mind

Co-authored by Greg Lukianoff and Jonathan Haidt, the teaser blurb for the article in the Table of Contents says “How a new strain of political correctness on campus is damaging higher education—and may be threatening students’ mental health.” It is an interesting read. Given Donald Trump’s current more-than-fifteen-minutes of fame, concerns about political correctness are in the news, safe spacebut in this article Lukianoff and Haidt are drawing our attention to what might be called “political correctness with a twist”:

The current movement is largely about emotional well-being. . . . It presumes an extraordinary fragility of the collegiate psyche, and therefore elevates the goal of protecting students from psychological harm. The ultimate aim, it seems, is to turn campuses into “safe spaces” where young adults are shielded from words and ideas that make some uncomfortable. And more than the last, this movement seeks to punish anyone who interferes with that aim, even accidentally.

The authors’ argument is largely anecdotal, relying either on their own experiences or on recent anecdotal stories and essays from various campuses across the country. seismic shiftThere is a great deal of speculation about the causes of this perceived seismic psychological shift among students over the past couple of decades, although virtually no data is provided to substantiate many of the authors’ claims.

In the first column of the article readers are introduced to two important terms that “have risen quickly from obscurity into common campus parlance: Microaggression and Trigger warnings. Microaggressions “are small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless.” Examples provided include asking an Asian American or Latino American “Where were you born?,” because this implies that she or he is not a real American. Mrs. DallowayTrigger warnings are “alerts that professors are expected to issue if something in a course might cause a strong emotional response”; examples of texts deemed as needing trigger warnings on various campuses include Virginia Woolf’s Mrs. Dalloway (suicidal inclinations) and Ovid’s Metamorphoses (sexual assault). The many examples of these and related problems in the article are chosen and presented with the clear intention of “triggering” the reader into concluding “well that’s just stupid—political correctness, like a hydra, rears a new ugly head.” One of the authors’ primary concerns, repeated frequently throughout the article is that such attention to words and actions that might possibly somewhere, somehow offend someone will leave students unprepared to live and work in a world that doesn’t give a crap about what makes them feel uncomfortable.

What are we doing to our students if we encourage them to develop extra-thin skin in the years just before they leave the cocoon of adult protection and enter the workforce? Would they not be better prepared to flourish if we taught them to question their own emotional reactions, and to give people the benefit of a doubt?

Even though I have twenty-five years of college teaching under my belt,pc my experience on college campuses is deep but narrow, given that I have taught at my current college home for twenty-one years and have shaped my teaching and professional life within the confines of its “105 acre, park-like campus.” Serious conversations about the negative power of language on students in various groups defined racially, economically, by gender or by sexual preference have been ongoing on my campus for some time now. In my own philosophy department regular, continuing, and often heated debates occur about what constitutes appropriate and inappropriate language in the classroom, in job candidate interviews, and in basic conversation with each other. What strikes some as obviously benign, scholarly, and insightful strikes others as ill-advised, insensitive, and downright offensive. That said, the tsunami described by Lukianoff and Haidt as drowning campuses nationwide has escaped my notice where I teach—at least in my classrooms. Perhaps this is because I have included this general “trigger warning” in every syllabus for every one of my courses for at least the past fifteen years:

Ine this course we will be considering some of the most important questions a human being can ask. Perhaps the most important feature of our considerations is learning to ask these questions clearly and precisely. Only then can possible answers be considered fairly. Although I have definite positions on the questions we will be addressing, my role as professor is not to tell you what to think. My role is rather to get you to think. Expect your assumptions to be challenged and comfortable ways of thinking to be disturbed. As the great 17th century philosopher Spinoza once said, I do not know how to teach philosophy without becoming a disturber of the peace.

During an oral final exam a couple of semesters ago a student told me that “This class really messed me up—but in a good way!” Mission accomplished.mission accomplished

The new fall semester is just three weeks old–here’s a bit of advice related to safe spaces and learning for the incoming freshmen:

  1. Free speech dictates that everyone has the right to their opinion, but not all opinions are equal. right to an opinionOne of the purposes of a liberal education is to help you become skillful at using the tools of lifetime learning; some of these tools, used properly, will help you learn how to distinguish a good argument from bullshit—even when it is your own argument. I often say that a liberally educated person earns the right to have an opinion. The process of earning that right begins with realizing that your opinion is not special just because it is yours, and without challenge and analysis it means nothing with regard to whether it is true (or even a defensible position).
  2. In the life of learning, comfort is vastly overrated. comfort zoneExpect to encounter people, ideas, situations and expectations that are both unfamiliar and well outside your comfort zone. You should be looking for these rather than trying to avoid them. If you manage to make it through your undergraduate college career without changing any opinion, belief, perspective or attitude, then your tuition dollars have been wasted.
  3. The world of adulthood into which you are making your first, tentative forays can be a tough, nasty place. The world out there is full of people, ideas, things, and events that couldn’t care less if they lie within your current comfort is what it is As my wife would say, the world is what it is. Your years in college are not so much about your landing a well-paying job after you graduate as they are about the construction of a powerful and flexible moral and psychological framework of belief and commitment, from within which you will engage with what’s “out there” on a daily basis. It is not the world’s responsibility to provide you with comfort and security. It is your task to create and maintain a moral and psychological home for yourself in that world using all of the resources available to you, resources to sustain you on a life-long journey. By the way, you’ll be making significant renovations and additions to this home your whole life. Your professors are here to assist you in the construction of that home—good luck!

A liberal education, especially, inspires students to value struggle. By grappling with authors and ideas that demand the greatest level of intellectual intensity—and this is especially true in subjects that are difficult and uncongenial—students learn that they stretch themselves more through struggle, whether or not they win the match. Christopher Nelson

Tired of Hating People–Thoughts on the anniversary of 9/11

Everyone beyond a certain age can remember clearly what they were doing fifteen years ago tomorrow when they heard the news. I was in my college’s main cafeteria getting coffee and noticed something weird happening on the Today Show broadcast on a television hanging from the ceiling in the corner. first towerAt that point all they knew was that one of the Twin Towers was on fire, apparently because an airplane had crashed into it. I had scheduled office hours that morning, so I listened to live radio reports on NPR of the second tower being hit and the collapse of both towers. There was a surreal air to the broadcast—I wanted to believe that it wasn’t true, some sort of elaborate hoax along the lines of Orson Welles’ “War of the Worlds” broadcast many decades earlier. But it was true.

Classes were encouraged to meet and decide individually how best to deal with the day’s events. Several students in my first class of the day at 12:30 had family and friends who lived and/or worked in Manhattan—it was clear that the best thing for these students to do was to continue their frantic attempts to contact their loved ones. About half the class stayed and shared their thoughts—what they said and the nature of our conversation is difficult to recall. I know that many students (as well as many of my colleagues) were understandably angry and wanted retribution; tower collapseas we gathered our things to leave about half way through the class period I said “the one thing I’m feeling is that my best response to what has happened is to become a better person. A better teacher, husband, father, friend. That’s all I’ve got right now.”

There will be any number of retrospective reports throughout the day and evening today. Neither Jeanne nor I lost any immediate family or close friends in that day’s terrible events, although in a few cases it was only “luck” that spared someone we know well. A decade and a half removed, when I think about 9/11 and its aftermath as I have been over the past few days, I think of patriotism, wars that seem never to end, and the realization that with the swift passage of time soon I will be teaching students who, first, will not remember 9/11 and then, two or three years later, will not have been born when 9/11 occurred. But most of all, the lasting effect in this country of the terrorist attacks on that day has been a persistent atmosphere of fear and suspicion—as well as of the hatred that fear and suspicion  produce.

Just about a year ago the theme of the weekly “TED Radio Hour” on NPR was “Transformation—stories and ideas about becoming a completely different person.” The first story up that day was titled “How Did the Son of a Terrorist Choose Peace?”untitled

How did the Son of a Terrorist Choose Peace?

The story teller, Zak Ebrahim, is a peace activist and the author of The Terrorist’s Son: A Story of Choice. Ebrahim’s father, El-Sayyid Nosair, for a number of years plotted with other radicals to attack a number of New York City landmarks, including tunnels, synagogues and the United Nations headquarters. May of these planned attacks were thwarted by an FBI informant, but one of the attacks—the 1993 bombing of the World Trade Center–was not. Nosair and his fellow terrorists were convicted of placing a van filled with 1,500 pounds of explosives into the sublevel parking lot of the North Tower; the subsequent explosion killed six people and injured over a thousand others. Ebrahim was seven years old at the time of his father’s conviction and incarceration—Nosair was sentenced to life imprisonment plus fifteen years.nosair and son

Ebrahim’s father had become radicalized in the early years of his son’s life; in his TED talk Ebrahim describes how shortly before his father was arrested he took Ebrahim, along with several of the men who turned out to be co-conspirators, to a shooting range for Ebrahim’s first lessons in using a rifle. Even after Nosair’s arrest, the impact of his worldview on his young son continued to be strong.

Growing up in a bigoted household, I wasn’t prepared for the real world. I had been raised to judge people based on arbitrary measurements, like a person’s race or religion. He would just talk about Jews being evil. And I would hear similar things from the men that were with him. You know, gay people being evil and them wanting to turn you gay so that you would go to hell too. And just gay people being all-around terrible people and a bad influence. And he used to say things like, a bad Muslim is better than a good non-Muslim. That’s pretty much what indoctrination is. You have authority figures around you telling you that the world is one way and you don’t get to see another perspective.

This radical indoctrination began to crumble when Ebrahim, as a teenager, began through school to be exposed to some of the people he had been taught to hate. PhiladelphiaOne of his fellow group members at the National Youth Conference in Philadelphia leading up to the 2000 Presidential election was Jewish. Ebrahim did not learn that his new friend was Jewish until several days after their friendship had started developing; he says that “I felt a sense of pride in having been able to overcome a barrier that, for most of my life, I had been led to believe was insurmountable.” That summer he found a job at a Busch Gardens amusement park and for the first time had the opportunity to meet some gay people performing in one of the park’s shows. “I soon found that many were the kindest, least judgmental people I had ever met.”

One day I had a conversation with my mother about how my worldview was starting to change. And she said something to me that I will hold dear to my heart for as long as I live. She looked at me with the weary eyes of someone who’d experienced enough dogmatism to last a lifetime, and said tired of hating“I’m tired of hating people.” In that instant, I realized how much negative energy it takes to hold that hatred inside of you.

On one level it’s easy to hate because a world made of “Us” vs. “Them” is simple to define and make judgments from within. On a deeper level, though, Ebrahim is right—the negative energy of fear and hate is psychologically exhausting, an exhaustion that is symptomatic of our culture. It’s almost as if it isn’t natural for humans to hate.

A few moments of attention to the level of discourse in the current Presidential campaign are sufficient to hear the tones of fear and anger that pervade our national conversation about almost everything. It is a season of intolerant and fear-mongering language. That such attitudes exist is nothing new; what is new is that we have reached the point where hatred and intolerance have found a new foothold in the public square and conversation. And even for those who seek a moderate position that avoids anger and fear, the current atmosphere is infectious. big enough lieA character in Eric Bennett’s new novel A Big Enough Lie explains the dynamic well:

There are people in the world whose opinions differ from yours so much that the difference implies violence, urges it, supplies a will for it. And if you stand on the side of moderation, this implication, this will to violence, upsets you even more than the mere difference of opinion itself. Because you are complicit in it—you become complicit in extremism by loathing extremism. You are reduced by your enemy to what you despise in your enemy. The world excuses only saints and lunatics from its economy of hatred, is what you realize. Pick a side.

On this fifteenth anniversary of one of the darkest days in American history, my hope is that we as a nation, as a culture will decide, as Zak Ibrahim’s mother did, that we are tired of hating people. us-vs-themTired of dividing our tiny little universes up into “Us” and “Them” as we vilify those who do not look like, act like, or believe the same as those in our self-defined groups of specialness do, often in the name of rigidly dogmatic beliefs that cannot accommodate the complex and shades-of-grey world in which we live. As Zak Ebrahim discovered, the best cure for fear and hatred is simple experience. But such experience can only happen if each of us has the courage to step outside our ossified comfort zones and dare to meet the most frightening thing in the universe—someone who is not the same as me.

Embracing the Barbarian Invasion

Every year the world is invaded by millions of tiny barbarians. We call them “children.”  Hannah Arendt

One of the wonderfully gratuitous features of my early years as a college professor was the opportunity to teach regularly with a couple of master teachers. During the first decade of my teaching career at Providence College, I taught on an interdisciplinary Honors Development of Western Civilization team every year with two such colleagues. images[6]Rodney was a teaching icon from the English department who now, a few years after his untimely passing, has a tree on campus, a seminar room in the brand new humanities building, and an annual lecture named after him. One of the most dynamic and engaging pedagogues I have ever encountered, I remember telling Jeanne shortly after meeting Rodney in the middle nineties in my first year at Providence College that “when I grow up, I want to be Rodney.”

rays[1]The other member of our teaching triumvirate, Ray, is an extraordinary professor out of the History department. He is also one of the flat-out finest human beings I have ever had the privilege of knowing. This coming spring Ray and I will be teaching a colloquium together for the third time the past four years, and class fondly referred to by students as “Nazi Civ.” I am a far better teacher and human being for having spent so many years in the classroom in the company of these outstanding colleagues.

Because we spent so much time together in and out of the classroom, the three of us got to know each others business over the semesters a bit more than is typical between professional colleagues. We often spoke of our children; Rodney’s and Ray’s were young adults at the time, while mine were in high school and junior high. One morning before class as we were getting coffee in the break room, Rodney was bemoaning the fact that he had returned home from work the previous day at 5:00 in the afternoon at the very same time that his son, yowl-380x190[1]a twenty-something who was still living at home, emerged bleary-eyed from his basement bedroom for the first time that day. As we compared notes about the shortcomings and failures of our respective offspring, Ray, who I had always pegged as the perfect father and husband, grew reflective. “I’ve heard so many parents talk about the wonders of parenthood, how raising children is such a privilege, how their children’s growing up years were the best years of their lives,” he said. “I guess I must have missed that.” Preach it, Ray. For all of our politically correct claims about the wonders of child rearing, all parents know that Hannah Arendt’s “tiny barbarians” comment is absolutely true. Civilizing barbarians is hard work.

Conan-the-Barbarian[1]The word “barbarian” is from the Greek word βαρβαρος (barbaros), the term Greeks used to refer to anyone who was not Greek. To the refined but xenophobic Greek ear, the sounds coming out of a non-Greek speaker’s mouth sounded like “bar, bar, bar”—hence, “barbarian.” We would call such persons “blahblahblahrians.” The wider connotation of “barbarian” is simply someone or something that does not fit into the expected categories, abide by the accepted rules, or behave according to agreed-upon standards. That description certainly fits children and a lot more—I frequently call our 196834_112520205494582_3062546_n[1]dachshunds barbarians when they pee or take a dump in the middle of the floor, just as I would probably call a human being a barbarian (and worse) if they did the same thing.

And yet there is something exhilarating about having barbarians in our midst. A world without barbarians, without unfamiliar hordes pressing against the outer walls of our holy-of-holies comfort zones, is a world that eventually would stagnate into a smug status quo. I realized this past semester, as I do in varying degrees every semester, that one of the regular features of what I do as a teacher is to let the barbarians loose on the civilized yet unexamined thought processes of my students. conan-barbarian-04_510[1]Philosophy is an inherently barbarian discipline because it’s entire raison d’etre is the challenge to consider that one’s most cherished beliefs might indeed need improvement, that the doors and windows to the inner sanctum might regularly be opened to allow the smelly and scary barbarians in.

Several years ago, when I was still an untenured assistant professor and should have been keeping my mouth shut, I recall being involved in a conversation about this feature of philosophy during a philosophy department meeting. We were in the process of crafting a new “mission statement” for the department, an exercise guaranteed to generate disagreement. Title[1]One of the older members who had been chair of the department for a couple of decades before my arrival, a Dominican priest, proposed that our mission statement read that “The mission of the philosophy department is to teach the Truth.” Period—and make sure that it’s a capital “T” on “Truth.” I, along with several others, suggested that this would presume that we possess the Truth with a capital T, a presumption that is directly contrary to the very spirit of the philosophical enterprise. In a condescending tone (or at least so it sounded to me), another priestly colleague said “Vance, some of us around here think we have the truth,” to which I replied “And here I thought we were a philosophy department.”

So how does one keep the pursuit of truth alive without it being sidetracked into defense of the Truth? Over the past several years in my teaching and writing this question has been directed more and more toward the arena within which Truth rears its ugly head most often—religious belief.collegeville-lecture-31[1] During my sabbatical semester at an ecumenical institute five years ago I described my original book project as follows: “Is it possible to live a life of human excellence, of moral focus and spiritual energy, in a world in which the transcendent is silent, in which God is arguably absent?” As I led an afternoon seminar based on my early work on this project with a dozen fellow “resident scholars,” one of them—a Lutheran pastor—asked “But Vance, don’t you have to believe something with certainty if you’re going to call yourself a Christian?” To which I replied, “I don’t know—do I?” I had been wondering that for many years, but this was the first time I had said it aloud. And it was liberating. What would a faith that in which no “truth” is a “Truth,” a faith in which no cows are sacred, look like?

As I’ve dug into these questions with new energy and focus over the past few years, several matters have begun clear, beginning with the fact that the transcendent is not silent after all and God is definitely not absent. They just show up in entirely different places than where we have traditionally looked for them. And I am finding that, for me at least, a vibrant faith requires little in the way of defending the Truth, but rather a willingness to welcome the divine even when wrapped in unexpected packages. JCarse3YT1.2c_000[1]As James Carse writes,

This is Christianity’s strongest feature: it tirelessly provokes its members to object to prevailing doctrines without having to abandon the faith . . . Neither Christianity nor any of the great religions has ever been able to successfully erect barriers against the dreaded barbarian incursions of fresh ideas. 

Such barbarian incursions are not to be feared or defended against. They are to be invited and welcomed. Just as the millions of tiny barbarians who invade the world every year are actually the way in which the human species is renewed and regenerated, so the regular introduction of barbarian ideas into our civilized and supposedly completed belief systems will keep those beliefs from turning into idols. What would a faith in which no “truth” is a “Truth,” a faith in which no cows are sacred look like? It would look a lot like Faith–the real thing.

You Are Free, Therefore Choose

Each of us can point to a time (or several times) in our past when we made a decision that, in retrospect, significantly shaped our lives going forward. Such decisions for me include getting married at one month past twenty, choosing to leave law school for a masters program in philosophy, and deciding to commit for life in my early thirties to a person whom I had known for six weeks. I could have chosen differently in each of these cases, and my life would be much different now than it is. DarwinBneuroscienceut could I really have chosen otherwise? The answer of many “experts” from psychology, science, philosophy, and more is “no.”

I wrote about how the “experts” have gradually but inexorably come to this conclusion a week ago, describing how evidence from Darwin to neuroscience supports the conclusion that everything about me, including all of my choices, is fully determined by both biological and environmental causes beyond my control.

They Will Never Take Our Freedom

I undoubtedly, the experts admit, will continue to believe that some of my choices are free in the sense that I could have chosen otherwise, but that belief is based on an illusion. illusionMy choices may feel free, but they really aren’t. If true, the news that free will—the foundation of what most of us believe concerning morality, reward, punishment, praise, blame, and responsibility—is an illusion cannot be taken lightly. Nor, I would argue, need I either as a philosopher or a human being believe that the “experts” are right about this. Free will is only an illusion if one accepts the starting assumptions that energize the argument against human beings having real free will, assumptions that include the belief that everything that exists is made of physical matter, that physical matter is governed by inexorable physical laws, and that we generally know what those laws are. These assumptions are so entrenched among the “experts” that challenging them is as uphill a battle as trying to argue that the earth is flat. But I’ll give it a shot.

I often tell my students that each person, among her or his beliefs, has a small handful of what Aristotle called “first principles.” Aristotle knew as much about cause and effect as anyone;ō indeed, he arguably invented our familiar system of logic that is built on the belief that we live in a world governed by cause and effect relationships. These relationships shape how our beliefs hang together as well. Consider the following conversation:

Me: I believe A is true.

You: Why?

Me: Because A depends on B, and I believe B is true.

You: Why?

Me: Because B depends on C, and I believe C is true.

You: Why?

There’s a pattern here. We all seek to support our beliefs by referring to connected and deeper beliefs on which they depend. There’s also a problem here, though. The chain of cause and effect has to end somewhere if we are to avoid the dreaded “infinite regress.” So eventually we get this:

Me: I believe that X is true because X depends on Z, and I believe Z is true.

first principlesYou: Why? (you’re getting really annoying, by the way)

Me: Because I do.

In Aristotle’s terminology, I have just identified “Z” as one of my first principles. In order to avoid an infinite regress, eventually we arrive at a belief for which we seek no further justification than that we believe it. Such first principles vary from person to person–some common ones include “Human life is intrinsically valuable,” “Human beings are fundamentally equal,” and “Do unto others as you would have them do unto you.” For many, including myself, “Human beings have the capacity to choose freely, choices that are not entirely determined by matters outside their control” is another first principle which, if true, stands in direct opposition to what the “experts” claim the truth to be. And like it or not, no one wants to hear that a first principle is an illusion.

When I choose freely, I deliberate between available options, weigh the evidence supporting and against each, and choose the option that best satisfies my operational criteria. I cause the choice, in other words, influenced but not determined by any number of factors. This simple idea—that a human being can choose without the choice being fully determined—violates assumptions so prevalent among the “experts” that it is tantamount to heresy. uncaused caused 2And to be sure, this simple idea is indeed a radical one, for it claims that the freely choosing human being is an exception to the inexorable laws of matter, capable of starting her or his own chain of causation that would not have happened without the choice starting the chain. There are few beings in the annals of philosophy with this power. Aristotle called his creating force—what we might call “God”—the “unmoved mover” or “the first cause uncaused,” the place where the chain of causation begins (or ends if one is going backwards). In short, human beings act “in loco parentis,” with the causal power of the divine itself, when we make free choices. If one wants to go religious, it is this very creative power of free choice that the sacred texts are referring to when they claim that human beings are “created in the image of God.”

The position that truly free choices step meaningfully outside the laws of nature has been called “metaphysically peculiar” by some philosophers, simply “bullshit” by others. Free will deniers assume that any human capacity that purportedly steps outside the laws of physical matter must be an illusion, since we all know that everything is made of matter and that matter is governed by deterministic laws. more thingsTo which I respond, as I often do, in the words of Shakespeare’s Hamlet: There are more things in heaven and earth . . . than are dreamt of in your philosophy. Presuming that everything in heaven and earth can be reduced to the confines of our current understanding of reality is hubris of breathtaking proportions. When a fundamental and definitive human ability is defined out of existence because of narrow assumptions, I choose to question the assumptions rather than the reality of the fundamental human ability. When answers to a question do not square with our strongest intuitions and beliefs about ourselves, change the question.

None of the above is very philosophical of me, at least not in the contemporary sense. I freely (J) admit that human free choice might be an illusion, but I see no reason to believe so based on an argument with questionable assumptions. I choose rather to embrace the mystery and miracle of being human and believe, until better contrary evidence is provided, in keeping with the intuitions of billions of human beings that at least some human choices make a difference—such a great difference that they make the world a different place than it would have otherwise been. SartreAnd human beings are not just vehicles of that change—their choices cause that change. Maybe we just don’t know enough about reality to rule out abilities that don’t square with our current understanding of things. Maybe human beings are truly the crowning glory of creation, endowed with a spark of the divine that reveals itself in our most basic capacities. Maybe all of the above. Take your pick. As Jean-Paul Sartre used to say, “You are free, therefore choose.”


They Will Never Take Our Freedom

Although I read incessantly, I don’t read a lot of magazines. The only magazine I currently subscribe to is The Atlantic—I appreciate the excellent writing and quirky features, but don’t exactly wait by the mailbox for each monthly edition to show up. Instead, they tend to pile up on the little table next to my side of the bed, waiting to be perused when I am between authors in my novel reading. I’m currently in one of those spaces, having just finished my fourth consecutive Arturo Pérez-Reverte mystery a few days ago and not ready to start a new, large reading project just a week before the semester starts. 394-They'll Never Take Our FreedomAccordingly, I started plowing through the three summer editions of The Atlantic that have accumulated on my nightstand since June. Inside the June edition, whose cover includes two-thirds of Donald Trump’s head peeking in from the right side announcing a lead article entitled “The Mind of Donald Trump” (an oxymoron if I ever saw one), I found this: “There’s No Such Thing as Free Will—Here’s why we all may be better off believing in it anyway.”

Stephen Cave: There’s No Such Thing As Free Will

CaveThe article is by Stephen Cave, a philosopher who runs a “Center for the Future of Intelligence” at the University of Cambridge. His article is well-written and engaging—so much so that I suspect he may have had help with it. Trust me, I know whereof I speak. I have spent over twenty-five years learning to write in ways that make core philosophical issues accessible and interesting to non-philosophers—it ain’t easy. First, it’s important to clarify what philosophers usually are referring to when they use terms like “free will” or “freedom.”  Just before the final battle in his 1995 epic “Braveheart,” Mel Gibson’s William Wallace screams to the Scottish army that They may take our lives, but they’ll never take our freedom!!

That sort of freedom, the kind enshrined in this country’s founding documents as “rights” that each citizen possesses and that must not be violated or taken away, is not what philosophers mean by freedom.

Instead, “free will” refers to the human ability to choose, for a person to deliberate between options and eventually choose, then act on one of the options, all the time knowing that she or he did not have to choose that option—decisionin other words, she or he could have chosen otherwise. This vaunted human ability to freely choose is, for many (including me), the fundamental and defining feature of what it means to be human. Stephen Cave points out that our legal systems, as well as our general beliefs concerning praise, blame, reward, punishment, and all things moral depend on our basic belief in human free will. And it is under attack—scientists, psychologists, philosophers, and just about everyone “in the know” have been trying to take it away for decades.

The “free will issue” is a go-to problem in all philosophy courses, the philosophical version of the divine foreknowledge/free will problem in theology. Just it is impossible to make room for free choice in a world governed by an omniscient deity, so in a world where everything that occurs is governed in a cause-and-effect manner by the physical laws of matter, there is no room for true human free will. Cave points out that at least since Darwin argued in The of Species that everything about human beings—including our vaunted reasoning abilities where the ability to choose is located—is a result of natural evolutionary processes rather than a mystical, magical, or divine “spark” that lies outside the physical laws of matter, illusionscience has reinforced the conclusion that whatever human consciousness and deliberate choice are, they are to be placed squarely in the material world. Making it impossible, of course, to squeeze out the special place we desire for choice. Our choices may “feel” free, “as if” they are up to us, but Cave pulls no punches in describing the truth about us:

The contemporary scientific image of human behavior is one of neurons firing, causing other neurons to fire, causing our thoughts and deeds, in an unbroken chain that stretches back to our birth and beyond. In principle, we are therefore completely predictable. If we could understand any individual’s brain architecture and chemistry well enough, we could, in theory, predict that individual’s response to any given stimulus with 100 percent accuracy.

Experiments by psychologists and neuroscientists have shown that the brain’s neurons fire in new patterns causing a specific action before a person consciously “chooses” to act—indicating that my conscious “choice” is an illusion that actually doesn’t cause anything. nature nurtureDebates rage concerning how much a human’s actions are caused by “nature”—one’s hardwiring—and how much is caused by “nurture”—one’s environment—but there is general agreement that none of them are caused by conscious choice. We are determined through and through.

The ensuing discussion is often amusingly similar to conversations that couples considering a divorce might have: Should we tell the children, and if so, when? In the service of all truth all the time, some argue that non-philosophers and non-scientists should be made aware that free choice is an illusion and they should stop believing in it. Others insist that such a revelation would be damaging to the basic human’s commitment to morality, law, reward, punishment, and all of the other cool things that rely on our apparently mistaken belief that our choices make a difference and that we are responsible for them. My own classroom experiences indicate that it doesn’t matter. I regularly use a very simple thought experiment with my students at the beginning of the “free will” unit on the syllabus:

Suppose that in the near future a super-duper computer can read your brain and physiology sufficiently to predict the rest of your life, from large events to the minutest second-to-second thoughts and feelings, from now until you die. For a nominal fee you can purchase a printout of every event, thought, and feeling that you will experience for the rest of your life. Some printouts will be yards in length, while others will be very short. Do you want to see yours?

In a typical class of twenty-five students, no more than one or two students will say that she or he wants to see it. Why? Because even with direct proof available that the rest of my history is determined down to the minutest level—including my “free” choices—illusionI prefer to believe that my choices make a difference in my life and in the world around me. I prefer to embrace the illusion. It appears, in other words, that human beings are determined to believe that they are not fully determined.

On this particular issue I find myself swimming against the tide. I not only believe that human beings have the ability, at least on occasion, to make choices that are not entirely determined by their biology, history, and environment—I also believe that this ability is not an illusion. It’s real. The free will/determinism issue as contemporary philosophy defines it has its current shape because virtually everyone accepts a starting assumption—everything that exists is material stuff subject to inflexible physical laws. Given that assumption, the claim that human beings have the capacity to jump outside the limitations of matter and make choices that avoid the determinism of cause and effect makes no sense. But as I often tell my students, if the answers one is getting are unacceptable, change the question. If the ability to freely choose is fundamental to what a human being is, and if our current assumptions about how reality is constructed make no room for that ability, then perhaps instead of accepting that choice is an illusion we should challenge the assumptions that forced us to this acceptance. Be watching for “What Freedom Amounts To” next week, where I’ll describe a very different way to think about human choice!Horatio


I’m Working On It

Any caring human being asks the question What is the right thing to do? on a regular basis. As a philosophy professor who teaches ethics regularly, IrisI am aware that in the minds of many, the whole purpose of thinking systematically and rigorously about the moral life is to provide reliable and confident answers to that very question. Moral philosophers from Immanuel Kant to Iris Murdoch, from Aristotle to MacIntyreAlasdair MacIntyre, have provided frameworks within which to answer the question. But each framework is different, they are often incompatible with each other, and philosophers do not agree on which aspects of the moral life are most important in a moral analysis. Some focus on the consequences of an action, others stress the reasons behind one’s actions, still others argue that the character of the person making the choices and doing the actions is most important of all. In short, philosophy’s answer to “What is the right thing to do?” is, at least partially, “Which philosopher are you currently studying?”

Such disagreement among those who are supposed to be the experts often leads to the conclusion that moral certainty must be sought elsewhere—in religion, for instance. If one is convinced that God not only exists but has bothered to let human beings know the divine preferences for human behavior, then faith promises to provide a far more reliable foundation for knowing the right thing to do than anything pointy-headed philosophers might come up god real But scratch the surface of the religious option and a whole bunch of additional questions pop up. Which God? Which sacred text? What about conflicting claims within the same tradition or the same text? Those from outside the camp of religious faith consider these awkward and essentially unanswerable questions to provide strong evidence that atheism, or at least agnosticism, is the way to go, while those who cling to their faith tend to get defensive and judgmental toward those who disagree.

I have spent the past several weeks preparing my syllabus, assignments, and lesson plans for the two sections of introductory ethics that I’ll be teaching this fall. In my twenty-five years of professorhood, I have taught at least fifty sections of introductory or upper division ethics courses, and there is nothing that I enjoy more than throwing students headfirst into the deep end of the “What is the right thing to do?” pool. By the time they are eighteen years old, just about every human being has been exposed just enough to a possible set or two of answers to this question to assume that they’re all set and have the moral life generally figured out. disturbing the peaceMy job as a philosophy professor is to disturb the peace starting on the first day of the semester. There is nothing more gratifying than to hear at the end of the semester, as I did from a student at her final oral exam two or three years ago, that “this course really messed me up—but that’s a good thing!” Mission accomplished.

But I’m not just a philosophy professor—I’m a regular human being as well. My professional training and natural disposition makes me generally skeptical of any claims to moral certainty—I frequently tell anyone who will listen, from the classroom to the blogosphere, that certainty is vastly overrated. (A quick search just revealed that I have used that very phrase eight times in blog essays). But I am also a person of faith, raised in a religious tradition that supposedly equipped me with the tools (scripture, prayer, authority, guilty conscience, and more) to provide definitive guidance when wondering about what the right thing to do is. working on itHow do I make being a philosopher and a person of faith work together, or at least not be in perpetual tension? As my youngest son Justin likes to say when challenged concerning important things: I’m working on it. This very issue is the central theme of this blog—after four years of hanging my struggles out for public display, I’m working on it. My sabbatical book that is under contract and will be out early next year is all about this. I’m working on it. When pressed for a summary of where my working on it stands in real time, two passages come to mind.

The first is from Simone Weil, the strange and beautiful woman who, for the past two decades has been a model for me of intellectual rigor as well as integrity to one’s faith commitments. In one of her dozens of notebooks, she writes:

The will of God. How to know it? If we make a quietness within ourselves, if we silence all desires and opinions and if with love, without formulating any words, we bind our whole soul to think “Thy will be done,”Simone the thing which after that we feel sure we should do (even though in certain respects we may be mistaken) is the will of God. For if we ask him for bread he will not give us a stone.

There is enough in this passage to justify many essays—what currently strikes me most strongly is Weil’s conviction that the knowledge each of us seeks is within us. Philosophers and theologians err when they tell us, implicitly or explicitly, that seeking the answer to “What is the right thing to do?” is like a treasure hunt, a search that, if successful, will once and for all provide us with proper guidance in all circumstances. Rather, as both the Pentateuch and the Apostle Paul tell us, the word is within you. It is within me. Believing this requires an act of faith that, at least at first look, is astoundingly optimistic. What reasons are there to believe that the universe, God, reality, or anything, is so attuned to what Catherine of Genoa called “my deepest me” that I can trust that this deepest me holds the answers to my most pressing questions? No reasons that can fully stand up to logical scrutiny, but in matters this important perhaps logic is as overrated as certainty. I choose not to believe that my desire for bread will inevitably produce rocks, that my deepest cries will go unheard. So sue me.tutu

Then there is a similar sentiment from Archbishop Desmond Tutu. When asked for his own insights concerning the will of God and how to know one is doing the right thing, he replied that

There is no shaft of light that comes from heaven and says to you “Okay, my son or my daughter, you are right.” You have to hold on to it by the skin of your teeth and hope that there’s going to be vindication on the other side.

Morality by the skin of your teeth. Tenacity and hope, along with faith, love, goodness, and as many other desirables that you care to list, are essential for even rudimentary answers to “What is the right thing to do?” This is a lot more challenging, but also a lot more fulfilling, than looking it up in a book or memorizing answers. I’m working on it.