Category Archives: politics

Tolerance on Steroids

What happens when a perfectly good virtue gets turned into not only the most important virtue, but in many cases the only virtue? I have come face to face with this question in the signpostsearly weeks of this semester with fifty juniors and seniors in two ethics classes. I chose this past summer to organize my General Ethics course, usually a tour of several of the notable moral theories in the Western philosophical tradition (Aristotle, Kant, Mill, Sartre, etc.) that are then applied to the details of human experience, by starting instead with those messy details themselves. We find ourselves in a world of competing religious, moral, and political claims shouting at each other across various divides, claims that are both incompatible with each other and resistant to compromise. How in the midst of diverse perspectives that too often lead to violence are we to find a place of stability from which to plot the way forward?

I have discovered both from early class discussion and student writing reflections what I suspected—most of my young adult students have been taught for as long as they can remember that the “go-to” virtue that must be cultivated in order to wend one’s way through the minefield of incompatible beliefs and commitmaristotleents is tolerance. It’s interesting that the granddaddy of virtue ethics, Aristotle, did not include tolerance in any of his lists of virtues—apparently such a virtue was not particularly useful in fourth century BC Athens. Tolerance is also rejected by many contemporary people as a sign of weakness, of lacking commitment to one’s beliefs, and of a willingness to compromise too quickly. But for many in our culture, particularly those who might consider themselves as “liberal” in some sense, tolerance is the proposed remedy for many of the things that ail us.

Don’t get me wrong—I have no problem with tolerance as a virtue. As a matter of fact, it probably plays as regular a role in my life on a daily basis as any virtue you could name. My concern about tolerance arises from intimate facebookfamiliarity with how it often works in my own life. When I remove myself from an email list on campus because I’m sick to death of being inundated with what I consider to be the often petty concerns of my colleagues, it feels like tolerance. “Let them continue emailing about anything they want,” I think. “I just don’t want to be part of it.” When a Facebook conversation wanders into areas that I find either offensive or seriously different-strokesmisguided, my tendency is to withdraw from the conversation rather than insert my concerns. Tolerant, right? Not really.

I find in my own life, and I suspect I’m not unusual or unique in this, that “tolerance” is an umbrella term for “whatever.” “Different strokes for different folks.” “I disagree with you but I’ll defend to the death your right to say it” (although I almost certainly wouldn’t). In other words, one of the best safeguards against being judgmental and ethnocentric, a check against our natural human tendency to negatively judge those who believe, think, and act differently than we do simply because they are believing, thinking, and acting differently than we do, turns into a placeholder for laziness and a reticence to engage even with what one most strongly disagrees with. When writing on topics related to diversity and difference, my students regularly include phrases such as “we all just need to accept people as they are” and “the world would be a better place if everyone would simply be more tolerant of differences.” Tolerance is not only the first virtue that gets mentioned in class discussion and assignments, but is often the only virtue in play. But is tolerance suitable as the primary virtue in a moral framework or ethic? And what if there are some things that must not be tolerated?

herodotusA brief but familiar story from the ancient Greek historian Herodotus provides a useful jumping off point for asking uncomfortable questions about tolerance. In his Histories, Herodotus tells the story of King Darius of Persia, a (somewhat) enlightened ruler who was fascinated by the various customs of the different groups of people from the far-flung reaches of his empire who were part of his extended court. Darius noted, for instance, that two different groups of people—the Greeks and the Callatians (a tribe of people from what is now the Indian peninsula)—had strikingly different methods of dealing with the body of a person who died in their community. The Greek practice when someone died was to burn the dead body, while the Callatian practice was to eat the dead body.

Intrigued, Darius first asked representatives of the Greek community what he would have to pay or give them, what honors he would have to bestow on them, so that the next time someone died in their community they would eat the dead body instead of burning it, as was their custom. Appalled, the Greek representatives told Darius that no amount of riches or honors could possibly convince them to do such a horrible and immoral thing. Darius also asked a similar question of the Callatians—could I convince you to burn the next dead body you have to deal with in your community rather than eating it, as is your custom? Hell no! the Callatians said, insisting that nothing could convince them to do such a disgusting and immoral thing. Herodotus’s conclusion? callatians“Custom is king.” What a person or group of people considers to be “right” or “moral” is what they are accustomed to, the practices of their family, their community, or their culture that they have been taught since their youth. Humans nature causes us not only to embrace what we are most familiar with as morally right, but also to assume that it is right for everyone.

If “custom is king” and moral values are culturally defined, then the most important attitude to cultivate, the habit most likely to put up a firewall against unwarranted projection of one’s parochial practices and values on others, is undoubtedly tolerance. As Herodotus’ story is intended to illustrate, the best answer to the question “Who is right about the best way to dispose of a dead body—the Greeks or the Callatians?” is “Both, within the parameters of their culture.” Furthermore, there is no way to step outside one’s own culturally defined moral stance and be “objective.” There is no such objective standpoint. The only proper response to differences between groups, or perhaps even between individuals, is tolerance—the habit of accepting differences without judgment.tolerance

The problem, as a student quickly pointed out in each section of my ethics course, is that tolerance as an exclusive or primary virtue is not sufficient to account for many of our strongest moral intuitions. What if, for instance, the difference is about something more serious than the difference between eating or burning a dead body? What if the difference is between a culture that practices female circumcision and our culture that does not? Is tolerance appropriate in this instance? Are we to say “wow, I’m glad I don’t live in that culture, but for them that practice is morally right”? If our intuitions say that some practices cannot be tolerated, no matter what cultures adopt them, is this because our intuitions have been shaped by our own culture or because our intuitions are resonating with a moral absolute that transcends cultural differences?moral-values

Of such questions a great General Ethics class is made. But it appears that if we raise tolerance to primary virtue status, we at the same time take any commitment to moral principles that transcend cultural differences off the table. And that may not be a price worth paying. As I told my students the other day, a moral theory that does not account for our strongest moral intuitions is like trying to cover a queen-size mattress with a twin-size fitted sheet. It covers some of what needs to be covered, but not all of it. I, for one, am not ready to tolerate a theory like that.

Tired of Hating People–Thoughts on the anniversary of 9/11

Everyone beyond a certain age can remember clearly what they were doing fifteen years ago tomorrow when they heard the news. I was in my college’s main cafeteria getting coffee and noticed something weird happening on the Today Show broadcast on a television hanging from the ceiling in the corner. first towerAt that point all they knew was that one of the Twin Towers was on fire, apparently because an airplane had crashed into it. I had scheduled office hours that morning, so I listened to live radio reports on NPR of the second tower being hit and the collapse of both towers. There was a surreal air to the broadcast—I wanted to believe that it wasn’t true, some sort of elaborate hoax along the lines of Orson Welles’ “War of the Worlds” broadcast many decades earlier. But it was true.

Classes were encouraged to meet and decide individually how best to deal with the day’s events. Several students in my first class of the day at 12:30 had family and friends who lived and/or worked in Manhattan—it was clear that the best thing for these students to do was to continue their frantic attempts to contact their loved ones. About half the class stayed and shared their thoughts—what they said and the nature of our conversation is difficult to recall. I know that many students (as well as many of my colleagues) were understandably angry and wanted retribution; tower collapseas we gathered our things to leave about half way through the class period I said “the one thing I’m feeling is that my best response to what has happened is to become a better person. A better teacher, husband, father, friend. That’s all I’ve got right now.”

There will be any number of retrospective reports throughout the day and evening today. Neither Jeanne nor I lost any immediate family or close friends in that day’s terrible events, although in a few cases it was only “luck” that spared someone we know well. A decade and a half removed, when I think about 9/11 and its aftermath as I have been over the past few days, I think of patriotism, wars that seem never to end, and the realization that with the swift passage of time soon I will be teaching students who, first, will not remember 9/11 and then, two or three years later, will not have been born when 9/11 occurred. But most of all, the lasting effect in this country of the terrorist attacks on that day has been a persistent atmosphere of fear and suspicion—as well as of the hatred that fear and suspicion  produce.

Just about a year ago the theme of the weekly “TED Radio Hour” on NPR was “Transformation—stories and ideas about becoming a completely different person.” The first story up that day was titled “How Did the Son of a Terrorist Choose Peace?”untitled

How did the Son of a Terrorist Choose Peace?

The story teller, Zak Ebrahim, is a peace activist and the author of The Terrorist’s Son: A Story of Choice. Ebrahim’s father, El-Sayyid Nosair, for a number of years plotted with other radicals to attack a number of New York City landmarks, including tunnels, synagogues and the United Nations headquarters. May of these planned attacks were thwarted by an FBI informant, but one of the attacks—the 1993 bombing of the World Trade Center–was not. Nosair and his fellow terrorists were convicted of placing a van filled with 1,500 pounds of explosives into the sublevel parking lot of the North Tower; the subsequent explosion killed six people and injured over a thousand others. Ebrahim was seven years old at the time of his father’s conviction and incarceration—Nosair was sentenced to life imprisonment plus fifteen years.nosair and son

Ebrahim’s father had become radicalized in the early years of his son’s life; in his TED talk Ebrahim describes how shortly before his father was arrested he took Ebrahim, along with several of the men who turned out to be co-conspirators, to a shooting range for Ebrahim’s first lessons in using a rifle. Even after Nosair’s arrest, the impact of his worldview on his young son continued to be strong.

Growing up in a bigoted household, I wasn’t prepared for the real world. I had been raised to judge people based on arbitrary measurements, like a person’s race or religion. He would just talk about Jews being evil. And I would hear similar things from the men that were with him. You know, gay people being evil and them wanting to turn you gay so that you would go to hell too. And just gay people being all-around terrible people and a bad influence. And he used to say things like, a bad Muslim is better than a good non-Muslim. That’s pretty much what indoctrination is. You have authority figures around you telling you that the world is one way and you don’t get to see another perspective.

This radical indoctrination began to crumble when Ebrahim, as a teenager, began through school to be exposed to some of the people he had been taught to hate. PhiladelphiaOne of his fellow group members at the National Youth Conference in Philadelphia leading up to the 2000 Presidential election was Jewish. Ebrahim did not learn that his new friend was Jewish until several days after their friendship had started developing; he says that “I felt a sense of pride in having been able to overcome a barrier that, for most of my life, I had been led to believe was insurmountable.” That summer he found a job at a Busch Gardens amusement park and for the first time had the opportunity to meet some gay people performing in one of the park’s shows. “I soon found that many were the kindest, least judgmental people I had ever met.”

One day I had a conversation with my mother about how my worldview was starting to change. And she said something to me that I will hold dear to my heart for as long as I live. She looked at me with the weary eyes of someone who’d experienced enough dogmatism to last a lifetime, and said tired of hating“I’m tired of hating people.” In that instant, I realized how much negative energy it takes to hold that hatred inside of you.

On one level it’s easy to hate because a world made of “Us” vs. “Them” is simple to define and make judgments from within. On a deeper level, though, Ebrahim is right—the negative energy of fear and hate is psychologically exhausting, an exhaustion that is symptomatic of our culture. It’s almost as if it isn’t natural for humans to hate.

A few moments of attention to the level of discourse in the current Presidential campaign are sufficient to hear the tones of fear and anger that pervade our national conversation about almost everything. It is a season of intolerant and fear-mongering language. That such attitudes exist is nothing new; what is new is that we have reached the point where hatred and intolerance have found a new foothold in the public square and conversation. And even for those who seek a moderate position that avoids anger and fear, the current atmosphere is infectious. big enough lieA character in Eric Bennett’s new novel A Big Enough Lie explains the dynamic well:

There are people in the world whose opinions differ from yours so much that the difference implies violence, urges it, supplies a will for it. And if you stand on the side of moderation, this implication, this will to violence, upsets you even more than the mere difference of opinion itself. Because you are complicit in it—you become complicit in extremism by loathing extremism. You are reduced by your enemy to what you despise in your enemy. The world excuses only saints and lunatics from its economy of hatred, is what you realize. Pick a side.

On this fifteenth anniversary of one of the darkest days in American history, my hope is that we as a nation, as a culture will decide, as Zak Ibrahim’s mother did, that we are tired of hating people. us-vs-themTired of dividing our tiny little universes up into “Us” and “Them” as we vilify those who do not look like, act like, or believe the same as those in our self-defined groups of specialness do, often in the name of rigidly dogmatic beliefs that cannot accommodate the complex and shades-of-grey world in which we live. As Zak Ebrahim discovered, the best cure for fear and hatred is simple experience. But such experience can only happen if each of us has the courage to step outside our ossified comfort zones and dare to meet the most frightening thing in the universe—someone who is not the same as me.

Good Morning, Psalms

Last Thursday, in just our second class of the semester, I had the opportunity to introduce my ethics students to the master of all things ethical. The key to Aristotle’s understanding of the life of human flourishing is that such a life depends on the formation of the best habits—Aristotle ethicsthe virtues—to guide one’s life. Aristotle conceived of the life of freedom and moral excellence as a life constructed out of the virtues, good habits that, when cultivated, incline a person to do the right thing as a matter of developed character rather than conscious choice. Habits are established by repetition and, once formed, are often very difficult to change. Accordingly, one should take great care that one’s moral habits are the right ones (virtues) and not the wrong ones (vices), since the wrong habits, once entrenched, will be next to impossible to replace with better ones.plato footnote

I have taught Aristotle’s ethics for many years and believe that although Alfred North Whitehead was probably correct when he said that all of philosophy is a series of footnotes to Plato, the best thinking about ethics begins with Aristotle. And his insights concerning the importance of habits are relevant beyond the ethical realm. I find myself in the best physical shape of my life now in my early sixties because several decades ago my grudging daily trips to the gym somehow turned into a habit that I no longer had to talk myself into. Reading psalms with 100_0670Benedictine monks in Minnesota three times a day during my 2009 sabbatical established a habit of reading the three or four psalms appointed for each day in the Book of Common Prayer that continued for several years after my sabbatical ended. Between my alarm at 5:15 AM and getting to the gym by its 6:00 opening time I read the day’s psalter aloud (or murmured it, lest I awaken the dogs and Jeanne). I am convinced that this simple habit both helped transfer important changes in my life from sabbatical to real life, and also contributed to the preservation of my sanity as I juggled full-time teaching with the additional full-time duties of running a large academic program for four years.

But then I lost the habit, under the strangest of circumstances. My next sabbatical arrived, and with the prospect of unlimited time to rest, re-center, read, and write in front of me, somehow the daily regimen of early morning psalm reading fell by the wayside. I no longer needed to arise at 5:15, I rode my new bicycle obsessively instead of daily workouts at the gym, I applied myself energetically to my sabbatical writing project, and somehow my simple ten to fifteen minutes alone with the psalms every morning didn’t make the cut. habitsI made no conscious decision to end the habit—I just did. If Aristotle is correct in saying that well-established bad habits are very difficult to break, it turned out—in my case at least—that good habits can be broken very easily. I didn’t even realize consciously that my psalm reading habit had gone by the wayside for several weeks; once I noticed its absence, I made a few half-hearted attempts to start again over the following months. But they didn’t take.

I returned to the classroom for the first time in fifteen months a week ago, and decided that along with a return to a 5:15 wake-up call, I would attempt to re-establish my psalm reading habit. With only a week under my belt, the returns are promising; coming back to the psalms has been like becoming reacquainted with very wise friends who have been away for a while. My renewed acquaintances include:

Monday, August 29: Psalm 139

The opening psalm on the list for my first day back was one that, depending on my mood and what’s going on in my life, has been either very disturbing or deeply comforting.

O LORD, you have searched me and known me.

You know when I sit down and when I rise up;

you discern my thoughts from far away . . .

Where can I go from your spirit? Or where can I flee from your presence?

If I ascend to heaven, you are there; if I make my bed in Sheol, you are there.

If I take the wings of the morning and settle at the farthest limits of the sea,

even there your hand shall lead me, and your right hand shall hold me fast . . .

big[1]For it was you who created my being, knit me together in my mother’s womb.

Sometimes this Psalm reads like a description of a divine stalker, but more often the mere improbability that the creator of the universe cares about lil’ ole me is overwhelming. If I were inclined to be an atheist, or at least an agnostic, it would probably be because of this very point—the idea that God cares about human beings in any specific sense at all. Most of what we observe and experience screams against it. Our obvious insignificance screams against it.

Psalm 139 offers hope in the face of insignificance. Perhaps there is one place where I do not need to be an impostor or be overwhelmed by my insignificance, a place where I am known better than I know myself and am valued more highly than I could ever manufacture. The other day at convocation, NY Times Pulitzer Prize winning journalist kristofNicolas Kristof told the hundreds of students and faculty in attendance that at those times when one feels insignificant, like a single drop of water in a very large bucket, a drop that can’t possibly make a difference, we should remember that buckets are filled by one drop of water at a time.

Tuesday, August 30: Psalm 146

The final entries in the collection of 150 poems are praises of various sorts—noon prayers at St. John’s Abbey in Minnesota, the place where I first learned to inhabit these ancient poems, include one of the final five psalms in rotation. I always looked forward to Psalm 146, which for me summarizes what God—and therefore those who profess to follow God—cares about the most.

It is the Lord who keeps faith forever, who is just to those who are oppressed.

It is God who gives bread to the hungry, the Lord, who sets prisoners free,

the Lord, who gives sight to the blind, who raises up those who are bowed down,

the Lord, who protects the stranger and upholds the widow and orphan.john the baptist

When John the Baptist sends some of his followers from his prison cell to ask Jesus whether Jesus is the Messiah, “the one who is to come, or should we look for another?” Jesus responds in the language of Psalm 146. Tell John that the blind see, the lame walk, the hungry are being fed, strangers are being welcomed, and those imprisoned are being set free. That’s how you can tell when the divine is in the house, when human beings are in tune with what is greater than themselves. Imagine how different our nation, our world, would be if the above lines were the defining touchstone for success.

Thursday, September 1: Psalm 1

The compilers of the Psalms chose to kick things off with a description of happy people, those who “delight in the law of the Lord.”

They are like a tree that is planted bedside the flowing waters,

That yields its fruit in due season and whose leaves shall never fade;

and all that they do shall prosper.

006I have always been fascinated with trees, but have come to love them in a deeper way over the past several years. Their stability, rootedness, and beauty have become iconic for me. I write about trees frequently in this blog: within the past few months I have written about Tolkien’s Ents, arboreal survival strategies, oaks of righteousness, and how the removal of a 150+ year old tree on campus this summer was traumatic for all involved. In an interview with Krista Tippett, theologian Ellen Davis said that “anything in our world now that slows us down is to be valued and maybe as a gift and even a calling from God.” The fact that the first analogy in the Psalms for the person who “meditates on God’s law day and night” is a tree silently proceeding through its seasons of fruitfulness and prosperity confirms Davis’ insight. I may not meditate on God’s law day and night, but fifteen minutes a day is doable.

Convocation_2007_16

Nice Work If You Can Get It

Not long ago, I read a Huffington Post article summarizing the results of what a bunch of anthropologists found out concerning the daily work habits of university faculty.

What Do Professors Do All Day?

After spending two weeks with a non-random sample of sixteen faculty of different ranks at boise stateBoise State University, the researchers found out that on the average the faculty worked 51 hours during the work week and 10 hours on the weekend at a host of different tasks. Amazing. It took a study to find out that teachers have full-time jobs after all and actually do work-related things on the weekend (although the numbers for both week and weekend seem a bit low). I’m wondering how to square these remarkable results with an article I read a couple of years ago claiming that “University Professor” topped the list of “Least Stressful jobs of 2013.” Of course I had to respond . . .

Those who read this blog regularly or even occasionally know that I believe I have the greatest job in the world. For those who are finding this out for the first time, let me repeat—I have the greatest job in the world. As a matter of fact, it is so great that I don’t consider it to be a job at all. For me, teaching is a vocation rather than a job, something that I truly believe I am called and was born to do. Convocation_2007_16I raise eyebrows occasionally on the campus of the Catholic college at which I teach when I say that I consider teaching to be a vocation for me just as much as being a priest is the vocation of the guys who walk around campus in white robes. But even though I love what I do to an almost unhealthy extent, I taken aback when I learned from a colleague via Facebook that “University Professor” is listed by CareerCast.com at number one in its top ten list of “Least Stressful Jobs of 2013.”

The Ten Least Stressful Jobs of 2013

Really? Or as one of my colleagues commented on Facebook “Bullshit!!! [pardon my advanced degree French].” I guess I must have failed to notice how non-stressful my job is during my 25 year university professor career.

Every person who teaches in higher education has a virtual file full of stories about how difficult it is to get a non-academic friend or family member to understand exactly what is involved with being a college professor. pic_short_teaching_courses_londonMost difficult is getting someone to understand that this is not a part-time job. For instance, Cousin Bob finds out that the typical teaching load for a faculty member at a teaching college or university is three or four three-credit courses per semester (or perhaps five at a two-year college), meaning that the faculty member is in class at most fifteen hours per week. Must be nice to make a full-time salary at a part-time job! Cousin Bob remarks. Early in my career I often patiently pointed out to the Cousin Bobs in my family that a good rule of thumb is that a teacher spends three to four hours outside of class (preparation, reading, grading, meeting with students, etc.) for every hour spent in class. “Really?” Cousin Bob replies. But he clearly is not convinced, since as we all know, easy working hours is the main reason that a person becomes a teacher.

Then, of course, Cousin Bob wonders about all of the weeks of the calendar year that I am not in the classroom. Christmas break, Spring break, the summer—teachers apparently get at least twenty weeks per year off.images Must be nice to make a full-time salary at a part-time job! With what begins to feel like infinite patience, I explain to Cousin Bob that teaching is only one part of a university professor’s job. In order to advance through the ranks of promotion and, more importantly, in order to get tenure, one must research and publish on a regular basis. For most college faculty, the time and focus required for this aspect of the profession is not available during the semester, so the “breaks” are all about research, writing, and praying for publication. But I’m not in the classroom, right? Must be nice to make a full-time salary at a part-time job! You get the idea. A colleague once told me about his frustrated response to a relative after one too many such conversations. Upon hearing Must be nice to make a full-time salary at a part-time job! one too many times, my colleague replied “It actually is really nice. If you were smart enough you could have a job like that too.”

CareerCast’s explanation of why University Professor is the least stressful job of 2013 has more than a whiff of Cousin Bob behind it, just expressed in a slightly less aggressive fashion. For instance, the article explains that

University professors are at the pinnacle of the education field. Their students are largely those who choose the classes they attend, and thus want to be in class. Unlike elementary and secondary educators, the performance of college professors isn’t evaluated based on standardized tests. 15rfd-image-custom3University professors also have the opportunity to earn tenure, which guarantees lifetime employment.

A full response would require something more like a book chapter than a blog post. Suffice it to say that the author of the article (1) has apparently never heard of core requirements if he thinks that college professors face “students . . . who choose the classes they attend, and thus want to be in class,” (2) is not aware that despite the (usual) lack of standardized tests, college teachers are evaluated by their peers, answerable for the results of student evaluations, and are under regular scrutiny in every aspect of what they do, and (3) needs to learn something about the tenure process (assuming that the faculty member is fortunate enough to be teaching at an institution that uses the tenure process).

Tenure. Such job security is certainly unusual in today’s job market and tenure is an attractive “perk” of the academic life. Once one earns it, that is. one-does-not-simply-become-an-adjunct-professorTenure-track positions are hard to come by in academia, more and more so as many institutions opt for hiring year-to-year adjunct professors or special lecturers then proceed to treat them as well-dressed slave labor (don’t get me started on that one). Should a teacher be fortunate to land a tenure-track position in today’s heavily buyer-skewed academic marketplace, the stress she or he will experience in the next several years leading to the tenure decision will be sufficient to last a lifetime. As is undoubtedly the case in many workplace environments, the tenure decision is often as much or more about internal campus politics as it is about the qualifications of the candidate and those things that she or he can control. “The opportunity to earn tenure” is indeed that—an opportunity that, unfortunately, for many talented and qualified teachers will never be available.

Then there’s the money. The article author points out that

csreport_header02_r1_c1_s1Harvard University pays full-time professors $198,400, with a 7:1 professor-to-student ratio, while University of Chicago professors receive $197,800 per year with a 6:1 ratio. Among public universities, the University of California at Los Angeles (UCLA) is highest paying, with an average wage of $162,600 for its full-time staff.

Really? All of them? At all levels? In all disciplines? Such “statistics” are useless without context, about as useless as telling a lawyer in a public defender’s office working 80-100 hours per week and struggling to make ends meet that the senior lawyers in the top firms on Wall Street often make seven-figures annually. Here’s an anecdote from the real world. At least a dozen years into my teaching career, still paying off the loans from ten years of college in order to earn the PhD required to teach at the college level in my discipline,business-ethics it occurred to me one day that the second semester seniors in my Business Ethics class, most of whom would be entering the work force shortly after graduation as entry-level persons with undergraduate business degrees, would be starting at a monthly salary noticeably higher than my own. As I once told a younger colleague when I was chair of my department, in response to his complaints about how little he was being paid, “if you became a teacher for the money, you’re a moron.”

1385581_616281185091038_1215320450_nI have reached the level of experience and rank (a tenured, full professor) at which one could expect that maybe stress levels might reduce and ultimately disappear. But persons such as I are those who are tapped, appropriately, to significantly commit themselves to the third leg of the academic stool (along with teaching and research): service. After four years as chair of a department of 25 faculty and having recently completed a four-year stint as the director of the core academic program at my college, responsible for 80 faculty and upwards of 1700 students at any given time, I realize that one sort of stress just gets replaced by another.

And actually that’s fine, since it is all part of the vocation I was born to inhabit. There are many attractive features to the life of a university professor. I can think of no other profession in which one’s creativity is required more often or in which one has more autonomy and flexibility. But it is anything but stress-free. A teacher never leaves the office. Your work goes with you everywhere. I realized at one point early one December that, other than Thanksgiving Day,  I literally had not had a day off since the middle of August. This is why I have recommended the teaching profession to no more than a half-dozen of my best students in more than twenty years of teaching. If you are looking for a profession that will fit nicely with your family obligations and other interests, don’t become a teacher.nice-work-if-you-can-get-it-1180 If you want to make a living wage at a stimulating 40-45 hour per week job, don’t become a teacher. If you want to “work to live” rather than “live to work,” you probably should not become a teacher. If you think of teaching as one among many equally interesting career possibilities, don’t become a teacher. But if you are incurably obsessed with the life of learning, if the dynamic of new ideas exhilarates you, if you suspect that you might have the heart of a teacher and nothing else will fulfill you, then this highly stressful but highly rewarding vocation might just be for you. It’s nice work if you can get it.

illusion

They Will Never Take Our Freedom

Although I read incessantly, I don’t read a lot of magazines. The only magazine I currently subscribe to is The Atlantic—I appreciate the excellent writing and quirky features, but don’t exactly wait by the mailbox for each monthly edition to show up. Instead, they tend to pile up on the little table next to my side of the bed, waiting to be perused when I am between authors in my novel reading. I’m currently in one of those spaces, having just finished my fourth consecutive Arturo Pérez-Reverte mystery a few days ago and not ready to start a new, large reading project just a week before the semester starts. 394-They'll Never Take Our FreedomAccordingly, I started plowing through the three summer editions of The Atlantic that have accumulated on my nightstand since June. Inside the June edition, whose cover includes two-thirds of Donald Trump’s head peeking in from the right side announcing a lead article entitled “The Mind of Donald Trump” (an oxymoron if I ever saw one), I found this: “There’s No Such Thing as Free Will—Here’s why we all may be better off believing in it anyway.”

Stephen Cave: There’s No Such Thing As Free Will

CaveThe article is by Stephen Cave, a philosopher who runs a “Center for the Future of Intelligence” at the University of Cambridge. His article is well-written and engaging—so much so that I suspect he may have had help with it. Trust me, I know whereof I speak. I have spent over twenty-five years learning to write in ways that make core philosophical issues accessible and interesting to non-philosophers—it ain’t easy. First, it’s important to clarify what philosophers usually are referring to when they use terms like “free will” or “freedom.”  Just before the final battle in his 1995 epic “Braveheart,” Mel Gibson’s William Wallace screams to the Scottish army that They may take our lives, but they’ll never take our freedom!!

That sort of freedom, the kind enshrined in this country’s founding documents as “rights” that each citizen possesses and that must not be violated or taken away, is not what philosophers mean by freedom.

Instead, “free will” refers to the human ability to choose, for a person to deliberate between options and eventually choose, then act on one of the options, all the time knowing that she or he did not have to choose that option—decisionin other words, she or he could have chosen otherwise. This vaunted human ability to freely choose is, for many (including me), the fundamental and defining feature of what it means to be human. Stephen Cave points out that our legal systems, as well as our general beliefs concerning praise, blame, reward, punishment, and all things moral depend on our basic belief in human free will. And it is under attack—scientists, psychologists, philosophers, and just about everyone “in the know” have been trying to take it away for decades.

The “free will issue” is a go-to problem in all philosophy courses, the philosophical version of the divine foreknowledge/free will problem in theology. Just it is impossible to make room for free choice in a world governed by an omniscient deity, so in a world where everything that occurs is governed in a cause-and-effect manner by the physical laws of matter, there is no room for true human free will. Cave points out that at least since Darwin argued in The of Species that everything about human beings—including our vaunted reasoning abilities where the ability to choose is located—is a result of natural evolutionary processes rather than a mystical, magical, or divine “spark” that lies outside the physical laws of matter, illusionscience has reinforced the conclusion that whatever human consciousness and deliberate choice are, they are to be placed squarely in the material world. Making it impossible, of course, to squeeze out the special place we desire for choice. Our choices may “feel” free, “as if” they are up to us, but Cave pulls no punches in describing the truth about us:

The contemporary scientific image of human behavior is one of neurons firing, causing other neurons to fire, causing our thoughts and deeds, in an unbroken chain that stretches back to our birth and beyond. In principle, we are therefore completely predictable. If we could understand any individual’s brain architecture and chemistry well enough, we could, in theory, predict that individual’s response to any given stimulus with 100 percent accuracy.

Experiments by psychologists and neuroscientists have shown that the brain’s neurons fire in new patterns causing a specific action before a person consciously “chooses” to act—indicating that my conscious “choice” is an illusion that actually doesn’t cause anything. nature nurtureDebates rage concerning how much a human’s actions are caused by “nature”—one’s hardwiring—and how much is caused by “nurture”—one’s environment—but there is general agreement that none of them are caused by conscious choice. We are determined through and through.

The ensuing discussion is often amusingly similar to conversations that couples considering a divorce might have: Should we tell the children, and if so, when? In the service of all truth all the time, some argue that non-philosophers and non-scientists should be made aware that free choice is an illusion and they should stop believing in it. Others insist that such a revelation would be damaging to the basic human’s commitment to morality, law, reward, punishment, and all of the other cool things that rely on our apparently mistaken belief that our choices make a difference and that we are responsible for them. My own classroom experiences indicate that it doesn’t matter. I regularly use a very simple thought experiment with my students at the beginning of the “free will” unit on the syllabus:

Suppose that in the near future a super-duper computer can read your brain and physiology sufficiently to predict the rest of your life, from large events to the minutest second-to-second thoughts and feelings, from now until you die. For a nominal fee you can purchase a printout of every event, thought, and feeling that you will experience for the rest of your life. Some printouts will be yards in length, while others will be very short. Do you want to see yours?

In a typical class of twenty-five students, no more than one or two students will say that she or he wants to see it. Why? Because even with direct proof available that the rest of my history is determined down to the minutest level—including my “free” choices—illusionI prefer to believe that my choices make a difference in my life and in the world around me. I prefer to embrace the illusion. It appears, in other words, that human beings are determined to believe that they are not fully determined.

On this particular issue I find myself swimming against the tide. I not only believe that human beings have the ability, at least on occasion, to make choices that are not entirely determined by their biology, history, and environment—I also believe that this ability is not an illusion. It’s real. The free will/determinism issue as contemporary philosophy defines it has its current shape because virtually everyone accepts a starting assumption—everything that exists is material stuff subject to inflexible physical laws. Given that assumption, the claim that human beings have the capacity to jump outside the limitations of matter and make choices that avoid the determinism of cause and effect makes no sense. But as I often tell my students, if the answers one is getting are unacceptable, change the question. If the ability to freely choose is fundamental to what a human being is, and if our current assumptions about how reality is constructed make no room for that ability, then perhaps instead of accepting that choice is an illusion we should challenge the assumptions that forced us to this acceptance. Be watching for “What Freedom Amounts To” next week, where I’ll describe a very different way to think about human choice!Horatio

They’re Baaack . . . .

We are in the business of believing in, and promoting, things that don’t yet exist. Julie Schumacher, Dear Committee Members

RI summerAfter the most beautiful June in my Rhode Island memory, July was warm and August has been abnormally hot. I hate heat—I will take zero over ninety-five degrees any day of the week. But August is one of my favorite months because I am a college professor. August is very quiet on campus—no classes, few hosted events, few visitors other than prospective students and their parents taking tours. I can work out at the gym without competing for equipment and enjoy observing the various construction sites on campus without dealing with tons of people.  It’s all a wonderful period of solitude; but just as Louis XV reportedly commented in anticipation of the Revolution that would cost his grandson Louis XVI his head, August says to the academic Apres moiaprès moi le deluge.” Before long, the floodgates will open. They’re baaack . . .!

Actually, this is great news. I can’t wait until next Monday when classes start–after a year-long sabbatical, I’m more than ready to be back in the classroom. I’m not one of those professors who regularly moans and complains about their students; they are the reason I am in the profession to begin with, they keep me young (at heart if not in outward appearance), and let’s be practical: for an academic, a world sans students would be a world sans paycheck. I got a fictional look at the dark side of academic attitudes about students, fellow faculty, administrators, and reality in general when reading DCMJulie Schumacher’s Dear Committee Members not long ago. Billed by Amazon as “A novel that puts the pissed back in epistolary,” it follows Jason Fitger, Professor of Creative Writing at Payne University, from the beginning to the end of an academic year through the exclusive lens of letters and emails of reference and support written for current and former students, colleagues, and acquaintances from graduate school days. Billed by reviewers as “funny as hell,” “hilarious,” “fun-as-heck,” and “funny and lacerating,” I must confess that although I smiled occasionally, I found the novel more sad than anything else. Sad because I know that the never-ending bureaucratic and pedagogical challenges of the academic life can turn someone into a jaded, sarcastic, and cynical curmudgeon like Jake Fitger (he’s four years younger than I am), and even sadder because it doesn’t have to be that way.

owaFitger is the graduate product of what he calls the “Seminar,” a graduate writing program that sounds a great deal like the Iowa Writers’ Workshop (which produced a few of my friends and colleagues—and none of them are like Fitger). After a reasonably successful first novel, followed by a couple more that fell still-born from the press, he finds himself in the final decade or so of a mediocre career at a mediocre public university in an English department whose infrastructure, both psychological and physical, is falling apart. As lines are closed down and the plumbing in the men’s bathroom fails, the Economics Department on the floor directly above is being treated to a complete upgrade of facilities. Fitger is the embodiment of what is actually happening across the country in more universities and colleges than I care to consider—the neglect and downsizing of the humanities while departments and programs perceived as job-producers and money-makers receive the lion’s share of funding and attention.

I am extraordinarily fortunate and privileged to have spent the last twenty-two years (and hopefully the next fifteen or twenty) at an institution that consciously attempts to swim against that tide. pcAccordingly, Fitger is for me the fictional embodiment of what could have happened had I not been as fortunate. I have few cynical and dedicated negative bones in my body, but some might have been created had I lived the professional life of Jason Fitger. Of course, many of the most problematic aspects of Fitger’s life are self-created. He is hated across campus for various justifiable reasons, his marriage to a fellow professor on campus fell apart when she became aware of his continuing infidelities with an administrative assistant, and his affair with the administrative assistant ends when he accidentally hits the “Send All” button on an email intended for one individual in which he expresses his continuing sexual attraction for his ex-wife that remains strong even after five years of divorce.

I don’t know anyone on campus like Jason Fitger (although he might be lurking somewhere). But hidden like buried treasure underneath page upon page of sarcasm and nastiness are occasional and brief homages to the academic life that I was surprised and pleased to find. faculty on quadIn an email to a former colleague from the Seminar, an epistle drenched in anger, regret and bitterness, Fitger steps back for a moment.

The stately academic career featuring black-robed professors striding confidently across the campus square is already fading; and, though I’ve often railed against its eccentricities, I want to proclaim here that I believe our mission and our way of life to have been admirable and lovely, steeped with purpose and worth defending.

Amen to that. I only get to wear black robes a couple of times a year—although it would be sort of cool to wear them all of the time—and I agree that the eccentricities of the academic process and of academics themselves can be a pain in the ass. But what a wonderful profession. It’s the best thing going, not because of money, fame, or notoriety (which come to only a miniscule percentage), but because of the privilege of making a living in the midst of the most exciting environment imaginable—the life of continual learning. VM Ruane 1As I noted in my remarks at the dedication of our new humanities building almost three years ago, the Apostle Paul’s words ring true at this time of year for every academic: Old things are passed away; behold, all things are become new. And Jason Fitger knows it, in spite of himself:

There is nothing more promising or hopeful than the start of the academic cycle: another chance for self-improvement, for putting into practice what one has learned—or failed to learn—during the previous year.

They’re baaack . . . and I can’t wait. Bring it on.

One Nation, Under God

I’m troubled by those who say so much about what God says so little, and so little about what God says so much. William Barber

In early 2014, during an interview with the Global Evangelism Television Network, former Texas congressman Tom Delay had the following diagnosis concerning various problems facing the United States:

I think we got off the track when we allowed our government to become a secular government. When we stopped realizing that God created this nation, that he wrote the Constitution, that it’s based on biblical principles.tom delay

Tom Delay interview

Sigh. I vaguely remember Delay saying something like this but dismissed it as yet another ludicrous statement from any number of elected officials from the South to whom I pay no attention. But when I bumped into an article about the interview the other day on my Facebook news feed, I decided it would be entertaining to put the link on my wall, commenting only “And I always thought that God wrote the Ten Commandments.” Sure enough, in short order the comments started rolling in, none of them complimentary. Some suggested that Delay had been dropped on his head several times as a baby, others drew attention to the legal problems that led to Delay’s leaving Congress a decade ago. One person suggested that if God wrote the Constitution, there are some inexplicable passages.

  • Interesting that God put in the part about the government making no law about an establishment of religion, and the part about never having a religious test for any office or public trust.constitution

No need for Mr. Khan to lend that guy a copy of the Constitution—he seems to be familiar with it. Several others used the strategy I often use when pushing back against ideas such as Delay’s: looking at the historical evidence.

  • Delay has no clue about the confessional chaos that existed at that convention. Tell me with a straight face that a Catholic is going to trust an Anglican, or a Puritan is going to trust a Deist, to write laws for everyone?
    • Me: I thought everyone trusted Anglicans!
      • Only if you’re serving my ale, my friend . . .
    • At one point, Ben Franklin said “Hey, we forgot to open this convention with a prayer! We better correct that!” The motion wasn’t carried. Madison wrote that everybody was kind of annoyed.jefferson
    • Thomas Jefferson, for one, was a Deist. Delay wouldn’t know that from apple butter. And James Madison was no church lady. These were men of the Enlightenment who had a distrust of theocrats and religious governance and its bloody ruin in Europe’s Hundred Years’ War.
    • “The Government of the United States of America is not in any sense founded in the Christian religion.” –1797 Treaty of Tripoli signed by Founding Father John Adams.

To get a sense of the chaos, compromise, and principled hard work that went into the shaping of our Constitution, read James Madison’s Notes on the Federal Convention of 1787. It should be required reading for all citizens of the United States. conventionFranklin, Washington, and Jefferson were all Deists, as were many other Founding Fathers; the tenets of Deism are pretty simple. There is a creating force we call God, what we do in this life matters, and we will be held responsible in some way for it. When one takes the traditional Christian God and strips away those characteristics that cannot be argued for using reason and logic alone, you get the Deist God—a God too disengaged with the everyday workings of creation to get involved with writing a founding document for a bunch of successful rebels.

So why do so many people, particularly various sorts of Protestants, insist in the face of a massive amount of contrary evidence that this country was not founded on secular principles but rather essentially as a theocracy? A comment from my cousin was most insightful.

  • Unfortunately, it is very common for fundamentalist protestants (the “born-again” crowd who simply call themselves “Christians” as if they were the only ones) to view all mentions of God or “the Almighty” within their framework only. For that reason, they actually believe that the US was founded as a Christian nation because of oblique references to the Almighty or the Creator in our founding documents. Given my fundamentalist background, I know whereof I speak.
    • Me: We were raised as insiders!
      • You’ve got that right!

Frnativismom seventh grade through high school, my cousin and I virtually lived in each other’s houses. We experienced together—and evolved from—exactly the sort of Christianity that sharply divides those who are in from those who are out, a religious form of the nativism that frequently rears its ugly head in our national discourse. This type of Christianity separates those favored by God from those who are not, just as nativism separates “us” from “them” in various ways. Tom Delay has simply taken the additional step of merging these two forms of exclusivity together.

Politicians often compete with each other as they seek to establish who is more “Christian” than their opponent. During my lifetime it is the Republican party that has owned the mantle of “most Christian,” particularly since the rise of the moral majorityMoral Majority during the 1980s. But during our current election cycle, it feels like an alternative universe. The Republican nominee for President said nothing about God, faith, or religious values during his acceptance speech at their convention, while the Democratic nominee referred explicitly to how her Methodist upbringing has shaped her life of public service. The patriotic energy of the Democratic convention was reminiscent of a Republican convention in any other Presidential election cycle.

And then there was this. William Barber, the head of North Carolina’s NAACP and leader of that state’s Moral Monday movement, demonstrated clearly in his ten-minute speech at the Democratic convention how it is possible to bring one’s faith-based values into the world without insisting that everyone must sign on to a particular religious worldview.

One person commented on YouTube: “I’m an atheist, but I’ll go to service every week wherever he preaches. Just amazing.” This is how one can bring whatever one believes God to be into the public square without assuming that every person in that square means the same thing by “God” as you do. Barber’s comments are an inspiring and eloquent expression of what I mean when I frequently say and write that I am a liberal because I am a Christian. In our nation dedicated, among other things, to the separation of church and state, ostumbling blockne should not bring one’s faith into political debates and become, as the Apostle Paul put it, “a stumbling block and an offense” to those who do not share your version of your faith or to those with no faith at all. Rather, one should enter the public arena as the person one’s faith has caused one to become.

P.S. for those who appreciate gospel music and good singing—Rev. Barber’s final reference is to two lines from an old Baptist hymn: “Revive Us Again.”

Revive us again, fill each heart with thy love

Let each soul be rekindled with fire from above

I know this hymn well—various church congregations in my youth sang it with gusto on a regular basis. If you’re interested in what a cappella singing is supposed to sound like, enjoy this recording of the hymn—the verse Reverend Barber quotes begins at 1:11. If you have no interest in or reject the theology in the lyrics (which I do, at least partially), at least enjoy the beauty of the human voice!

The Rule of the Best

The best argument against democracy is a five-minute conversation with the average voter. Winston Churchill

I had a fascinating conversation on Facebook the other day (imagine that!). You may have noticed that we are in the middle of a very polarized political campaign—a Facebook acquaintance posted some data identifying the demographic that is most favorable to Donald Trump and most problematic for Hillary Clinton—white men with no degree. At the time the article was published, Hillary and white malesHillary was doing 14% worse with this group of voters than President Obama did four years ago.

Hillary Clinton and white men without a degree

My Facebook acquaintance and I have never met in person, but we share a couple of important characteristics. Both of us are college professors, and both of us earned our bachelor’s degree from the Great Book curriculum at St. John’s College. We have “liked” each other’s posts before—this time, I took the opportunity to throw something out there that I have frequently taught in the classroom and written about—voting should be considered as a privilege that one earns rather than a right that one is entitled to.St. John's

  • Me: The elitist in me thinks that the white men no degree problem could be solved by voting being considered as a privilege rather than a right. Everyone should be required to get at least a 70% on the written civics test given to those seeking citizenship in order to earn the privilege of voting.
  • Facebook Acquaintance (FA): As an educator, I sadly do not have faith that a civics test would be voting.

Where did I get this ridiculous idea that voting should be an earned privilege rather than a right? It is rooted in the thought of perhaps the greatest philosopher in the Western tradition, as I explained to FA.

  • Me: As an educator as well, I get your point. My problem is there is a part of me that thinks Plato is on to something in the Republic. His critique of democracy is that it pretends that everyone is equal—but we know this isn’t true. Very un-American, I know. And I wouldn’t say that only the elite would be voting. Rather, those who have bothered to earn the privilege of voting would be voting. I know many “educated” people who would not pass the test.

plato-the-republicPlato was of the opinion that the proper education qualified a person to participate in government, implying that many people are not capable of successfully completing such an education. FA thought that the problem might lie elsewhere.

  • FA: I am not sure that many people would care to earn the privilege. Sadly. It is easier to complain and watch the reality TV Trump show. Have you seen the movie “Idiocracy”?
  • Me: I haven’t, but can guess from the title what it’s about. I agree that not many people would care to earn the privilege. Which might mean that Plato is right again. Aristocracy in its true meaning—the rule of the best—is the best form of government.

FA’s suggestion that many people might lack the drive or interest to take my proposed voting test illustrates—intentionally or not—one of Plato’s most important points in the Republic. Human beings are not created equal. Some are worthy of being educated to be full participating citizens and some are not. Whether because of lack of intelligence, drive, character, or a combination of these, some people are not capable of being full citizens. Plato and aristocracyThis is Plato’s fundamental critique of democracy—it is rooted in the ludicrously false assumption that all human beings are equal in all relevant ways. They aren’t.

So what does Plato advocate as the best form of government? Aristocracy, understood not as the passing on of power through blood lines as we think of when we hear “aristocracy,” but understood in its original and pure form. Aristocracy simply means “the rule of the best (aristos).” One of the major thrusts of the Republic is a meticulous construction of the perfect community, a community in which each person performs the tasks for which she or he is most naturally suited and which is ruled by the best people in the community. The rulers are identified early in their lives as potential leaders and educated with a view to actualizing the excellence that is latent in them. aristosThe potentials of others are similarly identified early in their lives; accordingly, each person is trained to be the person she or he is most naturally fit to be.

FA was not having it.

  • FA: I don’t think Plato is right. I think people don’t care to earn the privilege because they don’t believe the system will work for them, and because they are so poorly educated because they are poor. It is not a matter of individual failings, but of people being shaped by the system.

There’s a lot in this response. Suffice it to say that FA could be completely right without Plato being wrong. Our current system that has clearly produced millions of disaffected and disillusioned voters is a product of the democratic system, a system that Plato rejects. FA’s insight is that the “system” (society, if you will) shapes the individual—Plato would entirely agree. Our problem is that we have the wrong “system.”

Thirty seconds later FA sent an additional comment that changed the whole discussion.medea

  • FA: As a counterargument to Plato’s Republic, I give you Euripides’ Medea. She definitely would have wanted a vote.
  • Me: Good point.

Touché. That’s what I get for getting into this sort of conversation with someone who knows the ancient classics. The title character in Euripides’ Medea is brilliant, powerful, insightful, and effective—exactly what one would want in an informed electorate. She is also vindictive, manipulative, and murders three people (including her two sons) in the play. She’s a bad person, in other words. FA’s point is that what we need is not a test for how informed one is about current events and how government works. In an aristocracy, what is needed is a test to determine who is aristos. And before that, someone to define what aristos even means. Who do we trust to do that and to create the test? I have suggested to my students that I would be willing to do it, but they didn’t seem strongly supportive of my offer.

winstonWinston Churchill famously said that “Democracy is the worst form of government, except for all those other forms that have been tried from time to time.” Context is important here. After courageously leading Great Britain as Prime Minister through the dark days of World War II, voters rewarded Winston by voting him out of office in 1947. The famous comment was made in Parliament several months later. Democracy is the messiest imaginable way to run things—it might even facilitate the election of Donald Trump as President in November. But every time I work with students to try and devise a better way of doing things, we always come back to the same conclusion. Democracy is a mess, but it is our mess. What are you going to do?

Believing What I Think

Last week at the Republican National Convention, the Republicans nominated as their candidate for President of the United States a person so outside the norm, so iconoclastic in every way, that even the most experienced observers of American politics—insiders and outsiders alike—are scratching their heads. trumpHow did this happen? I suspect that it will take years for answers to fully develop, but there is one contributing factor that I have been hearing both through traditional and social media on a regular basis. Supporters of this candidate often say something along the lines of “He’s saying things that many of us have been thinking for years but have, for any number of reasons, not been able to say. He speaks for us.” Which raises the question—How much of what we believe to be true is simply a projection of what we want to be true? After all, as a bumper sticker I saw the other day insightfully pointed out, “You don’t have to believe everything you think.”believe

Fall classes begin in a few weeks and I have started planning my two sections of General Ethics in earnest. Over my twenty-five-plus years of teaching, ethics has always been my favorite course—because of administrative duties, then sabbatical, this will be the first time in five years that I have taught it. I chose several weeks ago to make the class as contemporary possible—with two exceptions, every assigned text was written within the past ten years. One of the exceptions will come early in the semester from one of my three or four favorite philosophers—Michel de Montaigne. massacre[1]Montaigne lived in a polarized religious world that reminds me strongly of our current equally polarized political situation. Sixteenth-century France was not a pretty place—in the aftermath of the Protestant Reformation, Christians were killing each other with regularity and abandon, all in the name of Christ. Catholics and Protestants each were certain that they were right; energized by such certainty, each was willing to kill the other in the name of truth and right belief. When I heard delegates in Cleveland last week regularly chanting that the soon-to-be-official nominee of the other major political party should be locked up or worse, I thought of Montaigne’s constant efforts to convince his readers that certainty and unwarranted conviction can be deadly.

In the second week of classes, my students and I will be working on perhaps Montaigne’s most famous essay—“On Cannibals.” Reflecting on the visit to France of several Brazilians from cannibal tribes, Montaigne notes that just as cultured Europeans of his day were appalled by various Brazilian tribal practices, so the visitors were just as confused and appalled by certain European cultural norms. michel-de-montaigne-006[1]Stepping back, Montaigne argues first that a stronger case for barbarism could be leveled against the Europeans than against the cannibals, then puts his finger on an issues that is remarkably relevant to our contemporary world.

We all call barbarous anything that is contrary to our own habits. Indeed, we seem to have no other criterion of truth and reason than the type and kind of opinions and customs current in the land where we live. There we always see the perfect religion, the perfect political system, the perfect and most accomplished way of doing everything.

Our own myopia and parochialisms are usually defined by something other than national borders, but Montaigne’s point is clearly as true now as it was in his day. We tend to believe that what we are most accustomed and used to is true, without ever wondering how we came to be accustomed and used to these things in the first place. We resonate most strongly with those who mirror back to us what we are already thinking.

Parochialism and attachment to what we think we know is not a problem exclusive to any particular set of beliefs or experiences. All of us, from conservative to progressive, from atheist to dedicated religious believer, assume that the way that we think is not only the epitome of common sense, but also the standard of reason well used. Yet as Adam Etinson, a contemporary commenter on Montaigne’s “On Cannibals” adam etinsonobserves,

Moral reasoning is generally something we use merely to convince others of long-held beliefs that we are unwilling to abandon . . . often, no amount of persuasive reasoning, clear argument or exposed contradiction can shake us from what we already believe.

Why are we so inclined to hang on to our most entrenched thoughts, even in the face of evidence that our most deeply held beliefs are rooted in anything but experiential evidence supported by logical reasoning?

The most obvious answer is that adopting the thoughts and beliefs of one’s culture and family is easy, while critically challenging one’s default settings and perhaps even changing them is hard work. Cultural centrism is evidence of both our intellectual laziness and our fallibility—the ever-present possibility that our beliefs might be wrong. One effective way that I have found to bring the randomness of our deepest convictions to light is to simply ask my students the following: “How many of you think that you would be a very different person today if you had been born in rural Tibet instead of where you were actually born?” All hands go up. “Why?” Because, as everyone knows, we are shaped early and often by features of our existence—our society, family, location, social status—that we do not choose. ethnocentrismYet we often wander unreflectively through life relying on these foundations that we did not choose, as if we had magically been given the universal truth about all important issues at birth. The fact that our deepest held beliefs would be different had we been born elsewhere on the planet (or even, sometimes, to different parents farther down the street) should disconcert us. In every course syllabus I include Spinoza’s observation that “I do not know how to teach philosophy without disturbing the peace.” The “What if you were born elsewhere?” exercise is one of the more effective peace disturbers in my teaching arsenal.

Of all the things I deeply believe, those that I have come to through challenging preconceptions and previously unchallenged assumptions are the ones that are now most definitive of who I am. All of us should regularly reexamine our beliefs and practices, become alert to weaknesses and inconsistencies in our own thinking, discover something plausible in another’s point of view and in so doing, become better than the parochial and myopic creatures that we naturally are. After all, none of us needs to believe everything that we think.

A Gnawing Suspicion

A mystic is anyone who has the gnawing suspicion that the apparent discord, brokenness, contradictions, and discontinuities that assault us every day might conceal a hidden unity. Lawrence Kushner

ebolaA while ago Jeanne and I were in the car listening to the hourly news update on NPR. As usual, they were trying to stuff as much horrible news as possible into a three-minute segment. Ebola, ISIS, Zika, Palestinians, Israel, Istanbul, Russia, illegal immigrants, racial discrimination— one of us said “they’re never going to figure this out.” I forget which of the above items the comment was referring to, but it could have been any of them. I know few people who are more naturally optimistic than I am, fergusonbut what evidence is there that we human beings are up to the challenge of solving our problems long-term in a sustainable way? The history of our species provides ample evidence to the contrary.

So what impact should this depressing and dour news have on a person not inclined toward cynicism or despair? I must admit that I would find it very difficult to avoid cynicism in general, overcome only by dogged attempts to make my little corner of the world a bit better on a daily basis, were it not that I am convinced that the often sad and grubby human story that is trumpeted at us 24/7 through multiple media outlets is not the only story in town. There’s something bigger going on. In other words, I believe in God. So sue me.

borg convictionsFor many the conversation stops right there. How on earth can an educated, relatively intelligent person with working senses possibly believe in the existence of God in the face of the massive evidence to the contrary that threatens to overwhelm us daily? Please note, though, that I said that I believe in God, not that I believe in the existence of God. This is a gradual, seismic internal shift that has been going on for a while, one that I have frequently taken note of in various ways during the almost-four years of this blog’s existence (and for a lot longer than that). KabbalahTwo short books, Marcus Borg’s Convictions and Lawrence Kushner’s Kabbalah: A Love Story, have crystallized this shift in unexpected ways. Let me explain.

The “does God exist?” question never had much philosophical interest for me (I don’t think any of the arguments designed to answer the question positively actually work very well); does god existover time I have lost interest in it just about entirely. The God whose existence is almost always in question is a being separate and distinct from the universe, a supreme being who created the universe a long time ago. This description usually goes on to add personality traits such as omnipotence, omniscience and omnibenevolence to God’s resume; God thus described is often imagined with authoritarian and parental attributes, with all of the positive and negative baggage accompanying. Marcus Borg calls belief in the existence of this being “Supernatural Theism.” For non-theists who deny the existence of God, it is almost always the God of Supernatural Theism whose existence is being denied; it is this God that is the target of the impassioned attacks of the “New Atheists.” supernatural theismBorg notes that when someone tells him that she or he does not believe in God, he “learned many years ago to respond, ‘Tell me about the God you don’t believe in.’ It was always the God of supernaturalism.” Borg professes that he stopped believing in that God when he was in his twenties (he passed away in his seventies about a year ago). I don’t believe in that God either.

It isn’t that I now believe in the existence of a divine being with a different resume. It’s rather than I think “does God exist?” is the wrong question. Because the issue of God for me is not existential—it’s not about whether there is another being out there in addition to the universe. The issue of God is experiential. Scripture says “taste and see that the Lord is good,” and tasting and seeing are not arguments, rationalizations or proofs. Borg describes the shift I have in mind well:

borgThere is a cloud of witnesses, Christian and non-Christian, for whom God, the sacred, is real, an element of experience, not a hypothetical being who may or may not exist and whom we can only believe in.

Both Borg and Kushner call this orientation “mysticism,” and both refer to experiences that might be described as “mystical” that helped bring them to this experiential conclusion. I’m not crazy about calling myself a “mystic” for a number of reasons, but I do resonate with Kushner’s definition at the beginning of this post, just as I resonate with Borg’s adjustment of what the word “God” refers to:

A theology that takes mystical experience seriously leads to a very different understanding of the referent of the word “God.” The word no longer refers to a being separate from the universe, but to a reality, a “more,” a radiant and luminous presence that permeates everything that is.

KushnerKushner refers to the “gnawing suspicion” that there is a hidden unity underlying all of the mess that we find ourselves in. “Suspicion” is a well-chosen term, because a reorientation from Supernatural Theism to Mystical Theism (as Borg calls it; Kushner calls it “mystical monism”) is difficult to talk about and impossible to provide convincing arguments for. Words fail me, although I keep trying to find them. More often than not I fall back on the evidence of a “changed life” and “come and see,” finding strength in the fact that those who have also experienced the sacred and have not just thought about it resonate with me on a level deeper than words. They just “know” what I am trying to convey.

Working out the implications of where this takes me on all sorts of issues is a continuing effort in these pages. Returning briefly to where I began, what might mystical theism say about the fractured and disjointed world in which we live? problem of evilTrying to square such a world with the God of Supernatural Theism gives rise to the problem of evil, perhaps the most intractable philosophical/ theological problem of all. But as Kushner suggests, there is a different orientation available.

If you are a mystic, saying you believe in God means that you have an abiding suspicion that everything is a manifestation of God, and no matter how horrific it might be, it is still, somehow, filled with holiness.

The only evidence for that is experiential, and even such experience is iffy and enigmatic. I have not had the “road to Damascus” sorts of experiences that have changed the lives of many. My reorientation has been more gradual, which for me means it is likely to have the permanence that a “once for all” experience might lack. 100_0331As I sat for many weeks in daily prayer with Benedictine monks several years ago, the reorientation began as I noticed a slow opening of peaceful spaces inside and a new way of seeing what is around me. This does not conflict with my intellect, my mind or my philosophy—it holds them in place. And when I run out of convincing words, I plan to remember this that I just read from Lawrence Kushner:

Why is it that you cannot simply tell someone a great religious truth without a whole rigmarole of questions and hints, allusions and mysteries? It is because that is the way God made the world.dostoyevsky