Saturday, August 20, 2016

Privilege Was Not the Original Sin, Arrogance Was

The forbidden fruit taken by Adam and Eve in the Garden of Eden certainly seems like it had a sour taste. For indulging in a little knowledge, the two brought sin and death into the world, got kicked out of paradise, and were each assigned their own specially frustrating labor projects. Bible commentators down through history have noted an obvious lesson here: sometimes what we learn through experience is pretty bitter. Thanks to the actions of Adam and Eve, we all have been corrupted and stand in need of salvation... or so the story goes.

Some right-leaning academics and activists have likened the concept of privilege to that of Original Sin. Both are things we are born into, that we cannot escape, and they are best dealt with by a confessional or penitent approach. James A. Lindsay and Peter Boghossian draw this comparison in their article, "Privilege: The Left's Original Sin," published at AllThink.com. There is no greater sin in the eyes of the left, they claim, than "having been born an able-bodied, straight, white male who identifies as a man but isn't deeply sorry for this utterly unintentional state of affairs."1

Interesting similarities do exist between privilege and Original Sin, as noted. Yet concepts like apostasy, faith, and religion are frequently associated with secular ideas in ways that are more tenuous than they are convincing. The more the sacred retreats from the latter half of that equation, the weaker the analogy seems. If the mere association with religion is meant to be an indictment on talk of privilege, then Boghossian's unabashed borrowing from the evangelizing pages of Christian ministry in A Manual for Creating Atheists is no less guilty. Presumably, though, the main complaint is not the religious connection, but how privilege and Original Sin have both been used as shaming devices.

Certainly, privilege talk can be used to try and control or stop conversation. In that sense it is quite like Original Sin as it has been employed by brazen preachers spreading a message of hellfire and brimstone. But where many on the right have interpreted privilege in terms of personal attacks, many on the left have been endorsing it with the aim of calling attention to broader social issues. Mychal Denzel Smith, writing for The Nation, observes that when "people with privilege hear that they have privilege, what they hear is not, 'Our society is structured so that your life is more valued than others.' They hear, 'Everything, no matter what, will be handed to you. You have done nothing to achieve what you have.'"2 Apology and repentance are not the goals for those who partake of the language of privilege – social reform is the goal.

Discrimination is offered in the article as a better alternative to privilege. This may be splitting hairs, but it may also underscore a valuable point. Discrimination has a history behind it, especially a legal one, and it has often been addressed on an isolated, individual-case basis. To suggest that there are more systemic problems in our courts, in our neighborhoods, and in our society, a bigger word seems necessary. Privilege stings. It evokes an air of elitism, of undeserved benefit, and it plays off the anti-magisterial sentiments that have long been a part of American culture. Privilege is less visible than we imagine discrimination to be. It saturates and it structures, as Maggie Nelson has written.

Granted, privilege has its conceptual flaws, too. It's been argued that it associates the advantages of privilege with luxuries rather than with rights. Others have suggested that it's not very conducive to understanding differences among various minority groups. Of course, these are conversations worth having civilly, and they have been ongoing in many areas of social justice for some time now. Boghossian and Lindsay are also willing to give a modest bit of credit to the term, conceding that it does describe something real and problematic. What they object to is how privilege helps to "glorify" the struggles of certain identities lucky enough to be born into the right group, while serving as a club to beat on those born into the wrong group. If social reform is what privilege talk is about, then these concerns are actually some of the focus for change.

What if these common problems with Original Sin and privilege are actually due to a confrontational attitude rather than to any conceptual similarity? There are Christians for whom Original Sin is not a weapon with which to persecute unbelievers, but a reminder to be humble and forgiving towards others. In Romans 3, Paul considers the standing Jews and Gentiles have before God. "Do we have any advantage?" he asks. "Not at all!" No one is righteous, not even one, as he goes on to declare in verse ten. Could privilege not serve as a similar reminder to humility?

Oddly, after explaining that "everybody is privileged," and that Original Sin and privilege are identical except in that they inhabit different moral universes, Lindsay and Boghossian contend that a distinguishing difference between the two is that the label of privilege is even more contemptible because it's seen to be a hindrance to the less fortunate among us. But everybody is privileged, so who can rightly take the moral high ground? Some might still claim the moral high ground, though there's no real explanation for why this would be tolerated more in the case of privilege than in that of Original Sin. Fighting privilege doesn't mean forcing repentance.

Now, it's true that no analogy is perfect, but Boghossian and Lindsay are ambiguous enough in their use of the term privilege that it presents a problem for their argument. Let's take a definition of privilege by Sian Ferguson at Everyday Feminism. Ferguson says, "We can define privilege as a set of unearned benefits given to people who fit into a specific social group."3 This doesn't tell us anything about most of what Lindsay and Boghossian attribute to privilege, such as its being an accident of birth, being inescapable, applying to everyone, or demanding atonement. That's because these are ancillary ideas about the function of privilege in society. Just as the concept of sin differs from the doctrine of Original Sin, the concept of privilege differs from the political and philosophical theorizing that has surrounded it.

The problem is that if we're going to bring in these ancillary ideas about privilege in drawing a connection to Original Sin, why stop there? Boghossian and Lindsay try to conceal the breakdown of their analogy with the line of qualification stating that Original Sin and privilege inhabit different moral universes. It allows them a little leeway to conveniently gloss over major incongruities like the importance of power systems for understanding privilege, or the supernatural nature of sin. Privilege functions between oppressors and the oppressed, whereas Original Sin doesn't really recognize anyone as being "in power," oppressing us sinners. Lindsay and Boghossian almost note this difference when remarking on how the labeling of another person as privileged is sometimes taken as a personal hindrance to us. Sin, on the other hand, isn't just a moral or interpersonal affliction, it's a spiritual one, and the "mechanism" by which it's passed down is frankly mysterious – not at all like the way that privilege persists through oppressive social structures. There is also the fact that, unlike sin, privilege actually represents a goal to aspire to. Sin can be viewed as a disease in need of healing, but the point of social justice is not to eliminate the privileges some people enjoy, it's to extend them to more people.

I'm not sure why we should feel persuaded by the criteria of similarity raised by Boghossian and Lindsay. They seem somewhat cherry picked, but their significance can be questioned, too. Death is something we have no say over, it cannot be escaped, and it's been said that all of us are dying from the moment we're born. Yet we might question the purpose of comparing death to Original Sin, or to privilege, on such grounds. It could be claimed that death isn't as comparable for some reason or other, but we have just seen a few ways in which privilege isn't as comparable, either.

It's admittedly somewhat amusing that privilege is denounced primarily when it's treated as a tool for shaming. Boghossian and Lindsay have both written in defense of ridicule when it suits their purposes, and they inhabit their own universe with other champions of ridicule like John Loftus and Jerry Coyne. They've advocated for shutting down academic studies like philosophy of religion and biblical scholarship when they dare to defend Christian beliefs, and they're quite fond of conceptualizing faith as a virus, not to mention defining it so as to be basically synonymous with irrationality. So why does privilege shaming catch their ire? One would think they'd be chomping at the bit for the chance to attack Christian privilege in such terms, which they more or less do in other language.

When it comes to our own privilege, we typically aren't exactly eager to own up to things. I can honestly admit that I still struggle with this. As Parul Sehgal eloquently observes: "It's easier to find a word wanting, rather than ourselves. It’s easy to point out how a word buckles and breaks; it's harder to notice how we do."4 The first sin wasn't being born into a certain class or identity. It wasn't being part of a majority group that benefits from the marginalization of others. The first sin was arrogance. It was selfish pride that motivated disobedience, as Thomas Aquinas said in his Summa Theologica.

I agree wholeheartedly with Boghossian and Lindsay that more perspective, kindness, and charity are needed. However, it seems to me that their critique of privilege has missed the mark in a number of ways. There is room for improvement, especially in how we talk to and treat the disadvantaged, but the encouragement given to "focus more on the positive qualities" you want to instill in others rings a bit hollow. It makes it sound once again like everyone else is the problem. Perhaps this is where the critic has more in common with religion than he likes to think. It would be an understatement to say that monotheistic religions haven't had very good track records of protesting privilege. On the contrary, they've often put in a great amount of effort defending their own privilege against so-called heretics and apostates.

Perspective, kindness, and charity seem mismatched to the disdain for what Lindsay and Boghossian call the religion of identity politics. It's telling where all the faith-based imagery is located in the picture painted by the two authors, and their contempt for religion is more than evident from their own writings, one of which bears the charitable title of Everybody is Wrong About God. "You don’t get to denounce identity politics," as Sincere Kirabo points out, "when your monomaniacal depreciation of all things religious is literally grounded in homage to the politics of your most treasured identity: atheism."5 Not everything religion has taught is worthy of derision – especially when it comes to the idea that change must begin with ourselves. There is likewise nothing patently religious about seeing ourselves as benefiting from certain social structures that disadvantage others. We should reject this claim just as we reject similar claims declaring morality to belong to the special domain of religion. Privilege talk that fails to recognize the need for humility and compassion is talk that is rightly criticized. At the same time, a critique of privilege that cloaks its main argument in anti-religious and politically conservative rhetoric is not doing anyone the favors its writers think it's doing.


References

1. James A. Lindsay and Peter Boghossian, Privilege: The Left's Original Sin, AllThink.com (May 24, 2016).
2. Mychal Denzel Smith, No One Cares If You Never Apologize for Your White Male Privilege, The Nation (May 5, 2014).
3. Sian Ferguson, Privilege 101: A Quick and Dirty Guide, Everyday Feminism (Sept. 29, 2014).
4. Parul Sehgal, How 'Privilege' Became a Provocation, The New York Times (July 14, 2015).
5. Sincere Kirabo, Navigating Critical Thinking, Intersectionality, and Identity Politics in the Secular Movement, TheHumanist.com (July 6, 2016).

Thursday, August 4, 2016

5 Religious Controversies in Video Games

Aristotle famously thought that art imitates life. As video game technology has evolved beyond depicting simple shapes and movements, its ability to represent aspects of our world has increased exponentially. Game designers, like all artists, often draw inspiration from their environment, and thus a broad range of subjects and concepts find their way into many titles. Over the years, we have seen countless games comment on music, politics, cultural norms, ethics, science, art, literature, relationships, and much much more. Not surprisingly, religion is another real life influence that can appear in video games, although its ties to the medium are arguably the most strained of the lot.

Here are five examples, ranked in no particular order, of controversial religious content in video games.

5. LittleBigPlanet and the Qur'an

 
In 2008, the puzzle platform game LittleBigPlanet had a delayed release after it was brought to Sony's attention that a song licensed in the game contained spoken verses from the Qur'an. Translated from the Arabic, the verses say: "Every soul shall have the taste of death" and "All that is on earth will perish." Admittedly, these are odd choices for a children's game, but are they offensive enough to merit their removal?
 
The original notice came from a poster on the PlayStation community forums, who explained: "We Muslims consider the mixing of music and words from our Holy Quran deeply offending," and asked that the song be removed. Sony complied and replaced the track in the game. However, some Muslims reacted against this, including The American Islamic Forum for Democracy, who criticized the censorship of the song in LittleBigPlanet. "Muslims cannot benefit from freedom of expression and religion," the group said, "and then turn around and ask that anytime their sensibilities are offended that the freedom of others be restricted."
 
The composer of the music in question, Toumani Diabaté, also considers himself a devout Muslim.

4. Brahmin in Fallout 3

One of the common inhabitants of the radiated post-apocalyptic Capital Wasteland in Fallout 3 is the species of two-headed mutated cow known as Brahmin. These cuddly critters don't do much in the game aside from grazing, transporting goods, and occasionally attacking those who disturb their peaceful existence. Yet that existence is apparently so controversial that Fallout 3 was not released on any platform in India, citing "cultural sensitivities" as the reason why.

A detailed explanation was not provided, but it has generally been assumed that the Brahmin are the culprits. There is a caste of Hindu priests and scholars in India known as the Brahmin, and the name is also similar to Brahman and Brahma in Hinduism - the former which is considered the highest or ultimate reality, and the latter being a creator god. It has additionally been speculated that the belief in the sanctity of cows in India is a further reason for why the Brahmin of Fallout could be responsible for the game's cancellation there.

Of course, it's difficult to know what exactly motivated the decision. Interestingly, Fallout 4 did see a release in India, with the Brahmin remaining in, and at least one complaint about their presence in the game has since been made on a gaming forum. It seems that although war never changes, concern for cultural sensitivities does.
 
3. The Fire Temple in Zelda

The controversy surrounding the Fire Temple in The Legend of Zelda: Ocarina of Time is one of the earliest examples of religious controversy in a video game that I can remember hearing. Well, aside from the general outrage over violent, evil, or allegedly un-Christian games that some religious groups used to love participating in. The Fire Temple thing wasn't just the usual "video games are corrupting the youth" nonsense. It was different and more surprising, given Nintendo's image of being family friendly and their longstanding policy of keeping religion out of their games.

Initial copies of Ocarina of Time featured music in the Fire Temple that was changed in later versions of the game. Nintendo has openly stated that the switch was due to an Islamic prayer chant being used in the original music (listen to the differences here). While no one had yet complained, the track was replaced to stay consistent with Nintendo's image. Allegedly, the chant was taken from a sound library, which was how it slipped under the radar.

Strangely, though, the Gerudo Symbol found on blocks, switches, and the Mirror Shield in Ocarina also looks quite like the crescent moon and star of Islam. In later releases, the design was altered drastically, but it's pretty curious how a Muslim prayer chant and a symbol very similar to an Islamic symbol could accidentally show up in the same game.

As a matter of fact, the Zelda games have a history of religious references that goes beyond Ocarina. The first game, Legend of Zelda for the NES, famously had a dungeon designed in the shape of a manji, the Buddhist symbol of good fortune. Link's shield bore the image of a cross as well, and the Book of Magic was even called the Bible in the Japanese version, complete with its own crucifix on the cover. Zelda II has the "Cross" as an item, which enables Link to see invisible enemies on his way through the Valley of Death. The Sanctuary in A Link to the Past is known as the Church in the Japanese original version, which makes sense of bizarre promotional artwork that shows Link praying before a cross in the place (prayer seems to likewise be how you enter the Desert Palace later in the game).

So if Link is a Christian crusader of sorts in the earliest Zelda games... how weird is it that Islam suddenly pops up in Ocarina?

2. Baptism in Bioshock Infinite 


During the beginning of Bioshock Infinite, you must undergo baptism in order to progress the story. This apparently upset one player enough to prompt them to request a refund, and the religious themes in the game reportedly even bothered some of the team members who worked on it. Unlike the other games on this list, Infinite intentionally comments on real world religion, especially the sort that gets wrapped up tightly with American exceptionalism. It isn't the main focus of the game, but with all the questions of free will, redemption, suffering, and so forth that it raises, bringing in politics and religion to the stage could almost be considered inevitable.

If religious sensibilities are why LittleBigPlanet, Ocarina of Time, and many other games have revised their content, then Infinite makes no apologies in directly confronting and challenging those sensibilities. The baptism scene at the start is not disrespectful or mocking, nor does it make light of the ritual. It plays a part in posing problems many Christians already ponder, about false prophets, going through the motions, the mundane nature of evil, the reach of salvation, and more. In some ways, the game is meant to be controversial, but what it draws attention to in the course of its beautiful tale of Booker and Elizabeth should be disturbing for plenty of reasons other than "blasphemy."

For those interested, I have written a longer review of religion in this game, exploring more of its ideas and controversy in greater depth.

1. Hitman 2 and the Sikh Temple

The Hitman games may just be some of the worst games to look to for any kind of religious deference. I mean, we're talking about games that simulate contracted murder, not games for kids or for generally sensitive folks. Even so, three missions in 2002's Hitman 2: Silent Assassin caught the ire of Sikhs who argued that they bear striking similarity to the tragic massacre that took place at Harmandir Sahib, or the Golden Temple, in 1984. The level description on the game's website (which has since been amended) spoke of an "ancient Gurdwara", or Sikh temple, and noted that an "uprising in this region in the mid 80's was ruthlessly cracked down on by government-issued troops, and many innocents were killed." A number of turban-wearing Sikh assassins are your enemies in the level, referred to at one point as "towelheads" by a contact you meet early on in the Temple City Ambush mission.

Eidos responded by removing offensive material from both its website and the game, but most of the changes seem to have been cosmetic, such as censoring or altering words and images. Considering that this game was released fairly shortly after September 11th, the controversy may appear very different now, looking back almost 14 years later. Still today, American Sikhs continue to experience violence and bigotry perpetrated by ignorant individuals who mistake them for Muslim-Americans. Hitman 2 didn't help by contributing to these misunderstandings in its depiction of Sikhs as "cult" members, assassins, and terrorists, regardless of whether the location in the missions is actually meant to be the Golden Temple.

Contra Aristotle, Oscar Wilde remarked that life imitates art far more than art imitates life. Likely the concern of many who object to religious controversies is that they can provoke other, potentially more harmful forms of discrimination. There does seem to be something to this, and it's probably one reason why most avid gamers find games like Hatred and Ethnic Cleansing abhorrent. On the other hand, censorship is almost never the best solution, not only because it limits the free expression of others, but because it can also significantly impact the attention given to something troubling. 

As forms of artistic expression, video games should experiment in the provocative and controversial, and should largely be free to do so. But where we should draw the line and how we ought to respond to offensive material are also questions worth asking - ones that may be productively taken up by the various religious, political, philosophical, and social communities in our diverse world.

Thursday, June 2, 2016

The Devaluing of Higher Education in American Society

I have been a college student now for several years, and I've finally got just one more year to go for a B.A. It's been a difficult journey to get me here, and it seems to only be getting harder and harder with each semester. This isn't because of tougher classes, although I've had my share of those. It's because of a persistent problem that every student and every professor seems to recognize. It's talked about among friends, among family, on the news, in entertainment, in politics, by social scientists, by employers, by corporations, by unions, and it's virtually everywhere in the United States. Yet the general reaction often continues to be one of reluctant surrender. Let's have a moment of silence for our poor students.

Poor indeed. But this kind of poverty isn't all it's frequently imagined to be. It's not the sort of scrape-by-on-the-skin-of-your-teeth poverty that challenges you while it builds character. It's not voluntary poverty for the sake of some greater good. It's not even simple financial poverty. Many students face something much deeper - a poverty of needs, values, aspirations, opportunities, and experiences.

Before you dismiss my use of the P-word as ridiculous exaggeration, allow me to make a point that should be, but rarely ever is, obvious. A lot of students live below the poverty line. Using data collected between 2009 and 2011, the Census Bureau reports that 63.3% of college students live at home with their parents or relatives. The same paper notes that although 15.2% of the U.S. lives in poverty, a staggering 51.8% of college students who live off-campus without parents or relatives are below the poverty line. Another recent study found that low-income students graduate at lower rates, with 51% of Pell Grant recipients graduating nationwide, compared to 65% of those who didn't receive the grants. Along these lines, high schools with higher poverty rates have been shown to be strong predictors of poor college performance.

Some people would like to chalk up some of these statistics to work ethic. I won't pretend there aren't students who try and take the lazy way. I've been in classes with plenty of them. What I will say is that these types of excuses for inaction never take into account other factors that should be relevant, like what all is going on in a student's life. Loved ones die, family members get sick and need care, cars break down, people get evicted, people struggle with physical and psychological disorders, and during the several years for which students attend college, there just are numerous issues that can arise. Good professors try to talk to students to find out these sorts of details, and they work with them when there are genuine difficulties involved, but the general public usually doesn't do this at all when explaining away calls for reform.

What's really tough to dismiss here are the alarming facts about suicide on college campuses. A 2014 study documented that about 31% of students have contemplated suicide, a figure that has risen by 6% in the span of only five years. According to Emory University, more than 1,000 suicides occur on campuses every year. While the factors contributing to this problem are numerous and complex, there are similar and recurring concerns voiced by the 48.7% of American college students who attend counseling for mental health concerns. Anxiety, depression, and stress top their worries.

I belong to the one-third of students in college living off-campus without family. I am also in the drastically rising minority of students with a full-time job. Of course, I am grateful to have what I have, but working a full-time job presents another layer of challenges to receiving an education. It means I have to work an alternate schedule, which can vary dramatically depending on the time-frames of the classes I need. Building operation restrictions further impact the hours I can work. Many full-time employers make allowances for educational leave, but a 2014 publication by The Council of Economic Advisers notes that there are "large disparities" in access to paid leave, as well as an overall need for more of both paid and unpaid leave.

Let's do a little math. You're taking 4 classes for three days a week, each 50 minutes long. That comes to around 10 hours per week that is spent sitting in class. Now add in the amount of time it takes to study for your classes. Conventionally, it's recommended that you spend at least two hours studying for each hour in class. So add 20 hours of study time (this is not far off the norm, according to a recent study). But your campus is a 30 minute drive, plus parking takes time, and depending where you park, you may have to plan on catching the bus as well. Then each day is an hour long trip both ways, and perhaps another hour both ways for parking and walking, or riding the bus. This conservative calculation already brings the total time devoted to attending school to 36 hours a week.

Now imagine your employer gives you four maximum hours of educational leave. Per week. The HR handbook lists up to 8, but they figure that an hour per class is reasonable. Taking other kinds of leave to use for school is frowned upon. You can't reduce your work hours to accommodate school, either, since you're in a salaried position. Basically, you have to pull 36 hours a week at work and another 36 in that same week handling school stuff. Subtract 72 hours out of a 120-hour 5 day week and that leaves you with 48 hours, or 9.6 hours a day for sleep. Of course, that's if you do absolutely nothing but attend college, work, and sleep. It doesn't include trying for a social life, visiting family, or, you know, having fun.

Then picture all that interrupted by any of the struggles I noted above. Death in the family? Loved one sick and needing your care? Personal medical condition? You lose sleep. And the more sleep you lose, the rougher everything becomes. Stress and anxiety heighten, more sleep is lost. You probably ought to go see a counselor, maybe even a doctor, or at least take some time off. But where will you find the time or the money to do that? You have rent and bills to pay. Not to forget about tuition.

It's no secret that college tuition has risen substantially over the last few decades, and in a manner that goes well beyond merely adjusting for inflation and increases in income. With fees and everything accounted for, tuition costs nearly $5,000 a semester to attend the university I attend. Even with my annual untaxed income at approximately five times this figure, I can't afford college without grants. The only problem is that since I'm sufficiently above the poverty level, I no longer receive much financial aid. For two years now, I've done all I can to avoid loans, including using payment plans, but it appears unavoidable at this point that I will have to start relying on student loans, even while just one year away from a B.A. The scenario I gave above is not fiction, it's a story I've been living, and that I know some of my fellow students are living, too.

One of the most popular arguments against making college free is that it "devalues" the education you'd be getting. Yet when we look at countries like Norway, Sweden, and Finland, where tax payers absorb the burden of tuition, college completion rates differ very little from the United States. Germany and Denmark are exceptions, but the U.S. still leads in the percentage of unemployed college graduates. These other nations also excel against the U.S. in college students who have graduated in the fields of Science, Computer Science, Engineering and Mathematics, according to research from 2011. To be clear, though, I am not advocating for free tuition in this post. Even if it is true that free education loses its worth, that is no argument for the exorbitant tuition rates we see at many American universities.

I believe there is devaluing of higher education already going on in our nation from a diverse assortment of sources. Many of us are aware of the pervasive anti-intellectualism that has been growing throughout the country, covered thoroughly by the likes of Chris Mooney and Susan Jacoby. Donald Trump's presidential campaign is merely the latest incarnation of this distinctly homegrown 'patriotic' flavor of brash ignorance. In the eyes of some Americans, it's as if the First Amendment guarantees their right to resist even the slightest semblance of critical thought regarding their own beliefs. But are these folks the culprits or the consequences of the decline in education?

It's all too easy to attach blame to the loudest and most belligerent of voices, which provide a useful distraction from the systemic problems that are not as visible. What does it say about us as a nation when we define the value of an education by expense alone? Sure, it's a very capitalist thing to do, but it's pretty questionable if this is the best way to assess an education, particularly when, as noted, the U.S. finds itself with many unemployed graduates. What good is an expensive education if you're not seeing a return on your investment? There is research showing that a measly 27% of graduates are working a job in their major, and, even worse, the majority of graduates work jobs that don't require a degree at all. This may be one reason why college enrollment has been declining over the last several years.

So why not save your money and opt out of college? Well, college graduates still earn more than non-graduates even in jobs that are outside their field. With the cost of living being what it is, this is a strong motivation to obtain a degree. And if you want that degree, you'll likely find yourself in a situation like I am now in, being forced into debt. I've applied for and received various grants, even won a scholarship, but at best these solutions cover just a fraction of tuition. "Change your major, then," I hear some people say, "and go where the money is." But this brings us back again to the big question. Is monetary value really the most important thing about higher education?

Part of why I haven't consistently attended college up until the past two years or so is that I used to place such an emphasis on income. When I began school, I was fortunate enough to have parents who paid for it and let me live at home. But I still had to find money for car repairs, for gas to get to school, for food when I would go out, and for really any social activities. I worked part-time jobs to make that happen, though these jobs were often surprisingly demanding of their employees. The food service industry is well known for having a very high turnover, and retailers are trying to compensate for a rising crisis of their own. Turnover isn't necessarily high because these people are finding better jobs, but even in cases like mine, a better paying job can also severely limit one's options for college participation. I told myself that education was for finding a well-paying career, so I switched majors a few times, searching for that one lucrative employment field that wouldn't feel miserable.

The nasty reality is that many high-paying positions carry with them a high amount of stress, and consequently a high amount of unhappiness. Some disturbing facts that aren't always talked about in conversations on career planning are the high suicide rates of physicians and lawyers, the massive debt incurred by dentists, and the astronomical work hours of investment bankers. We imagine a great salary will relieve us of the problems in our lives - many of them financial - but it's still true, if not terribly cliche, that more money can produce more problems.

The price tag on college and the focus on schooling as a means to a certain standard of living are devaluing higher education in the United States. They are a burden not only on students, but on parents and families, and they affect our communities and culture as a whole. I've already mentioned some of the ways this happens, but let me mention one more. Debt has become practically synonymous with the American dream. Several of the big things we recognize as important to living the American lifestyle are almost unattainable now without loans. Cars, homes, and college are the most familiar. These purchases often involve credit checks, too, and credit is itself a form of enforced debt meant to show personal "responsibility." Credit cards have replaced cash in many transactions. It's true that there are other options to some of these things, like used cars and renting a room in a house, but there are drawbacks in certain cases, and these are generally treated as temporary options. That our economy is increasingly pushing us towards a greater reliance on debt is just about indisputable, though.

It's not hard to see why, either, since debt enables people and businesses to make easy money without the need to produce some actual, tangible product that can deteriorate or change hands. A lot of people do not read the fine print, don't understand tricky interest rates, and may otherwise lack the necessary education for making an informed decision. Debt is great for business, and it preys especially on the disadvantaged, as was quite evident during the mortgage crisis that began in 2007. Considering that 8 in 10 Americans are in debt, and that mortgage is the most common kind of debt, dismissing all this as an instance of "buyer beware" seems both hypocritical and naive.

We live in a country that many believe is founded on Christian values. Were this true, it seems like there would be more believers upset at how we conduct business in the U.S. In the gospels, Jesus makes numerous references to debt, and specifically to debt forgiveness.

Then the master called the servant in. ‘You wicked servant,’ he said, ‘I canceled all that debt of yours because you begged me to. Shouldn’t you have had mercy on your fellow servant just as I had on you?’ In anger his master handed him over to the jailers to be tortured, until he should pay back all he owed.
Matthew 18:32-34

But love your enemies, do good to them, and lend to them without expecting to get anything back.

Luke 6:35

Two people owed money to a certain moneylender. One owed him five hundred denarii, and the other fifty. Neither of them had the money to pay him back, so he forgave the debts of both.
Luke 7:41-42

Perhaps most famously, the Lord's Prayer in Matthew 6:9-12 has the line asking God to "forgive us our debts, as we also have forgiven our debtors."

But these passages are speaking of spiritual debt, right? Actually, that's not as clear as some would like it to be. The Greek word for "debts" in Matthew 6:12, for example, is used in other ancient Greek texts in basically the literal sense of financial obligation. Another passage, Mark 12:17, where Jesus delivers his renowned "render unto Caesar" line, is regarded by plenty of scholars as a statement on taxation, which suggests that Jesus likely was putting forward some teachings with political and financial ramifications. Even if we go with the spiritual debt interpretation, it's odd to view this idea in a way that excludes extending that spirit of forgiveness to material matters as well as immaterial ones.

A lot of Christians do recognize debt as a problem for the debtor, and this should provide all the more reason to denounce economic practices that force people into debt. However, our "Christian nation" seems much less bothered by this institutionalization of sin than it is by, say, the alleged sanctity of marriage.

Debt is something we have grown accustomed to in America. I'm not interested in arguing that all debt is equally bad, or that no debt should exist, but isn't there something kind of specially insidious about tremendously burdensome debt that's required for an education - to have the tools to get by in the workplace, in society, and in a culture of debt? I think so. I think it creates tension in families, produces stress in many students, I think it drives a lot of people away from college, and I think it highlights just how deep the for-profit plague goes in the United States.

I've known some adults who love to remind young people that "education is its own reward." What they appear to mean by this isn't so much that we should all see education as a good in itself as it is that we should quit whining and work hard no matter the situation. I do tend to think education is its own reward, but that's also why I'm opposed to excessive tuition and excessive student debt. And it's why I feel like another contributing influence to the devaluation of higher education is the workforce.

Now, don't get me wrong: a lot of employers offer benefits that do help aspiring students. But not all employers do, and their benefits often come with strings attached. 54% of employers offer undergraduate tuition assistance, according to a 2014 study. Yet some of those 54% want you to attend school at online universities whom they have partnered up with. Some won't pay for a class until after you've completed it, and some will only pay the full amount if you make a certain grade. Many employers specify an acceptable area of study and will not provide assistance to other majors. Then there is the issue of employers expecting strenuous hours from students. Research has shown that a 10-15 hour work week is ideal for student engagement, but there are few businesses willing to employ staff for those hours, let alone at any kind of reasonable wage.

Even worse, there is a significant divergence between how well grad students think college has prepared them for the world and how well-prepared employers think they are. Whether one sides with the students or with the businesses here, this calls attention to a potential problem of prejudice in how some major companies perceive college and the college-educated. Interestingly, a national poll has found similar dissatisfaction among the public with college preparation for the workforce. There seems to be a distrust of higher education that leads some employers to adopt strict limitations to the benefits they give students, while other employers simply offer no benefits.

I was once told by a former supervisor that my primary commitment had to be to the job, not to college. This didn't come from any unprofessional behavior on my part, but just from attempting to negotiate more cooperation with my employer about working hours. What astounds me about it is that some businesses don't seem to have an inkling of how the stresses of a full-time job and full-time schooling affect job performance when little provision is made with a person's health in mind. Getting the hours and the work out of you is always the priority. Apparently a lot of companies give HR benefits mainly for PR reasons.

Universities do not exist to pump more bodies into the job market. The goal of higher education should be to produce educated and upstanding citizens. It can be debated if modern colleges serve this goal, but the expectation and insistence that education's most important purpose is to solve unemployment is another way in which education is being devalued in our society. Why is it not obvious to people that employers have a vested interest in implicating virtually everyone but themselves in the cause of unemployment? Let's not talk about the minimum wage, the obsolete 40-hour work week in the Information Age, the absurd Facebook-scouring anti-privacy practices of certain employers, the shoddy health benefits provided by many companies, or any number of the long list of problems that point to the real bottom-line priorities of countless businesses. Let's point the finger elsewhere, please. Of course, no amount of diligent preparation will help much if the fault is actually on the employer's end.

There is definitely a suspicion of institutions of higher learning, though. Americans have long had a distrust of elitism, even when it's more perceived than real. A 2015 Gallup poll shows that our nation places greater confidence in the military, in small business, in the police, in organized religion, in the presidency, and even in the Supreme Court than it places in public schools. This does include K-12 schools, but the report still exposes the overall distrust in public education, which has grown about threefold since the early 1970s. Another study that is specifically centered on higher education notes that while most Americans do have hope for college, the vast majority think universities should change to meet today's needs.

Anti-intellectualism is not necessarily incompatible with some of these attitudes towards college, either. Bringing education "down to earth," to something practical like finding a good job, may be somewhat of an anti-intellectual approach depending on the context. Certain majors have evoked disdain and ridicule as if they are degrees detached from reality, guilty of head-in-the-clouds thinking. I know this from personal experience as a Philosophy student, and I've heard majors in Psychology, Music, Sociology, and many other fields receive similar criticism. Some people act as if there's no point in learning anything that won't put food on the table or money in your bank account. But this is a value, an ideal, and there's a strong case to be made that part of the purpose of education should be to examine and challenge the values we are raised with.

Politicians play into the distrust of higher education, too. Scott Walker proposed a $300 million budget cut for the University of Wisconsin System last year, and Governor Matt Bevin trimmed 4.5% of the budget for Kentucky's public universities, promising to go as high as 9% in the following two years. Louisiana is in such bad shape that many colleges there are being forced to consider privatization in order to keep their accreditation. The cuts frequently affect Liberal Arts programs and involve reasoning precisely like that just described, claiming to be removing funding in the name of serving economic needs.

To be fair, there are legitimate criticisms of higher education. There remains widespread inequality of access among minority groups, some professors indoctrinate rather than educate, and retention rates are not often impressive. I don't have any simple solution to any of the problems mentioned in this article. These are complicated issues and I know that there are smart and well-intentioned people working on them at this very moment. I do think there are wrong ideas about higher education, however, and I've tried to explain and support my thoughts on this. They may not be convincing to everyone, and someone will likely accuse me of having a "liberal bias" merely for not taking every available opportunity to trash their personal definition of liberalism, but this is at least one frustrated (but hardworking) student's observations.

I value my education. I value it more than the cost of tuition, or the salary it might one day provide me. I value the help I've received from my parents, from the rest of my family, from the schools I've attended, from the grants and scholarships I've received, and from the jobs I've worked. I believe the more we turn education into an instrumental value in this country - a means to some other end - the more it loses lasting value, its power to inspire is diminished, and the worse the state it will be in. This is not some ideological prediction, it's something we're witnessing now and have been watching unfold over the last few decades. Because of all this, I think the rant is worth making.

Thursday, April 14, 2016

The Minimal Facts for the Resurrection: Why a Skeptic Doesn't Believe

The following is an excerpt from my critical review of Rick Broocks' Man, Myth, Messiah: Answering History's Greatest Question. Rice Broocks is a Christian apologist and the man who has inspired the God's Not Dead films. Man, Myth, Messiah is his 'companion' book to God's Not Dead 2.
_________________________________________________________________________________

Since 1975, Gary Habermas has been cataloging scholarly sources on the resurrection of Christ to establish certain trends, or ‘minimal facts’, accepted by most historians. In Man, Myth, Messiah, the number of these sources is given as “more than 2,200,” pulled from the 2007 book The Case for the Real Jesus. Just two years earlier, in a paper published in the Journal for the Study of the Historical Jesus, Professor Habermas numbered his sources at “more than 1400”.1 In the three decades Habermas took compiling those initial 1400, he averaged a survey of around 47 publications a year. Yet afterwards, in a mere two years, he managed to survey a whopping 800 additional sources for his list. Of course, some may point out the qualified use of “more than” in both of the total figures, but this ambiguity actually exposes a general problem with Habermas’ research. As Richard Carrier has noted, Habermas has not released his data – which is already quite selective in its reliance on only English, German, and French written sources – and so the trends he extrapolates from it are greatly presumptive.2

On this flawed backdrop, we come to the alleged facts in chapter two that “even skeptics believe.” These facts are built on solid historical criteria, according to Broocks, like multiple and independent attestation, a close proximity to the events in question, and the presence of details too embarrassing to have been invented. With the exception of the last criterion, I find this standard reasonable. Embarrassment is a sticky issue in many ways, particularly because of how it rests on judgments about the sorts of things that would’ve been contrary to the purposes of an author living in the very distant past. Perhaps in some cases where we have a good deal of information on the norms in a given society, it can be plausible to make an argument from embarrassment as a supplementary defense of historicity, but even then there are challenging questions about individual attitudes and ‘hierarchies’ of tolerable to intolerable embarrassments.

Before laying out his first minimal fact, our author sets his sights on Jesus mythicism. Denying the historical existence of Jesus is a “pop culture”, “blogosphere” thing, a “tabloid” level absurdity, says Rice, while suggesting a visit to Jerusalem would sway most rational minds. “And you don’t need a scholar or historian. Any tour guide can set you straight.” Although I am not a mythicist, I have to admit I find ridiculing mythicism to be unproductive as well as uncharitable. Broocks aspires to always be prepared to give an answer for his faith with gentleness and respect, per 1 Peter 3:15-16, but on more than one occasion he opts instead for resorting to strawmen and ad hominem attacks on his opponents. “The real motivation for skeptics to deny that Jesus really lived is not a lack of evidence,” he claims. “They often desire to attack Christianity in any way possible because of the evil perpetrated by self-proclaimed Christians.” (p. 28) Claims like these, whether or not they’re true of some mythicists, seem spectacularly inadequate at dealing with mythicists like New Testament scholar Robert Price, Dominican priest Thomas L. Brodie, or historian Richard Carrier.

Fact #1: He Was Crucified

Historical sources are even part of the supporting case for the first minimal fact, making it especially unnecessary to wage such a verbal war on mythicism. Josephus, Tacitus, Lucian, and the Talmud are cited as evidence for the crucifixion of Jesus, and all have been used to endorse historicity, too. While there are issues with each of these sources that leave them open to objections, I think at least the first two are fairly reliable, for the same reasons I gave in my review of God’s Not Dead. It’s worth stating that this first fact, crucifixion, is really not an argument for the resurrection in itself, but more of a stipulation to it. Naturally, it could be that Christ was crucified and remained dead after; the crucifixion is more a part of the minimal facts case to deter the objection that Jesus appeared alive later because he had never actually died. Since I don’t make that objection, I will not offer a critique of the first fact.

Fact #2: His Tomb Was Found Empty

The second minimal fact is the discovery of Jesus’ empty tomb by a group of his women followers. Saying that all four gospels depict women as the first to arrive at the tomb, Broocks notes that the testimony of women was “usually dismissed in ancient trials. So no first-century author would have ever made the story up.” (p. 31) Here is an example of the embarrassment criterion in action. The unstated assumption is that women were so distrusted in those days that the presence of them in the resurrection narrative, when they could’ve been omitted or replaced, makes the story more likely to be true. However, even the historian Josephus hung his entire accounts of the incidents at Gamala and Masada on the testimony of women.3 The fact that Rice is careful to say that “usually” the testimony of women was dismissed is also important. If there were instances in which women were treated as reliable sources – including by one of the most prominent historians of the era – then why should we think women in the resurrection story were too embarrassing a detail for the empty tomb to have been made up?

Another argument made in favor of this alleged fact is that the Roman and Jewish authorities could easily have squashed the Christian movement by producing the body of Jesus. Since they did no such thing, it must have been because the body was missing. Again, though, this is quite an assumption. The New Testament itself claims that the disciples did not begin preaching the risen Christ until about seven weeks after the ascension (see Acts 2), at which point the corpse was likely decayed beyond recognition. Add to that the small size of the early Christian sect, as well as the fact that the earliest Christian writings come 20-25 years after the death of Jesus, and it just doesn’t seem the Romans or Jews would have had the motive to hound Christians over what they were not exactly forthcoming with in the first place.

Skeptics of the empty tomb have often claimed it is unlikely that Jesus would have received a proper burial. In what may be one of his stronger counter-arguments in the chapter, Broocks responds to this objection by contending that leaving the body on the cross would have violated Roman laws urging respect for occupied peoples. “Jewish law expressly commanded bodies of the condemned be buried so that the land would not be defiled,” he states on page 32. Supporting these claims are two sources: Josephus and the Digesta Iustiniani

As Leonard Rutgers explains, Josephus mentions certain religious freedoms the Romans did extend to the Jews, such as to “gather freely in thiasoi, observe the Sabbath and the Jewish festivals, send money to the Temple in Jerusalem, and enjoy autonomy in their communal affairs,” as well as being “absolved from compulsory enrollment in the Roman military.”4 But to call the Romans tolerant of Jewish customs would seem to be a step too far. Rutgers goes on to say that, “Roman laws of the first century C.E. that relate to Jews give the impression that tolerance or intolerance was nothing but a by-product in the formulation of a given policy. Conscious efforts to be tolerant or intolerant do not seem to have been frequently made.” Indication of this even comes from Josephus, who notes in book 2, chapter 9 of The Jewish War how Pilate spent money from the sacred treasury to build an aqueduct, and then sent undercover soldiers to disrupt the mob of protest that ensued.

In book 48, title 24 of the Digesta Iustiniani (Digest of Justinian), we read: “The bodies of persons who have been punished should be given to whoever requests them for the purpose of burial.” Broocks cites New Testament scholar Craig Evans saying that burial would have been expected in the time of Jesus. In a paper commenting on the Digesta, Evans notes that most of the text is drawn from Roman jurist Ulpian, who lived from about 170-223 C.E. “Ulpian,” writes Evans, “goes on to say that ‘the bodies of those who have been punished are only buried when this has been requested and permission granted’. A statement in the lex Puteolana (at II.13) gives the impression that Romans, as did Jews in Israel, had burial pits reserved for criminals and others buried without honor.”5 Evans refers to a book by J.G. Cook that discusses the lex Puteolana. “Some of the corpses were denied burial,” Cook remarks, “apparently at the discretion of the magistrate,” and common burial pits “‘were in use already in the second century BC.'”6 Cook and Evans both mention a particular passage in the Digest that specifically states that permission for burial is not always given, “especially where persons have been convicted of high treason.” (48.24.1). Evans argues in his essay that the mention of treason does not apply to Jesus, but the passage appears to give treason as an extreme example rather than the only exception.

Bart Ehrman names a number of historical sources describing how crucifixion victims were often left to rot on the cross:

The Roman author Horace says in one of his letters that a slave was claiming to have done nothing wrong, to which his master replied, “You shall not therefore feed the carrion crows on the cross” (Epistle 1.16.46-48)… Artemidorus, writes that it is auspicious for a poor man in particular to have a dream about being crucified, since “a crucified man is raised high and his substance is sufficient to keep many birds” (Dream Book 2.53)… there is a bit of gallows humor in the Satyricon of Petronius, a one-time advisor to the emperor Nero, about a crucified victim being left for days on the cross (chaps. 11-12).7

There are a few important things to take from all this. First, there is reasonable doubt that Roman officials in the first century respected Jewish practices and beliefs as a matter of habit. We have seen both scholarly argument and a historical example for this, and it is perhaps further instructive to consider the Jewish-Roman war that arose just a little over three decades after the purported death of Jesus. Second, while it seems that some crucifixion victims were allowed to be buried in special cases, others were denied burial. Although it’s not entirely clear what all the circumstances were that could lead to a denial, the “stereotyped picture that the crucified victim served as food for wild beasts and birds of prey,” as conservative Christian Martin Hengel once remarked,8 suggests that being left on the cross was not a punishment reserved for only the worst of traitors. As a third point, there is a lack of clarity in this material about what kind of burial was allowed in which cases. It’s fair to assume that since giving the body to “relatives” is mentioned by Ulpian, the relatives would likely bury their beloved in a family tomb or something of the sort. Yet when the body is that of a troublemaker or perceived criminal who supposedly had a lot of enemies among the Jewish leaders, the law in Deuteronomy 21:22-23 could have been respected and the Sabbath could have been honored simply by burial in a common grave. Since Pastor Broocks’ main objection to improper burial is Roman respect for Jewish law, this possibility, conceded by both Evans and Cook, poses a significant problem.

Surprisingly, the “unanimous” early church tradition on the site of Christ’s grave is another supporting argument made in defense of the empty tomb. “Custom required Jesus to be buried outside the walls,” Mr. Broocks states, “so the tradition for the site’s location had to go back to within ten years of the resurrection.” (p. 32) The Church of the Holy Sepulchre is the earliest known site to be identified with the tomb of Jesus. Eusebius reports in his Life of Constantine that the tomb had a pagan temple built over it by the Romans to “obscure the truth.” Under Constantine, the temple was then demolished and replaced by a church. Constantine’s own mother allegedly found the “true cross,” which proved its power by restoring a corpse to life. Curiously, though, there is no evidence prior to the 4th century that links the location to the resurrection story. In her book, Christians and the Holy Places: The Myth of Jewish-Christian Origins, historian Joan E. Taylor argues that Constantine chose the site as part of his campaign to Christianize paganism, and the temple he built over was never constructed with the purpose of concealing the tomb of Christ. The absence of early veneration for any alleged site of Jesus’ grave, especially from Paul’s trip to Jerusalem, is a strong argument against the empty tomb legend.

It’s worth noting that Gary Habermas does not include the empty tomb among his minimal facts because 1/4 of the scholars he surveyed are skeptical of it, which Rice notes as well. Our author tries to dismiss the divergence here: “This drop is likely due to the profound implication of an empty tomb. If Jesus were buried after His death, then the empty tomb would be a decisive additional piece of evidence for the disciples encountering a physical Jesus.” (p. 31) However, we have just seen numerous reasons why the empty tomb is a questionable ‘fact,’ reasons all based in historical evidence. In addition, if the empty tomb is so critical to the Christian faith, then it seems the very same reasoning could be used against Broocks and other believers to suggest that bias is why they favor an empty tomb.

Fact #3: His Disciples Believed He Appeared to Them

As certain as Christ’s crucifixion, Broocks says, is the fact that his followers had experiences of him after his death. We find these appearances mostly in Paul and John, but Acts is also included with the caveat that the historical reliability of the text is disputed. 1 Corinthians 15:3-8, often regarded as an early Christian creed by scholars, reads:

For what I received I passed on to you as of first importance: that Christ died for our sins according to the Scriptures, that he was buried, that he was raised on the third day according to the Scriptures, and that he appeared to Cephas, and then to the Twelve. After that, he appeared to more than five hundred of the brothers and sisters at the same time, most of whom are still living, though some have fallen asleep. Then he appeared to James, then to all the apostles, and last of all he appeared to me also, as to one abnormally born.

Rice calls this a “credible list” of witnesses, and makes special mention of how Paul and James were skeptics before their conversions. What exactly about this list is credible? The appearances to the five hundred are frequently talked about by apologists as if we have five hundred eyewitness accounts, when all we really have is this one account saying that five hundred people saw the risen Jesus. No gospel or other early Christian document tells of these unnamed, mysterious five hundred. Most of the individuals identified in this list have left us no first-hand account of their experiences. Notice also that there is nothing said about women being the first to find the tomb – in fact, a tomb isn’t even mentioned at all.

There is an unexplained dissimilarity between the experiences of Paul and James. The story of Paul’s conversion is that he was a Jew persecuting Christians up until his vision on the road to Damascus. Thus, Paul was a skeptic converted by an appearance. James, on the other hand, is considered a skeptic merely because of biblical references to divisions in Jesus’ family (i.e. Mark 3:21, John 7:5), and we are given no information for when James became a believer, whether it was before or after the alleged appearance discussed in 1 Corinthians 15. Christian scholar James F. McGrath shares this view, explaining that “even if there were antagonism or otherwise soured relations between Jesus and James, this does not in any way lead to the conclusion that the estrangement lasted until Jesus’ death.”9 This matters because, as apologists like Broocks assert, the conversion of a skeptic due to a post-resurrection appearance is a more surprising deal than the report of a devout believer that they witnessed a miracle.

So what about Paul? The vision described in Acts stands out in some ways from the other appearances. Paul hears a voice and sees a light so blinding that he falls to the ground and loses his sight. In Acts 9, his companions hear a voice, but see nothing; in Acts 22, they see the light, but don’t hear a voice. What’s odd about labeling this a postmortem appearance is that Paul had never met Jesus while Jesus was alive, and in his vision Paul doesn’t see Jesus – only a bright light – he just knows (or assumes) it’s Jesus based on the voice. This experience is quite similar sounding to a hallucination, and what’s stranger still is that it is not distinguished in any way from the other supposed appearances spoken of in the early creed.

Could multiple people have hallucinated the same thing, or something quite like it? Broocks declares the Christian message “is not based on some corporate self-delusion triggered by the disciples’ grief over having lost their beloved leader; such a scenario would have required a much longer period of time to develop.” (p. 38) But why think this? 

On the hallucination theory, philosopher Keith Parsons writes:

In fact, the article “Hallucinations” in the second edition of the Encyclopedia of Psychology, says that 1/8 to 2/3 of the normal population experiences waking hallucinations… Causes of hallucinations in normal persons include social isolation, rejection, and severe reactive depression. The disciples were very likely to be experiencing a strong sense of rejection, isolation, and depression after the execution of Jesus. Further, it is very common for the bereaved to experience visual or auditory hallucinations of their deceased loved ones.10

Not all hallucinations require a good length of time, either, as Matthew McCormick explains.

When people lose someone they love, it is quite common for them to have hallucinations of the person (or even a pet) shortly after the loss. The phenomenon is now well documented and is known as a bereavement hallucination. In one study, a remarkable 80 percent of elderly widows reported having hallucinations – either visual or auditory – up to one month after the spouse had died… And these are not just fleeting glimpses or vague feelings that these widows and widowers are experiencing. They report seeing or hearing the lost person in some familiar environment, being visited in their dreams, or having complete conversations with them while being wide awake.11

We’ve already seen that not all experiences of the risen Jesus are equal. Paul’s vision in Acts is very different from the appearances in John 20. Other appearances, to the five hundred, to James, or to unspecified apostles, are so devoid of any description that it would be sheer speculation to imagine what those experiences might have been like. Worse yet, since Paul is thought to have relayed an early creed pertaining to appearances to some of the same people we find in the gospels, this creed raises doubts about whether the sources we have are truly independent. Perhaps John and Acts relied on the same material Paul relied on. Noting this problem of ambiguity, there just doesn’t seem to be any reasonable grounds for claiming that the postmortem appearances were shared by so many people that hallucination is out of the question. To make that argument, we would need more and better evidence for the array of alleged experiences.

Fact #4: Proclaimed Early

For the fourth fact, Broocks provides the earliness of the preaching of the resurrection. Because “creeds require time to become standardized, the original teaching had to have originated years earlier” (p. 37). Habermas is cited as claiming that such teaching must go back to within fewer than five years of the death of Jesus. This is said to be a consensus view of even critical scholars, but we have previously seen the flaws in the survey approach used in Habermas’ resurrection research. Nonetheless, if we assume that the resurrection was preached so early after the crucifixion – which I am actually willing to grant – is this a fact supporting the historical reality of Christ’s resurrection?

This is where the trouble with assessing miracles through historical method becomes especially apparent. The reports of Joseph Smith’s vision of the angel Moroni are very close to the time he supposedly had his vision. Likewise, as Matt McCormick argues, there is substantial evidence surrounding the Salem witch trials:

…hundreds of people were involved in concluding that some of the accused were witches. Eyewitnesses testified in courts, signed sworn affidavits, and demonstrated their utter conviction that those on trial were witches. Furthermore, the accusers came from diverse backgrounds and social strata, including magistrates, judges, the governor of Massachusetts, respected members of the community, husbands of the accused, and so on.

…The trials were part of a thorough, careful, and exhaustive investigation. The investigators deliberately gathered evidence and made a substantial attempt to view it objectively and separate truths from falsehoods, mistakes, and lies. In the court trials, they took great care to discern the facts. The accusers must have become convinced by their evidence…

The witch trials were historically recent, so we have hundreds of the actual documents that were part of the evidence. We have the signed, sworn testimonies of the eyewitnesses claiming to have seen the magic performed – not as it was repeated and relayed for decades to others, but immediately after it occurred. We have whole volumes written by witnesses to the trials, such as those by Cotton Mather and John Hale.12

Should we then believe Joseph Smith really was a prophet, or that those convicted in the witch trials really were witches? I should say not. The reason why involves a lot of what has already been covered. What we know (or don’t know) of those reporting the event, of the time and place in which they lived, and of the subsequent developments and advances in our general knowledge has to play a significant role in our approach, beyond a basic consideration of criteria like multiple and independent attestation, closeness in time, or embarrassment.

Further Facts

Additional facts are presented in the chapter that have already been touched on at this point, in one way or another. These involve Paul, James, the growth of the early church, and the baptism of Jesus by John the Baptist. To say brief words on the latter two, however, I find the absence of any figures or statistics on the growth of the church makes such a ‘fact’ indefensible, and the purportedly embarrassing nature of Jesus’ baptism hangs on an incredibly thin supposition that it “could” be seen as implying the superiority of John. I mention a study by Keith Hopkins in my review of God’s Not Dead which argues that Christians composed only 10% of the Roman population by the year 300. If the early church exploded in the miraculous way many Christian apologists claim it did, these are the kinds of studies that need to be produced to substantiate their claim. As for John and Jesus, Mark 1:7 effectively eviscerates any notion of embarrassment: “And this was [John’s] message: ‘After me comes the one more powerful than I, the straps of whose sandals I am not worthy to stoop down and untie.'”

As a conclusion to my (very long) review of this chapter, I want to make one last argument regarding these minimal facts – one which I believe greatly reduces their persuasiveness on top of all that’s been said so far. Throughout the chapter, Pastor Broocks makes frequent note of how “all four gospels” mention the crucifixion, the women at the tomb, Joseph of Arimathea requesting the body, John baptizing Jesus, and “supernatural confirmations of Jesus’ ministry” (p. 30, 31, 41). These remarks are misleading in that they give the impression that such details are independently and multiply attested by more sources than is likely accurate. The Two-Source Hypothesis in New Testament scholarship, which is the consensus view among even most conservative scholars, has it that Mark was a primary source for both Matthew and Luke. This is even addressed somewhat in the very next chapter of the book, and it’s a little suspicious why something so obviously pertaining to the historical criteria is put after the minimal facts case. The importance of this is that something which appears in all four gospels may only really be independently and multiply attested in two gospels once we take parallel passages into account.

Let’s take the women at the tomb as our example. After stating that this is found in all four gospels, Rice says, “This fact is significant because the testimony of women was usually dismissed in ancient trials.” The significance the author sees here is not just the reporting of women at the tomb, but clearly also the reporting of women at the tomb in four sources. Yet when we look at Mark 16:1-8, Matthew 28:1-8, and Luke 24:1-12, we find a number of similarities and parallelisms, from the two Marys to the fear of the women to the presence of men/angels (one in Mark) in white clothing and more. We even find some plausible spots where the authors of Matthew and Luke changed the text from Mark, such as Matthew 28:8, which adds that the women were not just afraid, but “filled with joy,” and so ran to tell the disciples what they’d found – quite an improvement over Mark’s original ending, where the women “said nothing to anyone, because they were afraid.” None of these details occur in John’s gospel. This illustrates how Matthew and Luke relied on Mark, and it changes the scope of attestation for the women at the tomb from four to two sources. If Helmut Koester is right, though, about Mark and John sharing a passion narrative source that is also represented in the Gospel of Peter, then the evidence for women at the tomb comes down to a lonely single attestation.13

When we move outside the realm of guesswork based on a questionable survey, and go into dealing with the problems and arguments that historians deal with, the picture becomes far more complicated with respect to which sources are reliable and for what reasons. The minimal facts case not only faces objections from a methodological perspective, for inferring a supernatural explanation out of historical data, but also from an evidential perspective.



Sources:
1. Gary R. Habermas, Resurrection Research from 1975 to the Present, GaryHabermas.com (2005). Retrieved April 12, 2016.
2. Richard Carrier, Innumeracy: A Fault to Fix, Freethought Blogs (Nov. 26, 2013). Retrieved April 12, 2016.
3. Josephus, Jewish War 4.81, 7.399.
4. Leonard Rutgers, "Roman Policy towards the Jews: Expulsions from the City of Rome during the First Century C.E.," Classical Antiquity, Vol. 13, No. 1 (Apr. 1994), p. 57.
5. Craig Evans, "Roman Law and the Burial of Jesus," Matthew and Mark Across Perspectives (Bloomsbury, 2016), ed. Kristian Bendoraitis and Nijay Gupta, p. 57.
6. John G. Cook, Crucifixion in the Mediterranean World (Mohr Siebeck, 2014), p. 385-386.
7. Bart D. Ehrman, How Jesus Became God (Harper-Collins, 2014), p. 158.
8. Ibid, p. 158.
9. James F. McGrath, Early Converted Skeptics?, Exploring Our Matrix (Aug. 7, 2009). Retrieved April 14, 2016.
10. Keith Parsons, in The Empty Tomb: Jesus Beyond the Grave (Prometheus, 2005), ed. Jeffery Jay Lowder and Robert M. Price, p. 441.
11. Matthew S. McCormick, Atheism and the Case Against Christ (Prometheus, 2012), p. 84-85.
12. Ibid, p. 58-59.
13. Helmut Koester, Ancient Christian Gospels (Trinity Press, 1990), p. 253-255. 

Tuesday, January 26, 2016

Walter Kaufmann on Courtroom-Style Religious Apologetics

If the smash hit success of the Making a Murderer series on Netflix is a testament to anything, it might be that everyone loves a good courtroom drama. Over the course of almost any criminal trial, there is suspense, intrigue, and excitement as each side builds its case and new evidence is presented, eventually leading up to a verdict. Depending on the circumstances involved, many such trials can also be deeply emotional, eliciting anger, disgust, sadness, or sometimes joy even in people not in any way affiliated with the case. So-called "trials of the century," like those of Charlie Manson or O.J. Simpson, have garnered massive public attention in modern times thanks largely to press coverage. Skilled attorneys, unexpected discoveries, and undecided jurors help to make some trials into gripping roller coaster rides of anticipation

Christian apologists have published a number of best-selling books modeled on this format, most notably Josh McDowell's Evidence That Demands a Verdict (1972) and Lee Strobel's The Case for Christ (1998). More recent is Cold Case Christianity (2013), written by homicide detective J. Warner Wallace, as well as the upcoming film God's Not Dead 2, which teases a "court case" showdown that threatens to "expel God from the classroom". The idea of defending the faith in a legal setting even goes back to Jesus himself, who defends his ministry before the Jewish and Roman authorities in John 18.

But how fruitful really is this approach in attempting to justify the truth of Christianity? Part of its appeal is likely that it tries to reduce bias by working on a more neutral ground of debate - a secular ground, arguably. Another part of the appeal is that it seems to allow for an evaluative contrast: the case is made so strongly that we ought to believe it tells the truth. This is what Lee Strobel implies by noting in the ending chapter of his book, "I had seen defendants carted off to the death chamber on much less convincing proof!"[1] If Christ comes out favorably by even the high standards of the same justice system to which we trust countless human lives, then shouldn't we trust Christ?

Years before the publication of the aforementioned texts, philosopher Walter Kaufmann offered an insightful critique of this particular apologetic style:

An attitude often encountered among religious people and exemplified professionally by a great many preachers and theologians is that of the counsel for the defense. Here is an attitude toward truth quite different from the scientist's or the historian's, but no less methodical and disciplined and moral. Only it is governed by a different morality.

In many countries the counsel for the defense is expected to use all his ingenuity as well as passionate appeals to the emotions to gain credence for a predetermined conclusion - namely, that his client is innocent. He may ignore some of the evidence if he can get away with it, and he is under no obligation to carry out investigations which are likely to discredit his conclusion. If, after all that, he cannot convince the jury of the truth of his position, he will saddle his opponent with the burden of disproof; and if necessary he will rest content with a reasonable doubt that his position might be true.

Common though this attitude is toward religion, it is indefensible outside the courtroom, and it does not indicate a second type of truth.

In the first place, some unusual conditions obtain in the courts where this attitude is legitimate. The very fact of the indictment creates some presumption, psychologically, that the accused is guilty. Then, the prosecutor is an official of the government and aided by its vast resources, ranging all the way from its prestige to its police. Against such formidable odds the defense requires a handicap; and that is one reason why it is conceded the liberties that have been mentioned. In the case of religion, the situation is more nearly the opposite. Its advocates are aided by the government's prestige and by voluble testimony from officeholders and would-be officeholders; and the case for all kinds of religious propositions is proclaimed not only from the pulpits but in our most popular magazines, too, and in the press, and over radio and television, while the case against these propositions never gets a comparable hearing. If the courtroom analogy could be extended to the case of religion, the prerogatives mentioned should be granted to its critics to redress the balance.

Secondly, a jury is not asked to come up with the most likely story or even the most likely culprit. The jury is confronted with a single suspect, and truth is not the highest consideration. Better let two guilty men go free than punish one who is innocent.

Suppose that the major philosophic positions were haled into court, one at a time, each defended by a brilliant advocate. Surely, these attorneys - it could even be the same lawyer every time - would succeed time and again in raising a reasonable doubt in the mind of the jury that the position might be true. The attorney might not even have to try very hard if the prosecution were under pressure to pull its punches, as it is in the case of religion. Position after position would be acquitted. But such acquittal of a philosophy or a religion creates no presumption whatsoever that the position is probably true. In the end, those who care for a considered choice would still have the whole field to choose from.

We have here two different attitudes toward truth, but not two different types of truth. The second attitude, unlike the first, subordinates questions of truth to other questions of a moral kind. In fact, it might be argued that the verdict of the jury, "We find the accused not guilty," is not so much a determination of fact as it is a deceptively phrased recommendation for action. In line with this, the records show that when juries know that a finding of "guilty" makes the death penalty mandatory they will find the accused guilty much less often.

There is no need here to distinguish legal truth from other kinds of truth: such a distinction only prompts confusion. Consider a case that happens occasionally: some of the evidence against the accused has been obtained illegally or was not legally admissible in court, and the judge therefore directs the jury to find the accused not guilty. There is no point whatsoever here in introducing any conflict between types of truth. Clearly, the truth is in this case subordinated to respect for civil rights. And the situation can be explained perfectly in terms of the one and only kind of truth we have encountered so far.

"Guilty" and "not guilty" are, in the mouth of a jury, elliptical expressions which are only apparently identical with these phrases in other contexts. In a verdict they mean "proved guilty (or not proved guilty) in accordance with the special rules of evidence and argument that govern court procedure." Thus the accused may well be guilty in the ordinary sense but not guilty in this more restricted sense.

A jury operates under unusual conditions and is not expected to decide more than the special question whether the accused has been proved guilty in accordance with a certain set of rules. Neither the jury's attitude nor that of the counsel for the defense is at all appropriate when we are asked if a religious proposition is true or not true.[2]

Could the gospels or the resurrection hold up in a court of law, as apologists like McDowell, Strobel, and Wallace have claimed? One is tempted to respond: so what if they could? Our legal system does not establish truth. Moreover, it's not even clear what the charges might be that could reasonably be leveled against a faith like Christianity, nor is it clear why subjecting the beliefs of that faith to  standards developed and intended for judging human social behavior at this specific time and place in history should be appropriate, let alone impressive in the event that everything stacks up well.


Citations
1. Lee Strobel, The Case for Christ (1998, Zondervan), p. 264.
2. Walter Kaufmann, Critique of Religion and Philosophy (1990, Princeton), p. 105-107.