Sunday, May 14, 2017

Why I've Rethought Calling Out Fallacies

© Existential Comics

When I was a young atheist, having just lost my faith, I used to feel like fallacies were so obvious in religious views and religious arguments that it was a wonder more people didn't notice them. From there, it was an easy step to concluding that most believers must just be ignorant. And why not? A lot of them don't seem to know science as well as they think they do, so we shouldn't expect them to know logic, either.

This kind of condescension is not limited to young atheists, though, nor is it limited to non-professionals. I've seen some very intelligent, highly credentialed, and otherwise very informed people hastily call out fallacies as if they're ordering off a drive-thru menu. I've sat in plenty a classroom where the first instinct of some students (undergraduate or graduate level) itching to dispute a claim is to label it a fallacy. And sometimes they're right.

I would never suggest that there are no fallacies behind religious arguments and religious beliefs. I wouldn't even say they're uncommon. But they certainly aren't omnipresent. Some can be more complicated to identify than we suppose, too.

All we skeptics and atheists need to do to realize this last point is think about the times we've engaged with theists who've wrongly accused us of committing a fallacy. How many times have you heard that you're appealing to authority simply by referencing an argument made by someone else? What about red herrings? Appeals to emotion? Non sequiturs? Especially online, these sorts of back-and-forth shouting matches become black holes of productivity.

Now, remember the Dunning-Kruger effect. We often judge ourselves more favorably than we judge others. Why presume that religious believers are the only ones that ever abuse fallacies? There's a point here that I think is worth really digesting, and not just blithely assenting to. That urge to dismiss another person's claim comes from something in us that isn't religiously, politically, economically, culturally, or otherwise situated. We make mistakes. We let our emotions get the best of us. There are a million and one ridiculous memes mocking each and every side for what we're all guilty of at one time or another.

Personally, I've tried to take this to heart more recently by just avoiding calling out fallacies whenever possible. There are a few reasons for why I don't think this is a bad idea.

First, it's frequently the case that we can make our point in some other way - perhaps even a clearer way - without name-dropping fallacies. It can get messy and confusing to people when conversation dives into what defines a fallacy, whether this particular claim is an example of that fallacy, and what we're meant to take away from seeing the claim in that manner. People who are experienced in philosophy and logic may have a good understanding of all these issues, but the average person won't, and even the philosopher might wish to steer clear of a needlessly complex topic in normal conversation.

As for alternative ways of making one's point, I like to employ analogies since I've found that they tend to help discussion regardless of the audience. They can be misused, too, of course, as anything can. Socratic-style questions can prompt reflection and open up dialogue. Then there is just explaining disagreement that can be effective, i.e. instead of rattling off "straw man" to an opponent, you can restate your argument in other words. Sometimes I feel like this is the best option when the real source of confusion could just be that you need a little more clarity to what you're saying.

Second, focusing on what label to give to an argument can distract us from the stronger points it does make. When we start to think that something just looks like a fallacy, we may have a tougher time carefully examining its premises. This seems to happen all the time online, and, yes, it happens often in internet atheist circles. Sometimes if there's even a hint of concession to plausibility in a theistic argument, in swarm the accusations of fallacy after fallacy. Philosophers tend to attribute the best version of an argument to their opponent because it's not only the charitable thing to do, but it strengthens your own case when you venture to dispute it. Plus, if there is a non-fallacious way to easily reconstruct an argument, expending all your time on criticizing the fallacy and celebrating triumphantly will appear very premature and, well, pretty silly.

Third, calling out a fallacy can frequently be a conversation stopper. I'm aware that this shouldn't be true, but that doesn't mean it isn't usually true. When conversing with most folks, labeling something a fallacy carries negative connotations, implying they don't know what they're talking about, they don't know how to properly think, etc. Even if that stuff is kinda true, people are defensive, particularly when it comes to their deeply-held beliefs. If you come at them out of hostility, blurting out weird-sounding things and calling their claims irrational, that's likely to end fruitful and civil discussion.

It's not just about laypeople, either. You can often tell how serious a person is about engaging with you by what it is they resort to first. Many people like to use the label "nonsense" to dismiss a claim even where there's been little actual consideration, and name-dropping fallacies is a similarly dismissive tactic. Not to mention how it provides an opportunity to show off. When someone's first response to me makes it obvious that all they want is to slap a label on what's 'wrong' with my position, before sliding it off neatly to the side, I have to doubt it's worth my time to engage with them.

Again, though, please don't misunderstand this post. I still do point out fallacies, and I think it's important to do this in discussions on a lot of subjects, religion included. But I've just given a few reasons for why I think we need to be careful in how we do this. I think there is greater value in acquiring, having, and exhibiting virtues like charity, clarity, openness, and humility than there is in being right about a claim fitting under the moniker of some fallacy. I try to save that last part for last, if it comes up, when it's hopefully become clear that I do value the things I mention and desire a dialogue that is productive and provocative.

If the aim behind noting fallacies is not a shallow sort of victory dance, but a pursuit of the virtues - in ourselves as well as in others - then we will be concerned not just with addressing wrong thinking, but also with what approach we choose to adopt in realizing that end.

Tuesday, April 18, 2017

It's Not Really About Free Speech

There was a time - it feels like forever ago now - that I was one of many Americans happy to rant continuously about the dangers of political correctness. Long before it was popular to deride people on the left as "social justice warriors", or to mockingly refer to your opponent as "triggered" for even the slightest of disagreement, it was a familiar talking point to suggest that political correctness is the road to tyranny. It just felt like common sense that everyone should be free to speak their mind without fear of reprisal from the state. And I still feel very strongly about this, although I have become increasingly skeptical of how these accusations are often thrown about.

One topical example is the debate over gender-neutral pronouns and the ways people choose to identify themselves. I have heard more than a few men (and occasionally women) complain about this societal shift away from black-and-white categories. Frequently, the line is: "You can call yourself whatever you like, but don't expect me to call you that." Taken charitably, the person making such a statement probably isn't aware of the hostility it expresses. In their mind, all should be well because you're free to be you, and they're free to be them.

Identity is something deeply personal, though. The questions of who we are and who I am are two of the biggest questions in the history of the West. We spend enormous chunks of our brief time on Earth trying to figure out what we want from life and where we factor into the whole grand picture of things. We have even created the term identity crisis to describe what is usually the most difficult and challenging form of this experience of self-examination. I rather like the synonym "soul-searching" because I feel it gives us a great idea of both the intensity and the elusiveness experienced during an identity crisis.

Now I can imagine someone suggesting that we should acknowledge this is a solo project. That word "self" is there for a reason, right? Well, this brings up the old Nature vs. Nurture debate. Some argue that we are born certain ways, while others argue that our environment determines who we are. I side with those that think the truth lies somewhere in between, but what I want to call attention to in all of this is the fact that our identities are not endowed to us whole-cloth by our genetics. We pick up some things from our parents, our other family, from our culture, and so on. Recognizing that identities are socially constructed does not mean absconding our responsibility or denying that some traits are inherited.

It should be abundantly obvious that if someone tells you how they identify, you are not respecting them in deliberately going against their wishes. Not only are you communicating that what this other person wants isn't really important to you, but you are basically telling them no, I think you are this. And by "this" I don't mean anything even like dishonest, bigoted, hateful, heartless, selfish, or the like. Those terms might suggest things about your behavior, but in telling someone their very identity, you're telling them who they are at their most intimate, deep down inside. It's understating things to say that you are in no real position to be able to tell someone else such a thing, especially in circumstances where you are barely an acquaintance.

"You can call yourself whatever you like, but don't expect me to call you that."

Stopping and thinking for a moment about the meaning behind this line reveals an expressed sense of identity. Don't tell me what to say, it communicates, because I have the freedom to say what I like. Important aspects of this person's identity, then, are things like their free will, their independence, and their individuality. It's not a coincidence that these ideas are also heavily-emphasized and cherished in a culture like that of the United States. In fact, they have become so deeply engrained in us that we don't always remember or appreciate the sources that have consistently upheld them as the norm for centuries.

There's an interesting web of issues here that is well beyond the scope of this post, but it's worth some exposure. Most of us are seeking to find ourselves, to know who we are, and to be respected as individuals. If we take individuality seriously, we can make our way to observing that what works for one of us isn't going to be everyone's cup of tea, and this can spark a rough agreement to generic freedom. You do you, I'll do me. However, the flexibility of identity keeps us on our toes. We encounter people who are not like us, sometimes even radically different, at the same time we question what separates us. Because we don't know the doubts, insecurities, or struggles the other might have, they can feel threatening to us, familiar as we are with our own doubts, insecurities, and struggles. They may appear to our minds as stronger examples of identity and individuality.

Of course, this changes as we get to know other people and learn who they really are. We find out they're not so different from us, and this makes them feel less threatening to us. The unknown always seems to tantalize us at the same time it instills fear in us. And when we feel threatened, sometimes we become defensive. In this context, what else could our reaction be except to attack the identity of the other person or fall back on reasserting our own prized identity? My free speech is at stake. Respect my rights. To some, it even becomes a conspiracy where the great boogeyman against individuality, totalitarianism, is looming on the horizon and threatening all our individual liberties.

Jordan Peterson, a professor at the University of Toronto, made headlines last year for declaring that he won't use "preferred pronouns" as part of his crusade against Bill C-16 in Canada. He wrote an article for the Toronto Sun purporting to explain his decision, and why we should all join him, that predictably throws in derogatory comments about "social justice warriors", political correctness, and Marxism, but is very light on either facts or compelling arguments. Peterson presents the bill as a measure to regulate what can be said on his campus, but many have pointed out that he is just wrong on this matter. Bill C-16 is actually aimed at protecting against federal discrimination and updating that with protections against advocating genocide and incitement to violent hatred against an identity group.

Peterson is a psychology professor and not a legal expert, it bears mentioning, and his article comes across as highly alarmist. So what - other than his obvious defensive need to safeguard his own identity - could possibly be so unbelievably urgent? Likely it would be argued that the wording is vague and ambiguous enough that it poses some hypothetical risk, but it's a wonder then why Peterson expends all his energy blasting the 'radical left' instead of proposing language that might be better suited for the bill. It sure seems like if his goal is actually to defend free speech, he could do so perfectly fine without any of the ideologically-motivated liberal-bashing that adorns his writing like lights on a Christmas tree.

All this would look terribly bad if it were only the case that folks like Peterson were being socially condemned for their choice to not acknowledge another person's identity. It's tough to justify outrage on that premise alone, but if you find a piece of legislation to whip yourself up into a frenzy about, it no longer seems like you're just upset that people aren't giving you undue deference anymore. However, it's common to find that people who do whip themselves up into a frenzy tend to have an exceedingly hard time not divulging what it is that really bothers them.

I use the pronouns I use because everyone else does. That’s how language works. When suddenly put on the spot with regards to exactly why I do that, and not something else, I am rendered speechless. Justification of this sort has never been required previously. It’s convention, and it is not a simple manner to understand the evolution of or rationale for convention.

One thing I will say is that we all need to acknowledge that we're capable of making mistakes. There needs to be more understanding, regardless of how we affiliate. This part of Peterson's article gives the impression that he is tired of being criticized for how he speaks to people. I do understand this, and I get that there are those on the left who can be unfair in the way they interact with people.

But there is a side of the excerpt above that feels like somewhat of an excuse. The very idea behind the push of some on the left to socially encourage the use of gender-neutral language is in recognition that this is how language works. The attempt is to change the language we use to be more inclusive, not through legal mandate, but through the social tools of praise and condemnation, among others. And yes, being criticized sucks. That's kind of the whole point here. If justification is being asked of Professor Peterson, it may not be for any initial use of certain pronouns, but for his repeated insistence on refusing to respect the wishes of his students after they have expressed how they want to be identified.

Without question, the immediate reply to this is probably that students today are too coddled as it is, or that the professor isn't there to be their friend or family member. But such a reply may be doing more harm than good. Peterson uses the analogy of a bank teller in his article, talking about how their interaction doesn't require her to reveal anything about her personal life to him. "To do her job," he writes, "she has to dress in a relatively innocuous manner, and present herself in [a] way that enables particularized, efficient and relatively shallow interactions."

I can't help making note of how amazingly apt this specific analogy is for raising a particular point. Paulo Freire and others have criticized what's called the banking model of education, which views students as passive receptors waiting to have the knowledge of the professor 'deposited' into them. For all sorts of reasons, this approach to education is regarded by many modern educators as outdated, too focused on memorization instead of real learning, and as having a less than helpful view of the role of students. Problem-posing education, which Freire advocated, gives students a more active role in their education, where the teacher facilitates discussion rather than dominates it, and where knowledge is presented in terms of questions for critical study instead of passing down a laundry list of facts through mere recitation and memorization. It certainly sounds from his analogy as if Peterson just wants more passive and less interactive students in his classes.

Peterson also works a job. Why shouldn't his job description be just as open to change as anyone else's? If a university wants to hire professors that are friendlier to students of different identities, what would be so wrong with that? It never ceases to baffle me how some conservative educators, who otherwise loudly defend the rights of employers and businesses, suddenly want special treatment when it comes to their own standing with their university. If your attitude towards a student disliking your use of a pronoun is essentially, "Tough, deal with it or transfer", then I'm not sure how you can become indignant when a school expects you to abide by certain codes of conduct, too.

I've used this example of Professor Peterson because I think it illustrates a lot of what I've seen and heard from those who take a firm stand against so-called identity politics. Free speech is very often a red herring, just as it is when Christians with a persecution complex lay claim to it in defense of their discrimination against homosexuals. It isn't really about that right being suppressed, it's about someone wanting to be able to speak their mind without reprisal. Like I said, I support this in the context of federal and state powers, and nowhere have I even implied that I think some legislation or university code of conduct ought to enforce what language we use. But it's not even clear that this happens anywhere close to as often as some on the right seem to allege it does. The bigger takeaway should be that we have to stop pretending that criticism and expressed disapproval from our peers (or students) is somehow a violation of our free speech.

Political correctness is frequently spoken of in this way, too. "You want to tell everyone what they can and can't say!" There is a world of difference between attempting to persuade others through reason and argument and attempting to force your will on them by legal means. This gets obscured so much in these types of debates it makes one wonder as to why. As if some among us feel so greatly threatened and averse to critique that they see the two as being one and the same.

We have to be careful what we buy into. Some like to throw out extreme analogies like whether we'd appreciate someone asking us to call them a word or name that we really wouldn't want to call them. Of course, that isn't at all like the idea behind respecting the way people honestly want to identify. And when I say that I wouldn't disrespect a friend or family member by calling them someone else's name, this doesn't need to be a one-to-one correlation in order for the point to be clear. We have no good reason for refusing to defer to another person on how they want to self-identify.

Wednesday, March 8, 2017

Emma Goldman on Atheism

Emma Goldman was a Russian Jewish immigrant to the United States, a passionate anarchist and atheist, as well as an advocate for the rights of women. While she was critical of some of the aims of first-wave feminism, she was also a vocal defender of contraception, free love, and homosexuality. Goldman was jailed several times for handing out information on birth control. "I demand the independence of woman," she wrote in 1897*, "her right to support herself; to live for herself; to love whomever she pleases, or as many as she pleases. I demand freedom for both sexes, freedom of action, freedom in love and freedom in motherhood."

In her eyes, anarchism was as much about liberating the individual from religion as it was about liberating her from the control of the state. Capitalism leads to exploitation and suffering, she believed, rather than to the social order and economic opportunities she found in her vision of anarchism. In 1923, she would publish My Disillusionment in Russia in reaction to her firsthand experiences with the aftermath of the Russian Revolution of 1917.

Goldman's atheism, like that of Nietzsche or Marx, was focused on this mortal life here on Earth, and tended to view religion as an impediment to human development. Her short essay, The Philosophy of Atheism, published in 1916, shares many of her thoughts on this subject. "The philosophy of Atheism expresses the expansion and growth of the human mind," according to Goldman. "The philosophy of theism, if we can call it philosophy, is static and fixed... Atheism has its root in the earth, in this life; its aim is the emancipation of the human race from all God-heads, be they Judaic, Christian, Mohammedan, Buddhistic, Brahministic, or what not... Man must break his fetters which have chained him to the gates of heaven and hell, so that he can begin to fashion out of his reawakened and illumined consciousness a new world upon earth."

The concept of God has changed substantially over time, she notes, but it originated from our fear of the unknown. As we discover ourselves and learn to shape our own destinies, theism becomes increasingly superfluous, the gods being transformed into something ever more indefinite, obscure, and nebulous. Goldman prematurely celebrates the decline of religion and the rise of atheism, though some of her observations here may sound all too familiar to us today, over a century later.
More and more, the various concepts "of the only true God, the only pure spirit, the only true religion" are tolerantly glossed over in the frantic effort to establish a common ground to rescue the modern mass from the "pernicious" influence of atheistic ideas. It is characteristic of theistic "tolerance" that no one really cares what the people believe in, just so they believe or pretend to believe.
Noting the injustice in the world, and the inaction of a supposedly loving deity, Goldman says that humankind alone can undertake the task of achieving justice on the earth. However, under the promises of an everlasting heaven and an omnipotent god, the human being has become "a will-less creature". "Again and again," she writes, "the light of reason has dispelled the theistic nightmare, but poverty, misery and fear have recreated the phantoms - though whether old or new, whatever their external form, they differed little in their essence." The triumph of atheism is its resistance to the paralyzing effects of religion.

For Goldman, atheism is not only the negation of gods, but the affirmation of man and woman, and in this it is the affirmation of life.

 *Cited in Alice Wexler, Emma Goldman: An Intimate Life, p. 94.

Tuesday, February 7, 2017

The Trump Administration and Plato's Prophetic Critique of Democracy

In the run up to the presidential election last year, it felt like it became commonplace at some point to hear Trump referred to as a demagogue. Arguably, there is no better word for it, with the way he appealed to fear and patriotic bravado to mobilize his supporters, while neglecting to give the slightest semblance of an argument in defense of the vast majority of his views. Even for months prior to November 9th, you could find references to Trump's demagoguery being made by Time, The Atlantic, The Washington Post, and others. Now that Kellyanne Conway has introduced the much criticized concept of "alternative facts", there hardly seems room left to hide from this accusation. Particularly if, as we suspect, these are facts in the same sense that the Bowling Green massacre is a fact.

What's interesting is that Plato kind of warned us this would happen. In The Republic, he critiques the direct democracy that existed in his time, and although this form of government differs in some ways from what we have in America today (or what we had in the past), we may nevertheless find a number of the criticisms are still quite relevant. Democracy understood as the rule of free people governing themselves in their own interests leads, in Plato's view, to demagogues and tyrants. In their emphasis on freedom and equality, democracies face the problem of corruptibility, and can fall either into anarchy or despotism.

Of course, Plato doesn't hold much regard for equality, probably in part because his favored form of society is class-based. On this we might rightly fault him, but we need not follow his lead here into abandoning ideals of economic or political equality, for instance. We can instead understand his criticisms in a manner like Robert Kane articulates them in his book Through the Moral Maze.

1. Democracies encourage mediocre leadership

Elected officials have to keep courting the favor of the people in order to maintain their place in positions of power. This tends to be a popularity contest more than any kind of election based on qualifications, experience, or intelligence. Pandering thus becomes a commonality as those in power try to stay in power the best way they know how: not by applying their own expertise or by doing what they think is right, but by appealing to the desires (and worries) of the masses. Unfortunately, one thing this can often mean is that our elected officials may be as ordinary and unexceptional as those that put them into office.

Considering how much attention has been devoted to the incredible lack of qualifications and experience in the Trump cabinet nominations, this critique looks to be pretty dead on. However, we can find further support in the very reasons Trump voters gave for why they voted as they did. "I feel like I know where I stand with Trump," says Rachel, in an article for The Guardian. "Trump is exactly what you get," Paul, another Trump voter, states. Most telling, perhaps, is Arlene's comment, who says that "Donald Trump might not have political experience but I truly believe he has the American people’s interest at heart."

Trump's win has also heralded what some consider to be the return of populism. Whether or not this is an entirely accurate characterization, it does speak to Plato's criticism. On the one hand, we may want our representatives to be "people like us" because those are generally the people we trust the most to make decisions on our behalf, but on the other hand, it is very likely true that most of us are not people especially capable of running branches, institutions, and systems of government. If we elect "people like us", we could well be electing people just as uninformed as us.

2. Democracies tend to focus on the short term rather than the long term

Because of how our leaders are elected, pandering to the wishes of the electorate typically means planning for the present and not for the future. Kane astutely notes that this problem is "at the expense of the long-term needs of society." It isn't always the case that taking no thought for the morrow is harmless. Sometimes failing to plan for the future has significant and long-lasting consequences. Along with this comes the all-too-familiar habit of giving the people what they want now, and passing the financial burden on to future generations.

What can't we say about this criticism? Let's just start with the foreboding moves the Trump administration has been making with respect to the EPA and climate change. Altering the EPA's climate website, suspending contracts, and teasing a case-by-case review of climate science work are fairly concerning signs of denial. Betsy DeVos' nomination as Secretary of Education raises additional worries, not just about science education specifically, but also about the seriousness with which the new administration takes the issue of providing an affordable, effective education to future generations.

Then we have the Wall as an example of a potential financial burden for American taxpayers, when Mexico predictably refuses for the last time to be bullied into building it. The immigration ban has been estimated as posing a cost of $700 million to U.S. colleges. Repealing Obamacare fully is being said to come at a price of a whopping $350 billion, not to mention all the jobs that stand to be lost as well. 

Getting people to understand the value of holding out for something better is notoriously difficult, particularly in a nation that prides itself on individualism, the myth of the self-made man, and instant gratification. By no means is Trump's administration unique in this, but we may see things as more pronounced here than they have been in many other administrations.

3. Image politics comes to dominate the electorate

The rulers in Plato's ideal society function in part to safeguard the values at the heart of society. Such an important task could not be trusted to the average person, Plato thought, but had to be something specially reserved for those who could be trained and educated in the proper ways. Democracy, he argued, usually devolves into politics focusing on appearances rather than on the things that really matter. How a candidate looks and sounds comes to be more important to people than what they say.

Again, we need not look far to see this criticism alive and well within the present administration. Trump has said some simply awful things and behaved in reprehensible ways toward women, yet his supporters haven't seemed all that perturbed by any of it. The Guardian article referred to above has the opinions of several Trump voters who state how impressive Trump's confidence, forthrightness, and business savvy are to them. We could indeed view this as an image issue, oddly winning out over even the moral concerns of some Americans.

It might be an understatement to say political debate has become superficial in the era of Trump. From remarks about penises during the campaign to the constant allegations of "fake news" that are being thrown at legitimate news outlets (and at almost any reporting the new administration merely seems to dislike), there is little question that this election and its aftermath have taken political discourse to another level. It may not be entirely new or entirely unprecedented, but what would previously have been roundly criticized as grossly immature has appeared to survive and elude such immediate hostility in today's political environment.

A society that focuses on images instead of issues is easy prey for manipulative personalities.

4. Democracies are prone to factionalism of special interests

Lobbyists, special interest groups, and money politics have been problems in the U.S. for a good while now, and they are certainly not limited to the current administration. But Trump's accusations against Hillary Clinton, as being controlled by special interests, can strike one as an instance of telling your neighbor to remove a splinter from her eye while you have a plank lodged in your own. Trump once said he'd disavow all Super PACs, shortly before he reversed his decision as soon as his party nomination was in the bag. His plan to "drain the swamp" apparently did not extend to keeping lobbyists out of his transition, either.

Plato believed that democracies lead to factionalism, as certain groups try to influence leaders for their own private interests. Because of this, democracies are also in danger of becoming a tyranny of the majority. As Plato saw it, the three worst forms of government participate in a sort of natural evolution: oligarchy gives rise to democracy, democracy gives rise to tyranny. Yet the tyrant won't be campaigning on a platform of dictatorship. Instead, he'll present himself as the champion of the people. It is "the insatiable desire" for freedom, Socrates says, "and the neglect of other things [that] introduces the change in democracy, which occasions a demand for tyranny."

Tyranny of the majority was a significant concern of some of the Founding Fathers, such as James Madison and Alexander Hamilton. In his classic 1840 text Democracy in America, Alexis de Tocqueville expressed this same worry:
When a man or a party suffers from an injustice in the United States, to whom do you want them to appeal? To public opinion? That is what forms the majority. To the legislative body? It represents the majority and blindly obeys it. To the executive power? It is named by the majority and serves it as a passive instrument. To the police? The police are nothing other than the majority under arms. To the jury? The jury is the majority vested with the right to deliver judgments.
While some might argue that Obama, Big Business, or another culprit is behind the true tyranny of the majority, an important aspect here is the role and rhetoric of populism. I think we have seen this manifested in Trump's campaign and presidency in ways that simply dwarf most other comparable examples. This attitude of giving back the power to the people is what Trump ran on and what his supporters continue to call for today, above and beyond many other issues and concerns.

5. A loss of shared values

Democracies that emphasize freedom and liberty make the individual the focus of value. This produces a gradual dissolution of common values, since people think more about themselves and their desire to do their own thing than they think of others. So responsibility to others and the common good are sacrificed to the right to do as you please, according to Plato. This in turn creates distrust of authority, social disorder, and rising crime rates. Another consequence, Kane writes, is a large generation gap, due to the fact that the young do not necessarily share their parents' values, and want the same right to do their own thing that everyone else has.

It's worth starting with that last point, because one interesting statistic that emerged from the 2016 election was the factor of age difference. "Young adults preferred Clinton over Trump by a wide 55%-37% margin," the Pew Research Center notes, while "Older voters (ages 65 and older) preferred Trump over Clinton 53%-45%." Distrust of authority and social disorder might be more than familiar to us, too, considering how both featured in the presidential debates. The crime rate is a somewhat thornier issue, though, since there is evidence showing an increase in violent crime from 2014-2015, for example, but this figure is still lower than it was in 2011 or 2006.

What ought to stand out most, however, is the theme of division. Trump supporters like those mentioned above in The Guardian article have voiced their opinion that our nation is fractured, hurting, and headed in the wrong direction. After the election, it seems that very many of those who voted against Trump likely feel the same way. Even before the presidential race, though, social conflict and domestic tensions were not at all outside the field of worries for Americans, as subjects such as immigration and Black Lives Matter would highlight.

It almost seems undeniable that there has indeed been a loss of shared values. How we should respond to this problem is what remains a matter of intense debate.

Is this the end of democracy?

A.N. Whitehead famously described the Western philosophical tradition as a series of footnotes to Plato. This may appear as only the most minor of exaggeration to those familiar with Anglo-European philosophy. Plato's take on democracy, as we've seen, levies some fairly powerful criticisms that we are still wrestling with over 1600 years later.

Does this mean that democracy is hopeless? I have heard many declare its death in the wake of the election, but the meaning of this death, and how we move forward, are challenging questions to answer. I don't pretend to have the solution, and it's pretty clear that Plato didn't have it, either. We might be reminded of Winston Churchill's comment on how democracy is the worst form government except for all those other forms. One thing I think could be a promising start would be a revival of the sort of understanding of democracy held by someone like John Dewey, where there is vital emphasis on the social nature of democracy, as it consists of shared common interests and cooperative interaction among a plurality of groups.

On the other hand, this revival may be too unlikely to be a practical hope. The next four, eight, or however many years will tell. My point in this post has not been to spell out the doom of democracy. Plato's criticisms speak to the flaws of democracy, and I believe we are seeing these loud and clear now in 2017, although they have actually been present for a long time. This doesn't have to mean that democratic governments are inevitable failures, but it should cause us to recognize the problems that do exist, and it should motivate us to seek out solutions.

Some Americans probably think this is what they've done in electing the man that now sits in the White House. But we have seen how the Trump administration, in just its first month in power, has aligned itself more clearly with the flawed side of democracy - where it is much closer to descending into tyranny, as Plato explained. It's hard to understand how the tendencies discussed here could set us back on the right track, though there are strong arguments, made by one of the greatest philosophers to have lived, that we are heading for serious trouble.

This may not be the end of democracy full stop, but it just might be the end of democracy as we know it.