Our Foundering Fathers: On Moral Facts And The Moral Imagination
Yesterday the Times published an opinion piece by professor Justin P. McBrayer entitled “Why Our Children Don’t Think There Are Moral Facts.” Like many similar pieces of modern philosophy, it begins with a debate between Professor McBrayer and his second-grade son. Professor McBrayer takes a Socratic approach:
Me: “I believe that George Washington was the first president. Is that a fact or an opinion?”
Him: “It’s a fact.”
Me: “But I believe it, and you said that what someone believes is an opinion.”
Him: “Yeah, but it’s true.”
Me: “So it’s both a fact and an opinion?”
The blank stare on his face said it all.
I think we can be fairly sure that the blank stare said the following (I’m paraphrasing here): “Dad, sometimes, you are a real pain in the ass.”
McBrayer is ultimately trying to discredit the Common Core curriculum, a new mainstay in our public schools, by ridiculing its curricular distinction between fact and opinion. This is a political attack, not a philosophical inquiry. He implies that the Common Core is an offshoot of “the presence of moral relativism in some academic circles,” summoning those straw men so beloved of the American Right: left-wingers with tenure, spouting fashionable nonsense from atop their ivory towers. (McBrayer’s source for information about this new epidemic of relativism? The New American, published by The John Birch Society.) He claims that the Common Core, and others of its kind, lead to “rampant cheating on college campuses.” He draws a somber distinction between the halls of learning, where anything goes, and the “world beyond grade school,” where “the stakes are greater.”
McBrayer’s argument is a train wreck. When the Common Core defines “beliefs” as “opinions,” it is using the word “belief” in the sense of “trust, faith, or confidence in someone or something.” On the other hand, when McBrayer says he “believes” that George Washington was America’s first President, he is indicating his “acceptance that a statement is true.” (I’m quoting the built-in definitions of “belief” supplied by Google Search. You don’t even need to follow a link.) Obviously, there is a difference between stating a fact and accepting that it is true, so we use two different words.
I’m not a relativist, yet I am comfortable calling value judgments and strongly held beliefs by their rightful names. I am part of a global community where murder is illegal and free speech is protected. I do not see how invoking “moral facts” does anything new to protect or honor the lives of the artists murdered in France. When McBrayer invokes them, he is exploiting them, not protecting others who may be in danger.
When McBrayer blames academic dishonesty on academic “doublethink,” he is blaming the students for a system-wide decline. Sociologist Dan Ariely studied cheating and dishonest behavior, and found that creativity was correlated with cheating:
In all five of Gino and Ariely’s experiments, creativity was clearly correlated with increased dishonesty. And though they are not yet fully able to demonstrate it, both Gino and Ariely feel like creativity increased dishonesty precisely because it allowed people to genuinely see credible rationalizations where others could not.
“If you are a creative person, all of a sudden you can go through the same amount of evidence and find many more links to justify the position that you want to have to start with.”
His study produced these results because all of his tests consisted of arbitrary assessments completed for a monetary reward. The more the participants cheated, the more money they received. In other words, the more they cheated, the better they did on the assessment. It is a huge mistake to separate the payout from the participant’s “actual results” on the assessment, since the whole exercise is a unified experience for the participant. Furthermore, the participant’s interests are not aligned with those of the researchers. The participant wants money; the researchers want data. The researchers lied to the participants, telling them that the self-scoring portion “accidentally” revealed correct answers (giving the participants the chance to cheat). The participants lied back, as well they might — and this is exactly what happens at college.
College students attend college in order to earn a degree. They get their degrees in order to earn more money. Cheating clearly helps them accomplish their goals. The institution, meanwhile, worries about cheating because it is trying to protect its own reputation. That is ultimately a financial concern too: if a school has a reputation for being too easy, its diplomas lose value, which makes it harder for the school to compete in the academic marketplace. The school doesn’t care about its existing students; they’re locked-in. It cares about its future. Can we really blame students for feeling the same way?
When I was in college, I didn’t cheat. I had too much invested in my own “intelligence” and “creativity.” It would have been depressing to download an essay from the Internet, because it would have been tantamount to admitting that I had no ideas of my own, and probably never would. I wasn’t motivated by the “moral fact” that cheating was wrong. I was motivated by a different story — about who I was, and what talents I possessed — that trumped any possible rationalization I could devise. Creative people don’t like sacrificing opportunities to create. Framing assignments that way is the way to keep them honest.
The same is true of intelligence. If a person considers herself smart, and thinks she can get smarter, she will not cheat. The mobile game “Trivia Crack” is an excellent example. Almost every question in the game could be answered by cheating, and yet cheating appears to be rare. (I play regularly, against a dozen random opponents at a time, and have yet to encounter any serious abuses.) The point of the game is to test and improve your skills, so nobody cheats. Without accurate results, they can’t know where they stand.
In this sense, there is very little difference between “self-interest” and “enlightened self-interest.” When I hold up my end of a social contract, I do so for two reasons: (a) because I can successfully imagine the consequences of my actions, and (b) because I imagine my actions either helping, or harming, someone with whom I share a common interest. Thus moral action requires just as much “rationalization” as immoral action. It requires “moral imagination,” a term popularized quite recently by mediation expert John Paul Lederach.
It is natural that a mediator would be at the forefront of contemporary ethical investigations, because mediators are trained to resolve conflicts between parties that have both competing and common interests. Without a common interest there is nothing to mediate. College students can show up to their classes for reasons that transcend personal gain, but the college has to embrace those same values too, and all the evidence suggests they do not. Colleges are “streamlining” course offerings, exploiting cheap labor, eliminating departments, and increasing class sizes. The students are not fooled.
Remember that recent meme about the dress? Somebody wore some dress to a wedding. The photo was then run through a couple digital filters. One made the dress look blue/black, and the other made it look white/gold. The point of the meme was not to “figure out” what the dress “really” looked like. That was pretty obvious — it looked like neither, but was closer to white/gold. The point of the meme was to run around squawking excitedly about the picture: “Which one is it! Which one is it! Which do you think? Take a look!” The point was also to pretend that maybe, just maybe, you didn’t think the three pictures were all created from the same original, even though they clearly were.
The meme accomplishes two important things. First of all, it reinforces our sense that you can’t trust the media because, as Meghan Trainor puts it, “we know” that they are “working that Photoshop.” You can’t trust the pictures in a magazine, but you should be obsessed with them anyway, just like you are with the enigma of the dress. Second, the mundane photo and the florid filters combine to suggest that, hey, don’t worry, anybody with a smartphone can look equally transformed, and nobody will be able to tell what’s real and what isn’t.
This is the sophistic attitude of the college student who tells himself that if the administration is motivated by greed, then cheating is justified. He thinks he is just as empowered as the college to pursue his own agenda, but he is not, and college instructors catch plagiarists all the time.
More generally, this is the attitude of a person like McBrayer, who thinks that if big phenomena like the Common Core are “confusing” us about the difference between facts and opinions, then we are totally on our own, and can reach any conclusions we want. He says that’s “a hard thing to do,” but it is a very easy thing to do, if you don’t trouble yourself about the consequences of your actions. Values aren’t facts, which is why there are no “moral facts.” Values matter, but only indirectly, through the laborious, endless work of trying to understand how best to promote the general welfare. If you go around assuming that good intentions are sufficient, you really have no idea at all what your actions mean in a world so much larger than yourself. In the realm of ethics, replacing moral imagination with moral facts undermines tolerance and retards critical inquiry. McBreyer thinks the filter makes the dress:
Not all of our reasons for belief are epistemic in nature. Some of our reasons for belief are prudential in the sense that believing a certain thing advances our personal goals. When it comes to belief in God, the most famous formulation of a prudential reason for belief is Pascal’s Wager. And although Pascal’s Wager fails, its failure is instructive. Pascal’s Wager fails because it relies on unjustified assumptions about what happens in the afterlife to those who believe in God verses those who do not. A renewed wager can avoid this difficulty by relying solely on well-documented differences between those who believe in God verses those who do not. Social scientists have put together an impressive set of data that shows that theists do better in terms of happiness, health, longevity, inter- personal relationships, and charitable giving. Hence, most people have a strong reason to believe in God regardless of the evidence.
-Justin McBrayer, “The Wager Renewed: Believing In God Is Good For You”
There you have it, folks. God is a fact, regardless of the evidence. Behold the moral fact: the fact par excellence, vaster than opinion, and freed from all sordid burdens of proof!
Until next time, this is Kugelmass saying…