This is an old revision of the document!


PBH310: Moral Psychology & Abortion

Introduction

Heart Apologetics & Moral Psychology

Apologetics is about effective communication and persuasion, fundamentally about rhetoric (in the classical, non-pejorative sense): How can we change what people think about an issue, and offer a defence, in this case, of the pro-life stance that is persuasive?

But effective apologetics is not purely cerebral. Aristole says that effective communication is ethos, pathos, and logos: build a bridge, touch the heart, then deliver the message. We need to not just win a debate, but to truly reach the person to be effective in apologetics - to change both the heart and the mind.

That's where heart apologetics comes in: Heart apologetics is about the emotional layer of apologetics, about responding to emotional blocks to accepting an argument. Fundamentally, heart apologetics is applied moral psychology. Moral psychology is about understanding how we think morally. Heart apologetics is being sensitive to moral psychology in our apologetics, about applying the insights from moral psychology to our conversations in order to be effective communicators to both head and heart.

Note that moral psychology is about how the mind actually works, not how it ought to work. Moral philosophy (or theology), and the field of normative ethics in particular, is about what is actually right or true. Here, in moral psychology, we are looking at functionalist and descriptive definitions - moral psychology looks at how the mind operates when thinking about morality (not at questions of objective right and wrong). This is not moral relativism - it's not a denial that there is an objective right and wrong, it's just not the focus of moral psychology. Moral psychology is about the “how” of moral thinking, not the truthfulness.

The Righteous Mind

For a guide through the field of moral psychology, I'm going to turn to Jonathan Haidt - whose name you'll see all over the Wikipedia article on Moral Psychology - and his landmark 2012 book The Righteous Mind: Why Good People Are Divided By Politics and Religion, which draws on 25 years of groundbreaking research and which, I think, is full of wisdom for pro-life activists as a sort of textbook for a 300-level course in heart apologetics.

(I first encountered Jonathan Haidt's concepts from the book in late 2020, in the midst of pandemic and 2020 presidential election debates and conspiracy thinking, and then my friend Katie bought me the book a year later at the end of 2021, and I read it slowly in 2022, 2023, and early 2024. I knew there were deep insights into heart apologetics from my first encounter, but as I worked my way through the book, I became more and more convinced that it would be helpful to run a seminar on this.)

I'm going to pull out the core insights from Haidt's work that apply to pro-life activism, and leave aside philosophical bones to pick and worldview differences. In the first half, we'll look at the Elephant and the Rider; in the second half, the six taste receptors and the hive switch.

Part 1: The Elephant and the Rider

Activate moral intuitions

(Show, don't tell, that moral intuitions come first.)

Haidt opens the first chapter by asking the reader to consider this story, and whether or not the people in it did anything morally wrong:

A family's dog was killed by a car in front of their house. They had heard that dog meat was delicious, so they cut up the dog's body and cooked it and ate it for dinner. Nobody saw them do this.

:?: Did the people in the story do anything morally wrong?

Or take this story:

A woman is cleaning out her closet and she finds her old American flag. She doesn't want the flag anymore, so she cuts it up into pieces and uses the rags to clean her bathroom. (Or if you're anti-American, insert the flag of your favourite country)

:?: Did the woman do anything wrong?

Haidt constructed many stories1) in his studies to produce a kind of moral dumbfounding (“I know it's wrong, but I can't explain why…”)

  • The well-educated people in Haidt's studies typically feel an initial flash of disgust, but then hesitate before saying the family had done anything morally wrong. After all, the dog was dead already, so they didn't hurt it, right?
  • But if you're not a liberal Westerner, like most people on the planet, you believe: “Some actions are morally wrong even if they don't hurt anyone” - but even if they couldn't explain why it was wrong
  • People would invent victims, like “what if someone saw her do it?” One kid said, “well the flag might clog up the toilet and cause it to overflow,” or of the dog meat, someone said they might get sick from it, etc
  • Moral reasoning is often the servant of moral emotions. Gut feelings can sometimes drive moral reasoning. Moral reasoning is sometimes a post hoc fabrication.
  • More on the particular moral feelings in part 2, but for now, let's think about the method here: intuitions first, reasoning second

The Elephant and the Rider

Through much of his research, Haidt found that: Intuitions come first, strategic reasoning second. He sums this up with his analogy of the elephant and the rider.

Stop at 2:02

Jonathon Haidt illustrates this with a personal example:

On February 3, 2007, shortly before lunch, I discovered that I was a chronic liar. I was at home, writing a review article on moral psychology, when my wife, Jayne, walked by my desk. In passing, she asked me not to leave dirty dishes on the counter where she prepared our baby's food. Her request was polite but its tone added a postscript: “As I have asked you a hundred times before.”

My mouth started moving before hears had stopped. Words came out. Those words linked themselves up to say something about the baby having woken up at the same time that our elderly dog barked to ask for a walk and I'm sorry but I just put my breakfast dishes down where I could. In my family, caring for a hungry baby and an incontinent dog is a surefire excuse, so I was acquitted.

Jayne left the room and I continued working. I was writing about the three basic principles of moral psychology. The first principle is intuitions come first, strategic reasoning second. … So there I was at my desk, writing about how people automatically fabricate justifications of their gut feelings, when suddenly I realized that I had just done the same thing with my wife. I disliked being criticized, and I had felt a flash of negativity by the time Jayne had gotten her third word (“Can you not…”). Even before I knew why she was criticizing me, I knew I disagreed with her (because intuitions come first). The instant I knew the content of the criticism (“ … leave dirty dishes on the …”), my inner lawyer went to work searching for an excuse (strategic reasoning second). It's true that I had eaten breakfast, given Max his first bottle, and let Andy out for his first walk, but these events had all happened at separate times. Only when my wife criticized me did I merge them into a composite image of a harried father with too few hands, and I created this fabrication by the time she had completed her one-sentence criticism (“… counter where I make baby food?”). I then lied so quickly and convincingly that my wife and I both believed me.

It's the moral flash I want you to recognize, your moral intuitions. This is your elephant.

Think about your own experience, talking to other people, but more importantly, reflect on yourself and your own moral psychology. What happens when you hear these words: vaccines, evolution, climate change, transgenderism, abortion, condoms, Donald Trump, Justin Trudeau, COVID-19, immigration, guns. What I want to to feel is the elephant - that strong affective response, the “gut feeling.”

In study after study, Haidt finds that moral judgment is far from a purely cerebral affair in which we're consciously reasoning (the rider), but actually “moral judgment is mostly done by the elephant.” For example, they hooked people up to fMRI scanners and presented them with trolley problem type moral dilemmas, and it was the emotional processing part of the brain that immediately lit up and that corresponded with the moral judgments made - not the conscious, cerebral logical part of the brain.

Does this not match our experience talking to people about abortion? This is why we have heart apologetics - it's not always a purely rational conversation. Does this not match your experience in discussions on a wide variety of issues? There are so many proxy battles being fought in typical moral and political discussions… FIXME elaborate

Haidt puts the elephant and rider into more academic terms with the social intuitionist model.

The Social Intuitionist Model

This is an academic, evidence-based explanation of heart apologetics, through the insight of the elephant and the rider: (Walk through each of the arrows, one by one.)

:!: Haidt talks about the application of the social intuitionist model for moral persuasion:

The social intuitionist model offers an explanation of why moral and political arguments are so frustrating: _Moral reasons are the tail wagged by the intuitive dog._ A dog's tail wags to communicate. You can't make a dog happy by forcibly wagging its tail. And you can't change people's minds by utterly refuting their arguments. […] If you want to change people's minds, you've got to talk to their elephants. You've got to use links 3 and 4 of the social intuitionist model to elicit new intuitions, not new rationales.

Therefore, if you want to change someone's mind about a moral or political issue, talk to the elephant first. If you ask people to believe something that violates their intuitions, they will devote their efforts to finding an escape hatch - a reason to doubt your argument or conclusion. They will almost always succeed.

Haidt offers a lot of insight into the dynamic of moral debates with this model, and the elephant and the rider analogy, and how to be persuasive and avoid fuelling motivated reasoning:

When does the elephant listen to reason? The main way that we change our minds on moral issues is by interacting with other people. We are terrible at seeking evidence that challenges our own beliefs, but other people do us this favour, just as we are quite good at finding error's in other people's beliefs. When discussions are hostile, the odds of change are slight. The elephant leans away from the opponent, and the rider works frantically to rebut the opponent's charges.

But if there is affection, admiration, or a desire to please the other person, then the elephant leans *toward* that person and the rider tries to find the truth in the other person's arguments. The elephant may not often change its direction in response to objections from its *own* rider, but it is easily steered by the mere presence of friendly elephants [social persuasion] or by good arguments given to it by riders of those friendly elephants [reasoned persuasion].

These are even times when we change our minds on our own, with no help from other people. Sometimes we have conflicting intuitions about something, as many people do about abortion or other controversial issues. Depending on which victim, which argument, or which friend you are thinking about at a given moment, your judgment may flip back and forth as if you were looking at a Necker cube. (FIXME)

And finally, it is possible for people simply to reason their way to a moral conclusion that contradicts their initial intuitive judgment, although I believe this process is rare.

In studies on IQ and moral psychology, they found that IQ was by far the biggest predictor of how well people argued, but it predicted only the number of “my-side” arguments. Smart people make really good lawyers and press secretaries, but they are not better than others at finding reasons on the other side. “People invest in buttressing their own case rather than in exploring the entire issue more fully and evenhandedly.”

On motivated reasoning:

The social psychologist Tom Gilovich studies the cognitive mechanisms of strange beliefs. His simple formulation is that when we want to believe something, we ask ourselves, “Can I believe it?” Then, we search for supporting evidence, and if we find even a single piece of pseudo-evidence, we can stop thinking. We now have permission to believe. We have a justification, in case anyone asks.

In contrast, when we don't want to believe something, we ask ourselves, “Must I believe it?” Then we search for contrary evidence, and if we find a single reason to doubt the claim, we can dismiss it. You only need one key to unlock the handcuffs of must.

Now, Haidt says that elephants are sometimes open to reason, not like a slave serving a master but a lawyer serving a client. The elephant is far more powerful, but is not an absolute dictator.

  • In a lawyer/client relationship, the client is usually calling the shots - but it can sometimes be possible for the lawyer to persuade the client to change their mind
  • In a healthy relationship, ie. in cultivating moral virtue, I think that the rider can gradually train the elephant so that there is a harmonious relationship and so that the elephant receives direction from the rider in a virtuous person, but the point is that this is not easy or immediate, it would take years of careful training because of the sheer power of the elephant, and our moral judgments are still elephant-responses first - just, they may be the response of a trained and well-formed elephant, if we work to train the elephant over the course of our lives

For pro-life activism:

  • Reflecting on our own elephant is hugely helpful for developing empathy for other people's elephants, if we can become aware of the elephant/rider dynamic in ourselves
  • When we are speaking to other people, we need to be conscious of speaking to the elephant and not the rider if we want to be persuasive and reach the whole person

And this leads into the second half, on how we can apply further lessons from moral psychology on how to be persuasive to other people's elephants.

FIXMEs

FIXME I need some good stories and experiments that speak to conservatives, not liberals, e.g. flag burning or religious desecration (maybe in reverse order, starting with secular religion first…)

FIXME I need some key images and graphs: the moral matrices from Part III, maybe the graphs from Part II as a warm up to that…

The Six Taste Receptors

There's more to morality than harm and fairness.2)

FIXME combine parts II and III into a second half, where Part II is descriptive and Part III is prescriptive (in parts of the pro-life application)

From YourMorals.org:

  1. Care: Concerns regarding care and protecting individuals from harm
  2. Equality: Equality is a psychological motive for balanced reciprocity, equal treatment, equal say, and equal outcome
  3. Loyalty: Concerns regarding loyalty to others, self-sacrifice, and patriotism
  4. Authority: Concerns regarding respect to authority and rejection of insubordination
  5. Purity: Concerns regarding maintaining purity and preventing degradation
  6. Proportionality: Proportionality is a psychological motive for rewards and punishments to be proportionate to merit and deservingness and benefits to be calibrated to the amount of contribution

FIXME need better definitions and understanding, and… can this just be combined with the second section? The lessons are in Part III, the discovery is in Part II

The Hive Switch

Morality binds and blinds. We are 90 Percent Chimp and 10 Percent Bee.

FIXME points to bring out:

  • How political teams form, how people gravitate to the left or right
  • For pro-life activism
    • First, there's a lot of wisdom here in internal community-building - active the hive switch
    • Second, there's a lot of wisdom here in being effective communicators
      • We need to understand how abortion advocates may be thinking, working off different moral foundations, and we need to have empathy and be able to speak their language
      • We also need to understand how abortion advocates may see us if they misunderstand our moral foundations, and be prepared to speak to the elephant in order to build connection and help lower their defences, etc
1)
p. 22, typically focused on disgust and disrespect
2)
This is directed at a liberal audience…