Good Reason

It's okay to be wrong. It's not okay to stay wrong.

Category: cognition (page 1 of 4)

Do we have free will?

I recently interviewed professor of neuroscience Thalia Wheatley for an episode of Talk the Talk, but at the tail end of the interview, I threw her a curveball and asked her about free will. I’ve been trying to understand this for a long time. Do we choose something, or does our brain just… do it? And if it does, what does that mean?

Here’s that part of our conversation.

Your browser does not support this audio

If you want some prep, here’s a video of Thalia with actor Alan Alda.

The rest of our interview will be appearing in an episode this August. Watch for it!

Pareidolia of the Daylia: God moves in eggplanty ways

It’s not just Christians and Muslims who imagine religious images in food. Now Hindus are getting in on the act.

Believers are flocking to a Leicestershire temple to pray twice a day to a vegetable that looks like a Hindu god.

The divine aubergine was discovered among a box from a wholesalers and has been worshipped by more than 80 people so far.

Hindus: Behold your god!

I’m sure that many Hindus would think this is silly, just as many Christians think that Toast Jesus is silly. But according to the article, about 80 people have come to the restaurant to pray. For every one of those people, their religion has short-circuited the part of their brain that helps them realise that it’s stupid to venerate an eggplant. And that’s a terrible thing.

The danger is that, by worshipping an eggplant, they might accidentally be paying homage to the Eggplant God, and that’d really piss Ganesha off. Tramplings would ensue. You don’t want to make Ganesha mad — he never forgets.

Talk the Talk: Language and the Pirahã

Doing the podcast is my dream job. Not only do I get to talk about language every week, but I also get to talk about language with some of my linguistic idols. Dr Daniel Everett is definitely on the list. I’ve talked about his work with the Pirahã people of the Amazon many times in my classes, but here I got to ask him about what it all means.

Now everyone on my interview list can move up one. What linguistic types should I go after next?

First episode: Here
Second episode: Here
Subscribe via iTunes: Here
Show notes: Here

Show tunes:

‘Sunchemical’ by O Yuki Conjugate
from the album Equator

‘Crawling by Numbers’ by Lali Puna
from the album Faking the Books

‘Now It’s On’ by Grandaddy
from the album Sumday

Is it right to try to influence others in the Church?

Steve Bloor is an ex-Mormon, an ex-bishop, and a much nicer guy than I am. He’s so nice, he actually cares about other people’s feelings. If I could be like that — it would require a soul transplant, and since I don’t have a soul, there’s no place to put a new one. Ah, me.

His latest blog post has got me thinking. It’s titled: Is it right to try to influence others in the Church?

My first impulse is to say, “Of course. Why wouldn’t you? Next!” But it’s more complex than that, and it gets into the reasons why I write things here and there, so let me unpack it.

I like where Steve’s coming from. He knows that people are harmed by their involvement in the church, and that a rational worldview is the most helpful basis for living one’s life. I agree — dumping a layer of supernatural fictional goo isn’t going to help anyone think better or be a better person, at least not for any real reason worth mentioning. Maybe some people can be frightened into acting good for a while, but there are going to be some costs involved (and the church profits in the end), so I don’t see any real benefit.

Something Steve doesn’t mention is that as ex-Mormons, we sometimes try very hard not to ‘appear evil’. I certainly did; I didn’t want to confirm every rotten stereotype I’d been fed about people who leave the church. And one of the ‘evil’ things ex-Mormons do is talk about their experience, so sometimes people hesitate to do that. Seriously, I’ve run into a lot of people in the initial stages of their deconversion who ‘don’t wish to harm others’ with their knowledge. Isn’t that strange? Yes, deconversion is disruptive and difficult, but it’s not Jedi death rays.

So when Steve asks:

Should I try to raise awareness of the potential problems with their belief system?

Does my attempt at raising awareness actually achieve anything? Or does it create a feeling of being threatened & create fear in my TBM [true believing Mormon] friends and family? In the end is my attempt futile or counter-productive?

the answer is: Don’t worry! The ex-Mormon power has a unique attribute: it only works on people who are ready for it. It leaves True Believers entirely unscathed. There is no way you can ‘harm’ anyone’s faith. True Believers have a whole range of psychological defence mechanisms to protect their faith, including communal reinforcement, confirmation bias, and sudden-onset deafness. Don’t worry about ‘harming their faith’; because it isn’t built on reason, it’s very robust against rational attacks. Unless they take science and reason seriously, as good things — then it’s another story.

The other part of Steve’s question is a really good one:

Does my approach to this increase my own happiness & wellbeing, or does it cause me angst & emotional fatigue?

It’s a good point; if it’s pissing you off, you’re doing it wrong. Remember, we’re the ones who escaped! We’re not bound down by artificial guilt or arbitrary restrictions. We have permission to be happy. If the discussions are causing you angst, maybe you’re taking the need to convert others too seriously. (It’s a common holdover from the evangelical mindset.) Accept that some of your friends and family will never deconvert, and will stay in the Church their entire lives. Relax; they won’t go to hell or one of the lower kingdoms. Eternal punishment for the crime of misbelief is no longer one of your beliefs. Realise that long-term social and religious patterns are trending in our direction, even if the people you care about aren’t part of it. Once you’re free of the idea that you can deconvert people, it’s very liberating. You can accept your TBM friends, even if they can’t fully accept you.

So if we can’t deconvert them, and they may not even hear us, why speak out at all? Why be public? We do it because there are people around us who are deconverting, and if we’re visible, they’ll come to us for help. I have had several people from my mission contact me. They’d begun thinking thoughts that couldn’t be unthunk. So I’ve been able to talk to them and encourage them, and it’s been a great experience.

It’s perfect, really — those who are ready find us, and those who aren’t ignore us. Influence others? They influence themselves. All we have to do is stay visible.

Denser is slower

A linguistic tidbit from the ‘Obvious in Retrospect’ file:

A recent study of the speech information rate of seven languages concludes that there is considerable variation in the speed at which languages are spoken, but much less variation in how efficiently languages communicate the same information.

Dr. Pellegrino outlined the major findings of the team’s research: “Languages do need more or less time to tell the same story – for instance in our study, the texts spoken in English are much shorter than their Japanese counterparts. Despite those variations, there is a tendency to regulate the information rate, as shown by a strong negative correlation between the syllabic rate and the information density.” In other words, languages that are spoken faster (i.e., that have a higher syllabic rate) tend to pack less information into each individual syllable (i.e. have a lower information density).

In other other words, the more information packed into each syllable, the slower those syllables have to be delivered. Across languages, those two factors balance each other.

It makes sense because human brains have a cognitive limit, and they’ll only put up with so much throughput. Still, nice to see this result in black and white.

Inappropriate brand identificaton

There’s enjoyment and there’s investment.

Let’s take the band Gomez for an example. I noticed the other day that I have a lot of Gomez albums, and I like them, but I wouldn’t call myself a Gomez fan. There’s some level at which I haven’t identified with them.

On the other hand, when I first heard the Leisure Society or Seabear, it was more than just liking their stuff. I connected with them in some way that made me say “I can get behind this.” I reserved a tiny part of myself for them, and made them a part of my social identity (because listening to music is as much about social alignment as musical enjoyment).

But defining yourself in terms of musical taste might not be such a great idea. What happens if ‘your special band’ releases a disappointing second album (as the Leisure Society and Seabear both did)? Will you be able to update, or will that be too threatening to your self-image? Maybe you’ll just never listen to the new stuff, and keep thinking they’re great.

What I’m talking about is the perils of Fanboi Syndrome, and it’s the topic of this study (thanks to Kuri). Except this is about brands, not bands.

You may think you’re defending your favorite platform because it’s just that good. But, according to a recently published study out of the University of Illinois, you may instead be defending yourself because you view criticisms of your favorite brand as a threat to your self image. The study, which will be published in the next issue of the Journal of Consumer Psychology, examines the strength of consumer-brand relationships, concluding that those who have more knowledge of and experience with a brand are more personally impacted by incidents of brand “failure.”

The researchers performed two experiments, one on a group of 30 women and another on 170 undergraduate students, in order to see whether the subjects’ self esteem was tied to the general ratings of various brands. Those who had high self-brand connections (SBC)—that is, those who follow, research, or simply like a certain brand—were the ones whose self esteem suffered the most when their brands didn’t do well or were criticized. Those with low SBC remained virtually unaffected on a personal level.

Boy, do I hear this. I used to be an Apple fanboi. Well, I still kind of am, partly because I think their stuff is good, and partly because of the thousands of happy hours I’ve spent computing on the MacOS. But a little tiny part of me is heavily invested in Apple, to the extent that I have to try not to feel personally affronted if AppleHaterz bag it, and I’m likely to write off their opinion.

I used to be worse. You should have seen me in the 90s, when the Mac was an endangered species. But brand identification is something of a danger. It’s one more kind of bias that keeps us from seeing clearly. Companies shouldn’t have that kind of hold.

Young children are especially trusting of things they’re told

From the Journal of Obvious Results: Little kids will believe anything you tell them.

In one experiment, an adult showed children a red and a yellow cup, then hid a sticker under the red one. With some children, she claimed (incorrectly) that the sticker was under the yellow cup; with other children, she placed an arrow on the yellow cup without saying anything. The children were given the chance to search under one of the cups and allowed to keep the sticker if they found it. This game was repeated eight times (with pairs of differently colored cups).

The children who saw the adult put the arrow on the incorrect cup quickly figured out that they shouldn’t believe her. But the kids who heard the adult say the sticker was under a particular cup continued to take her word for where it was. Of those 16 children, nine never once found the sticker. Even when the adult had already misled them seven times in a row, on the eighth chance, they still looked under the cup where she said the sticker was. (At the end of the study, the children were given all the stickers whether or not they’d found any of them.)

“Children have developed a specific bias to believe what they’re told,” says Jaswal. “It’s sort of a short cut to keep them from having to evaluate what people say. It’s useful because most of the time parents and caregivers tell children things that they believe to be true.”

Useful, yes, but then some of us have religious parents — good people who love us, no question — but who give us hours and hours of religious indoctrination, filling our heads with appalling rubbish. It short-circuits our logic and makes us believe things are true if the group says they’re true. Our thinking skills now subverted, we’re sitting ducks for all kinds of crazy ideas. Or as Dawkins said in ‘The God Delusion‘:

Natural selection builds child brains with a tendency to believe whatever their parents and tribal elders tell them. Such trusting obedience is valuable for survival: the analogue of steering by the moon for a moth. But the flip side of trusting obedience is slavish gullibility. The inevitable by-product is vulnerability to infection by mind viruses.

But eventually the influence of parents diminishes. Then you believe it, not because your parents keep telling you it’s true, but because you keep telling yourself it’s true. Your own mind takes over the work that your parents began. At that point, it’s very difficult to escape.

Jesus (that jerk) knew what he was talking about when he said you’d need to be like a little child to be a believer. Undeveloped reasoning skills, and complete reliance on authority figures. Yep, that sums it up.

Even now, when I think of the time I spent on superstition, I feel quite cranky. How much farther ahead I’d be now if I’d been taught (or taught myself) to reason well.

And then every once in a while, I’ll see someone who says, “Even as a little kid, I knew religion was crap.” I look on these people with a kind of awe and envy. It sure wasn’t me.

The role of disgust in opinion-forming

How do we go about forming opinions? As for me, when a moral or political decision comes up, I rationally sit down, weigh up the pros and cons of the options, and take the view that I think is best based on the evidence.

No, just kidding. I probably do it the other way around like everyone else. Form a snap opinion, and then hunt around for evidence to justify it. I don’t like the idea that this is how we operate, but it’s probably true all the same.

My first experience with political opinion-forming was the US election in 1972. My entire Republican family was voting for Nixon, but I thought I’d vote for McGovern. I didn’t even know what voting was. I’d seen the primaries, and I thought that when you voted, you had to go and stand next to your candidate so they could count you. There I imagined my family, standing with Nixon (with his fingers in ‘V for Victory’ pose), while on the other side of the room it was just George and five-year-old me. Why did I take the view I did? Why did they? I don’t know, but it is funny that no one in my family has changed voting patterns since then.

Sometimes my opinions lead on from prior opinions, or from values that I have, but where did they come from? I can’t say it’s anything more conscious than my ‘voting’ for McGovern all those years ago. I’ve often suspected that my opinions are based on some tendency, a leaning one way or the other that tips other decisions. But what tendency? Looking out for in-group v sympathy for out-group? Fearful or fearless? Authoritarian or democratic? Or something more primal?

New research highlights the role of simple ordinary disgust.

This is the argument that some behavioral scientists have begun to make: That a significant slice of morality can be explained by our innate feelings of disgust. A growing number of provocative and clever studies appear to show that disgust has the power to shape our moral judgments. Research has shown that people who are more easily disgusted by bugs are more likely to see gay marriage and abortion as wrong. Putting people in a foul-smelling room makes them stricter judges of a controversial film or of a person who doesn’t return a lost wallet. Washing their hands makes people feel less guilty about their own moral transgressions, and hypnotically priming them to feel disgust reliably induces them to see wrongdoing in utterly innocuous stories.

Psychologists like [Jonathan] Haidt are leading a wave of research into the so-called moral emotions — not just disgust, but others like anger and compassion — and the role those feelings play in how we form moral codes and apply them in our daily lives. A few, like Haidt, go so far as to claim that all the world’s moral systems can best be characterized not by what their adherents believe, but what emotions they rely on.

Primal emotions as atoms in the periodic table of our moral chemistry. Maybe these simple reactions are too simple to explain the complex range of opinions that grow out of them, but if opinion-forming goes back to something simpler, then disgust seems like a good candidate. I’ll be looking forward to more of this research.

Illusion of the Year 2010

How do you get a ball to roll uphill?

This fascinating device won first prize for Best Illusion of the Year, held by the Neural Correlate Society. The other illusions are great too.

I love optical illusions. They make me say, “Wow, I must have had some really bad assumptions back there.” We do the best we can with our pretty-good brains.

Deconversion stories: Why so long?

Why did it take so long for me to leave religion?

I keep coming back to this question, in fact kicking myself over it — all that time and energy gone. Then I cut myself some slack. I remember that it’s hard to get out of a system you’re born into, and one that you’ve believed and invested so much in.

Still, all that aside, why did it take me so long to recognise the now-obvious absurdities and contradictions in Mormon doctrine — actually, in all of theism? And Mormon doctrine is full of absurdities. Translating out of a hat? Pouring oil on someone to heal them from diseases? God living on a planet near the star Kolob? Having to memorise and repeat words and signs to get into heaven? Ridiculous in retrospect. Why did it seem so plausible at the time?

Of course, we can turn to the standard set of devices that humans use to believe the implausible: communal reinforcement, childhood indoctrination, confirmation bias. But recently I realised a little something extra that probably helped keep my belief afloat: It’s very difficult to critique a religion effectively when you still accept some religious ideas. Meeting on Saturday might seem arbitrary, but really, meeting on Sunday is equally so. Believing in chakras is not so absurd when you believe in spirits. Why would it be a problem for a ghost to tell Nephi to kill Laban, when David killed Goliath? And so on. Religious beliefs don’t seem absurd in contrast with other religious beliefs. What we’re able to question depends on what we already accept as true.

In other words, the only solid ground from which to criticise religion is atheism. But how likely is someone to question the whole kit-n-kaboodle all at once? What’s more likely to happen is that we’ll try to preserve as much of the original belief as we can. Much less painful that way. But when you do that, you’re unlikely to question that one little assumption that allows the whole structure to stand: that there’s a god who can do magical things when it wants to. If you accept that one idea, then you can magic your way around any contradiction.

Once you step outside of that bubble and question the idea of a god, then all the absurdities become transparently obvious. But that’s an advanced move, and probably one that people only try when all other options are exhausted. No wonder it can take so long.

Older posts

© 2024 Good Reason

Theme by Anders NorenUp ↑