Thursday, 29 May 2014

Arguing with stupid people

Changing your mind in response to an overwhelming argument is as fine a quality as it is a rare one. I was reminded of this fact yesterday in discussion with the current Mrs Ant. Then, again, today when a friend posted a Mark Twain quote I like and I thought it might make an interesting blog post.

"Never argue with stupid people, they will drag you down to their level and then beat you with experience."

Mental models

We all have a mental model of the world in our brain, our version of the world, our place in it (the centre) and our friends and acquaintances as satellites orbiting us. This model is super useful for us, as it allows us to play out scenarios in our heads before trying them out in the real world. For example, I am in a room with my boss and his boss. My boss makes a suggestion which I think is stupid, rather than blurting out that fact and suffering the consequences I can run a series of scenarios in my head, before commenting.

  1. Hey Bob, 1963 called, they want their shit idea back, oh and November the 22nd says it wishes it was you instead of Kennedy.
  2. Yea, nice one, or we could do x, might be quicker?

Now we run the simulations and go through the likely outcomes.

  1. Bob loves a good roasting and respects opinionated people. Bob's boss looks like a hard businessman, just the type to take the hard line.
  2. It's a pleasing answer, but it puts across two points, which makes me come across as uncertain.

It may seem unnecessary to state that this exchange all goes on inside my head. That the Bob referred to here is my mental model of Bob, and may, in fact, not represent the real Bob in every aspect. Yet the world is a complex place and we are often surprised when an interaction goes completely differently to the simulation. Unfortunately, we don't think very much about how poor a representation of the real world our mental model is. This is largely because we need to have faith in our simulations in order to act without undue hesitation.

Changing the model

Each time one of our simulations leads us ary, it causes us to restructure our mental model. This is fine for non weight bearing parts of the model, for example, realising you actually do like olives. For most people, there's not too much re-shifting of heavy mental furniture to accommodate this new fact. This shifting of mental furniture takes a great deal of mental effort.

If I can stretch the analogy further and imagine changing my mental model to switch my opinion of a co-worker to be like moving the sofa in my living room. As well as expending the energy to move the sofa, I have to adjust all of the things associated with it. My mental model must remain consistent, useful and operational in the same way my living room does. I can't just turn the sofa to face a wall and continue using it as is. I need to be able to see the t.v. and ensure I get enough light to read.

This effect is even greater when being asked to accommodate new information. This is why some of the most persuasive arguers will take the time to help you build the mental frame work into which they want you to place their new idea long before they ever get to the idea itself. In his excellent books Douglas Hofstadter takes us on wonderful walks through Godel's theorum, Escher's art and Derek Parfit's transporter thought experiment before he gets to the subject of conciousness. Those amazing Eureka! moments are where the last piece of the puzzle slots gracefully into your newly formed mental model of the world.

Mentally lazy

Obviously, some people are quite energetic mentally, to them, it's fun to listen to someone and say "Hey, you're right, if I knock a wall down here and add another window there I'll get so much more light in and have room for that chaise longue!" (some people have weird taste in thoughts). Others are not and laziness can cock a deaf ear to even the most simple idea. This is why you can be demonstrably correct and still not get someone to listen to you. This is why people get angry with you for being right sometimes, because accepting your right idea requires too much mental restructuring. Some beliefs are simply so entrenched (so foundational) as to be literally unchangeable.

Entrenched beliefs may not be as detrimental as they first sound - if my mental model is right most of the time, but you have shown me an obvious edge case where it is wrong, what is the cost / benefit of changing my entire model? Remember, all of our mental models are incomplete, their value is in how well they help us predict our future, not in how logically consistent they are. This is nicely illustrated by Philip K. Dick, who was a barking mad drug hound, yet, never the less, operated at the highest intellectual level - giving talks on reality at universities. This was because, despite believing we were still living in biblical times, 50 years after the death of Christ, his mental model made correct predictions.

When I was younger I read Richard Dawkins' The Selfish Gene and the other books in that series. They moved my mental furniture through reasoned argument and I was a big Dawkins fan for nearly 30 years. I now see him arguing from a position of complete entrenchment as a die hard atheist / materialist against equally entrenched religious people in what are supposed to be serious debates. I know he can't possibly expect to change his opponents views (my internal model refuses to shift to allow me to see him as a fool) which leaves me thinking that he supposes I am a sufficient fool to be influenced by this sham. Sigh...

People's mental models of the world are important to them and serve as the foundation of who they see themselves as. The key to being successful in changing their opinion is to help them see the benefit of shifting to your model rather than proving theirs is wrong. I guess it also makes you a nicer person to understand when being right doesn't really matter.