Simler and Hanson on Our Hidden Motivations in Everyday Life

[Update: December 17, 2017, with comments from Hanson]

Kevin Simler and Robin Hanson recently published The Elephant in the Brain: Hidden Motives in Everyday Life. It’s a book that might best be understood as a 400-page-long elaboration of the following blog post by Hanson, written nearly ten years ago:

Food isn’t about Nutrition
Clothes aren’t about Comfort
Bedrooms aren’t about Sleep
Marriage isn’t about Romance
Talk isn’t about Info
Laughter isn’t about Jokes
Charity isn’t about Helping
Church isn’t about God
Art isn’t about Insight
Medicine isn’t about Health
Consulting isn’t about Advice
School isn’t about Learning
Research isn’t about Progress
Politics isn’t about Policy

The book is about the elaborate dance between the pleasant sounding, prosocial, altruistic motives we project to the world and the selfish motives that often underly our behavior.

I’ve long enjoyed the writing of both Simler and Hanson, and so I will confess I that was predisposed to like the book. I was not disappointed. It was a thoroughly enjoyable and easily digestible read on a difficult subject.

The book is an excellent survey of the literature on evolutionary biology, self-deception, and the biology of self-deception. The authors draw from the research of Trivers, Tooby, Haidt, and others.

The key thesis is not just that we are blind to our motives, but that we are strategically blind to our motivations. That we are evolutionarily designed to provide post-hoc rationalizations for why we do what we do, but that it is often not in our best interests to fully know the real motives.

First, we’re suggesting that key human behaviors are often driven by multiple motives—even behaviors that seem pretty single-minded, like giving and receiving medical care. This shouldn’t be too surprising; humans are complex creatures, after all. But second, and more importantly, we’re suggesting that some of these motives are unconscious; we’re less than fully aware of them. And they aren’t mere mouse-sized motives, scurrying around discreetly in the back recesses of our minds. These are elephant-sized motives large enough to leave footprints in national economic data.

As an example, imagine someone who gives to charity. If the real reason for that giving is not only a genuine care for others, but also a desire to look good in the community, according to the authors, the best way to sell that false motivation is to actually believe that the real reason for giving is a genuine care for others.[1]

The authors quote Trivers, who says, “We deceive ourselves the better to deceive others.”

Politics is about coalition building rather than pure policy. Art is about showing off how much leisure time we have to perform challenging and hard-to-replicate tasks rather than beauty. Religion is about norm enforcement and hard-to-escape community bonds rather than divine inspiration. Education is about conformity, day care, and socialization rather than learning.

Nearly all of our social activities have hidden subtexts that are about more than what we politely discuss in public. These are our hidden motives in everyday life.

When I talked about this book with my wife, she said, “that’s interesting and probably at least partially true, but what do we do with that information?”

It’s a good question. It’s probably the question most people will ask themselves as they read the book.

Funny she should ask. It just so happens that this question was the central focus in the book’s last chapter and conclusion.

This was also what I considered the weakest part of the book.[2]

The authors’ primary answer to the question is “situational awareness.”

That’s all well and good when the goal is to detect others’ bullshit, but an alarm went off in my head in the “Physician, Heal Thyself” sub-chapter.

After all, if one of the main theses in the book is that self deception is strategic and lack of self awareness in terms of our motivations serves a critical evolutionary purpose, how is it that situational awareness of that self deception can also be strategic?

We cannot “deceive ourselves to better deceive others and simultaneously strategically benefit from doing the opposite.

This seems flatly contradictory. If the very trait that is strategic in its absence can also be strategic its presence, then neither trait would be strategic. The whole book is about not-P and then the last chapter says, “But P!”  The Elephant in the Brain is an anti-self-help book, and that’s ok. It might be the best anti-self-help book I’ve read. But in the last chapter it reverses course and goes into full-on self-help mode.

The correct answer to the question of “what do we do with this information?” is probably “situational awareness of our self deception, though interesting, might not be that helpful in terms of our own behavior. That’s why we were designed with this lack of self awareness.”

But that’s not what the authors say. Instead, they try to rationalize why this brand of situational awareness is helpful, and how it can be used in our personal life and in business.

The authors state that, “Savvy institution designers must therefore identify both the surface goals to which people give lip service and the hidden goals that people are also trying to achieve.”

If taken literally, this is horrible advice! Savvy institution designers will do no such thing. Elon Musk would not be a better entrepreneur if he were aware and openly stated that his real motivations for building his companies were not just the betterment of the human race but rather the glorification of his own ego and the raising of his own status.

If Stanford and other elite institutions advertised that their education was available for free to everyone and that the real value of a degree was because of a bald, zero-sum elitist credentialism; if churches advertised that the real reason for their elaborate ceremonies and overwhelming institutional demands was to demonstrate shared commitment and community-enforced norms rather than because of divine inspiration; if companies acknowledged that the real purpose of the business is for the ego-glorification and wealth-creation of the owners, rather than for whatever garbage is spouted off in the mission statement; if a political party admitted “what we’re really trying to do is raise the status of these groups and lower the status of these groups,” then all of these institutions would immediately and irrevocably unravel.

Such rational instincts make for bad coalition building. And weak coalitions make weak institutions. Tooby says:

People whose coalitional membership is constituted by their shared adherence to “rational,” scientific propositions have a problem when—as is generally the case—new information arises which requires belief revision. To question or disagree with coalitional precepts, even for rational reasons, makes one a bad and immoral coalition member—at risk of losing job offers, one’s friends, and one’s cherished group identity. This freezes belief revision.

Savvy institutions have dogma. Savvy institutions have mission statements. Savvy institutions have mottoes, creeds, and fight songs.

Savvy institutions do not acknowledge their own inconsistencies.

Institutions that acknowledge their own weaknesses, biases, and inconsistencies are weak institutions.

This is why rationalists struggle to organize a meetup of 20 people in a metro area of two million people, whereas the Mormon Church and Islam are growing as fast as they are. This is why you’ll never meet a 3rd-generation Unitarian.

It would appear that the authors fell into their own trap—wishing for a pretty benefit to ascribe to our awareness of our hidden motivations, when the rest of the book tells us that the opposite is true.

Either way, this doesn’t take away from the greatness of the book on the whole. The overall work is still well worth reading. If any of these concepts are new to you, reading this book will make it hard to look at much of anything you do in the same way again.

[Update: Hanson replied to this post twice on Twitter. I’ll give him the last word]

“Did you see us say: ‘Even when we simply acknowledge the elephant to ourselves, in private, we burden our brains with self-consciousness and the knowledge of our own hypocrisy. These are real downsides, not to be shrugged off.'”

and

“You ask ‘how is it that situational awareness of that self deception can also be strategic?’ We didn’t mean to suggest that the gains from situational awareness will usually outweigh these harms. We just said ‘There are benefits'”

[1] The authors would probably acknowledge that charity is at least partially about the selfless act of giving, but would emphasize that we are programmed to emphasize the pleasant-sounding aspect our selflessness when doing so while concealing our more selfish desires beneath the surface.

[2] I’m not normally inclined to focus on what I believe to be the most negative aspects of an author’s work. But in this case, Hanson claims that he prefers direct, frank criticism. So here goes.