Odysseus and Intentionally Inconvenient Design

You perhaps know the famous story of Odysseus and the Sirens.

In Greek mythology, the Sirens were famous for their gorgeous singing. So very beautiful was their signing that any sailor who heard their songs would stop sailing their ships to listen. With no one steering the ship, the ship would crash, and everyone who heard the Sirens’ songs would die.

In the great Greek epic the Odyssey, the hero Odysseus avoided this trap by telling his men to block their ears with wax. And then he ordered his sailors to tie him to the mast so he could hear the Sirens’ songs without doing anything stupid like intentionally crashing the ship so that he could hear more songs.

Lo and behold, it worked. By planning ahead of time to avoid temptation, Odysseus managed to experience what was most beautiful in his voyage home without killing himself.

Given that this was written about 2500 years ago, it’s clear that the human struggle with what we know will tempt us as opposed to what we know is best for us is nothing new.

But I’m of the opinion that the struggle is harder now than it has ever been. That capitalism is an evolutionary process that continues to push the boundaries of what tempts us, and to continue to make it harder to resist. Every new day, we are faced with the most potent new forms of temptation known in human history.

This makes Odysseus-like skills at avoiding temptation more important than ever.

***

Last year, I was listening to one of my favorite podcasts, 99% Invisible, and I learned about the concept called “unpleasant design.”

According to the show:

Benches in parks, train stations, bus shelters and other public places are meant to offer seating, but only for a limited duration. Many elements of such seats are subtly or overtly restrictive. Arm rests, for instance, indeed provide spaces to rest arms, but they also prevent people from lying down or sitting in anything but a prescribed position. This type of design strategy is sometimes classified as “hostile architecture,” or simply: “unpleasant design.”

Unpleasant design is an intentional structure that’s designed to make you uncomfortable. What a brilliant idea! Using discomfort by design to deter certain behavior.

***

Given the challenges that I have in keeping attention focused when I want to focus, I have often thought about the idea of tying myself to the mast in designing my own workspaces. To do work, you have to go online. But when you’re online, the entire information age with all of its increasingly well-cultivated attention traps are there to tempt you. It’s not easy for me to get what I need to get done without hearing the Sirens’ song of web temptation and crashing into a rabbit hole of internet distraction.

So I’m constantly working on better ways to filter the internet I need to use from the internet that is taking me away from what I need to do.

What I’ve been working on is a concept I call “intentionally inconvenient design.” It’s stolen from the idea of unpleasant design, and the basic thought process is that I need to be able to search for things. But the internet now proliferates with sites and attractions that make it hard to do just the bare minimum online, which means I tend to spend far longer online than I need to when I go online.

My solution is to make it difficult to go online, particularly to sites that are not designed for productivity.

This is still a work in progress. But here are a few things I have done to improve my own work environment:

  • I use a service called “stay focused” that gives me a nuclear option to block all sites except for certain allowed sites while I am online.
  • I’ve created impossible-to-remember 25-30 character passwords for Twitter and Facebook. I then printed them out and put them in my sock drawer, and I don’t keep them anywhere on my computer or other devices. That way, if I really want to access social media, I can. But it’s not easy. I have to really want to do it or it’s not going to happen.
  • I created a slightly-less challenging password that I must manually input into my computer ever time I want to use an internet browser.
  • I have a sit-stand desk with a ball. I alternate between sitting on a ball that’s tiring to sit on for too long and standing, which is also tiring to do for too long. That way, my work time is more concentrated and intentional.
  • I try to never use the internet while sitting comfortably.
  • I store my iPad in the least convenient place I can, in a closet in the least-used part of my house.
  • I have a 4-year-old iPhone with cracked glass and an internet browser that does not work well. The phone will not be replaced until the “phone” feature stops working.
  • I put all desserts in a crawl space only accessible by a rickety ladder and in a 24-inch stainless steel safe that is protected by padlock.[1]

I appreciate that all of these mechanisms are kind of ridiculous. But as silly as they all seem, I think they help. It’s much easier to go online and let my attention wander rather than doing the things I know will make my life better. And, without prior restraint—true to Zipf’s Principle of Least Effort—that’s what I do. Unless I tie myself to many different metaphorical masts throughout my day, I waste much of my day.

The whole world of information and entertainment is right there, at our fingertips, every minute of every day. That’s gotta be every bit as tempting as the Sirens’ song, right?

[1] This one isn’t true.

First Ascents, Google Glass, and Thalidomide

Recently I was climbing a remote mountain near my home.

On this day, the route that I took up this particular remote mountain was ill advised. It didn’t look all that bad from the bottom; it just looked like the most direct route up the mountain. But after I ascended the first ridge, it became clear that I had put myself in a bad place. I’m a veteran mountain runner, with nearly 30 years of mountain running experience. But on this day, the route I chose was a serious mistake.

It’s a safe bet to say that every mountain in Colorado has been climbed many times. There’s no such thing as a first ascent on Colorado peaks anymore. But on this day there were a couple of points where I wondered to myself whether any human being had ever been in precisely the position where I was in at that moment.

If you find yourself in place like this in 2018, chances are you’re in a very remote, exciting place. And your life is probably in danger. Because today, with 7 billion-plus people on the blue orb, if no human has been where you are, there’s probably a reason for it.

***

I remember hearing an adage about plane crashes that they don’t happen because of just one mistake. There are fail-safes in place to ensure that a single mistake does not cause a plane crash. Crashes happen when pilots and their co-pilots make a series of novel and interesting mistakes in sequence that no other pilots have ever made before. When the mistakes happen in a sequence that the safety mechanisms do not anticipate and therefore cannot prevent.

I think the same is often true of other accidents, including mountain deaths. Mountain deaths don’t happen when you make just one mistake. It’s usually when you make a series of mistakes. You climb a mountain that is beyond your experience level. And then you get off-route, and then you decide to take a “short cut” down. And then you find yourself in a series of places where the only way out is on a cliff with unstable talus or loose scree.

I try to look at each decision on the mountain as unique. I try not to let a prior bad decision influence my decision-making about what to do next. But this particular day, I just kept making bad decision after bad decision. I was path breaking in a way that I’m not normally accustomed. I was path breaking in a way that I prefer not to do.

I like to go to remote places in the mountains, but I don’t like to take huge risks when I do. I like to explore, but not to the extreme where there is anything higher than a totally improbable risk of serious danger.

***

Venturing off the beaten path gets good PR. Perhaps undeservedly good PR.

I think this is because of survivorship bias. People tell heroic tales of those who take great risks and are rewarded. It doesn’t always work out that way.

A few years ago, I got a pair of Google Glass. For a brief shining moment, this was the cool new thing in tech. People would stop me on the street and ask to use it. I thought I was being novel and ahead of the game. I was new to the community of startups and technology, and I wanted to signal that I was adventurous and “in the know” when it came to tech.

I should have bought Bitcoin instead.

On a dime, the world turned on Google Glass, and decided that this was no longer the coolest thing. It became apparent that this device was a buggy, wasteful, and a grandiose symbol of the worst forms of perpetual tech distraction. People who wore them became known as “Glassholes.”

My fancy (and expensive) new toy quickly came to signify the opposite of what I had wanted to signal.

I wanted to signal I that I was not afraid to take a risk. I took a gamble on being the first to adopt a new technology, and I ended up wasting my money on a form of technology that wasn’t helpful or useful to me at all. Plus, it made me look like a fool.

***

And then there is the lesson of Thalidomide.

According to a 2009 article by Bara Fintel, Athena T. Samaras, and Edson Carias.

Thalidomide first entered the German market in 1957 as an over-the-counter remedy, based on the maker’s safety claims. They advertised their product as “completely safe” for everyone, including mother and child, “even during pregnancy,” as its developers “could not find a dose high enough to kill a rat.” By 1960, thalidomide was marketed in 46 countries, with sales nearly matching those of aspirin.

Around this time, Australian obstetrician Dr. William McBride discovered that the drug also alleviated morning sickness. He started recommending this off-label use of the drug to his pregnant patients, setting a worldwide trend….

However, this practice can also lead to a more prevalent occurrence of unanticipated, and often serious, adverse drug reactions. In 1961, McBride began to associate this so-called harmless compound with severe birth defects in the babies he delivered. The drug interfered with the babies’ normal development, causing many of them to be born with phocomelia, resulting in shortened, absent, or flipper-like limbs. A German newspaper soon reported 161 babies were adversely affected by thalidomide, leading the makers of the drug—who had ignored reports of the birth defects associated with the it—to finally stop distribution within Germany. Other countries followed suit and, by March of 1962, the drug was banned in most countries where it was previously sold.

Obviously, this is horrible and a parent’s worst nightmare. Well-intentioned parents took medication they were told was safe and ended up giving their children life-altering and often fatal birth defects.

***

It’s perhaps callous to compare these three distinct phenomena, but they all—with varying degrees of seriousness—touch on the question of “when is the safe and wise to try something new?”

There is no universal rule for how to deal with novel risks, of course. You could be the first person to try a new computer game and the risk of serious harm would be next to nil, and you could be the millionth person to try base-jumping and you could still get splattered against a mountainside.

But while there is no universal rule, there are some game-theoretic insights that may apply. While on the mountain and then afterward, I was reminded of the fantastic book by Kevin Leland, Darwin’s Unfinished Symphony: How Culture Made the Human Mind.

According to the book, for all the praise many people want to give to those who innovate, it is those who strategically observe and exploit that fare the best.

Being the first to go somewhere or adopt a new technology is highly overrated. As a general rule, the authors say, watch where others go, see which strategies seem to work, and then copy based on what you observe. That’s the meta-strategic path.

Standing on the side of a cliff, that thought resonated with me. It’s fun to explore novel and interesting places. But in the moment where you are truly exploring and trail blazing where no one has gone before, you appreciate more than ever why some prior strategic reconnaissance would have been helpful.

Sitting and Moving

Sitting and moving. That’s what I spend the majority of my free time doing.

Stated another way, I spend a lot of time meditating and running. Between those two activities, I probably average between three and four hours a day. That may seem like a lot of time to spend on activities with no obvious practical purpose. But to me, they are the most fundamental things I do every day.

It’s hard for me at this point to imagine doing anything different with my life. But almost every other person on this planet chooses to spend their time differently, which means that my choices here are unique. So I figured I’d write a brief homage to why I enjoy these two activities so much.

The moving is beneficial because it is exercise. Evolutionary history tells me that we humans are animals first and cognitive creatures second. If we get caught up too much in our cognition, the machinery stalls out. By moving every day, I bring myself back to the most basic aspects of my physical being—navigating my natural environment. That navigating helps me remind myself every day of the way that I physically inhabit the space around me. It is powerful to move. Without that movement, the other aspects of my day feel less real.

The sitting is beneficial because it stops me from getting too caught up in the things I have going on in my life. The tasks of a professional can seem very important in the moment you are doing them, but on a cosmic scale, it is hard to think that they are. Sitting is a daily reminder of the fragility of our temporary forms of existence. It is a reminder to let go of small things and to remember one’s place in the broader scheme of the universe. That sounds like heady stuff, but in the moment of sitting, there is truly nothing to do. Sitting is not an addition process; it is a subtraction process. Just sit and let go of everything else.

On the surface, one might think that running that would be the harder of the two—much harder than sitting. But I certainly find the sitting to be much more challenging than the moving. For me, unless I’m pushing myself to the extremes of my fitness level, running for an hour or even two hours isn’t that much of a challenge. On the other hand, sitting still for an hour is incredibly difficult—and I almost always fail to sit completely still for the entire time that I set out to sit.

So in a sense, I’m a better mover than a sitter. Or perhaps sitting still is inherently more difficult than moving. Either way, it’s the sitting that is the harder of the two for me.

Also, the more I meditate, the more I find commonality in what previously seemed like disparate experiences. I practice a few different kinds of meditation, some of which involve deep attempts at concentration and some that are not as active, but rather attempts to let go of the intention to control one’s attention. Initially, the latter experience didn’t feel much like meditation at all, but rather just an out-of-control wandering. But the more I meditate, the less different the focused meditation and the non-intentional meditation seem to be. Often, after twenty or thirty minutes, there is no difference, or there does not seem to be.

And so too with sitting and moving. The more I meditate and run, the more I find that returning to the breath can be as useful in running as it is in sitting. And after an extended time running, I sometimes notice similar sensations to those I feel when sitting—the challenge of staying focused in the moment, an internal voice suggesting to do something else, even a form of mini-nausea arising from the intensity of the experience.

The more that I sit and move, the more the experiences start to seem alike. Perhaps there might be a lesson in there about the universality of all experience, but I doubt I’m qualified to render such an opinion. What I can attest to is the power of the intentional sitting and moving. It’s a power that feels as deep as anything I have encountered.

Nudging to Non-Distraction

Cass Sunstein has been one of the most influential legal scholars of the 21st century. One of the most influential economists of the 21st century is recent Nobel Prize winner Richard Thaler. In 2008, the two got together to write Nudge, a book about a series of methods and techniques for influencing people’s behavior—without legal coercion— toward healthier lifestyles.

There are many examples of how simple nudges can lead to better behavior and better life outcomes. For example, it’s possible to influence whether students eat healthy food in school cafeterias  by rearranging the placement of the food so that the healthy options are more prominent and the unhealthy options less prominent. It’s also possible to get workers to save more for retirement by making saving a default choice rather than a choice that requires them to opt in. Nudges take a variety of forms, from subtle and seemingly invisible to transparent and educative.

Nudges have shown to be effective by improving outcomes in the context of health, savings, highway safety, employment, discrimination, the environment, and consumer protection (Sunstein 2013, Halpern 2015).

One Finnish academic paper notwithstanding, there has been comparatively little attention placed on the ability to use nudges to limit the amount of distractions in our lives. I think this is a shame.

I believe the following statements to be true:

  • There is more information now than ever before.
  • There are more sources of distraction now than ever before.
  • Those sources of distraction are getting more effective at distracting us.
  • Perhaps more than ever before, people are distracted and unable to focus in a way that prevents them from having healthy relationships and reaching their personal and professional potential.
  • Nudges are a relatively popular and effective way to influence social behavior.
  • Nudges could be used to reduce mental clutter and distractions in our lives.
  • It is worth exploring ways to do this effectively.

There is nothing magical about nudges. Nudges won’t solve these problems immediately or permanently. But carefully crafted nudges could help us design less distracted communities and improve our well-being.

There is ample literature about nudges and their effectiveness and benefits; I think it’s time to start seeing if we can apply this non-coercive policy tool to help make us less distracted.

This is a topic I expect to read and write about much more in the upcoming months and years.

The Feck Off Gates

“Around here,” my uncle said, accentuating with a pause, in his thick, West Cork accent, “those are known as the ‘Feck Off Gates.'”

The Irish do have a way with words.

It was the day after a cousin’s wedding in Ireland, and I was at my uncle’s house (the father of the groom), watching a hurling match between Cork and Limerick. This was the all-Ireland semifinal, and since all of my father’s side of the family is from Cork County, all eyes were on the match. And since this was the groom’s family’s house and it was near the hotel where the reception was the night before, most of those eyes were in this one particular room.

Imagine a dozen or so Irish men, and a few Irish women (plus me and my wife), crammed into a smallish room, sitting on couches, all exhausted from the prior night, screaming at young men playing a sport with sticks and a ball.

Hurling is an Irish sport. And the sport is governed by an organization called the GAA, or the Gaelic Athletic Association. It’s hard to describe in a few sentences what the GAA means to Irish communities. But it’s possible that outside the church, the GAA is the most influential organization in many Irish towns and small cities. And in some towns and cities, it might be even more important than the church.

The GAA is about the sports, but, like most social endeavors, it’s mostly an excuse for the community to get together. To talk about what’s happening in the community.

At half time, after the habitual and expected armchair punditry about the match, the topic of conversation, as often happens with Irish conversations, switched to local news and gossip. My father, his brothers, and other relations bantered back and forth about who recently had been married, who had died, who was having children, and who had moved where.

In this community, it’s an understatement to say that everyone knows everyone.

If I go back to visit family in the small West Cork town of Kilbrittain, where my father is from, by the end of the third day, seemingly everyone in town will know that there’s an American in town, that I’m Mick the Manager’s grandson, that my wife who may be of Mexican descent is with me, and that we’re staying in town for a week. They probably even know whether they can expect to see me at Mass on Sunday.

So when a new house is built, or someone new moves to town, it’s news.

With that mind, my father asked my uncle, “Who owns that huge new house by the strand in [name intentionally omitted]?”

“An American,” my uncle replied.

“Really?” my Dad said. “I didn’t know there were any Americans living around here.”

Upon hearing about the new house, a few other relations commented on the size of the house, and perhaps more newsworthy, the size of the gates in front of the house.

And this was when my uncle dropped the line about the “Feck Off Gates.”

If you’ve never been to Ireland or been close to an Irish person, “feck” is a slightly more polite way of saying fuck. And the fact that this was coming from this particular uncle was a bit of a shock. While most of my family has no compunction about swearing or cursing, this particular uncle is known as the saint of the family. He’s never had a drink in his life (an impressive accomplishment for an Irish man), and he almost never says a bad word about anyone. And he doesn’t really swear.

But clearly this particular American, and that particular house, had made an impression.

I was thinking to myself, the guy who bought the house is probably an Irish-American. He’s probably achingly proud of his Irish roots. At some point, he visited this area and fell in love with it. Maybe his grandparents or great-great grandparents were from the area. I imagine he wanted to feel more connected to it, and since he had done well in life, he probably thought to himself, “you know what, how great would it be if I had a second (or third or fourth or fifth) home around here? I could wake up next to the ocean, go for a walk along the same beach my grandpa (or great-great-great grandpa) used to walk on, go into the local pub. It’ll be great.”

So he bought the land, spent millions of dollars on a gorgeous house with magnificent gates, and he built it on one of the best parcels of land in West Cork. This was the status symbol that was supposed to connect him back to his Irish roots—to show everyone that he’d come back, and that he’d made it in life.

But as with most status symbols, it’s a hard thing to get right. That’s what the whole humblebrag phenomenon is all about. Everyone wants status and respect, but we have to be subtle about how we go about getting it.

Go about it the wrong way, and you’re the guy with the “Feck Off Gates.”

Completeness of Experience

So much of the focus in western culture is on the novelty, intensity, and variety of our experiences.

Where have you traveled? What restaurants have you been to? Have you ever done a full Ironman? Have you ever had this grapefruit-infused IPA? What about the truffle-oil infused pork belly? Have you gone zip-lining in Costa Rica? How about bike-packing in Columbia?

Have you done a 100-mile race? How about a 200-mile race?

And of course when the goal or focus is on having the greatest variety of intense experiences, there is always a worry. Did I pick the best dish at this restaurant? Is this job the best job I could possibly have at this moment? Am I doing enough?

I think that’s the source of the ubiquitous “fear of missing out.” The fear that there is a better experience somewhere than the one we are currently having. And this creates a pervasive sense that what we are doing now is somehow not quite good enough. And then the feeling that we are not quite good enough generates a desire to perpetually optimize for better experiences.

But the mind that is perpetually optimizing is a mind that is never at rest. The perpetually optimizing mind may struggle to appreciate the present experience, because it is always searching for a better one.

In my 20s and 30s, I was always worried about how to acquire a variety of novel and intense experiences. Now I try to focus more on the completeness of any given experience.

This is a perpetual challenge.

If you are focused on the completeness of your experience, rather than its novelty, intensity, or variety, it doesn’t matter if you are taking out the trash or washing the dishes. It can still be a source of pleasure and peace. But if you are not focused on the completeness of your experience, you could be sipping on a cocktail on a tropical beach and still be deeply dissatisfied or outright miserable (“That private beach over there looks nicer”; “I should have gone to Aruba instead of Jamaica”; “The view here is blocked by those trees”; “This humidity is oppressive”; “I should have ordered that Pina Colada instead of this Bahama Mama”).

This isn’t to say there is anything wrong with good food, travel, or intense experiences. It’s just to say that without a sense of appreciation, and the feeling that the present moment is good enough, that all of our experiences may seem incomplete. But if we do our best to appreciate the present moment, it brings a sense of completeness and fulfillment to whatever we do, no matter how mundane or ordinary.

Defaulting to Distraction

Recently, during a minor surgery for a loved one, I was waiting in the hospital lobby. The main waiting area adjacent to the operating rooms was a tight space with about a dozen chairs. All of the chairs faced an extremely large television. The television was turned on, and the volume was up very high.

I looked around and thought, “Does everyone else around me really want to be watching The Golden Girls at full blast at 7:35 in the morning?”

I suspect the answer was no. No one seemed to be enjoying it (say what you will about how hip Betty White is these days, the humor did not age well). Eventually, I, and almost everyone else waiting in the lobby sought out and moved to quieter areas of the hospital.

It seemed so obviously ridiculous. But it was still happening, and I suspect it probably happens the other 364 days of the year, too. And the same exercise is repeated at hotels, airports, restaurants, laundromats, and other public spaces around the world.

In so much of the public sphere, the default setting is one of intense distraction. And as was the case this morning, sometimes the distraction is loud, in your face, and almost impossible to avoid.

Many of the smartest policy influencers like to think about Nudges and choice architecture, and how that architecture can impact whether we choose to smoke, whether we invest in our retirement, or whether we donate our organs. I’m a fan of this conscious and thoughtful choice architecture as a low-impact, soft, and non-coercive way to influence people’s decision-making.

I know there are some people who have thought about such things, but perhaps this could be a greater point of emphasis. I’m sure the hospital administrators felt they were providing a service by playing the TV all hours of the day and night. I mean, they spent money on the damned TVs, and I’m sure some people like having the noise to distract them. Maybe for some, even the lowest quality, over-the-top distraction is better than being alone with their thoughts, particularly when we know a loved one may be in distress or danger.

But perhaps we can do better for our default social setting than to force-feed 30-year-old sitcoms on everyone around us. Those anxious times in waiting rooms might also be the moments when we could have the most important, tender, and meaningful conversations of our lives. When we could tell the friend or brother that we often take for granted that we love them.

But it’s hard to do that when you have a laugh track cackling at full volume eight feet from your head.

When I was a kid, people smoked on planes. When I was in college, you couldn’t go out to most restaurants or bars without coming home reeking of cigarette smoke. Over time, we decided that second-hand smoke was gross and that the default setting should be a smoke-free environment.

No one decided smokers shouldn’t be allowed to smoke. But we decided we didn’t want to make non-smokers breathe that dirty air everywhere they go. I’m not suggesting we ban The Golden Girls completely (well, maybe). But it’s not strictly necessary to force everyone to watch TV in shared public spaces as a default social norm, either.

Maybe, over time, we could adjust our default settings of distraction. We could decide that the default setting for public spaces could be one with no distractions, and let the TV-watchers adjourn to a TV lounge in some remote part of the building.

Here’s to hoping.

Reflections on the Passing of Anthony Bourdain

I don’t feel envious of many people, but I was definitely a little envious of Anthony Bourdain.

I remember thinking every time I watched one of his shows, “now that guy has the greatest job on the planet.” Traveling around the world to its most beautiful and exotic places, connecting with its best chefs, eating its best food. Meeting many of its most interesting people. And then crafting fantastic and creative narratives around those experiences. His shows were brilliant.

Who wouldn’t have traded places with Anthony Bourdain?

All the epicurean delights this planet has to offer. Status. Success. Autonomy, Mastery, and Purpose. Eudaimonia. PERMA. By what seemed like every external measure of what a person could want, he had it.

Turns out, it wasn’t enough.

I cannot pretend to know his thoughts or his inner demons. But I think it’s safe to say, that at least for him, having one of the best jobs, and one of the most interesting lives, of any person on the history of this planet: It wasn’t enough. But if that life wasn’t enough, what could be?

The answer, sadly, is almost certainly nothing.

There is no thing that will ever satisfy you.

Nothing will ever be enough. There is no thing after which you get that thing that life’s problems will then go away. We just weren’t designed to be happy. We were designed to feel perpetual dissatisfaction and to think what we need and want more. When we see the Instragram photo, Facebook post, or celebrity snapshot of what seems to be a better or more glamorous life than our own, we’ll never know what deep suffering or sadness may be lurking beneath it.

So if I take away one thing from what I believe to be a great man’s passing, it is this: Try not to worry about what you think you want. Let go of the pursuit of the things that are supposed to be the things that will make you happy. They will never be enough.

Who knows if that sentiment could have saved Anthony Bourdain? Probably not. He was well traveled enough where he had likely heard something like that before. We will never know if anything could have saved him.

But what I take from his passing is that all the wonderful things this life has to offer will not be enough to save you, if you are not already at peace with this life.

How Obsession with Success May Lead to Failure

This may sound obvious, but success isn’t generic. To be successful, you have to be very good at some specific thing. 

But there seems to be a burgeoning industry in the study of success as a sort of meta-discipline. As if knowledge of how various people succeed at a variety of things might lead us to become more successful at any given thing.

But it’s all based on dubious premise.

By treating success as an object in itself, we’ve turned an incredibly hard problem, namely, the study of how to achieve at the highest levels in any specific discipline, and we’ve converted it into what strikes me as an nigh-impossible problem, the study of how to achieve the highest levels of success in any discipline.

What makes someone a great basketball player, long-distance runner, venture capitalist, attorney, writer, or pianist is not the same. There is no evidence that obsessing over the meta-habits and traits of people in a variety of disciplines provides any tangible benefit to those looking to outperform others in any specific field.

Achieving the highest levels of success requires a single-minded obsession with craft (or crafts). And then in every discipline, there are people competing over novel strategies specific to their disciplines to achieve recognition and status in those disciplines. Most will never get there. Not because they’re stupid or misguided. Because life is competitive and many talented people compete for status and recognition.

Lebron James is the best basketball player on earth because he’s awfully talented and he continues to hone his basketball skills. Everyone who listens to Tim Ferriss’s podcast knows the habits and routines of the successful people he interviews, but no amount of meta-trait analysis will make them the next Lebron James (or Magnus Carlsen, Richard Feynman, Elon Musk, or even Tim Ferriss).[1]

I suspect that Lebron James, Magnus Carlsen, and Elon Musk, on the other hard, are all totally oblivious about the traits and meta-habits of those who succeed in other fields. It doesn’t matter, because they’re masters in their fields.

Success–like so many things in life–comes indirectly. If you seek out success in general, you’re almost certainly going to be moving farther away from any hopes of success the longer you think that way. You could spend a lifetime obsessing over how to be the best chess-master, programmer, or 800 meter runner, and you probably won’t get there. And if you spend your life obsessing over how to be successful in general, you definitely won’t get there.

Success isn’t a thing you just go and get. It’s a thing that comes sometimes to some people when they get really good at some other thing.

[1] Also, traits that make you successful in one field are often detrimental in others. Not answering emails might help with your focus if you’re trying to write the great American novel but good luck with that approach if you’re a mid-level associate trying to make partner at a major law firm. Getting up at 4:30 am might work well for a former Navy Seal looking to create a media empire, but it’d be a terrible strategy for someone training for a marathon world record. Context always matters.

Why Most Theories Are Like Stories

My high school history teacher had a one-liner he liked to drop into our history lessons. It went something like this:

“Before I had kids, I had two theories about parenting. Now, I have two kids, and I no longer have theories about how to be a good parent.”

It’s a clever turn of phrase, always good for a chuckle with new parents. And it hits at two near-universal truths. First, that parenting is a tough and messy business. Second, that most of our lofty theories, especially the ones that are supposed to guide our behavior, are really just so much BS.

It’s easy to sit in a comfy chair when you have no kids, fully rested, and think it’s best to never yell at your kids. But when your four-year-old is smearing fecal matter on your new carpet or headed straight toward an open ledge, sometimes you gotta do what you gotta do.

The way the joke resonates with people, it shows that at some level we know that most of our theories are kind of BS.

So why have theories at all?

***

A few scholars have taken the position that the best and most honest way to judge human behavior is to avoid theories.

A law clerk named Andrew Jordan recently published an excellent law review article called Constitutional Anti-Theory. If you’re at all interested in the US Constitution and how it is applied I couldn’t recommend it more highly. The basic summary of the article is that sound decision-making in applying the Constitution does not require a coherent theory. What judges and policy-makers do when they apply the Constitution, whether they admit it or not, is weigh certain criteria and values, such as the right to privacy, safety concerns, freedom, the stability of the judiciary and the government, and try to come up with a fair and equitable result based on the facts presented to them. Judges hear facts and apply them to the facts and the law as they understand them. This is what they do. By adding a universal theory about how to interpret the Constitution, this creates an intermediary layer that simply prioritizes one of these values over another.

According to Jordan, there is no reason we should have to do this.

Jordan’s argument isn’t new. Former Judge Richard Posner, perhaps the most influential jurist over the last fifty years, published another law review article called Against Constitutional Theory over twenty years ago. In it, he criticized Antonin Scalia for making value judgments in the guise of making not making value judgments at all, in the name of Strict Constructionalist constitutional theory. This theory states that we should base all of our judgments of modern-day constitutional arguments based on how the original founders intended the Constitution to have been written. This ran counter to how most judges have interpreted the law throughout history, as a corpus of decisions and precedents that evolve over time. It wasn’t an accident that Scalia was also the most conservative Supreme Court judge over the past half century. By intellectually pushing the judiciary to tether decision-making to a bygone era, this was forcing an anti-progressive stance on the way decisions had to be made.[1]

Posner wasn’t a fan of this approach. He considered himself a pragmatist, someone who looked at the specific facts of the cases presented to him, weighed the harms and injuries suffered by the parties, and then tried to make decisions to deliver the most equitable result. As a high-level appeals judge, almost all of the cases he reviewed dealt with difficult facts and circumstances, and to make the right decision was rarely easy. He argued that what he wanted was more information, more data, and better social science to guide his decisions. Posner wasn’t interested in more theory, but rather a better understanding of the facts.

***

Posner and Jordan notwithstanding, many of us feel inclined to adopt theories. To adopt a rule we deem to be universal that we claim governs our actions.

But if we know that as fallible humans, that our actions are sometimes inconsistent, and that life is always messy, why take this step?

One of the most powerful intellectual tricks you can play (on yourself and) on others is to make a value judgment seem like it is not a value judgment at all, but rather a simple application of a universal rule. That way, it’s not your judgment that’s making a decision against someone’s interest, but rather, there’s something more transcendent at work.

Since the time of David Hume, we’ve known you can’t get an “ought” from an “is.” This is what philosophers call the “is-ought gap.” Value judgments must always derive from other value judgments. Any attempt to proclaim a value judgment “this is the way things should be” from a description of a state of affairs, “this is just the way it is” is just a clever slight of hand trick. When we make value judgments, we’re always applying our values. It’s just sometimes we pretend that we’re not.

This may be one of the reasons why it is useful for people and organizations to enact policies or to publicly state a theory or principle that will guide their behaviors.  These public stances aren’t useful because they’re universal truths; they’re helpful to convince ourselves and others that our actions will be predictable or consistent (even when they’re not). In a way, they’re like stories we’re selling to those around us. In the story, there is an arc of truth and universality. When in reality, principles and policies are just ways to get other people to do what we want them to do.

***

Richard Posner, despite his impeccable credentials and respect among other members of the judiciary, was never nominated to the Supreme Court. Antonin Scalia and Clarence Thomas were. Scalia and Thomas are also brilliant, but the scope and breadth of their intellectual accomplishments never rivaled Posner’s.

Perhaps one explanation for why Scalia and Thomas made it to the Supreme Court and Posner did not, is that Scalia and Thomas were much more predictable. By labeling themselves as Strict Constructionalists, these men effectively sent a signal to their conservative colleagues in the legislative branch that they could be trusted to deliver conservative verdicts. And that’s what they did (and are still doing, in Thomas’s case).

Posner was also a Reagan appointee and a conservative, but he was less predictable from a left-wing/right-wing axis than his Strict Constructionalist colleagues. His pragmatic, detail-oriented approach inspired judges around the world, and his opinions were always deeply concerned with making the right decision based on the facts presented to him. His decisions were rooted in context, not ideology.

And that is perhaps why, despite the fact that was one of the greatest legal geniuses ever, he never made it to the Supreme Court.

***

There’s plenty of evidence that we don’t come to our moral decisions based on reason. Since judges are people, too, it’s safe to assume the same applies to them.

But it’s problematic for judges and other people with authority to admit that their logic is based on moral intuition rather than something greater.

It’s useful for judges, companies, and parents to be able to appeal to something more transcendent than their own judgment. “Because I say so” just isn’t that compelling. “It’s our company policy that we cannot accommodate your request,” is harder to argue with than “I just don’t want to refund your money because then I won’t have your money.” If a judge tells you, “I think what you’re doing is wrong,” that seems arbitrary and unfair, but “I’m simply applying the law,” is hard to dispute. When it’s company policy or transcendent principle, rather than personal judgment, there’s no one individual person with whom you can take up your debate. It’s outside the control of the agents acting out the order.

We may sense that the real motivation for a decision is personal values rather than principled reasons, but the process of depersonalization lends an air of transcendence to any decision, and is a useful tool for persons in power to justify their actions.

***

It is best to view any theory, principal, or policy that would attempt to impose consistency on the way people act with deep suspicion.

Life is all about making hard decisions in context. Theories and principles that purport to guide our behavior independent of context are almost always an artifice, because the human brain is a modular instrument that was not designed for consistency.

There’s an appeal to the idea of intellectual consistency. It’s a nice thought. But it’s just not how we work. If we’re looking for honesty, it’s probably best to admit that we’re fallible creatures trying to our best to make things work in a messy world.

If we’re looking to rise to power, however, it might be in our best interests to tell a different story. All the better if it’s a simple, clear story where you are the author, where you alone know the answers, and everything you say is consistent and universally true.

That’s the kind of story people like to hear.

[1] It’s probably useful here to distinguish between theories such as the Einstein’s General Theory of Relativity, Darwin’s Theory of Evolution, and Scalia’s Strict Constructionalist Theory. The former two theories purport to predict and explain, and currently do predict and explain better than any other theories, natural phenomena in the fields of physics and biology. What Scalia’s theory is trying to do, on the other hand, is to provide normative guidance on how people should behave. Evolution is true whether people believe in it or not. There is nothing obligatory or necessary about Strict Constructionalist theory. If people decide to stop trying to interpret the constitution based on the literal intent of the founders, then that is a phenomenon that will simply cease to exist. Regardless of whether people are aware of it or believe in it, evolution will continue as long as there are creatures who are living.