Incrementalism

It’s hard to pin down exactly how many revolutions have happened in Afghanistan since Zahir Shah, the last king of Afghanistan, was deposed in a bloodless coup in 1973. There was the Saur Revolution of April 1978, another mini-revolution in October of that year in response to a moderation of marriage laws, the Parchami coup of 1979, the assassination of President Taraki in 1979, the Soviet assassination of President Amin in December of that same year, a ten-year war with the Soviet Union, the rise of the mujahideen, various failed puppet regimes led by a collapsing Soviet Union, complete chaos and lawlessness between 1992 and 1996, when the country was ruled by various warlords, followed by the takeover of the Taliban in 1996. At that point, the story becomes more familiar to most westerners, but, as the events of 9/11 and the subsequent 17-year war (and counting) in Afghanistan have aptly demonstrated, the situation has not improved.

Afghanistan is good at fomenting revolutions, bad at stable governance.

***

My spine tingles whenever I hear someone arguing for a “need for radical change” or a “revolution in X.”

Most of the time, attempted revolutions fail, and things end badly for the revolutionaries. And when revolutions succeed, at least in the short term, most people are far worse off than before the revolution started.

***

Democracy, peace, and stability are historical anomalies. If you turn on your faucet and clean, potable water magically appears, if you take your trash to a location in front of or near your home and people reliably pick up that trash and take it somewhere safe, if you never (or almost never) have to look at or smell your sewage again after you flush your toilet, then your life is good. If all of the above circumstances apply to you, then your life is top-notch, by historical standards.

You won the lottery.

My philosophy is that people who have won this historical lottery should be cautious about insisting on the need for radical change, lest they get what they ask for. Radical change usually leads to more radical change and that radical change tends to end badly, or worse still, to never end.

***

Of course, sometimes revolutions work. With the possible exception of Japan, nearly every nation is where it is today because some revolution took place.

But nearly all revolutions were followed by a period of instability or complete chaos. In the American Revolution, the government of the Articles of Confederation would have collapsed were it not for the many compromises (many of which now seem horribly fraught) agreed to by the US Constitution. And the US Constitution set up a government that wasn’t all that different from the constitutional monarchy that the United States had previously taken great pains to escape.

The French Revolution was followed by the reign of terror, and then the reign of Napoleon, which led to the death of hundreds of thousands of French (and many more English, Prussian, Russian, and Egyptian soldiers). The Irish Revolution was followed by a civil war. The Russian Revolution was followed by a 70-year murderous, autocratic state, which included the purges of an estimated 30 million people and the brutal forced starvation of many millions more in the Ukraine. The Cultural Revolution in China might have been the worst of them all.

***

This is not to suggest that the opposite of revolutionary change—stagnation or unabashed conservatism—is a preferred approach, either. Individuals, systems, and organizations can always get better. And they should try to get better whenever they can.

Since life is competitive and most systems evolve over time, failure to improve as a principled approach will lead to obsolescence. If you don’t get better and grow, those around you will, and you will fall behind. Casey Neistat has a nice video illustration of his philosophy on this.

And the more extreme version of this philosophy, that espoused by reactionaries, the philosophy of trying not only to stop growth but to go backwards in time, well that’s even harder to fathom.

Ours is the greatest period of prosperity in human history. The average life expectancy today in every country is better than it was in any country in 1800. Women couldn’t vote in the United States until 100 years ago. Blacks couldn’t vote here reliably until about 50 years ago. Until a few weeks ago, no openly gay person had held the office of governor in any state in the United States.

Why would we to undo any of that?

***

For a brief while, I considered becoming a professional poker player. After six years of practicing law, this seemed like a good way to make a living. I was decent, but not great. I made modest amounts of money regularly. But what I discovered after playing a lot of poker was that playing poker well is incredibly boring, and, as hard as it may be to believe, perhaps even more boring than the practice of law.

If you watch poker on TV, there’s a lot of emphasis on the handful of times when a player goes “all in,” or bets all of their chips on one particular hand. The reality is that in real life, this happens infrequently. This is because good poker players only make huge commitments of their chips when they are extremely confident that they will win or when circumstances (such as tournament timing and increasing chip blinds) demand that they do so.

Absent such extraordinary circumstances, poker players who are concerned about their bottom lines bet infrequently and make small but aggressive, well-timed bets.

Make big bets too often at the wrong times and you’ll go broke. Make no bets and you’ll go broke. Make small, smart bets often at the right times, and you’ll do well.

The goal of a professional poker player is to make more rational bets than the sum of the other players against whom they are playing. If they do that enough over time, they’ll make money. If they don’t, they won’t. Do this over the course of tens of thousands of bets and you’ll make hundreds of thousands, in not millions, of dollars.

***

Incrementalism is a modest hypothesis with the following tenets:

  • For any existing system, the best way to improve the system is to make small changes to the system, assess their impact, and then re-evaluate.
  • The fact that a system exists at all is a good indication that is a better than average or random system—by the simple virtue that it has survived, adapted, and evolved for some period of time.
  • The longer the system has existed, the more likely it is to be better than average or random.[1]
  • Radical change that forces the system to change more than in maintains is likely to have a net negative impact on the system.[2]

***

The moment that made me decide not to play poker as much anymore was a time when I was sitting at a table with a particularly bad poker player in Blackhawk, Colorado. The man was probably in his 70s and had a breathing tube.[3] He joined a table where half of the people knew each other, and were either on the payroll of the casino or were semi-professional players.

He kept making bad decision after bad decision, and he was not aware what he was up against. Put simply, most of us were playing poker, and he was gambling. He had watched the poker he had seen on TV and had convinced himself that poker was about betting aggressively, regardless of what cards you had in your hand. He thought poker was mostly about bluffing with sunglasses, and that it didn’t matter if all signals pointed to the opponent having the better had.

He had a relatively small stack of chips and a big wad of cash. He’d bet aggressively, lose it all, and then go get more chips. He did this at least a half dozen times.

For this man, every hand was all or nothing. He wanted to get rich on every single hand. If he didn’t, he cursed himself and his bad luck. The longer he did this, the farther away he got from the financial position where he started.

The pros are in the business of chipping away. Making one incremental good decision after another until, over time, due the magic of compounding growth, their stack grows exponentially. The amateur tries to win it all, without patience or diligence, and without putting in the work. And the amateur who does that almost always ends up going home broke.

***

To use a different analogy, a 22-year-old novice runner might get inspired around New Years Eve and decide that they want to run in the Olympics some day. That runner might read that to get to the Olympics, a marathon runner typically runs 120 miles a week with three hard workouts each week, and try to replicate that strategy. But without 7-10 years of running to build up to that workload, that strategy will invariably result in injury, illness, and breakdown. A novice runner attempting a quarter of that workload would likely implode. To anyone with any knowledge of running, this observation is obvious and un-insightful.

But for something much more involved or complex than becoming a good runner, such as creating a new system for healthcare distribution in this country, there are some reasonably intelligent people that think that a complete overhaul to the system could work out. That’s far less likely, in my opinion, than someone jumping off the couch and making an Olympic team.

 

***

There are lots of people out there selling the idea of the simple solution to all of your problems. It’s incredibly appealing to think that if we do just one simple thing, that if we just find one hack or trick, that all of the other things we do will get easier.

That’s why blog posts and YouTube videos with titles like “the ONE thing you need to do to get 6-pack abs” are so popular. We want to think that there’s just one thing we can do to fix our insecurities and make our abs look sexy, when in reality the “one thing” we need to do is to exercise obsessively, to eat healthier, and to have the right genes (and perhaps shave our chests and get spray tans).

This business of selling simple solutions is great for getting clicks, but it’s not as great for the people doing the clicking.

People crave simple solutions so much that they will flock to those who offer simple solutions to hard problems even when those solutions are obviously false.[4]

But the hard truth is that for individuals to get better you have to get a little better over time. For institutions to get better, they must work to solve the hard problems that they have not yet solved. If they’ve been around for a while, the solutions to those hard problems will likely be hard, technocratic, complex, and dull.

***

Of course, there are moments in history when only a dramatic change will do. An easy example of this is human slavery in the United States before 1865. There was no step-by-step, incrementalist way to solve that horrific affront to human dignity. It had to be done in one fell swoop.

That’s what happened, starting with the most violent and deadly war in American history, which was then followed by a more than 100-year period of violence and struggle, the remnants of which are still around today.

Even when dramatic change needs to happen, history tells us it won’t come easy.

***

To me, it seems obvious that getting better as a person or as an organization is something that requires diligence, patience, and consistent effort. And, most importantly, it requires building on the lessons we have learned in the past. [5]

But I get the feeling that more people, on the right and the left, are calling for radical solutions to our problems, ones that discard the lessons we have learned from the past. I see people saying that our society is going to hell and that we need something other than incremental change to get better. Reasonably intelligent people are claiming that our society is in collapse. That we need to throw out everything and try something new.

***

Right now, I’m reading a book called Red Famine, by Anne Applebaum.

I’ve read a lot of depressing books about horrible things that have happened in history, but man, this one is truly brutal.

If you do not know about the Ukrainian famine of the 1930s (I was only vaguely familiar before reading the book), you could be forgiven. It’s not something that gets much press. Stalin and the Soviet Union pretended it didn’t happen for 50 years. And the Western press (and Walter Duranty of the New York Times, in particular) was complicit in covering it up.

The short version is that the Soviet Union’s policy of collective farming in the early 1930s was a disaster. And because it was a disaster, and a decision made by Stalin, it wasn’t possible to admit it was a disaster. Instead, someone had to be blamed. The Soviets blamed Ukrainians, the biggest farming community within the Soviet Union, for hoarding food and refusing to produce enough bread. And the decision was made to actively rob and steal any food possessed by these Ukrainian farmers and give it to the rest of the country (and those loyal to the party, in particular).

In the words of Applebaum:

The result was a catastrophe: At least 5 million people perished of hunger between 1931 and 1934 all across the Soviet Union. Among them were more than 3.9 million Ukrainians. In acknowledgement of its scale, the famine of 1932–3 was described in émigré publications at the time and later as the Holodomor, a term derived from the Ukrainian words for hunger—holod—and extermination—mor.

Examples of how this was enforced are truly horrific.

Our father hid three buckets of barley in the attic and our mother stealthily made porridge in the evening to keep us alive. Then somebody must have denounced us, they took everything and brutally beat our father for not giving up that barley during the searches…they held his fingers and slammed the door to break them, they swore at him and kicked him on the floor. It left us numb to see him beaten and sworn at like that, we were a proper family, always spoke quietly in our father’s presence…

A brigade searching through the roof thatch at the home of Hryhorii Moroz in Sumy province failed to find any food and demanded to know: “With the help of what do you live?” With each passing day, demands became angrier, the language ruder: Why haven’t you disappeared yet? Why haven’t you dropped dead yet? Why are you alive at all? 

The effects this had on the population were predictable.

One railway employee, Oleksandr Honcharenko, remembered “walking along the railroad tracks every morning on the way to work, I would come upon two or three corpses daily, but I would step over them and continue walking. The famine had robbed me of my conscience, human soul and feelings. Stepping over corpses I felt absolutely nothing, as if I were stepping over logs.” Petro Mostovyi remembered the beggars who came to his village seemed “like ghosts,” sat down beside roads or under fences—and died.

***

Often, in politics, “the simple solution to our problem” is to get rid of a scapegoat.

It’s much easier to blame an outsider for our problems than it is to actually solve a problem. Immigrants. Tech magnates. Conservatives. Liberals. Capitalists. Socialists.

In a pluralistic society, it’s almost never true that one sector of society is responsible for all of our problems.[6]

So when people try to pretend that they are, it isn’t just dangerous rhetoric; it’s language that has the potential to destroy the entire system.

***

Another factor that weighs in favor of incrementalism, regardless of whether you like the idea, is the critical role of habits and inertia in human activity.

As philosopher John Dewey wrote nearly 100 years ago. “[H]abits of thought outlive modifications in habits of overt action,” he said.

Dewey was skeptical of rapid-scale transformation of the “short-cut revolutionist”who did not appreciate the power of habit:

Any one with knowledge of the stability and force of habit will hesitate to propose or prophesy rapid and sweeping social changes. A social revolution may effect abrupt and deep alterations in external customs, in legal and political institutions. But the habits that are behind these institutions … are not so easily modified.

This is why attempts to impose Western-style or Western-like institutions in countries that have no history of such institutions invariably fail. Our institutions have evolved over hundreds, if not thousands, of years. From the Magna Carta to the UK Constitution of 1689 to the US Constitution of 1789, followed by 230 years of judicial precedent. We have habits and inertia behind these institutions. You can’t just drop that history on to a country that has no experience of it and expect them to replicate what we have. A place like Afghanistan or Iraq cannot just wake up from a history of autocracy and transition to representative democracy overnight. Whatever systems of law and justice existed before, however brutal they might seem to us, will persist. You can’t just get rid of that history and expect things to work.

Stalin may have wanted to implement collectivism in the Ukraine and elsewhere in the Soviet Union, but farmers in the Soviet Union were unable to overcome the deeply held belief that workers should reap the benefits of their own hard work. When farmers were asked to produce grain, and were asked to do so without any food being given to their own families, they simply stopped working. Which led to less grain being produced, and a deeper famine. Instead of working on collective farms, peasants moved to cities or took to the countryside scrounging for food.

You can’t just undo the way people have always done things in a single action and expect it to go well.

***

I think a major issue today is that many people lack real perspective on what collapse, injustice, and tragedy really are—on how bad institutions can be.

The suffering from Holodomor, the Ukrainian famine, was not just preventable: Stalin and the Soviet Union made conscious decisions that killed millions of people, with full knowledge that those decisions would kill many people. The Soviets actively engaged in systematic practices of theft, arrest, murder, and torture to ensure that poor peasants starved to death. The famine was caused by a government that thought it could throw out the way things have always been done and make things better. When they didn’t get better, the government doubled down on their bad decisions and tried to force-feed radical change on to people who weren’t ready for it. As a result, millions starved to death.

Our system of government is imperfect. But by historical standards, our institutions are excellent. We should not forget that fact. We have more privilege, luxury, and good fortune than any other society in human history. Throwing it all away, simply because it isn’t perfect, is as bad of an idea as could be proposed.

***

Incremental improvement isn’t sexy. It doesn’t do well on a meme or at a protest march. It probably won’t get you clicks. It’s technocratic and tedious.[7]

Incrementalism is not about inventing the wheel. It’s about building a better wheel. And not convincing ourselves that to build a better wheel, we first have to destroy all existing wheels.

[1] If you meet someone who tells you that they have a great investment system, and they do something other than investing for a living, you should be wary of that person. On the other hand, if you know someone who has made a living off of markets or betting (other than just the fees associated with investing other people’s money) for a long period of time, that person might be worth listening to.

[2] For a more academic articulation of a similar thesis, I recommend Charles Lindblom’s The Science of Muddling Through.

[3] Turns out, taking money from unhappy, addicted people who are making bad decisions right in front of you is not a very satisfying way of earning a living.

[4] While the current incarnation of the “life hack” may be new, the idea of selling simple solutions to complex and hard problems is almost certainly nothing new. People were literally selling snake oil 100 years ago and I’m sure they’ve been selling miracle cures to hard problems for at least a few orders of magnitude longer than that.

[5] This philosophy is consistent with the game theoretic conclusions in Darwin’s Unfinished Symphony, the primary thesis of which is that strategic copying outperforms innovation.

[6] Although in an autocratic system, sometimes it’s easy enough to pinpoint the source of the problem: the autocrats.

[7] For a smart read on how good governance out to be done, a great place to start is The Cost-Benefit Revolution, by Cass Sunstein, who worked in both the Reagan and Obama administrations.

The Forgotten Art of Spacing Out

If there’s one thing I’m really good at, it’s spacing out.

Not looking at a screen. Not getting anything done. Not listening to a self-improvement podcast or reading a book about what to do next.

When I wake up, after I get back from a run. When I’m not sure what to do, I just stare into space.

It’s taken some work to get back into the habit, but I now that I’m doing it again, I think I like it.

I’ve been doing it most of my life. I was great at it even when I was young. When I was a baby, it was pretty much all I did. In middle school and high school, I was always looking for an excuse to sneak away and space out. On lunch breaks and independent studies, I’d go to the library, find a quiet space, and just stare at the wall. In college, too.

In my 20s, though, I was diagnosed with ADHD. They gave me some pills that sped up my heart and my brain. This had the effect of making me much better at staring at a computer all day. But I also think it had the unwanted side effect of making me less likely to stare at a wall all day. At the time, though, I think I was too busy to notice.

By focusing on work more, some things got better for me. I got a prestigious job, and I made a lot of money. From an outside point of view, I was doing well. But I didn’t like it that much. Everything was such a big damned rush. When I was working, when I wasn’t working. It didn’t feel natural to me.

In 2006, I got my first blackberry. In 2008, I got my first iPhone. And then I got all these apps that fed me a constant stream of new information. In 2009 and 2010, I probably checked Twitter an average of 50 times a day.

Looking back at it, for all the information I was getting fed, I can’t remember much of what I learned. I’m sure I was getting something out of it.

But in retrospect, it’s all a bit of fog. A fog of information.

In 2010, I went off my meds. My heart and mind slowed down. In 2012, I quit my job. In 2013, I started meditating regularly.

Around that time, I think I recovered the lost art of spacing out. Maybe it’s because I didn’t have so much to do anymore. Maybe it’s because I was less focused on work. Maybe’s it’s because I wasn’t on drugs that made me “focus.”

Nowadays, I keep my phone in airplane mode 95% of the time. Nothing in. Nothing out. My number is 720-635-5503. Feel free to text or call. Unless you’re my mom or my wife, I won’t answer.

Having my phone and other devices in forced hermit settings gives me the time and the space to do what I do best: Space out (or when circumstances demand, to focus).

I have read that there are some benefits to spacing out. That those who don’t check their phones obsessively have better attention spans and less anxiety.

Maybe that’s true. Maybe it’s not. But to me, spacing out is not a means to an end. Absent other confounding variables, spacing out will not make you rich; it will not give you six-pack abs. Spacing out, to use Kantian terminology, is an end-in-itself.

That’s another way of saying that spacing out is inherently valuable. It’s taking a moment in time and declaring, consciously or unconsciously, that nothing else matters other than this moment in time. That maybe we don’t need to be mainlining more extraneous information into our brains all the waking hours of the day.

Spacing out is when we allow ourselves to appreciate the wonder of the very fact that we are alive. After all, we won’t be forever.

What could be more important than that?

Odysseus and Intentionally Inconvenient Design

You perhaps know the famous story of Odysseus and the Sirens.

In Greek mythology, the Sirens were famous for their gorgeous singing. So very beautiful was their signing that any sailor who heard their songs would stop sailing their ships to listen. With no one steering the ship, the ship would crash, and everyone who heard the Sirens’ songs would die.

In the great Greek epic the Odyssey, the hero Odysseus avoided this trap by telling his men to block their ears with wax. And then he ordered his sailors to tie him to the mast so he could hear the Sirens’ songs without doing anything stupid like intentionally crashing the ship so that he could hear more songs.

Lo and behold, it worked. By planning ahead of time to avoid temptation, Odysseus managed to experience what was most beautiful in his voyage home without killing himself.

Given that this was written about 2500 years ago, it’s clear that the human struggle with what we know will tempt us as opposed to what we know is best for us is nothing new.

But I’m of the opinion that the struggle is harder now than it has ever been. That capitalism is an evolutionary process that continues to push the boundaries of what tempts us, and to continue to make it harder to resist. Every new day, we are faced with the most potent new forms of temptation known in human history.

This makes Odysseus-like skills at avoiding temptation more important than ever.

***

Last year, I was listening to one of my favorite podcasts, 99% Invisible, and I learned about the concept called “unpleasant design.”

According to the show:

Benches in parks, train stations, bus shelters and other public places are meant to offer seating, but only for a limited duration. Many elements of such seats are subtly or overtly restrictive. Arm rests, for instance, indeed provide spaces to rest arms, but they also prevent people from lying down or sitting in anything but a prescribed position. This type of design strategy is sometimes classified as “hostile architecture,” or simply: “unpleasant design.”

Unpleasant design is an intentional structure that’s designed to make you uncomfortable. What a brilliant idea! Using discomfort by design to deter certain behavior.

***

Given the challenges that I have in keeping attention focused when I want to focus, I have often thought about the idea of tying myself to the mast in designing my own workspaces. To do work, you have to go online. But when you’re online, the entire information age with all of its increasingly well-cultivated attention traps are there to tempt you. It’s not easy for me to get what I need to get done without hearing the Sirens’ song of web temptation and crashing into a rabbit hole of internet distraction.

So I’m constantly working on better ways to filter the internet I need to use from the internet that is taking me away from what I need to do.

What I’ve been working on is a concept I call “intentionally inconvenient design.” It’s stolen from the idea of unpleasant design, and the basic thought process is that I need to be able to search for things. But the internet now proliferates with sites and attractions that make it hard to do just the bare minimum online, which means I tend to spend far longer online than I need to when I go online.

My solution is to make it difficult to go online, particularly to sites that are not designed for productivity.

This is still a work in progress. But here are a few things I have done to improve my own work environment:

  • I use a service called “stay focused” that gives me a nuclear option to block all sites except for certain allowed sites while I am online.
  • I’ve created impossible-to-remember 25-30 character passwords for Twitter and Facebook. I then printed them out and put them in my sock drawer, and I don’t keep them anywhere on my computer or other devices. That way, if I really want to access social media, I can. But it’s not easy. I have to really want to do it or it’s not going to happen.
  • I created a slightly-less challenging password that I must manually input into my computer ever time I want to use an internet browser.
  • I have a sit-stand desk with a ball. I alternate between sitting on a ball that’s tiring to sit on for too long and standing, which is also tiring to do for too long. That way, my work time is more concentrated and intentional.
  • I try to never use the internet while sitting comfortably.
  • I store my iPad in the least convenient place I can, in a closet in the least-used part of my house.
  • I have a 4-year-old iPhone with cracked glass and an internet browser that does not work well. The phone will not be replaced until the “phone” feature stops working.
  • I put all desserts in a crawl space only accessible by a rickety ladder and in a 24-inch stainless steel safe that is protected by padlock.[1]

I appreciate that all of these mechanisms are kind of ridiculous. But as silly as they all seem, I think they help. It’s much easier to go online and let my attention wander rather than doing the things I know will make my life better. And, without prior restraint—true to Zipf’s Principle of Least Effort—that’s what I do. Unless I tie myself to many different metaphorical masts throughout my day, I waste much of my day.

The whole world of information and entertainment is right there, at our fingertips, every minute of every day. That’s gotta be every bit as tempting as the Sirens’ song, right?

[1] This one isn’t true.

First Ascents, Google Glass, and Thalidomide

Recently I was climbing a remote mountain near my home.

On this day, the route that I took up this particular remote mountain was ill advised. It didn’t look all that bad from the bottom; it just looked like the most direct route up the mountain. But after I ascended the first ridge, it became clear that I had put myself in a bad place. I’m a veteran mountain runner, with nearly 30 years of mountain running experience. But on this day, the route I chose was a serious mistake.

It’s a safe bet to say that every mountain in Colorado has been climbed many times. There’s no such thing as a first ascent on Colorado peaks anymore. But on this day there were a couple of points where I wondered to myself whether any human being had ever been in precisely the position where I was in at that moment.

If you find yourself in place like this in 2018, chances are you’re in a very remote, exciting place. And your life is probably in danger. Because today, with 7 billion-plus people on the blue orb, if no human has been where you are, there’s probably a reason for it.

***

I remember hearing an adage about plane crashes that they don’t happen because of just one mistake. There are fail-safes in place to ensure that a single mistake does not cause a plane crash. Crashes happen when pilots and their co-pilots make a series of novel and interesting mistakes in sequence that no other pilots have ever made before. When the mistakes happen in a sequence that the safety mechanisms do not anticipate and therefore cannot prevent.

I think the same is often true of other accidents, including mountain deaths. Mountain deaths don’t happen when you make just one mistake. It’s usually when you make a series of mistakes. You climb a mountain that is beyond your experience level. And then you get off-route, and then you decide to take a “short cut” down. And then you find yourself in a series of places where the only way out is on a cliff with unstable talus or loose scree.

I try to look at each decision on the mountain as unique. I try not to let a prior bad decision influence my decision-making about what to do next. But this particular day, I just kept making bad decision after bad decision. I was path breaking in a way that I’m not normally accustomed. I was path breaking in a way that I prefer not to do.

I like to go to remote places in the mountains, but I don’t like to take huge risks when I do. I like to explore, but not to the extreme where there is anything higher than a totally improbable risk of serious danger.

***

Venturing off the beaten path gets good PR. Perhaps undeservedly good PR.

I think this is because of survivorship bias. People tell heroic tales of those who take great risks and are rewarded. It doesn’t always work out that way.

A few years ago, I got a pair of Google Glass. For a brief shining moment, this was the cool new thing in tech. People would stop me on the street and ask to use it. I thought I was being novel and ahead of the game. I was new to the community of startups and technology, and I wanted to signal that I was adventurous and “in the know” when it came to tech.

I should have bought Bitcoin instead.

On a dime, the world turned on Google Glass, and decided that this was no longer the coolest thing. It became apparent that this device was a buggy, wasteful, and a grandiose symbol of the worst forms of perpetual tech distraction. People who wore them became known as “Glassholes.”

My fancy (and expensive) new toy quickly came to signify the opposite of what I had wanted to signal.

I wanted to signal I that I was not afraid to take a risk. I took a gamble on being the first to adopt a new technology, and I ended up wasting my money on a form of technology that wasn’t helpful or useful to me at all. Plus, it made me look like a fool.

***

And then there is the lesson of Thalidomide.

According to a 2009 article by Bara Fintel, Athena T. Samaras, and Edson Carias.

Thalidomide first entered the German market in 1957 as an over-the-counter remedy, based on the maker’s safety claims. They advertised their product as “completely safe” for everyone, including mother and child, “even during pregnancy,” as its developers “could not find a dose high enough to kill a rat.” By 1960, thalidomide was marketed in 46 countries, with sales nearly matching those of aspirin.

Around this time, Australian obstetrician Dr. William McBride discovered that the drug also alleviated morning sickness. He started recommending this off-label use of the drug to his pregnant patients, setting a worldwide trend….

However, this practice can also lead to a more prevalent occurrence of unanticipated, and often serious, adverse drug reactions. In 1961, McBride began to associate this so-called harmless compound with severe birth defects in the babies he delivered. The drug interfered with the babies’ normal development, causing many of them to be born with phocomelia, resulting in shortened, absent, or flipper-like limbs. A German newspaper soon reported 161 babies were adversely affected by thalidomide, leading the makers of the drug—who had ignored reports of the birth defects associated with the it—to finally stop distribution within Germany. Other countries followed suit and, by March of 1962, the drug was banned in most countries where it was previously sold.

Obviously, this is horrible and a parent’s worst nightmare. Well-intentioned parents took medication they were told was safe and ended up giving their children life-altering and often fatal birth defects.

***

It’s perhaps callous to compare these three distinct phenomena, but they all—with varying degrees of seriousness—touch on the question of “when is the safe and wise to try something new?”

There is no universal rule for how to deal with novel risks, of course. You could be the first person to try a new computer game and the risk of serious harm would be next to nil, and you could be the millionth person to try base-jumping and you could still get splattered against a mountainside.

But while there is no universal rule, there are some game-theoretic insights that may apply. While on the mountain and then afterward, I was reminded of the fantastic book by Kevin Leland, Darwin’s Unfinished Symphony: How Culture Made the Human Mind.

According to the book, for all the praise many people want to give to those who innovate, it is those who strategically observe and exploit that fare the best.

Being the first to go somewhere or adopt a new technology is highly overrated. As a general rule, the authors say, watch where others go, see which strategies seem to work, and then copy based on what you observe. That’s the meta-strategic path.

Standing on the side of a cliff, that thought resonated with me. It’s fun to explore novel and interesting places. But in the moment where you are truly exploring and trail blazing where no one has gone before, you appreciate more than ever why some prior strategic reconnaissance would have been helpful.

Sitting and Moving

Sitting and moving. That’s what I spend the majority of my free time doing.

Stated another way, I spend a lot of time meditating and running. Between those two activities, I probably average between three and four hours a day. That may seem like a lot of time to spend on activities with no obvious practical purpose. But to me, they are the most fundamental things I do every day.

It’s hard for me at this point to imagine doing anything different with my life. But almost every other person on this planet chooses to spend their time differently, which means that my choices here are unique. So I figured I’d write a brief homage to why I enjoy these two activities so much.

The moving is beneficial because it is exercise. Evolutionary history tells me that we humans are animals first and cognitive creatures second. If we get caught up too much in our cognition, the machinery stalls out. By moving every day, I bring myself back to the most basic aspects of my physical being—navigating my natural environment. That navigating helps me remind myself every day of the way that I physically inhabit the space around me. It is powerful to move. Without that movement, the other aspects of my day feel less real.

The sitting is beneficial because it stops me from getting too caught up in the things I have going on in my life. The tasks of a professional can seem very important in the moment you are doing them, but on a cosmic scale, it is hard to think that they are. Sitting is a daily reminder of the fragility of our temporary forms of existence. It is a reminder to let go of small things and to remember one’s place in the broader scheme of the universe. That sounds like heady stuff, but in the moment of sitting, there is truly nothing to do. Sitting is not an addition process; it is a subtraction process. Just sit and let go of everything else.

On the surface, one might think that running that would be the harder of the two—much harder than sitting. But I certainly find the sitting to be much more challenging than the moving. For me, unless I’m pushing myself to the extremes of my fitness level, running for an hour or even two hours isn’t that much of a challenge. On the other hand, sitting still for an hour is incredibly difficult—and I almost always fail to sit completely still for the entire time that I set out to sit.

So in a sense, I’m a better mover than a sitter. Or perhaps sitting still is inherently more difficult than moving. Either way, it’s the sitting that is the harder of the two for me.

Also, the more I meditate, the more I find commonality in what previously seemed like disparate experiences. I practice a few different kinds of meditation, some of which involve deep attempts at concentration and some that are not as active, but rather attempts to let go of the intention to control one’s attention. Initially, the latter experience didn’t feel much like meditation at all, but rather just an out-of-control wandering. But the more I meditate, the less different the focused meditation and the non-intentional meditation seem to be. Often, after twenty or thirty minutes, there is no difference, or there does not seem to be.

And so too with sitting and moving. The more I meditate and run, the more I find that returning to the breath can be as useful in running as it is in sitting. And after an extended time running, I sometimes notice similar sensations to those I feel when sitting—the challenge of staying focused in the moment, an internal voice suggesting to do something else, even a form of mini-nausea arising from the intensity of the experience.

The more that I sit and move, the more the experiences start to seem alike. Perhaps there might be a lesson in there about the universality of all experience, but I doubt I’m qualified to render such an opinion. What I can attest to is the power of the intentional sitting and moving. It’s a power that feels as deep as anything I have encountered.

Nudging to Non-Distraction

Cass Sunstein has been one of the most influential legal scholars of the 21st century. One of the most influential economists of the 21st century is recent Nobel Prize winner Richard Thaler. In 2008, the two got together to write Nudge, a book about a series of methods and techniques for influencing people’s behavior—without legal coercion— toward healthier lifestyles.

There are many examples of how simple nudges can lead to better behavior and better life outcomes. For example, it’s possible to influence whether students eat healthy food in school cafeterias  by rearranging the placement of the food so that the healthy options are more prominent and the unhealthy options less prominent. It’s also possible to get workers to save more for retirement by making saving a default choice rather than a choice that requires them to opt in. Nudges take a variety of forms, from subtle and seemingly invisible to transparent and educative.

Nudges have shown to be effective by improving outcomes in the context of health, savings, highway safety, employment, discrimination, the environment, and consumer protection (Sunstein 2013, Halpern 2015).

One Finnish academic paper notwithstanding, there has been comparatively little attention placed on the ability to use nudges to limit the amount of distractions in our lives. I think this is a shame.

I believe the following statements to be true:

  • There is more information now than ever before.
  • There are more sources of distraction now than ever before.
  • Those sources of distraction are getting more effective at distracting us.
  • Perhaps more than ever before, people are distracted and unable to focus in a way that prevents them from having healthy relationships and reaching their personal and professional potential.
  • Nudges are a relatively popular and effective way to influence social behavior.
  • Nudges could be used to reduce mental clutter and distractions in our lives.
  • It is worth exploring ways to do this effectively.

There is nothing magical about nudges. Nudges won’t solve these problems immediately or permanently. But carefully crafted nudges could help us design less distracted communities and improve our well-being.

There is ample literature about nudges and their effectiveness and benefits; I think it’s time to start seeing if we can apply this non-coercive policy tool to help make us less distracted.

This is a topic I expect to read and write about much more in the upcoming months and years.

The Feck Off Gates

“Around here,” my uncle said, accentuating with a pause, in his thick, West Cork accent, “those are known as the ‘Feck Off Gates.'”

The Irish do have a way with words.

It was the day after a cousin’s wedding in Ireland, and I was at my uncle’s house (the father of the groom), watching a hurling match between Cork and Limerick. This was the all-Ireland semifinal, and since all of my father’s side of the family is from Cork County, all eyes were on the match. And since this was the groom’s family’s house and it was near the hotel where the reception was the night before, most of those eyes were in this one particular room.

Imagine a dozen or so Irish men, and a few Irish women (plus me and my wife), crammed into a smallish room, sitting on couches, all exhausted from the prior night, screaming at young men playing a sport with sticks and a ball.

Hurling is an Irish sport. And the sport is governed by an organization called the GAA, or the Gaelic Athletic Association. It’s hard to describe in a few sentences what the GAA means to Irish communities. But it’s possible that outside the church, the GAA is the most influential organization in many Irish towns and small cities. And in some towns and cities, it might be even more important than the church.

The GAA is about the sports, but, like most social endeavors, it’s mostly an excuse for the community to get together. To talk about what’s happening in the community.

At half time, after the habitual and expected armchair punditry about the match, the topic of conversation, as often happens with Irish conversations, switched to local news and gossip. My father, his brothers, and other relations bantered back and forth about who recently had been married, who had died, who was having children, and who had moved where.

In this community, it’s an understatement to say that everyone knows everyone.

If I go back to visit family in the small West Cork town of Kilbrittain, where my father is from, by the end of the third day, seemingly everyone in town will know that there’s an American in town, that I’m Mick the Manager’s grandson, that my wife who may be of Mexican descent is with me, and that we’re staying in town for a week. They probably even know whether they can expect to see me at Mass on Sunday.

So when a new house is built, or someone new moves to town, it’s news.

With that mind, my father asked my uncle, “Who owns that huge new house by the strand in [name intentionally omitted]?”

“An American,” my uncle replied.

“Really?” my Dad said. “I didn’t know there were any Americans living around here.”

Upon hearing about the new house, a few other relations commented on the size of the house, and perhaps more newsworthy, the size of the gates in front of the house.

And this was when my uncle dropped the line about the “Feck Off Gates.”

If you’ve never been to Ireland or been close to an Irish person, “feck” is a slightly more polite way of saying fuck. And the fact that this was coming from this particular uncle was a bit of a shock. While most of my family has no compunction about swearing or cursing, this particular uncle is known as the saint of the family. He’s never had a drink in his life (an impressive accomplishment for an Irish man), and he almost never says a bad word about anyone. And he doesn’t really swear.

But clearly this particular American, and that particular house, had made an impression.

I was thinking to myself, the guy who bought the house is probably an Irish-American. He’s probably achingly proud of his Irish roots. At some point, he visited this area and fell in love with it. Maybe his grandparents or great-great grandparents were from the area. I imagine he wanted to feel more connected to it, and since he had done well in life, he probably thought to himself, “you know what, how great would it be if I had a second (or third or fourth or fifth) home around here? I could wake up next to the ocean, go for a walk along the same beach my grandpa (or great-great-great grandpa) used to walk on, go into the local pub. It’ll be great.”

So he bought the land, spent millions of dollars on a gorgeous house with magnificent gates, and he built it on one of the best parcels of land in West Cork. This was the status symbol that was supposed to connect him back to his Irish roots—to show everyone that he’d come back, and that he’d made it in life.

But as with most status symbols, it’s a hard thing to get right. That’s what the whole humblebrag phenomenon is all about. Everyone wants status and respect, but we have to be subtle about how we go about getting it.

Go about it the wrong way, and you’re the guy with the “Feck Off Gates.”

Completeness of Experience

So much of the focus in western culture is on the novelty, intensity, and variety of our experiences.

Where have you traveled? What restaurants have you been to? Have you ever done a full Ironman? Have you ever had this grapefruit-infused IPA? What about the truffle-oil infused pork belly? Have you gone zip-lining in Costa Rica? How about bike-packing in Columbia?

Have you done a 100-mile race? How about a 200-mile race?

And of course when the goal or focus is on having the greatest variety of intense experiences, there is always a worry. Did I pick the best dish at this restaurant? Is this job the best job I could possibly have at this moment? Am I doing enough?

I think that’s the source of the ubiquitous “fear of missing out.” The fear that there is a better experience somewhere than the one we are currently having. And this creates a pervasive sense that what we are doing now is somehow not quite good enough. And then the feeling that we are not quite good enough generates a desire to perpetually optimize for better experiences.

But the mind that is perpetually optimizing is a mind that is never at rest. The perpetually optimizing mind may struggle to appreciate the present experience, because it is always searching for a better one.

In my 20s and 30s, I was always worried about how to acquire a variety of novel and intense experiences. Now I try to focus more on the completeness of any given experience.

This is a perpetual challenge.

If you are focused on the completeness of your experience, rather than its novelty, intensity, or variety, it doesn’t matter if you are taking out the trash or washing the dishes. It can still be a source of pleasure and peace. But if you are not focused on the completeness of your experience, you could be sipping on a cocktail on a tropical beach and still be deeply dissatisfied or outright miserable (“That private beach over there looks nicer”; “I should have gone to Aruba instead of Jamaica”; “The view here is blocked by those trees”; “This humidity is oppressive”; “I should have ordered that Pina Colada instead of this Bahama Mama”).

This isn’t to say there is anything wrong with good food, travel, or intense experiences. It’s just to say that without a sense of appreciation, and the feeling that the present moment is good enough, that all of our experiences may seem incomplete. But if we do our best to appreciate the present moment, it brings a sense of completeness and fulfillment to whatever we do, no matter how mundane or ordinary.

Defaulting to Distraction

Recently, during a minor surgery for a loved one, I was waiting in the hospital lobby. The main waiting area adjacent to the operating rooms was a tight space with about a dozen chairs. All of the chairs faced an extremely large television. The television was turned on, and the volume was up very high.

I looked around and thought, “Does everyone else around me really want to be watching The Golden Girls at full blast at 7:35 in the morning?”

I suspect the answer was no. No one seemed to be enjoying it (say what you will about how hip Betty White is these days, the humor did not age well). Eventually, I, and almost everyone else waiting in the lobby sought out and moved to quieter areas of the hospital.

It seemed so obviously ridiculous. But it was still happening, and I suspect it probably happens the other 364 days of the year, too. And the same exercise is repeated at hotels, airports, restaurants, laundromats, and other public spaces around the world.

In so much of the public sphere, the default setting is one of intense distraction. And as was the case this morning, sometimes the distraction is loud, in your face, and almost impossible to avoid.

Many of the smartest policy influencers like to think about Nudges and choice architecture, and how that architecture can impact whether we choose to smoke, whether we invest in our retirement, or whether we donate our organs. I’m a fan of this conscious and thoughtful choice architecture as a low-impact, soft, and non-coercive way to influence people’s decision-making.

I know there are some people who have thought about such things, but perhaps this could be a greater point of emphasis. I’m sure the hospital administrators felt they were providing a service by playing the TV all hours of the day and night. I mean, they spent money on the damned TVs, and I’m sure some people like having the noise to distract them. Maybe for some, even the lowest quality, over-the-top distraction is better than being alone with their thoughts, particularly when we know a loved one may be in distress or danger.

But perhaps we can do better for our default social setting than to force-feed 30-year-old sitcoms on everyone around us. Those anxious times in waiting rooms might also be the moments when we could have the most important, tender, and meaningful conversations of our lives. When we could tell the friend or brother that we often take for granted that we love them.

But it’s hard to do that when you have a laugh track cackling at full volume eight feet from your head.

When I was a kid, people smoked on planes. When I was in college, you couldn’t go out to most restaurants or bars without coming home reeking of cigarette smoke. Over time, we decided that second-hand smoke was gross and that the default setting should be a smoke-free environment.

No one decided smokers shouldn’t be allowed to smoke. But we decided we didn’t want to make non-smokers breathe that dirty air everywhere they go. I’m not suggesting we ban The Golden Girls completely (well, maybe). But it’s not strictly necessary to force everyone to watch TV in shared public spaces as a default social norm, either.

Maybe, over time, we could adjust our default settings of distraction. We could decide that the default setting for public spaces could be one with no distractions, and let the TV-watchers adjourn to a TV lounge in some remote part of the building.

Here’s to hoping.

Reflections on the Passing of Anthony Bourdain

I don’t feel envious of many people, but I was definitely a little envious of Anthony Bourdain.

I remember thinking every time I watched one of his shows, “now that guy has the greatest job on the planet.” Traveling around the world to its most beautiful and exotic places, connecting with its best chefs, eating its best food. Meeting many of its most interesting people. And then crafting fantastic and creative narratives around those experiences. His shows were brilliant.

Who wouldn’t have traded places with Anthony Bourdain?

All the epicurean delights this planet has to offer. Status. Success. Autonomy, Mastery, and Purpose. Eudaimonia. PERMA. By what seemed like every external measure of what a person could want, he had it.

Turns out, it wasn’t enough.

I cannot pretend to know his thoughts or his inner demons. But I think it’s safe to say, that at least for him, having one of the best jobs, and one of the most interesting lives, of any person on the history of this planet: It wasn’t enough. But if that life wasn’t enough, what could be?

The answer, sadly, is almost certainly nothing.

There is no thing that will ever satisfy you.

Nothing will ever be enough. There is no thing after which you get that thing that life’s problems will then go away. We just weren’t designed to be happy. We were designed to feel perpetual dissatisfaction and to think what we need and want more. When we see the Instragram photo, Facebook post, or celebrity snapshot of what seems to be a better or more glamorous life than our own, we’ll never know what deep suffering or sadness may be lurking beneath it.

So if I take away one thing from what I believe to be a great man’s passing, it is this: Try not to worry about what you think you want. Let go of the pursuit of the things that are supposed to be the things that will make you happy. They will never be enough.

Who knows if that sentiment could have saved Anthony Bourdain? Probably not. He was well traveled enough where he had likely heard something like that before. We will never know if anything could have saved him.

But what I take from his passing is that all the wonderful things this life has to offer will not be enough to save you, if you are not already at peace with this life.