It’s hard to pin down exactly how many revolutions have happened in Afghanistan since Zahir Shah, the last king of Afghanistan, was deposed in a bloodless coup in 1973. There was the Saur Revolution of April 1978, another mini-revolution in October of that year in response to a moderation of marriage laws, the Parchami coup of 1979, the assassination of President Taraki in 1979, the Soviet assassination of President Amin in December of that same year, a ten-year war with the Soviet Union, the rise of the mujahideen, various failed puppet regimes led by a collapsing Soviet Union, complete chaos and lawlessness between 1992 and 1996, when the country was ruled by various warlords, followed by the takeover of the Taliban in 1996. At that point, the story becomes more familiar to most westerners, but, as the events of 9/11 and the subsequent 17-year war (and counting) in Afghanistan have aptly demonstrated, the situation has not improved.

Afghanistan is good at fomenting revolutions, bad at stable governance.


My spine tingles whenever I hear someone arguing for a “need for radical change” or a “revolution in X.”

Most of the time, attempted revolutions fail, and things end badly for the revolutionaries. And when revolutions succeed, at least in the short term, most people are far worse off than before the revolution started.


Democracy, peace, and stability are historical anomalies. If you turn on your faucet and clean, potable water magically appears, if you take your trash to a location in front of or near your home and people reliably pick up that trash and take it somewhere safe, if you never (or almost never) have to look at or smell your sewage again after you flush your toilet, then your life is good. If all of the above circumstances apply to you, then your life is top-notch, by historical standards.

You won the lottery.

My philosophy is that people who have won this historical lottery should be cautious about insisting on the need for radical change, lest they get what they ask for. Radical change usually leads to more radical change and that radical change tends to end badly, or worse still, to never end.


Of course, sometimes revolutions work. With the possible exception of Japan, nearly every nation is where it is today because some revolution took place.

But nearly all revolutions were followed by a period of instability or complete chaos. In the American Revolution, the government of the Articles of Confederation would have collapsed were it not for the many compromises (many of which now seem horribly fraught) agreed to by the US Constitution. And the US Constitution set up a government that wasn’t all that different from the constitutional monarchy that the United States had previously taken great pains to escape.

The French Revolution was followed by the reign of terror, and then the reign of Napoleon, which led to the death of hundreds of thousands of French (and many more English, Prussian, Russian, and Egyptian soldiers). The Irish Revolution was followed by a civil war. The Russian Revolution was followed by a 70-year murderous, autocratic state, which included the purges of an estimated 30 million people and the brutal forced starvation of many millions more in the Ukraine. The Cultural Revolution in China might have been the worst of them all.


This is not to suggest that the opposite of revolutionary change—stagnation or unabashed conservatism—is a preferred approach, either. Individuals, systems, and organizations can always get better. And they should try to get better whenever they can.

Since life is competitive and most systems evolve over time, failure to improve as a principled approach will lead to obsolescence. If you don’t get better and grow, those around you will, and you will fall behind. Casey Neistat has a nice video illustration of his philosophy on this.

And the more extreme version of this philosophy, that espoused by reactionaries, the philosophy of trying not only to stop growth but to go backwards in time, well that’s even harder to fathom.

Ours is the greatest period of prosperity in human history. The average life expectancy today in every country is better than it was in any country in 1800. Women couldn’t vote in the United States until 100 years ago. Blacks couldn’t vote here reliably until about 50 years ago. Until a few weeks ago, no openly gay person had held the office of governor in any state in the United States.

Why would we to undo any of that?


For a brief while, I considered becoming a professional poker player. After six years of practicing law, this seemed like a good way to make a living. I was decent, but not great. I made modest amounts of money regularly. But what I discovered after playing a lot of poker was that playing poker well is incredibly boring, and, as hard as it may be to believe, perhaps even more boring than the practice of law.

If you watch poker on TV, there’s a lot of emphasis on the handful of times when a player goes “all in,” or bets all of their chips on one particular hand. The reality is that in real life, this happens infrequently. This is because good poker players only make huge commitments of their chips when they are extremely confident that they will win or when circumstances (such as tournament timing and increasing chip blinds) demand that they do so.

Absent such extraordinary circumstances, poker players who are concerned about their bottom lines bet infrequently and make small but aggressive, well-timed bets.

Make big bets too often at the wrong times and you’ll go broke. Make no bets and you’ll go broke. Make small, smart bets often at the right times, and you’ll do well.

The goal of a professional poker player is to make more rational bets than the sum of the other players against whom they are playing. If they do that enough over time, they’ll make money. If they don’t, they won’t. Do this over the course of tens of thousands of bets and you’ll make hundreds of thousands, in not millions, of dollars.


Incrementalism is a modest hypothesis with the following tenets:

  • For any existing system, the best way to improve the system is to make small changes to the system, assess their impact, and then re-evaluate.
  • The fact that a system exists at all is a good indication that is a better than average or random system—by the simple virtue that it has survived, adapted, and evolved for some period of time.
  • The longer the system has existed, the more likely it is to be better than average or random.[1]
  • Radical change that forces the system to change more than in maintains is likely to have a net negative impact on the system.[2]


The moment that made me decide not to play poker as much anymore was a time when I was sitting at a table with a particularly bad poker player in Blackhawk, Colorado. The man was probably in his 70s and had a breathing tube.[3] He joined a table where half of the people knew each other, and were either on the payroll of the casino or were semi-professional players.

He kept making bad decision after bad decision, and he was not aware what he was up against. Put simply, most of us were playing poker, and he was gambling. He had watched the poker he had seen on TV and had convinced himself that poker was about betting aggressively, regardless of what cards you had in your hand. He thought poker was mostly about bluffing with sunglasses, and that it didn’t matter if all signals pointed to the opponent having the better had.

He had a relatively small stack of chips and a big wad of cash. He’d bet aggressively, lose it all, and then go get more chips. He did this at least a half dozen times.

For this man, every hand was all or nothing. He wanted to get rich on every single hand. If he didn’t, he cursed himself and his bad luck. The longer he did this, the farther away he got from the financial position where he started.

The pros are in the business of chipping away. Making one incremental good decision after another until, over time, due the magic of compounding growth, their stack grows exponentially. The amateur tries to win it all, without patience or diligence, and without putting in the work. And the amateur who does that almost always ends up going home broke.


To use a different analogy, a 22-year-old novice runner might get inspired around New Years Eve and decide that they want to run in the Olympics some day. That runner might read that to get to the Olympics, a marathon runner typically runs 120 miles a week with three hard workouts each week, and try to replicate that strategy. But without 7-10 years of running to build up to that workload, that strategy will invariably result in injury, illness, and breakdown. A novice runner attempting a quarter of that workload would likely implode. To anyone with any knowledge of running, this observation is obvious and un-insightful.

But for something much more involved or complex than becoming a good runner, such as creating a new system for healthcare distribution in this country, there are some reasonably intelligent people that think that a complete overhaul to the system could work out. That’s far less likely, in my opinion, than someone jumping off the couch and making an Olympic team.



There are lots of people out there selling the idea of the simple solution to all of your problems. It’s incredibly appealing to think that if we do just one simple thing, that if we just find one hack or trick, that all of the other things we do will get easier.

That’s why blog posts and YouTube videos with titles like “the ONE thing you need to do to get 6-pack abs” are so popular. We want to think that there’s just one thing we can do to fix our insecurities and make our abs look sexy, when in reality the “one thing” we need to do is to exercise obsessively, to eat healthier, and to have the right genes (and perhaps shave our chests and get spray tans).

This business of selling simple solutions is great for getting clicks, but it’s not as great for the people doing the clicking.

People crave simple solutions so much that they will flock to those who offer simple solutions to hard problems even when those solutions are obviously false.[4]

But the hard truth is that for individuals to get better you have to get a little better over time. For institutions to get better, they must work to solve the hard problems that they have not yet solved. If they’ve been around for a while, the solutions to those hard problems will likely be hard, technocratic, complex, and dull.


Of course, there are moments in history when only a dramatic change will do. An easy example of this is human slavery in the United States before 1865. There was no step-by-step, incrementalist way to solve that horrific affront to human dignity. It had to be done in one fell swoop.

That’s what happened, starting with the most violent and deadly war in American history, which was then followed by a more than 100-year period of violence and struggle, the remnants of which are still around today.

Even when dramatic change needs to happen, history tells us it won’t come easy.


To me, it seems obvious that getting better as a person or as an organization is something that requires diligence, patience, and consistent effort. And, most importantly, it requires building on the lessons we have learned in the past. [5]

But I get the feeling that more people, on the right and the left, are calling for radical solutions to our problems, ones that discard the lessons we have learned from the past. I see people saying that our society is going to hell and that we need something other than incremental change to get better. Reasonably intelligent people are claiming that our society is in collapse. That we need to throw out everything and try something new.


Right now, I’m reading a book called Red Famine, by Anne Applebaum.

I’ve read a lot of depressing books about horrible things that have happened in history, but man, this one is truly brutal.

If you do not know about the Ukrainian famine of the 1930s (I was only vaguely familiar before reading the book), you could be forgiven. It’s not something that gets much press. Stalin and the Soviet Union pretended it didn’t happen for 50 years. And the Western press (and Walter Duranty of the New York Times, in particular) was complicit in covering it up.

The short version is that the Soviet Union’s policy of collective farming in the early 1930s was a disaster. And because it was a disaster, and a decision made by Stalin, it wasn’t possible to admit it was a disaster. Instead, someone had to be blamed. The Soviets blamed Ukrainians, the biggest farming community within the Soviet Union, for hoarding food and refusing to produce enough bread. And the decision was made to actively rob and steal any food possessed by these Ukrainian farmers and give it to the rest of the country (and those loyal to the party, in particular).

In the words of Applebaum:

The result was a catastrophe: At least 5 million people perished of hunger between 1931 and 1934 all across the Soviet Union. Among them were more than 3.9 million Ukrainians. In acknowledgement of its scale, the famine of 1932–3 was described in émigré publications at the time and later as the Holodomor, a term derived from the Ukrainian words for hunger—holod—and extermination—mor.

Examples of how this was enforced are truly horrific.

Our father hid three buckets of barley in the attic and our mother stealthily made porridge in the evening to keep us alive. Then somebody must have denounced us, they took everything and brutally beat our father for not giving up that barley during the searches…they held his fingers and slammed the door to break them, they swore at him and kicked him on the floor. It left us numb to see him beaten and sworn at like that, we were a proper family, always spoke quietly in our father’s presence…

A brigade searching through the roof thatch at the home of Hryhorii Moroz in Sumy province failed to find any food and demanded to know: “With the help of what do you live?” With each passing day, demands became angrier, the language ruder: Why haven’t you disappeared yet? Why haven’t you dropped dead yet? Why are you alive at all? 

The effects this had on the population were predictable.

One railway employee, Oleksandr Honcharenko, remembered “walking along the railroad tracks every morning on the way to work, I would come upon two or three corpses daily, but I would step over them and continue walking. The famine had robbed me of my conscience, human soul and feelings. Stepping over corpses I felt absolutely nothing, as if I were stepping over logs.” Petro Mostovyi remembered the beggars who came to his village seemed “like ghosts,” sat down beside roads or under fences—and died.


Often, in politics, “the simple solution to our problem” is to get rid of a scapegoat.

It’s much easier to blame an outsider for our problems than it is to actually solve a problem. Immigrants. Tech magnates. Conservatives. Liberals. Capitalists. Socialists.

In a pluralistic society, it’s almost never true that one sector of society is responsible for all of our problems.[6]

So when people try to pretend that they are, it isn’t just dangerous rhetoric; it’s language that has the potential to destroy the entire system.


Another factor that weighs in favor of incrementalism, regardless of whether you like the idea, is the critical role of habits and inertia in human activity.

As philosopher John Dewey wrote nearly 100 years ago. “[H]abits of thought outlive modifications in habits of overt action,” he said.

Dewey was skeptical of rapid-scale transformation of the “short-cut revolutionist”who did not appreciate the power of habit:

Any one with knowledge of the stability and force of habit will hesitate to propose or prophesy rapid and sweeping social changes. A social revolution may effect abrupt and deep alterations in external customs, in legal and political institutions. But the habits that are behind these institutions … are not so easily modified.

This is why attempts to impose Western-style or Western-like institutions in countries that have no history of such institutions invariably fail. Our institutions have evolved over hundreds, if not thousands, of years. From the Magna Carta to the UK Constitution of 1689 to the US Constitution of 1789, followed by 230 years of judicial precedent. We have habits and inertia behind these institutions. You can’t just drop that history on to a country that has no experience of it and expect them to replicate what we have. A place like Afghanistan or Iraq cannot just wake up from a history of autocracy and transition to representative democracy overnight. Whatever systems of law and justice existed before, however brutal they might seem to us, will persist. You can’t just get rid of that history and expect things to work.

Stalin may have wanted to implement collectivism in the Ukraine and elsewhere in the Soviet Union, but farmers in the Soviet Union were unable to overcome the deeply held belief that workers should reap the benefits of their own hard work. When farmers were asked to produce grain, and were asked to do so without any food being given to their own families, they simply stopped working. Which led to less grain being produced, and a deeper famine. Instead of working on collective farms, peasants moved to cities or took to the countryside scrounging for food.

You can’t just undo the way people have always done things in a single action and expect it to go well.


I think a major issue today is that many people lack real perspective on what collapse, injustice, and tragedy really are—on how bad institutions can be.

The suffering from Holodomor, the Ukrainian famine, was not just preventable: Stalin and the Soviet Union made conscious decisions that killed millions of people, with full knowledge that those decisions would kill many people. The Soviets actively engaged in systematic practices of theft, arrest, murder, and torture to ensure that poor peasants starved to death. The famine was caused by a government that thought it could throw out the way things have always been done and make things better. When they didn’t get better, the government doubled down on their bad decisions and tried to force-feed radical change on to people who weren’t ready for it. As a result, millions starved to death.

Our system of government is imperfect. But by historical standards, our institutions are excellent. We should not forget that fact. We have more privilege, luxury, and good fortune than any other society in human history. Throwing it all away, simply because it isn’t perfect, is as bad of an idea as could be proposed.


Incremental improvement isn’t sexy. It doesn’t do well on a meme or at a protest march. It probably won’t get you clicks. It’s technocratic and tedious.[7]

Incrementalism is not about inventing the wheel. It’s about building a better wheel. And not convincing ourselves that to build a better wheel, we first have to destroy all existing wheels.

[1] If you meet someone who tells you that they have a great investment system, and they do something other than investing for a living, you should be wary of that person. On the other hand, if you know someone who has made a living off of markets or betting (other than just the fees associated with investing other people’s money) for a long period of time, that person might be worth listening to.

[2] For a more academic articulation of a similar thesis, I recommend Charles Lindblom’s The Science of Muddling Through.

[3] Turns out, taking money from unhappy, addicted people who are making bad decisions right in front of you is not a very satisfying way of earning a living.

[4] While the current incarnation of the “life hack” may be new, the idea of selling simple solutions to complex and hard problems is almost certainly nothing new. People were literally selling snake oil 100 years ago and I’m sure they’ve been selling miracle cures to hard problems for at least a few orders of magnitude longer than that.

[5] This philosophy is consistent with the game theoretic conclusions in Darwin’s Unfinished Symphony, the primary thesis of which is that strategic copying outperforms innovation.

[6] Although in an autocratic system, sometimes it’s easy enough to pinpoint the source of the problem: the autocrats.

[7] For a smart read on how good governance out to be done, a great place to start is The Cost-Benefit Revolution, by Cass Sunstein, who worked in both the Reagan and Obama administrations.