Entropy: The improbability of order (oh, and the miracle of life itself)

“The ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy and carve out refuges of beneficial order.”

Steven Pinker, The Second Law of Thermodynamics (2017)

In the realm of thermodynamics, the term “entropy” represents the measure of disorder or randomness in a system. While the first law of thermodynamics tells us that the total energy of the universe remains constant (the principle of the conservation of energy), this energy becomes progressively less useful as it is used up and spread out. The second law of thermodynamics says that the total disorder (entropy) of the universe increases over time. Energy is not perfectly recyclable; something is always lost.

Entropy may not be intuitive, but its implications ripple across our understanding of the physical world, social systems (such as companies or families), and even the emergence of life on Earth!

Just as a sandcastle will inevitably erode without constant maintenance, so too do the systems in our lives require energy and attention to stave off disorder.

Irreversibility and chaos

A great way to understand why entropy always increases is by exploring a fundamental property of nature: irreversibility. Most processes cannot be perfectly undone. We would struggle to un-cook an egg, to un-mix two combined paint colors, to un-burn firewood, or to un-birth a child.

Imagine applying heat to an ice cube. As the temperature rises, the rigid, frozen water molecules will begin to loosen and scatter as the water goes through phase transitions of melting and eventually vaporizing. Total entropy (disorder) has increased through this process, as some heat (useful energy) was dissipated to cause the phase transitions. We can re-freeze the water vapor by cooling it, but this will require additional energy use, further increasing entropy.1

Now imagine we simply place our ice cube on the kitchen counter. The warmer counter will transfer some heat to the ice, until their temperatures equalize. Because the atoms in a hotter substance—by definition—are moving more quickly than those in a colder substance, heat tends overwhelmingly to flow in one direction: from a hotter place to a cooler place. The ice cube does not transfer “coldness” to the counter. Critically, this one-directional property means that we are unable to perfectly “reverse” most heat transfers.2

Imagine if physical processes—such as melting an ice cube or operating a car engine—were perfectly reversible: we would have infinitely recyclable energy, and there would be no entropy. But in reality, complex natural processes are generally irreversible, meaning that they involve at least some expenditure of useful energy. Consider how even an incredibly efficient car engine requires a non-zero amount of fuel, electricity, or friction to function.

This is thermodynamics’ key limitation: because some useful energy must be expended to do work, no process can be perfectly reversed, and entropy must increase.3

The improbability of order

Nature’s tendency towards disorder, whether with ice cubes or engines or businesses, can also be understood through probabilities. Processes tend to move from less probable to more probable states. Because the number of ways a system can be ordered is far smaller than the number of ways it can be disordered, disorder is inevitable.

Imagine a deck of cards thrown on the ground. How likely is it that the cards land in an ordered fashion? There are more than 8×1067 ways to arrange a 52-card deck, so the probability of any random arrangement is miniscule. Chaos is much more common. In fact, the change from an ordered arrangement to a disordered one is the source of irreversibility: work must be done in order to re-impose an ordered state on the system!4

Carving out refuges of order

Just as we must do more work to re-freeze our ice cube after it inevitably melts, we must constantly expend effort to create and maintain order in our lives, as time tips them towards chaos and disorder. This includes the condition of our homes, relationships, companies, and physical health.

Bottom line: because disorder is always increasing, complacency will eventually lead to failure.

Though invoking thermodynamical laws in everyday life may not earn us many “cool points,” we can drastically improve our decision making by understanding entropy in the context of social systems. For a social system such as an organization or relationship to be productive, it must be in some useful order, and organizing the people and activities into a useful state requires us to actively invest energy and resources.

In business, companies that don’t work constantly to cultivate and evolve their purpose, structure, and processes, will inevitably stagnate. Over time, bureaucracy and decay will seep in, as the gap widens between the company’s goals and the activities of its employees. Lines of responsibility will blur, and the product portfolio will become bloated. To stave off entropy, management must periodically redesign processes and reallocate resources to ensure the company channels its competitive energy outward, not inward.5

Life’s dance with disorder

At a cosmic scale, entropy tells us that all things will fall apart eventually, from our own Sun to the entire Milky Way. This is not something to fear; in fact, the law of entropy can help us appreciate the “miracle” of life on our own planet.

If disorder is always increasing in a closed system, how can it be that highly “ordered” organic life and ecosystems have evolved on Earth? The answer stems from the fact that our planet is not a closed system, but a sub-system within a larger solar system, within a larger galaxy, and so on.

While disorder always increases in the universe at large, order can emerge in smaller systems (such as the Earth) which feed off an outside energy source. Our energy source is the Sun, whose dissipated heat helps create the unique conditions on Earth that enable complex organic systems to emerge.6

In this sense, the universe’s march towards disorder is the very source of the rare, beautiful order of our planet!

***

In conclusion, entropy explains why we are seemingly always battling against forces messing things up in our lives. It also serves as a celebration of the precious and transient order that makes life possible in the first place. There is perhaps no better motivation for us to create and sustain order in the families, communities, and organizations that we cherish.

Phase Transitions: Uncovering the hidden tipping points that change everything

A phase transition is the process of change between different states of a system or substance, which occurs when one or more of the system’s control parameters crosses a “critical” threshold.

We tend to take stability for granted, leading us to be caught off-guard when the ground shifts between our feet. By better understanding the dynamics of phase transitions, we can learn to anticipate change and manage it to our advantage.

From stability to phase change

As a simple example, consider an ice cube (a solid), whose key control parameter is temperature. If we apply heat to the ice, the temperature will rise, and the frozen water molecules being held in rigid formation by binding forces will begin to scatter as the water goes through phase transitions of melting (into liquid) and eventually vaporizing (into gas).

Between each phase, there is a range of temperatures in which the state of the system remains stable. It is only once its temperature crosses certain critical thresholds (specifically, 0° and 100° C) that it enters a phase transition.

In fact, this logic helps explain changes in the “phases of matter” (solid, liquid, gas) for all kinds of physical substances as their temperatures fluctuate.1

For us, the real value of the phase transition concept lies in its applicability not only to physical systems (changes in phases of matter), but also to social systems (changes in phases of behavior). In both types, the whole is not only more than the sum of its parts, but it is very different from its parts.

In complex systems, we can’t analyze one component and predict how the whole system will behave, whether it’s one water molecule in a boiling pot, one employee in a company, etc. In each ease, we need to consider the system—its collective behaviors, including the control parameters that can tip it into unpredictable phase transitions.

Avalanches of peace

Imagine dropping grains of sand, one-by-one, onto a countertop. A pile will gradually form, and its slope (the key control parameter) will increase. For a time, each additional grain has minimal effect; the pile remains approximately in equilibrium.

Eventually, however, the pile’s slope will increase to an unstable “critical” threshold, beyond which the next grain may cause an avalanche (a type of phase transition).

Near the critical point, there is no way to tell whether the next grain will cause an avalanche, or how big that avalanche will be. All we know is that the probability of an avalanche is much higher beyond the threshold, and that avalanches of any size are possible, though smaller avalanches will happen much more frequently.

Through a series of avalanches, each of which widens the base of the pile, the sandpile (an inanimate complex system) “adapts” itself to maintain overall stability!

Research has observed the stabilizing role of frequent smaller “avalanches” in a variety of systems, including the extinction of species in nature, price bubbles and bursts in financial markets, traffic jams, forest fires, and earthquakes relieving pressure from grinding tectonic plates.2 These systems unconsciously “self-organize” towards a critical state, beyond which they undergo phase transitions that help preserve equilibrium, over time.

Growing to death

For those of us who work in organizations, the science of phase transitions provides insight into how to nurture innovation, and how to avoid destructive breakpoints.

Just as avalanches occur with a steep enough pile, high-performing teams in every creative industry unexpectedly shift as they grow over time, often to their own detriment. Consider the fate of former industry titans who, once on top, failed to adapt to a shifting technological landscape—PanAm, Nokia, Kodak, Blockbuster, and so on.

“Growth will break your company.”

Tony Fadell, Build (2022, pg. 242)

Team size is a key control parameter. With small teams (up to ~15 people), every member’s stake in the results of the project are very high. There’s no need for management. Communication happens naturally. Up to 40-50 people, some silos and sub-teams begin to form, but individual stakes remain high, and most interactions remain informal.

However, as teams and companies scale (particularly beyond 120-150 people), individual stakes in project outcomes decline, while the perks of rank (job titles, salary growth) increase until, when the two cross, the system “snaps” and incentives begin encouraging unwanted behavior: the rejection of risky but potentially groundbreaking ideas.3

Moreover, layers of management form, information becomes siloed, jobs become more specialized, culture drifts, and people-related issues explode. Further growth magnifies these problems.

Managing these transitions successfully requires careful design of incentives, org structure, and roles and responsibilities. To do so, we should consider a portfolio of preemptive actions:4,5

  1. Increase “project-skill fit” — Search for and correct mismatches between employee skills and project needs. Employees who are well-matched to their assignments will take more ownership of the outcomes.
  2. Non-political promotions — In promotion decisions, emphasize independent assessment from multiple sources over politics (reliance on the manager).
  3. Get the incentives right — Motivate others with “soft equity” rewards such as peer recognition, autonomy, and visibility. External rewards such as money should be based on milestones or outcomes that individuals can actually control. Too many incentive structures rely on perverse schemes such as earnings-based compensation for junior employees, who have no direct influence on those metrics.
  4. Decentralize — Once you have multiple products, you will need to split your org into individual product groups, sort of “mini-startups” within the business that are more nimble and autonomous.
  5. Optimize “management spans” — For teams focused on innovative new projects (R&D), consider increasing the average number of direct reports per manager to encourage looser controls and more trial-and-error. For “franchise” groups that focus on growing existing businesses, consider narrowing management spans to encourage tighter controls and measurable outcomes (since failure is more costly).

***

Whether in an organization, relationship, laboratory, or highway, we can improve our decision-making by investigating the control parameters governing the stability of our system—and the thresholds beyond which chaotic phase transitions may occur.

We cannot simply extrapolate linearly from the present. Even if our company, our marriage, or our environment is stable at the moment, small changes in critical factors can create unpredictable change. We can either be surprised by the pervasive phase transitions in our lives, or we can anticipate them and harness them to our advantage.

Emergence: More is different, very different

“We can’t control systems or figure them out. But we can dance with them!”

Donella Meadows, Thinking in Systems (2008, pg. 170)

In complex systems such as human beings, companies, or ecosystems, collective behaviors can create dynamics that cannot be defined or explained by studying the parts on their own. These “emergent” behaviors result from the interactions between the lower-level components of a system and the feedback loops between them.

For example, in our brains, groups of neurons exchanging disordered electrical signals create incredible phenomena, such as senses and memory. Collections of employees swapping information and favors within a company can create factions and hidden bargains. In nature, examples of emergent collective behaviors are everywhere, such as birds flocking, hurricanes or sand dunes forming, social network development, the evolution of life, climate change, the formation of galaxies and stars and planets, and the development of consciousness in an infant.

In all of these examples, the system is not only more than the sum of its parts, it is also very different from those parts. We cannot analyze a single skin cell and infer someone’s personality. Nor can we analyze one employee and infer the behavior of the organization.

If we aspire to operate effectively in complex systems, we cannot afford to underestimate the possibility and power of emergent phenomena. The complex ongoing interactions between system components will cause unique and unexpected behaviors. Developing a better understanding of emergence can not only help us prepare to be unprepared, but also to create truly unique explanations and solutions.

Emergent explanations

Imagine how difficult it would be if we could only learn reductively—that is, by analyzing things into their constituent parts, such as atoms. Even the most basic, everyday events would be overwhelmingly complex. For example, if we put a pot of water over a hot stove, all the world’s supercomputers working for millions of years could not accurately compute what each individual water molecule will do.

With emergent phenomena, however, high-level simplicity “emerges” from low-level complexity. As a result, we may be able to understand systems extremely well and make useful predictions by analyzing phenomena abstractly—that is, at a higher (emergent) level. In fact, as theoretical physicist David Deutsch explains, all knowledge-creation actually depends on emergent phenomena—problems that become explicable when analyzed at a higher level of abstraction.1

Consider thermodynamics, the physical laws governing the behavior of heat and energy. These powerful and fundamental laws do not attempt to describe the world in terms of particles or atoms. They are abstract, describing physical phenomena in terms of their higher-level properties, such as temperature and pressure.

Thermodynamics can help us understand why water turns into ice at low temperatures and gas at high temperatures, without requiring us to analyze the details of each individual water molecule. It is due to the emergent phenomena of phase transitions, sudden transformations that occur when one or more control parameters of a collective system cross a critical threshold. At certain temperatures, the water will freeze, melt, or vaporize.2

Merely analyzing individual water molecules would not enable us to simply “deduce” how these phase transitions happen—or indeed whether they would occur at all. Phase transitions are emergent phenomena.

The emergent (crypto) economy

The economy is an intricate web of behaviors and relationships, emerging from the actions of diverse agents—individuals, companies, investors, regulators—each concurrently pursuing their own goals. These agents are not merely “participants” in the economy; they co-create its dynamics. A great example is the emergence of cryptocurrency markets such as bitcoin.

Imagine a small group of techies that starts investing in bitcoin, driven by a belief that its price will rise in the future. If bitcoin’s value indeed rises, news will spread, potentially igniting a viral wave of speculation. The influx is not just about making money, but also about joining a movement. Stories of newly minted millionaires fuel additional speculation and even attract criminals and fraudsters seeking to exploit the frenzy. Concerned regulators may institute game-changing rules to quell the madness.

Here, thousands of decisions by diverse individuals converge to create a market behavior that is far different than the sum of its parts: a bubble—which is clearly not in the collective interest. Characterized by speculation and volatility, the bubble is a cycle of elation and fear, gains and losses, that transcends individual intentions.

We observe such emergent economic phenomena not only in cryptocurrencies, but also in stock market oscillations, housing bubbles, and even the rise and fall of entire economies. In these systems, predictability is an illusion and periods of stability are tentative, as economic agents constantly respond to and anticipate changes in the dynamic landscape they themselves help shape.3

Managing emergent behavior in economics requires an understanding that plausible-sounding approaches might fail—or backfire entirely.

Dancing with fire

In economics as well as nature, emergent behavior can cause rapid change, and managing it may require counterintuitive interventions.

Consider forest fires. The destructiveness of forest fires follows a power-law distribution, in which there are many small fires that are generally manageable or tolerable, and a few massive, catastrophic fires.

Until 1972, Yellowstone Park rangers were required to extinguish every small fire immediately—a policy that made sense when analyzed reductively (“We should save every tree.”). But this policy led Yellowstone to grow dense with old trees, making a mega-fire like the one in 1988 inevitable. The reductive policy ignored the emergent phenomena of phase transitions: as a forest approaches critical thresholds of tree density and tree age, the potential for a massive fire grows exponentially. Today, most forestry services utilize a “controlled-burn” policy of intentionally sparking smaller fires under careful watch, reducing the potential for a massive fire.4

We must be aware when the potential for emergent behavior exists—that is, any time we are engaged with complex systems. Our focus should not be on trying to perfectly predict the future (which invites us into traps of overconfidence and illusions of control), but rather on being adaptable, to ensure we’re prepared to respond to the unexpected.

***

Systems such as the economy, democracy, science, and forests emerged through centuries of iteration and adaptation. The most complex systems emerge and function without anyone having knowledge of the whole.5

Consequently, we should be skeptical about the potential efficacy of simplistic policies and predictions targeted at complex systems. It is impossible to perfectly anticipate the system’s behavior. What we can do is diligently attempt to observe a system’s dynamics, map out its feedback loops, identify and intervene at key leverage points, and be prepared to learn and adapt as emergent behavior unfolds.

Inertia: Things keep moving and change is hard

If an object is left alone, if no other force (such as friction) acts upon it, it will maintain its current state of motion. Objects already in motion will continue to move forward with a constant velocity, and stationary objects will continue to stand still—unless another force intervenes.1 The more massive the object, the greater the tendency to resist changes in its motion.

The principle of inertia is a fundamental physical rule of motion, first discovered by Galileo and later codified as Isaac Newton’s “First Law” of gravity.

More broadly, we can observe inertia effects in the behavior of individuals, systems, organizations, and relationships. Whether obviously or subtly, effects of inertia are pervasive, and leveraging their power can be an effective strategic tool.

Stasis: the easier way forward

Correcting or reversing our current course is costly and effortful. The status quo is easier. Following the forces of inertia allows us to minimize the use of energy, but it can also lead us towards stagnation and decay as the environment shifts beneath our feet.

Consider how we tend to succumb to inertia by continuing to use obsolete technology standards even after new, better technologies have been introduced. For example, the original “QWERTY” keyboard layout has persisted for more than a century, primarily because people have simply become accustomed to it. QWERTY endures despite the fact that an alternative system called the “Dvorak” layout enables more than double the share of keystrokes to be done in the home row and requires about 37% less finger motion than QWERTY.2 Dvorak was too late; inertia prevails.

Recognizing inertia and either combating it or capitalizing on it can be a very productive endeavor, especially in competitive systems such as business, where adaptability and dynamism are key.

Expunging (or exploiting) inertia in business

“There is nothing more difficult… than to take the lead in the introduction of change. Because the innovator has for enemies all those who have done well under the old conditions and lukewarm defenders in those who may benefit under the new.”

Niccolo Machiavelli, The Prince (1532)

In organizations (especially larger ones), inertia ensures that change is difficult. Because people generally fear uncertainty and prefer the status quo, they tend to resist new norms or strategies that undermine their existing responsibilities and routines.

As with resisting the forces of entropy (the universal tendency towards disorder in the physical world), combating inertia requires an outside energy injection.

Let’s consider four key types of inertia in business, each of which can present a threat or a strategic opportunity.

1. Routine inertia

Inertia can live in obsolete or inefficient routines, such as excessively large meetings or complex approval processes. These behaviors are often addressable by retraining or replacing managers who have invested many years in developing and applying the obsolete processes, as well as by reorganizing business units around new patterns of information flow. The routines that worked in the past may be wildly inappropriate for future contexts.3

2. Cultural inertia

Moreover, long-established cultural behaviors can prevent companies from promptly responding to competitive threats. Breaking cultural inertia demands simplification to eliminate the hidden inefficiencies buried beneath complex behaviors and back-door bargains between teams. Simplification may demand eliminating excessive administration functions, non-core operations, coordinating committees, or complex initiatives. It may require breaking up entire organizational units, or even reorienting the company entirely towards a redefined strategy.4

For example, in 2021, Volkswagen—then the world’s largest carmaker—had outspent all rivals in a race to beat Tesla in the development of electric vehicles. Volkswagen attributed its struggles to make an attractive electric vehicle in part to the failure of company’s managers, lulled into complacency by years of high profitability, to recognize that electric vehicles are more about software than hardware. The culture and competencies that enabled them to produce exquisitely engineered gas vehicles did not translate into coding prowess. The company’s CEO eventually acknowledged, “VW must completely change.”5 He was right.

3. Customers’ inertia

Consumers themselves also exhibit inertia by generally following their past behaviors. For example, we tend to keep the same bank accounts and to auto-renew our insurance policies and subscriptions—generally without conscious choice.6

The big banks know this, and make massive profits as a result. One analysis estimated that U.S. savers missed out on more than $600bn in interest payments from 2014-22 by keeping their savings in the five biggest banks instead of in higher-yield money-market accounts that paid over 10x higher interest rates.7 The giants are banking on inertia: that their customers are unlikely to investigate the alternatives and migrate their accounts to other banks, so there’s no need to offer competitive interest rates.

4. Competitors’ inertia

Business strategist Hamilton Helmer coined the term “counter-positioning” to describe the strategy in which a new entrant adopts a novel business model that would be irrational for an incumbent to mimic (capitalizing on their inertia). If copying a new product or technology would mean undermining their legacy profit streams, incumbents may calculate that inertia is preferable.

For example, around 2000, Netflix pioneered the mail-order DVD business based on the key assumption that Blockbuster, the then-dominant brick-and-mortar DVD rental giant, would be slow to recognize and respond to the threat of Netflix’s new model, allowing Netflix to steal away their customers who were tired of paying $1/day late fees.8 Eventually in 2004, Blockbuster did launch its own DVD subscription service, Blockbuster Online—but it was too late. The costly launch exacerbated Blockbuster’s financial distress. In desperation, Blockbuster actually pivoted back to its brick-and-mortar business.9 By 2010, Blockbuster had filed for bankruptcy, while Netflix went on to become one of the century’s best-performing stocks.

***

In general, the principle of inertia reminds us to be intentional, and to avoid complacency. Deciding not to make a change is making a decision—to preserve the status quo. But what we did in the past may or may not have been optimal at the time, and may be entirely suboptimal in the future. We must be vigilant in reviewing the choices—conscious or otherwise—that we make to keep things as they are to ensure we are adapting appropriately as the complex systems we live in shift beneath our feet.

Randomness: Harnessing the chaos

“Here, on the edge of what we know, in contact with the ocean of the unknown, shines the mystery and beauty of the world. And it’s breathtaking.”

Carlo Rovelli, Seven Brief Lessons on Physics (2016, pg. 81)

Despite our human programming to detect patterns and seek causes or explanations for everything we observe, many events in the world are simply chaotic and unpredictable. Failures to properly account for randomness lead us astray constantly, especially when we are operating in complex systems such as an economy, company, country, or ecosystem.

Our human tendency to craft neat, linear narratives about cause-and-effect can fool us into identifying causal connections between events where none actually exists (such as a relationship between astrological signs and personality traits). It also leads us to naively extrapolate that what has happened in the past will continue into the future. In our interpersonal interactions, we tend to over-attribute people’s behavior to inherent characteristics, versus circumstantial factors or chance. Overall, these fallacies give us false confidence that things are more predictable and explainable than they really are.1

“We are far too willing to reject the belief that much of what we see in life is random.”

Daniel Kahneman, Thinking, Fast and Slow (2011, pg. 117)

But we can, sometimes, harness the chaos. Randomness can work to our advantage, whether in business, computer science, statistics, or—indeed—in the evolution of all life forms. But first…

Why so random?

Is the world inherently random and unpredictable? Physics offers some intriguing insights.

In classical physics, which describes everyday events such as rolling billiard balls and orbiting planets, “random” behavior can emerge from phenomena that are completely orderly and predictable—at least in theory. The problem in practice is twofold. First, perfect prediction requires a flawless understanding of the laws of nature, which we may never achieve. Second, it also requires an impossibly precise knowledge of the system’s initial conditions.2 Whether we’re measuring the motion of a billiards ball or a planet, no instrument can provide infinite precision. Approximation is our best hope. With time, even small errors in these specifications can lead to huge errors in the prediction (the “butterfly effect”). For this reason, many events—the weather, stock market, highway traffic—may appear “random” simply because we can’t gather and process data quickly enough to predict them.

However, in quantum physics, the other pillar of modern physics, which studies microscopic phenomena, unpredictability may go deeper. The observed behaviors of the universe’s most basic particles are notoriously random. Even if we had perfect information of their initial positions and velocities, we can only make probabilistic predictions of where they will go.3 The universe, it seems, will always be full of surprises!

Taming randomness with numbers

But the value of randomness is not limited to the cautionary tale that “we can’t perfectly predict things.” Randomness is a versatile, multidisciplinary mental asset—particularly in the field of statistics.

While individual random events (particle movements, coin flips) are unpredictable, if we know the “distribution” of the underlying data, the probability of different outcomes over a large enough sample size becomes predictable. This principle lies at the heart of statistics, producing tools such as the well-known normal distribution, which can help us quantify uncertainty and make useful inferences and predictions even for random events. In quantum physics, we can predict the probability distribution of a particle’s movements with remarkable accuracy, but we can never be certain of its exact behavior on any particular observation.

When faced with a problem too complex to be understood directly, one of the best tools we have to begin to untangle it is to collect a random sample and closely study the results. In scientific experiments aiming to assess causality (for example, whether a new dieting method causes weight loss), randomness is a critical ingredient. A valid experiment requires randomization both in (1) selecting a sample from the target population to study, and (2) assigning the subjects to “treatment” versus “control” groups. True randomization ensures that on average a sample resembles the population, enabling us to make valid inferences about that population.4

Without truly randomized sampling in our experiments, we are likely to generate biased and misleading results. If, for instance, all the subjects in our clinical trial were American adult men (as was the case for decades), our sample may not be representative of the patients we intend to treat.

Randomness is also the explanation behind regression to the mean: when there is some amount of randomness involved in an event, we should expect extreme outcomes to be followed by more moderate outcomes, because some extreme results are simply blind luck. And luck is transitory.

For example, because our body weight fluctuates daily, the heaviest participants in a new diet study are certainly more likely to have a consistent weight problem (an inherent trait), but they are also more likely to have been at the high-end of their weight range on the day we first weigh them (a random fluctuation). Therefore, the heaviest patients at the beginning of the study should, on average, be expected to lose some weight over time, regardless of the treatment being studied.5 To get a useful signal, we need to compare the results of the “treatment group” to those of a “control group” that did not try the diet. Otherwise, any “discovery” we make could simply be the (predictable) result of randomness!

Solving problems with chaos

Whenever we encounter a problem we’re not sure how to solve, injecting a bit of randomness into the process can often unearth unique and unexpected solutions. If the solution seems elusive, we should ask ourselves whether we can simply try something, learn from whatever happens, and adjust from there.

Nature itself has mastered the art of trial-and-error. In evolutionary biology, random variation in the copying of genes enables the incredible adaptations and life forms we observe in nature. First, the imperfect process of copying genes from parent to offspring creates random mutations, with no regard to what problems those variants might solve. Over time, nature will “select” for the genes most successful at causing themselves to be replicated in the future, such as those that cause better brain function in humans, prettier feathers in peacocks, or longer necks in giraffes.6

Remarkably, without any intentional “design,” randomness breathes complexity, resilience, and beauty into the world. Driven by evolutionary forces, incredibly complex systems—from human beings to organizational cultures to artificial intelligences—can emerge and function without anyone having consciously designed each of their elements.

In the realm of computer science, programmers have embraced randomness as a problem-solving tool. Randomized algorithms can prove extremely useful when we are stuck. For example, checking random values may help crack complex equations. Many effective “optimization” (or “hill-climbing”) algorithms apply random changes to improve the system whenever it looks like it might be stuck on a local peak. We could “jitter” the system with a few small changes, or we could apply a full “random-restart.”7 Netflix invented a useful resiliency-enhancing tool called “chaos monkey” which deletes random bits of code and shows you how your system reacts.8

***

Embracing randomness can help us to unlock creativity by exploring new ideas and approaches, to eliminate errors of causality, and even to better understand the natural world. We have a choice: we can be astonished or distressed or in denial that the world is unpredictable, or we can admit that we will never have perfect knowledge—and then turn randomness to our advantage.

Relativity: Perspective is paramount

One of the most foundational scientific concepts is the idea of realism, the common-sense view that there exists an objective physical reality independent of any individual’s own consciousness.1 But that doesn’t mean that everything appears the same to everyone; in fact, one of the deepest theories of physics tells us the exact opposite.

Albert Einstein’s groundbreaking theories of relativity demonstrated that time and distance are relative notions, and that they are really two parts of the same thing—a fourth dimension we refer to as “spacetime.” A point in spacetime, called an “event,” is both a location and a moment.

Einstein’s “special theory of relativity” (1905) challenged our intuitive understanding of time and space. According to relativity, observers moving at different speeds will get different answers when measuring lengths and durations.2 For instance, an observer who is moving quickly will experience time more slowly than one who is stationary. The watch of the moving observer would quite literally tick more slowly.3

In other words, time is relative. The only thing that remains fixed is the speed of light—around 186,000 miles per second in a vacuum.

The so-called “twin paradox” provides an incredible example of relative time. Imagine identical twin sisters together on Earth. If one twin hops in a rocket and flies directly away from Earth at a very high speed, while the other one remains stationary on Earth, when the spacefaring twin returns she will be younger than her sister on the ground!4

Even beyond physics, relativity as a concept teaches us that our everything depends on our individual vantage point, which rarely paints a complete picture.

Tricks of perspective

We experience relativity whenever we are riding in a car, train, or airplane. When we are traveling in smooth, linear motion at constant velocity, we will not perceive the speed of our movement without an external frame of reference (such as a window), whereas an outside observer could clearly observe this movement. This is also why we don’t “feel” the rotation of the Earth.

Suppose that you are a passenger on a rocket traveling at a uniform velocity and you toss a ball up in the air and then catch it. You will only perceive the vertical change in the ball’s position. Now suppose you are a stationary observer of the rocket passing by overhead (and you have special x-ray vision into the rocket). You will also notice the horizontal movement when the passenger tosses the ball as the rocket moves across your field of vision. The passenger will not notice this horizontal movement, because he and the ball are moving at the same velocity as the rocket!

Even the most beautiful theory is fallible

Observing that his theory of special relativity did not fit with Isaac Newton’s 200-year-old theory of gravity, Einstein printed another article in 1915 providing a complete solution: the “general theory of relativity,” one of the most powerful and elegant theories ever produced by mankind. This elegant theory posits that space can expand and contract, and that it curves in the presence of matter, such as planets or stars.5

Einstein’s groundbreaking theory transformed our understanding of physics and the universe, superseding Newton’s theory of mechanics and gravity that had dominated our thinking for centuries. This is perhaps the best example to remind us that every theory can be wrong. To this day, Newton’s predictions remain incredibly accurate for most practical circumstances; his theory was just incomplete.

“It was a shocking discovery, of course, that Newton’s laws were wrong, after all the years in which they seemed to be accurate. Of course it is clear, not that the experiments were wrong, but that they were done over only a limited range of velocities, so small that the relativistic effects would not have been evident.”

Richard Feynman, The Feynman Lectures on Physics (1963, Vol. I pgs. 16-1—16-2)

Despite its revolutionary impact, we already know that general relativity cannot be a complete description of reality. Although it explains the motions of larger objects (from planetary orbits to billiards ball trajectories) with great accuracy, it breaks down when describing the bizarre behaviors of microscopic particles. That realm belongs to quantum theory, the other pillar of physics, which—perplexingly—cannot explain gravity.

Fortunately, scientific knowledge progresses not by discovering unattainably perfect theories, but by the repeated toppling of our best theories by stronger, more testable, more unifying ones. We can simultaneously regard both general relativity and quantum theory as our best current explanations in their respective domains, and expect that both theories will be eventually be revised, unified, or replaced!

No single perspective rules

The concept of relativity holds profound implications beyond physics, particularly by underscoring the value of diverse perspectives in our everyday thinking and decision making, including in social systems such as organizations or governments.

Everything (except the speed of light) depends on one’s perspective; there is no “ruling” frame of reference. There is no single, shared “present” in the universe, only the appearance of such with the objects that are close to us and moving at similar speeds.

Relativity teaches us that the knowledge an observer can have about a system to which he or she belongs is limited. When we are immersed in our own system, we can become blind to events outside of our immediate experience, and we may not be able to easily detect developments in our system (e.g., culture change, unconscious bias) that outsiders might observe much more readily.

Just as we require a frame of reference in order to notice the rotation of the Earth, we can frequently benefit from an external perspective, whether through formal observational methods, independent research, or even through an outside consultant, coach, or therapist. We must be open to other perspectives if we want to truly understand ourselves and our environment.

“Your personal experiences make up maybe 0.00000001% of what’s happened in the world but maybe 80% of how you think the world works.”

Morgan Housel, investor

***

We can also draw inspiration from relativity to practice more empathy; our concepts of “rationality”, “morality”, or “duty” can be highly relative to the contexts we live in. Before we pass judgment or claim to truly understand something, we should pause to consider alternative perspectives. Ask experts from various backgrounds. Seek out the best arguments against your ideas. The perspective that led us to our initial conclusions is rarely the only valid perspective to take!