Physics Reinterpreted: What Are the Forces Doing?
By Eugene Sandugey · · 12 min read
Every physicist knows what the four forces do. Gravity attracts. Electromagnetism binds atoms and carries light. The strong force holds nuclei together. The weak force enables radioactive decay. But nobody asks what they're doing it for.
This page applies the Universal Question to established physics: how does each feature optimize the process of optimization itself? Each feature has its own domain-specific explanation. One principle explains why they all exist together.
Empty space (99.9999% of the Universe)
Established. The universe is mostly empty space. Average matter density is about 5 atoms per cubic meter.
Why it exists. Empty space provides isolation between experiments. One star exploding doesn't destroy every other experiment in the universe. It enables massively parallel processing: billions of galaxies running independent optimization experiments simultaneously. The "emptiness" is the buffer that makes parallel optimization possible.
The speed of light
Established. Nothing can travel faster than light. No information, no influence, no cause-and-effect. This speed limit comes from the structure of space and time themselves.
Why it exists. Without a speed limit, the first civilization to reach any capability dominates everything instantly. No parallel experiments, no independent paths, one winner takes all. That is the worst possible optimization strategy. The speed of light guarantees that every pocket of the universe gets to run its own experiment independently. Billions of galaxies, each isolated enough to try different things, none able to prevent the others from exploring. It is the buffer between parallel optimization paths.
Standard physics says c comes from the structure of spacetime. The framework explains why spacetime has this structure: because optimization requires isolation between parallel experiments. For how accelerating expansion deepens this isolation over time and flips game theory from zero-sum conquest to positive-sum creation, see the dark energy section below.
The arrow of time
Established. At the level of individual particles, physics works equally well forward or backward. But in the real world, time clearly has a direction: eggs break but don't unbreak. The standard explanation points to the universe starting in a very ordered state that's been getting messier ever since. Standard physics doesn't explain why it started so ordered. The framework does: maximum initial order gives you maximum opportunity for optimization. The most organized starting state creates the most room for exploration, differentiation, and improvement.
Why it exists. Time is the optimization sequence counter. You can't tell if things improved without a "before" and "after." The large-scale direction of time (things fall apart, eggs don't unscramble) emerges because the universe is systematically trying new arrangements. Entropy going up IS the universe exploring its options, which is complete exploration.
Inflation as initialization
Established. In the first sliver of a second after the Big Bang, the universe ballooned to an enormous size in roughly 10⁻³² seconds. This "inflation" explains several puzzles: why the universe looks the same in every direction, why space is flat, and why we don't see certain exotic particles that should have been created. How inflation worked is still debated.
Why it exists. Inflation is a startup sequence. It spread matter and energy across vast space, carved out separate regions, and set conditions for structure formation. The precision of inflation's timing (both start and stop had to fall within narrow ranges) is real physics that needs explaining. Standard cosmology describes it through specific field dynamics but doesn't explain why those dynamics have the precise values they do. The framework does: maximum initial order gives maximum room for optimization.
Dark Energy and accelerating expansion
Established. The universe isn't just expanding. It's expanding faster and faster. Something is pushing space apart, and we call it "dark energy" because we have no idea what it actually is. It makes up about 68% of all the energy in the universe. Recent data from DESI (2024) suggests dark energy may not even be constant but evolving over time.
Why it exists. The strongest optimization reading of accelerating expansion: it eliminates the arms race. In a decelerating or static universe, civilizations eventually meet. Game theory kicks in: grab resources now or lose them to competitors. Arms races. Destruction. Mutually assured destruction threats. Accelerating expansion removes this incentive entirely. There is no future where you meet your competitors in physical space. The territory is literally moving away faster than anyone can reach it. The dominant strategy is creation, never conquest. The game theory flips from zero-sum to positive-sum. No other explanation for dark energy predicts this specific game-theoretic consequence.
One mechanism, five observable consequences. Accelerating expansion makes physical expansion expensive while simulation gets cheaper (forces civilizations inward). It eliminates inter-civilization arms races (competitors can never meet in physical space). It cools the universe, making computation cheaper over time (Landauer's principle). It grows the holographic boundary (increasing information capacity). And it protects accumulated optimization during the critical window when civilizations emerge.
The counterfactual: without accelerating expansion, you get arms races, resource conflicts, and premature collapse before the cascade completes. Would the universe optimize better without dark energy? No.
DESI 2025 data (3.9 sigma) suggests dark energy is evolving over time and may eventually go negative. If true, the expansion doesn't need to be permanent. It needs to last long enough for the cascade to produce its recursive multiplication. A universe that expands to let civilizations explore independently, then contracts to share and consolidate, would fit the framework even better than a permanent constant. This is becoming empirically testable: recent models project a potential recollapse on the order of ~33 billion years.
Why does gravity exist?
Gravity exists because optimization requires aggregation. You cannot build complex structures without first collecting matter in one place. Gravity is the universe's aggregation mechanism: it pulls matter together into stars, planets, galaxies, and clusters. Without it, matter stays dispersed as a thin gas and nothing complex ever forms. Every structure in the universe, from atoms to superclusters, depends on gravity bringing the raw materials together first.
The four forces as optimization gradients
| Force | Standard Description | Optimization Role |
|---|---|---|
| Gravity | Masses attract | Complexity aggregation. Brings matter together to form structure |
| Electromagnetism | Charges interact | Information exchange. Enables communication, chemistry, and consciousness |
| Strong nuclear | Holds nuclei together | Stable structures. Locks matter into configurations that can persist and grow more complex |
| Weak nuclear | Enables radioactive decay | Transformation. Allows matter to change form, enabling nuclear fusion in stars |
Read as a set, these look like four aspects of one optimization process, each governing a different class of physical change. Physicists have been trying for decades to prove that all four forces are really one force wearing different masks (a single equation that produces all of them). Nobody's succeeded yet. This framework offers a different kind of unification: not one equation, but one purpose. All four serve optimization.
Conservation Laws as optimization infrastructure
Conservation laws look like the optimization process's bookkeeping system.
Energy conservation means you can't lose the score. Energy can't be created or destroyed, only moved around. Momentum conservation means optimization has direction: when things interact, the total push has to balance. Information conservation means the universe never forgets a result. Even black holes, which seemed to destroy information, appear to preserve it on their surfaces (this was a 50-year debate in physics, and the "information is preserved" side is winning).
Mathematician Emmy Noether proved in 1918 that every symmetry in physics automatically produces a conservation law. Physics working the same way today as yesterday gives you energy conservation. Working the same here as over there gives you momentum conservation. Noether explains how symmetries produce conservation laws. The framework asks: why does the universe have these specific symmetries rather than others?
Extreme objects as optimization infrastructure
The universe systematically produces objects at every physical limit. These are optimization infrastructure.
Magnetars carry magnetic fields 10¹⁵ times stronger than Earth's, with surface gravity 200 billion times ours. They create unique quantum states impossible elsewhere. Natural laboratories for physics beyond the standard model. Crust shifts cause "starquakes" detectable across the galaxy.
Neutron stars have nuclear density throughout (a teaspoon weighs a billion tons). Some spin 700 times per second. Matter compressed to the nuclear limit. Natural matter recycling at cosmic scale. They generate gravitational waves detectable across the universe.
Black holes represent maximum information density and are found at the center of every major galaxy. They pack exactly one bit per four smallest-possible patches of surface area (the theoretical maximum). They are maximum-density information storage. The universe mass-produces the densest possible computational structures at the center of every major galaxy. General relativity guarantees their formation wherever enough mass concentrates.
The pattern: the universe doesn't just enable extreme objects. It systematically creates them at every physical limit.
Entropy as exploration
Why does entropy increase?
Entropy increases because the universe needs to explore. A system stuck in perfect order never discovers if some other arrangement works better. Entropy is the cost of searching every possible configuration. The second law of thermodynamics isn't decay. It's the universe systematically trying alternatives. Without entropy, optimization stalls at whatever the initial state was.
Established. Left alone, things naturally drift toward messiness. Ice melts. Buildings crumble. Organized things become disorganized. This is the second law of thermodynamics. It happens because there are vastly more ways to be messy than to be neat, so random changes almost always make things messier. Standard physics doesn't explain why the universe started so organized. The framework does: maximum initial order = maximum room for optimization.
Why it exists. That drift toward messiness isn't decay. It's the universe trying every possible arrangement. A universe where nothing ever fell apart would be stuck in its original state forever, never discovering whether some other arrangement works better. Entropy is the cost of searching. You have to take things apart to find out what else you can build.
Observer-Dependence
Established. Reality depends on who's looking and how. Two people moving at different speeds disagree about when events happen, how long things are, and how fast time passes (Einstein, 1905/1915). Quantum measurement outcomes depend on what you choose to measure. Near a black hole, two observers can see completely different but both correct physics (Susskind et al., 1993). A famous thought experiment called "Wigner's friend" (formalized by Frauchiger and Renner, 2018) shows quantum mechanics gives contradictory descriptions to different observers. The delayed-choice quantum eraser (Kim et al., 2000) shows this observer-dependence even operates across time.
Why it exists. Observer-dependence is what you'd expect from a computational system: different users accessing the same data from different angles see different things, but the underlying system stays consistent. The rules are the same for everyone (the laws of physics don't change between observers). But the data you get depends on how you look. A rock looks the same to everyone. A mathematical proof gives the same answer for everyone. But quantum measurements, reference frames, and black hole horizons are genuinely observer-dependent. Computers work the same way: two programs reading the same database at the same time can get different results depending on timing, without the database itself being inconsistent.
Quantum uncertainty
Established. The Heisenberg uncertainty principle is a fundamental property of quantum mechanics. Certain paired properties (like position and momentum, or energy and time) cannot both be precisely determined simultaneously.
Why it exists. Computational efficiency at its finest: don't calculate what nobody's asking for. Why compute exact position AND exact momentum when only one is being queried? Uncertainty also enables quantum tunneling (particles passing through barriers they shouldn't be able to cross according to classical physics), which is the universe's built-in mechanism for escaping dead ends.
Entanglement as infrastructure
Established. Quantum entanglement is real. Bell's theorem (Aspect, Clauser, Zeilinger, Nobel 2022) proved the correlations between entangled particles are stronger than any local model allows. Something nonlocal IS happening. The "no signaling" theorem says you can't use it to send messages, but the correlation itself violates locality.
Why it exists. Entanglement may be the construction material of spacetime itself. Maldacena and Susskind (2013) proposed ER=EPR: every entangled pair is connected by a microscopic wormhole. Van Raamsdonk (2010) showed the converse: remove all entanglement from a region and the geometry disconnects. No entanglement, no space. If this is right, entanglement is not a weird quantum side effect. It is the fabric.
Entanglement builds up with every interaction. New correlations form constantly. The arrow of time IS this accumulation. Under the transactional interpretation, the connection between entangled particles is through time (a 4D event), not across space. No faster-than-light needed. See Quantum Simulation for how entanglement, gravity, and the arrow of time may all be the same system.
The computational interpretation of fundamental quantities
Energy as computational rate. One of the most basic equations in quantum physics (E = h-bar times omega) says energy equals a tiny constant times frequency. Frequency is how often something happens per second. Energy IS computation speed. Energy conservation maps to: computation rate can be transferred but not created or destroyed.
Gravity as information flow. Erik Verlinde proposed in 2010 that gravity emerges from differences in how information is distributed across space. If ER=EPR is correct, gravity IS entanglement gradients: regions with more entanglement pull on regions with less. This proposal remains controversial and has not achieved consensus. The framework finds it suggestive because it would connect gravity, entanglement, and spacetime into a single system: quantum information generating everything else. Gravity builds structure (stars, planets, galaxies) regardless of whether its ultimate nature turns out to be informational or geometric.
The core mechanism: time as the positive attractor
Here's the key insight: the universe doesn't need to "know" what's optimal. There is only positive pressure. Things that optimize better outcompete things that don't. Over time, what works compounds and what doesn't fades. Nobody is pruning or filtering. There's just a positive attractor: better solutions outperform worse ones, and time is what lets that play out.
Three forces work together. Gravity pulls matter together into structures (stars, planets, galaxies). Time lets better solutions compound and outcompete worse ones. Consciousness selects specific possibilities by making measurements that turn quantum "maybe" into definite "is."
Think of the difference between AlphaGo and Deep Blue. Deep Blue tried to avoid bad chess moves by checking a list of known mistakes. AlphaGo learned to find good moves by following gradients toward what wins. The universe works more like AlphaGo: it doesn't need to know every failure mode in advance. It follows gradients toward better outcomes.
Why mathematics describes reality
Established. Mathematics describes reality with extraordinary effectiveness. Eugene Wigner named the puzzle in 1960:
"The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve."
The same mathematical structures appear across wildly different physical domains.
Why it exists. Mathematics maps to reality because reality IS computational. In a computational universe, mathematics being the native language is expected, not mysterious. The effectiveness is only unreasonable if you assume the universe isn't a computation. If it is, mathematics being effective is as natural as a program responding to its own programming language.
Try to Break This
Steel-manned objections — strongest counterarguments first. Submit yours →
The claim isn't that constants have human-like purpose. It's that the specific values and structures of physics are consistent with an optimization process. Fine-tuning to 10⁻¹²² precision and computational structure in quantum mechanics are real observations that need explaining. Standard physics has its own explanations (multiverse selection, physical necessity, brute fact). The framework offers a different one. "Anthropomorphism" is a label, not an explanation for why the physics has the specific structure it does.
The claim is testable: if empty space serves computational isolation, then matter density should be optimized for maximum parallel processing. Too dense means cascade failures. Too sparse means wasted potential. The observed density falls in a range that balances independent processes with enough gravitational interaction for structure formation. The counterfactual is clear: pack all matter together and you get no independent experiments, no parallel processing, catastrophically slower optimization.
Standard physics describes WHAT happens but doesn't address WHY these specific features exist. Why does light have a finite speed? Why is the universe mostly empty? What gives time its arrow? Why is the cosmological constant tuned to 10⁻¹²²? "Shut up and calculate" treats these as brute facts. The optimization framework answers all of them with one principle. One principle covering what standard physics needs separate brute facts for is not adding complexity. It's reducing it.
Related
Is the Universe a Quantum Computer?
We already run quantum mechanics inside quantum mechanics. Every transistor does it. The weird parts of QM become engineering features.
What's Bigger Than Evolution?
Evolution trades short-term costs for long-term gains: sex, intelligence, language, culture, technology. Each step extends how far ahead the system can plan.