Time is Not Fundamental — Entropy Is
April 2026
One of the deepest things about the Universe is time.
I am not going to step into the full Albert Einsteinian discussion of relativity, spacetime curvature, or the geometry of the cosmos here. I want to talk about time the way we actually feel it—the way we observe it in daily life.
Time moves.
It does not pause for permission. It does not reverse for regret. It flows only forward.
You remember yesterday, but not tomorrow. A glass falls and shatters, but the shattered pieces never spontaneously reassemble themselves and leap back onto the table.
This forward-only nature of reality is so ordinary that we stop noticing how strange it actually is.
Why does time have a direction?
Why is there an arrow?
To approach that question, we must step into one of the most profound ideas in physics: Entropy
Entropy
Imagine you have prepared balloons for a birthday surprise, all packed tightly inside a large bag.
The moment you open the bag, what happens?
The balloons rise, disperse, and spread in all directions.
Now consider another example.
You have a bottle filled with water, with a small hole on its side. You press your finger against the hole so the water stays inside. The moment you remove your finger, the water immediately spills out.
These feel like ordinary events, almost too ordinary to deserve attention.
But hidden inside them is one of the most fundamental laws of the Universe.
What you are witnessing is entropy at work.
Entropy — A Probabilistic View
Let us return to the balloon example.
State A:
All balloons are packed tightly inside a drum.
State B:
The balloons are scattered and spread everywhere.
Now ask a simple question:
Which of these states is more probable?
Clearly, State B.
Why?
Because there are vastly more ways for balloons to be scattered than there are ways for them to remain neatly packed together.
Physics often calls these possible arrangements microstates.
A macrostate is what we observe (“the balloons are spread out”), while microstates are the enormous number of hidden arrangements that produce that observation.
A low-entropy state has fewer possible microstates.
A high-entropy state has many more.
This is the heart of entropy.
Entropy — Statistical Mechanics
Entropy is not simply “disorder.”
It is a counting problem.
It asks:
How many microscopic arrangements correspond to the same macroscopic state?
Claude Shannon defined entropy as:
Where:
- = probability of outcome
- = entropy of the random variable
The self-information of a single event is:
Entropy is the expected value of information:
which gives:
This means:
Entropy = Average Uncertainty = Expected Surprise
Rare events carry more information.
Likely events carry less.
Boltzmann and Physical Entropy
Ludwig Boltzmann gave the thermodynamic definition:
Where:
- = entropy
- = Boltzmann constant
- = number of accessible microstates
The Universe evolves toward states that can happen in more ways.
Suppose all the air molecules in your room suddenly gathered into one corner. Physics does not forbid this. It is allowed.
But it is so unimaginably improbable that it effectively never happens.
Nature is not “trying” to create disorder.
It is simply moving toward what is statistically most likely.
That is a very important distinction. Entropy is not chaos as intention.
Second Law of Thermodynamics
The Second Law states:
The entropy of an isolated system never decreases.
Mathematically:
For reversible processes:
For irreversible processes:
For reversible heat transfer:
This is the real source of the arrow of time.
The Arrow of Time
This movement—from low entropy to high entropy—is what gives us the arrow of time.
In a sense, entropy is what allows us to distinguish past from future.
Consider ice and water.
An ice cube is highly ordered: molecules arranged in a tight crystalline structure.
Liquid water is far less ordered: the same molecules moving freely in countless possible configurations.
Ice→Water
We see ice melt into water.
We do not see water spontaneously gather itself into a perfect ice cube at room temperature.
Why?
Not because it is impossible.
Because it is statistically absurd.
Time does not move forward because of some cosmic preference.
It moves forward because forward is the direction of increasing probability.
That is the real meaning of the arrow of time.
Black Holes and Entropy
Now things become even stranger.
When we study Black holes, we encounter a disturbing question.
If information falls into a black hole and never comes out, does that violate the laws of physics?
More specifically:
Does it violate the Second Law of Thermodynamics?
At first glance, it seems like it should.
If information disappears, entropy should decrease.
But nature says otherwise.
A black hole itself has entropy.
Remarkably:
Where:
- = area of the event horizon
- = Planck length
notice, that entropy of a black hole is not proportional to its volume—
but the area of its event horizon.
This leads to the holographic principle:
It suggests that all the information swallowed by the black hole is somehow encoded on its two-dimensional boundary.
Can a three-dimensional world be fully described by information stored on a two-dimensional surface?
Can reality itself be a kind of hologram?
This is not science fiction.
It emerges from black hole thermodynamics.
And it forces us to ask whether space itself may be more like information than substance.