Mastodon Politics, Power, and Science

Friday, February 13, 2026

The Universe Has No Constants: Why Physical "Laws" Are Just Coordinate Artifacts

J. Rogers, SE Ohio

The formal paper is here.

Physics is currently stuck in a Ptolemaic age. We have built an incredibly precise, predictive model of the universe, but it is built on a fundamental error of perspective. We have placed the observer—and the observer’s arbitrary distinctions between Mass, Length, and Time—at the center of the solar system.

Because the universe does not actually respect these distinctions, our data looks messy. "Planets" appear to move backward. To fix the math, we have added epicycles—correction factors that force the data to fit our geocentric model.

We call these epicycles Physical Constants.

We treat the speed of light (

        cc
      

), Planck’s constant (

        hh
      

), and the gravitational constant (

        GG
      

) as fundamental properties of the universe. We measure them to deeper and deeper precision, marveling at their "fine-tuning."

But what if they aren't properties of the universe at all? What if they are just the conversion factors required to translate a unified reality into a fragmented coordinate system?

The Copernican Shift: Moving the Center to the Substrate

In my recent paper, "The Structure of Physical Law as a Grothendieck Fibration," I argue for a Copernican shift in metrology.

The Standard Framework of physics is Human-Centric. It assumes that Mass, Energy, Frequency, and Length are ontologically distinct dimensions because that is how we perceive them. When we insist on measuring a unified universe using these four separate rulers, the math requires us to insert "constants" to glue the rulers back together.

  •         cc
          
    is the glue between Space and Time.

  •         hh
          
    is the glue between Energy and Frequency.

  •         GG
          
    is the glue between Mass and Geometry.

My framework is Substrate-Centric. It posits a single, unified substrate (

        SuS_u
      

)—a "Sun" at the center of the mathematical system. When you shift your coordinate origin to the substrate’s own geometry (using Natural Ratios), the "retrograde motion" stops. The constants all become 1. They vanish.

This isn't just a trick of algebra. It is a structural proof that the constants were never there to begin with.

Naturally, the "High Priests" of the Standard Framework have objections. Here is why they are wrong.

Objection 1: "This is just Philosophy."

The claim: "Physics is about measurement. Saying 'reality is unified' is metaphysics, not science."

The Rebuttal: This is not metaphysics; it is topology.
The paper demonstrates that physical laws are Cartesian liftings within a specific category theory structure (a Grothendieck fibration). This structure makes a rigorous physical claim: that the "constants" are coupled Jacobian connection coefficients. You cannot change one without breaking the geometry of the others.

This leads to a falsifiable prediction: The Inversion Point.
If the universe were truly fragmented, the scaling relationships of mass (

        m/mPm/m_P
      

) and wavelength (

        lP/λl_P/\lambda
      

) would be independent. But they aren't. They are reciprocal. My framework predicts that if you plot all fundamental physical laws on a logarithmic scale of natural ratios, they must all intersect at exactly one point:

        (1,1)(1,1).

Standard physics views the Planck scale as a limit. I view it as the geometric origin. That is a physical claim, not a philosophical one.

Objection 2: "This is just Dimensional Analysis."

The claim: "You’re just using the Buckingham

        π\pi
      

theorem to rearrange units. That’s a tool we already use."

The Rebuttal: You are confusing the shadow with the object.
Standard physics uses dimensional analysis, but it cannot explain why it works. Why should the universe care about our units? Why should physical laws be invariant under scaling?

In my framework, dimensional analysis isn't a "tool"—it is the Shadow of the Fibration.

  • The Object: The Substrate (

            SuS_u
          
    ).

  • The Light: The coherence of physical law.

  • The Shadow: The dimensionless ratios (

            π\pi
          
    groups) we observe.

Dimensional analysis works because it is tracing the shape of the shadow cast by the substrate. The Standard Framework spends its time measuring the edges of the shadow with extreme precision. I am turning around to look at the object casting it. To dismiss this as "just dimensional analysis" is like dismissing astronomy as "just looking at dots."

Objection 3: "You are a Layman ignoring the King's Fine Clothes."

The claim: "Real physics is hard. It requires Quantum Field Theory, Renormalization, and Gauge Symmetry. You are ignoring the complexity because you don't understand it."

The Rebuttal: The complexity is the problem.
The "finery" of the King's clothes—the parameter salad of the Standard Model, the renormalization schemes—is exactly what you get when you use the wrong coordinate system.

If you try to map a sphere using a square grid, you get mathematical singularities at the poles. You need complex calculus to fix the map. The expert cartographer is proud of his complex calculus.
I am simply pointing out: "It’s a sphere."

The complexity of modern physics is the complexity of the Jacobian Matrix. It is the math required to rotate our arbitrary axes (Mass, Length, Time) to match the natural geometry of the substrate.

The Forensic Evidence: The Equivalence Chain
If you need proof, look at the "Golden Thread."
When you normalize physical quantities by their Planck scales (the Jacobians), 15 different physical laws—Newton’s Gravity, Einstein’s Mass-Energy, Planck’s Frequency—collapse into a single identity:

        X=X.

The probability of 15 independent laws aligning perfectly by accident is less than

        102210^{-22}.

This is the forensic evidence that the "laws" are not independent discoveries. They are 15 different camera angles of the same object. The "constants" are just the focal lengths of the lenses we used.

The Verdict

We have spent centuries measuring the lens instead of the light.

When we measure the Fine-Structure Constant (

        α1/137\alpha \approx 1/137
      

), we aren't measuring a fundamental property of reality. We are measuring the distortion of our own electromagnetic coordinate slice. We are measuring the shape of the prism, not the light passing through it.

The universe has no constants. It has only a unified geometry, and the clumsy coordinates we use to describe it. It is time to stop counting epicycles and look at the Sun.

The Universe Has No Constants: Why Physical "Laws" Are Just Coordinate Artifacts

 J. Rogers, SE Ohio

The paper with the full category framework is here.

What if everything we call a "fundamental constant" is actually just a conversion factor between arbitrary measurement axes?

A new paper argues that the physical constants we've spent centuries measuring—the speed of light c, Planck's constant h, Newton's gravitational constant G—aren't properties of the universe at all. They're properties of our description of the universe. And this isn't philosophy—it's a mathematical proof with a falsifiable prediction.

The Setup: One Thing, Many Measurements

Here's the core insight: when you measure an electron, you can describe it in multiple ways:

  • Mass: 9.109 × 10⁻³¹ kg
  • Energy: 8.187 × 10⁻¹⁴ J
  • Frequency: 1.236 × 10²⁰ Hz
  • Wavelength: 2.426 × 10⁻¹² m

We typically think of these as four different properties that happen to be related by physical laws: E = mc², E = hf, λ = h/mc. But what if that's backwards?

What if there's only one underlying thing—call it the substrate—and Mass, Energy, Frequency, and Wavelength are just four different ways of measuring it, like describing a location using latitude, longitude, altitude, or distance-from-Chicago?

When you insist on using multiple "independent" coordinate axes to describe a single unified thing, you're forced to introduce conversion factors between them. We call those conversion factors c, h, and G. But they're not measuring properties of nature—they're measuring how misaligned our coordinate system is from the substrate's natural geometry.

The Math Already Knew This

Here's what makes this framework so compelling: it uses exactly the same mathematics as standard physics, but interprets it geometrically rather than as a collection of separate facts.

Jacobians are the diagonals in a rotation matrix that convert between unit systems.  We incorrectly called these jacobians "Planck units." But they are not a separate unit chart. They are not God's little rulers. The Planck jacobians rotate between the SI unit chart and the single set of natural ratios that exist as physical reality in a unified universe.  Every unit chart has a set of jacobians that rotate to the same set of natural ratios.

Consider these relationships, all known for over a century:

  • E = mc² (Einstein, 1905)
  • E = hf (Planck, 1900)
  • λ = h/mc (de Broglie, 1924)
  • F = Gm₁m₂/r² (Newton, 1687)

Standard physics treats these as independent discoveries that happen to be consistent. But if you normalize everything by Planck-scale factors, something remarkable happens:

m/m_P = E/E_P = f·t_P = l_P/λ = T/T_P = p/p_P

Every fundamental quantity, when divided by its corresponding Planck jacobian, equals the same dimensionless number. One number. Not five different properties related by four laws—one substrate measured six different ways.

From this single equality, you can derive all 15 pairwise "laws" connecting mass, energy, frequency, temperature, momentum, and wavelength. Each law is just what you get when you solve for one variable in terms of another. The constants appear as the algebraic leftovers—the Jacobian factors needed to convert between misaligned coordinate axes.

The paper proves this isn't a numerical coincidence. The probability that 15+ independently discovered laws would align this perfectly by chance is less than 10⁻²².

The Conceptual Revolution

Standard Framework:

  • The universe has properties: mass, energy, momentum, etc.
  • These properties are related by laws: E = mc², F = ma, etc.
  • The constants in these laws are fundamental parameters to be measured
  • Understanding nature means discovering more laws and measuring constants to higher precision

This Framework:

  • The universe has one unified substrate (mathematically: a terminal object)
  • Mass, Energy, Time, etc. are measurement axes we impose (conceptually independent but not ontologically independent.)
  • Physical "laws" are forced consistency conditions when you describe one thing through multiple fragmented axes.
  • Constants are Jacobian transformation coefficients—they measure how far your coordinate system is displaced from the substrate's natural geometry.  And this is already operationally true since the 2019 redefinition of constants.
  • There exists exactly one natural coordinate system: dimensionless ratios where all units cancel.

The Falsifiable Claim: Natural Ratios

This isn't just mathematical philosophy. The framework makes a specific, testable prediction:

There exists a single, universal physical scale defined by dimensionless natural ratios, independent of any unit system.

When you form the ratios:

  • m/m_P (mass divided by Planck mass)
  • l_P/λ (Planck length divided by wavelength)
  • f·t_P (frequency times Planck time)

These aren't "Planck units" (which is still a choice of coordinates). These are unit-free numbers—all the dimensions cancel.  We are just cancelling out the unit scaling we arbitrarily added. The paper claims these dimensionless ratios converge to a single universal substrate value that is the same regardless of what you're measuring or what units you use.

Experimental test: If this is true, then any sufficiently precise measurement should reveal that all phenomena, when expressed as natural ratios, cluster around integer or simple fractional relationships to each other. The substrate has structure, and that structure should be visible in the dimensionless ratios.

If, instead, the constants really are independent fundamental parameters, then the natural ratios should show no particular pattern—they'd just be whatever values happen to make our arbitrary unit choices work out.

Why This Matters

1. It Explains Why Physics Works

Why can we describe the universe with mathematics at all? Standard answer: "The universe is mathematical" (Tegmark) or "It's just unreasonably effective" (Wigner).

This framework: Mathematics works because consistency is the only requirement. Once you choose to describe a coherent substrate through multiple fragmented axes, dimensional analysis, conservation laws, and "fundamental" relationships are unavoidable consequences. They're not features of the universe—they're features of any consistent description.

2. It Reframes the Search for Unification

Physics has spent a century trying to unify the four forces, reconcile quantum mechanics and relativity, and find a "theory of everything."

This framework suggests: The universe is already unified. The apparent multiplicity (different forces, different particles, different conservation laws) is an artifact of describing one substrate through multiple incompatible coordinate systems. We're not looking for unification—we're looking for the correct description of the substrate we fragmented.

3. It Reinterprets Constants

When we measure c to higher precision or wonder why the fine structure constant α ≈ 1/137, we're asking the wrong question.

The right question: What is the geometric structure of the substrate, and why does our particular choice of conceptual axes (Mass, Length, Time as independent) produce these specific Jacobian factors when we try to express that geometry?

The constants aren't parameters of nature. They're parameters of our perceptual fragmentation of nature.

The Newton Connection

Isaac Newton would have understood this immediately. The Principia is written in pure geometric ratios—areas swept out, distances traversed, forces compared. Newton proves the inverse-square law without ever assigning units or defining constants. The geometric relationships are primary; the constant G only appears when you try to express that geometry in specific measurement units.

Newton knew gravity wasn't "force equals G times stuff"—it was a geometric relationship between curvature and matter. The algebraic formulation came later, and with it, the illusion that G was a property of gravity rather than a property of our coordinate choice.

This paper is Newton's geometric physics, generalized to all of physics, and proven using category theory (specifically, Grothendieck fibrations—though the author discovered the structure before knowing that name).

The Bottom Line

We've been doing physics backwards. We thought:

  1. Measure constants precisely
  2. Discover new laws
  3. Unify the forces
  4. Understand why math works

This framework says:

  1. There is one substrate (terminal object)
  2. We fragmented it into "independent" conceptual axes (Mass, Length, Time)
  3. Constants are forced to appear as connection coefficients (Jacobians) making our fragmented description consistent
  4. Laws are forced as Cartesian liftings of substrate morphisms into coordinate systems
  5. Math works because consistency is the only requirement

The universe doesn't have multiple properties connected by laws. It has one thing. We invented the "multiple," and we're measuring the cost of that invention. The cost is precisely: c, h, G, k_B.


For the Technically Inclined

The paper models this using:

  • 𝓑: Category of conceptual types (Mass, Energy, Time...) with dimensionless morphisms
  • 𝓔: Category of measured quantities (values + units)
  • π : 𝓔 → 𝓑: Grothendieck fibration projecting measurements onto conceptual types
  • S_u: Terminal object in 𝓑 representing the unified substrate

Physical laws are Cartesian liftings of morphisms in 𝓑. Constants are cocycle data encoding the geometric distortion introduced by coordinate choices. The Planck scale is the unique inversion point where reciprocal scaling relationships (m/m_P vs l_P/λ) simultaneously equal unity.

Dimensional analysis (Buckingham π) and conservation laws (Noether's theorem) emerge as functorial consequences, not separate tools.


The radical claim: Physical constants aren't fundamental. Unit systems are not fundament. And there's exactly one system of dimensionless natural ratios—where the substrate reveals itself directly, with no conversion factors needed.

Everything else is just us, insisting on measuring one thing in dozens of different ways, and then marveling at how precisely the conversion factors work out.

Thursday, February 12, 2026

There Are No Units in Nature: Why the Natural Ratios Is Inevitable

J. Rogers, SE Ohio

Physicists like to say we “set c=h=G=kB=1c=h=G=kB=1” in natural units, and then everything becomes magically simple. But they never say what that actually means. Because it is never don with any rigor. If they did then they would se it is ababout harmonizing axis of measurement to a single physical scale.

My revised paper, “The Structure of Physical Law as a Grothendieck Fibration,” makes a stronger claim:
  • The simplicity of natural/Planck units is not a convenient trick.
  • It is a proof that there is only one real “thing” underneath all of physics.
And the constants c,h,G,kBc,h,G,kB are the unavoidable price we pay for insisting there are separate things like “mass,” “length,” and “time.”

This isn’t just aesthetics. It says: any universe with coherent physics and observers like us must have something like the Planck scale and something like our constants. There is no other way to do measurement on a single underlying reality.

1. The core move: reality is already unified, axes are in your head

The usual story:
  • There are independent fundamental quantities: mass, length, time, energy, temperature…
  • We discover “mysterious” conversion factors between them: c between space and time, h between energy and frequency, G between mass and geometry, k_B between energy and temperature.
My story:
  • There is one coherent, dimensionless physically real substrate Su.
  • Our “fundamental quantities” are just axes we made up to describe that one thing from different perspectives.
  • The independence of Mass, Length, Time, etc. is a perceptual error, not a feature of the world.
Once you fracture a unity into separate axes and put arbitrary scles on each concept, you create gaps between them. The constants are not little bridges out there in nature; they are the measured size of the distortion you introduced by splitting something that was never separate.

2. The categorical skeleton: a fibration of measurement

The paper formalizes this with category theory, but the picture is simple:
  • There’s a base layer of pure concepts (mass‑like, time‑like, energy‑like directions).
  • There’s a measurement layer where you attach numbers and units (kg, m, s, J, K…).
  • There’s a projection from measurements back to concepts.
This projection has a specific structure (a Grothendieck fibration), and physical laws are special arrows (“Cartesian liftings”) that sit above simple, unit‑free relations.

Example:
At the concept level: “energy is equivalent to mass” (no units, just E∼m).
At the measurement level: “E=mc^2” — which is that same relation, expressed in a weird, skewed coordinate system we call SI.

In this language, c^2 is not a mysterious physical ingredient. It’s a Jacobian: the conversion factor that appears because we chose axes that are misaligned with the underlying physical real unity.

3. The new part: Planck scale as a structural inversion point

The biggest addition in this update is an explicit inversion argument that explains why the Planck jacobians align in the unique place where our distortions cancel out.

Look at three types of ratios:

Mass: m/m_P – direct.
Length: l_P/λ – inverted.
Frequency: f⋅t_P – inverted.

You can see the pattern:

Mass goes up → m/m_P goes up.
Wavelength goes up → l_P/λl goes down.
Frequency goes up → f tP goes up, but period goes down. 

The substrate ties these together m/m_P = l_P/λ = f t_P=⋯=X.

That equation is the Equivalence Chain. It says: once you normalize everything correctly, all of these “different” quantities are just different names for the same dimensionless number X.

Now, where do all these ratios equal 1?

m=1*m_P
λ=1*l_P
f=1/t_P

That special point is what we mistakingly call the "Planck scale". Geometrically, it’s the unique “inversion point” where all reciprocal relationships balance and the log‑space curves cross. At that point, our axes are as aligned with reality as they can possibly be, and all the Jacobians collapse to 1.

The key punchline:
You don’t pick the Planck scale; the Planck scale is what you get when you stop lying to yourself about the independence of your axes.

4. Why the constants are structurally unavoidable

The paper proves a “structural necessity” theorem in plain language:

Start with one coherent substrate where everything can interact.


Let an observer describe it using multiple axes (mass, length, time, etc.).
Assume there are real laws – nontrivial relations between those axes.

Then:

There must exist some distinguished system of scales where all laws look simple (the Planck‑like system).

In any other system, you are forced to introduce constants like c,h,G,k_B as correction factors. They are the Jacobians of your choice of description.

So in this view:
  • A universe with coherent physics and no constants is impossible, if you insist on talking in fractured axes like kg, m, s, K.
  • A universe with “many unrelated fundamental scales” is also impossible, if those scales are supposed to interact.
  • The Planck scale and the constants are not oddities of our universe. They are what any universe looks like when a fragmented observer measures a unified substrate.

5. Why this matters (and to whom)

For working physicists
  • It explains why “natural units” feel natural: they are the coordinates where your axes finally line up with the substrate.
  • It demotes constants from “deep mysteries to be explained by more physics” to geometric bookkeeping for your unit choices.
  • It unifies dimensional analysis (Buckingham ππ) and Noether/Lie symmetries as operations on the same underlying fibration, not two separate tricks in different textbooks.
If you’ve ever felt like c,h,G,k_B are more about our description than about reality, this gives you a rigorous way to say that.
For philosophers of physics
  • It attacks the idea that “mass,” “length,” and “time” are ontologically primitive.
  • It treats the observer’s concept‑splitting as the source of constants and complexity.
  • It gives a precise sense in which “laws are coordinate artifacts” of a deeper, unit‑free structure.
This is not just wordplay; it’s backed by categorical structure and an explicit Equivalence Chain that reproduces ~15 major laws (Einstein, Planck, de Broglie, Stefan–Boltzmann, Newtonian gravity, etc.) as projections of a single tautology.
For mathematically inclined readers

  • You get a clean Grothendieck fibration π:E→B where laws are Cartesian liftings.
  • Constants become cocycles / connection coefficients in a “measurement bundle.”
  • There’s a clear path to higher‑category generalizations (constants as 2‑morphisms, measurement as a stack, etc.).

If you like the idea that “physics is a bad coordinate chart on something simple,” this paper is almost literally that sentence turned into math.

6. The real philosophical punchline

The old question: “Why do c,h,G,kBc,h,G,kB have these specific values, and why do Planck units seem so special?”

The new answer:
  • Because you chose to describe a single thing with multiple conceptual axes.
  • Because you insisted on having “mass” over here and “length” over there and “time” somewhere else.
The constants are the cost of that insistence. The Planck scale is the unique point where the bill sums to zero.

Or in one line:
There are no units in nature. There is only a unified substrate, and the constants you are forced to invent when you split it apart.

The Universe Has No Constants: Why Physical "Laws" Are Just Coordinate Artifacts

J. Rogers, SE Ohio The formal paper is here. Physics is currently stuck in a Ptolemaic age. We have built an incredibly precise, predictive ...