Mastodon Politics, Power, and Science: The Constant of Proportionality: Newton's Deliberate Omission of G and the Lost Principle of Invariance

Tuesday, October 7, 2025

The Constant of Proportionality: Newton's Deliberate Omission of G and the Lost Principle of Invariance

J. Rogers, SE Ohio



Abstract

Standard histories of physics credit Henry Cavendish with the first measurement of the gravitational constant, G, in 1798, treating it as the discovery of a missing piece in Newton's theory. This paper argues for a radical reinterpretation: that Isaac Newton was not only aware of a constant of proportionality in his law of universal gravitation but deliberately and systematically structured his entire methodology in the Principia Mathematica to render its specific value irrelevant. We contend that Newton, in his dual role as natural philosopher and Master of the Royal Mint, possessed a uniquely sophisticated understanding of the distinction between invariant physical law and arbitrary units of measurement. By formulating his physics in terms of ratios and proportions, Newton was not being primitive; he was being architecturally rigorous, effectively canceling out the unknown constant in every calculation. This paper demonstrates that Newton’s method constitutes a deliberate adherence to a principle of invariance, a foundational wisdom that was later lost as physics shifted from a proportional to an algebraic, unit-dependent framework.

1. Introduction: The Myth of the Missing Constant

The canonical form of Newton's law of universal gravitation, F = G * (m₁m₂/r²), is a cornerstone of modern physics education. The gravitational constant, G, is presented as a fundamental constant of nature, a deep property of the universe whose value Newton simply lacked the experimental means to determine. This narrative implicitly frames Newton’s work as incomplete, a brilliant but unfinished sketch that awaited the empirical precision of Cavendish to be made whole.

This paper challenges that narrative. We argue that the absence of G in Newton's Principia is not an omission born of ignorance, but a deliberate feature of a sophisticated and philosophically coherent methodology. Newton did not seek the numerical value of a gravitational constant because his framework was designed to bypass the need for it entirely. He understood, we will argue, the fundamental difference between the "business logic" of physics—the invariant proportional relationships—and the "data translation layer"—the arbitrary constants required to bridge those laws to human systems of measurement.

2. The Two Newtons: Natural Philosopher and Master Metrologist

To understand Newton's methodology, one must understand the man's dual career. As the author of the Principia, he sought the timeless, geometric laws of God's creation. As Master of the Royal Mint, he was the chief metrologist of the British Empire, embroiled in the messy, practical world of arbitrary standards, currency conversion, and the enforcement of unit consistency.

In his role at the Mint, Newton’s daily work was a masterclass in the nature of constants as conversion factors. He understood that the value of a pound sterling was a human convention, and that the ratio of gold to silver was a fluctuating "constant" of the economic system. This practical experience with the arbitrary nature of units would have made him exceptionally sensitive to the philosophical error of embedding such a contingent, man-made artifice into a law meant to describe the cosmos. To Newton, the laws of nature should be as universal as the theorems of Euclid, independent of whether one measures length in French feet or English inches.

3. The Method of Ratios: Physics Without Constants

A careful reading of the Principia reveals that Newton does not write algebraic equations in the modern sense. He makes statements of proportion. He states that the force of gravity is proportional to the product of the masses and inversely proportional to the square of the distance.

F ∝ m₁m₂/r²

This is not an incomplete formula. It is a complete statement of physical law in its purest, invariant form. Implicitly, this means F = k * (m₁m₂/r²), where k is some unknown, universal constant of proportionality. The modern physicist sees the inability to specify k as a failure. But Newton saw it as an irrelevance to be eliminated.

His method for doing so was the consistent and rigorous use of ratios. To solve any real-world problem, Newton would construct a comparison between two scenarios, causing the unknown constant k to cancel out.

Case Study: The Moon and the Apple

Newton's great triumph was proving that the force holding the Moon in orbit was the same force that caused an apple to fall. He did not do this by calculating the absolute force on either object. Instead, he calculated the ratio of their accelerations.

Let a_apple be the acceleration of the apple at the Earth's surface (radius R_e), and a_moon be the centripetal acceleration of the Moon in its orbit (radius r_m).

According to his proportional law:
F_apple = k * (m_apple * M_earth) / R_e² => a_apple = k * M_earth / R_e²
F_moon = k * (m_moon * M_earth) / r_m² => a_moon = k * M_earth / r_m²

To test his theory, Newton takes the ratio of these two accelerations:

a_apple / a_moon = (k * M_earth / R_e²) / (k * M_earth / r_m²)

The unknown constant k (our G) and the unknown mass of the Earth M_earth both cancel out perfectly.

a_apple / a_moon = r_m² / R_e²

The equation is reduced to a pure geometric ratio. Newton could measure a_apple (from Galileo's experiments), and he knew the ratio of the Moon's orbital radius to the Earth's radius (r_m / R_e ≈ 60). He could then predict that the Moon's acceleration should be about 1/60² = 1/3600 that of the apple's. The fact that the numbers matched, within the precision of his time, was the proof.

The specific numerical value of the constant of proportionality was never needed. Newton had successfully proven his theory by working exclusively with the invariant proportionalities of the system.

4. The Great Reversal: From Proportionality to Algebra

Over a century after the Principia, the scientific culture had changed. The rise of engineering and the need for precise numerical predictions in specific, standardized units (like the newly defined metric system) shifted the focus of physics. The question was no longer just "What is the proportional relationship?" but "What is the exact force in Newtons?"

To answer this new type of question, one could no longer cancel out the constant of proportionality. It had to be measured. Henry Cavendish's 1798 experiment did just that. He measured the force between lead spheres of known mass at a known distance, allowing him to solve for the numerical value of the constant, which we now call G.

This was a great practical achievement, but it came at a profound philosophical cost. The constant G—a conversion factor whose specific value depends entirely on our arbitrary, human-invented definitions of the kilogram, meter, and second—was reified. It was mistaken for a fundamental property of nature. The architectural elegance of Newton's proportional system was replaced by the brute-force utility of an algebraic formula. Physics lost the distinction between the invariant law and the unit-dependent scaling factor.

5. Conclusion: Reclaiming Newton's Wisdom

Isaac Newton was not a primitive physicist waiting for a missing number. He was an architectural purist who constructed his system to be independent of such contingencies. His deliberate omission of a gravitational constant was a mark of profound methodological sophistication, born of a deep understanding of the difference between divine law and human convention.

The modern quest to "unify G and h" is the ultimate consequence of this lost wisdom. It is an attempt to find a deep relationship between two unit-system artifacts, a problem that would have been unintelligible to Newton. By recognizing that Newton was not missing a constant but actively and purposely canceling it out, we can see that he was working at a higher level of abstraction than most of his successors. He was not just doing physics; he was teaching a principle about the very structure of physical law. The lesson was that the constants are not the physics; the proportions are. It is a lesson that modern physics is only now, painfully, beginning to relearn.

Note: 

Newton, as Master of the Mint, wasn't just 
vaguely aware of unit conversions - he was steeped in them. His daily work involved navigating between:

  • English units: pounds, shillings, pence, grains, troy weights, avoirdupois weights

  • French units: livres, francs, écus

  • Dutch units: guilders, florins

  • Spanish units: dollars, reales

  • And the actual physical metal: gold vs. silver ratios, purity standards

Every single one of these systems had its own "gravitational constant." The conversion factor between, say, Spanish gold dollars and English pounds wasn't a fundamental constant of economics - it was an artifact of political convention.

When he wrote F ∝ m₁m₂/r², he wasn't being mathematically primitive. He was being metrologically sophisticated. He knew that any specific numerical constant k would be:

  1. Unit-dependent - different for pounds vs. kilograms vs. livres

  2. Conventional - determined by human agreement, not nature

  3. Irrelevant to the physical law - which was purely about the proportional relationships

The genius of his ratio method was that it automatically normalized for whatever local unit system you were using. The Moon-Apple test would work whether you measured in French feet or English inches, because the method canceled out the local "k" factor.

This makes Cavendish's measurement look different in historical context. It wasn't the "discovery of a missing fundamental constant" - it was the calibration of Newton's proportional law for the particular unit system (meters, kilograms, seconds) that European science was standardizing around.

Newton didn't omit G because he didn't know it existed. He omitted it because he knew too many of them existed - one for every possible unit system - and none of them were fundamental.

The modern interpretation gets the history exactly backwards. We see Newton's silence on a numerical G as ignorance, when it was actually deeper understanding. He recognized that the constant was part of the measurement framework, not the physical law.

This insight reframes everything. Newton wasn't waiting for someone to measure G. He had built a system that made measuring it unnecessary for understanding the physics. The fact that we later decided we needed its numerical value for engineering calculations doesn't mean his system was incomplete - it means our applications had become more practical, but our understanding had become more confused.

No comments:

Post a Comment

Progress on the campaign manager

You can see that you can build tactical maps automatically from the world map data.  You can place roads, streams, buildings. The framework ...