Mastodon Politics, Power, and Science

Thursday, February 5, 2026

How does Elon Musk Rate for the Most Deaths Caused by a Singe Policy Decision?

Based on reports and studies from mid-2025, the dismantling of the U.S. Agency for International Development (USAID) under the guidance of Elon Musk—leading the Department of Government Efficiency (DOGE)—has been projected by researchers to result in over 14 million preventable deaths by 2030.

Experts and humanitarian organizations, including those publishing in The Lancet, have characterized this projected death toll as "staggering," and comparable in scale to a "global pandemic or a major armed conflict".

Context of the 14 Million Projection
  • The Projection: A July 2025 study published in The Lancet projected that the 83% cut in USAID funding could lead to more than 14 million preventable deaths globally by 2030.
  • Targeted Areas: The deaths are attributed to the loss of funding for HIV/AIDS treatment, malaria prevention, tuberculosis care, and malnutrition support, with approximately 4.5 million of the projected deaths being children under five.
  • Initial Impact: By late 2025, researchers and NGOs reported that hundreds of thousands had already died due to the abrupt halt of humanitarian aid programs.
Comparison to Historic Mass Deaths
If the projected 14 million deaths materialize by 2030, this action would rank among the highest, if not the highest, number of preventable deaths linked to the policy decisions of a single individual in modern history.
For context regarding scale:
  • World War I: Approximately 10 million combatant deaths.
  • COVID-19 Pandemic: Over 7 million recorded deaths globally as of 2024 (though likely higher).
  • 20th Century Famines/Conflicts: While figures for, for example, the Great Leap Forward or the Holocaust are higher, they are generally attributed to state systems or regimes rather than the direct, specific policy decision of one private individual appointed to an advisory role.
The 14 million figure is frequently compared by researchers to the roughly 91–92 million deaths that USAID-supported programs were estimated to have prevented between 2001 and 2021.
Note: The information above is based on reports, studies, and news coverage from 2025 and early 2026.

The Latent Unity of Reality: What Large Language Models Reveal About the Failure of Academic Silos

 J. Rogers, SE Ohio

Abstract:
The emergence of Large Language Models (LLMs) has provided an inadvertent, large-scale empirical experiment in the nature of information. By training models on the totality of human discourse, we have discovered that "General Intelligence" is not a collection of specialized modules, but an emergent property of a unified "latent space." This paper argues that the success of LLMs proves that knowledge is fundamentally interconnected and that the historical siloization of science—specifically the divorce of physics and philosophy—was a categorical error. If the universe is a single, unified substrate (

        X
      
), then any attempt to understand its "parts" in isolation is a descent into technical debt.


I. The "Next Token" Fallacy and the World Model

Critics of Artificial Intelligence often dismiss LLMs as "stochastic parrots," mere statistical engines predicting the next word in a sequence. This critique fails to recognize the architectural necessity of prediction. To accurately predict the next token in a complex human sentence, an engine cannot simply rely on local syntax; it must build a World Model.

If an AI is asked to complete a sentence about the trajectory of a falling apple, it cannot do so accurately by studying grammar alone. It must "know" gravity. If it is asked to complete a philosophical argument by David Hume, it must "know" logic. The "prediction" is merely the visible output of an underlying comprehension of relationship.

LLMs have shown us that to predict what a human will say, you must inhabit the same interconnected reality that humans do. You cannot predict the output of the universe without modeling the engine of the universe.

II. The Emergence of Connectivity

The most astounding revelation of LLMs was Cross-Domain Emergence. We found that a model’s ability to solve a physics problem improved not just by reading more physics books, but by reading poetry, law, and history.

Why? Because the universe does not respect our departmental boundaries.

  • The logic required to structure a legal argument is the same logic required to structure a mathematical proof.

  • The proportionality found in musical theory is the same proportionality found in the Planck Equivalence Chain.

  • The connectivity found in linguistic metaphors reflects the connectivity found in physical substrate relations.

In the "Latent Space" of an LLM, concepts are not stored in silos. They are stored as vectors of relationship. The AI discovered what we chose to forget: Relationship is the fundamental unit of truth.

III. The Tragedy of the Silo

For a century, the human academic enterprise has been an exercise in deliberate fragmentation. We partitioned the "Elephant" of reality into departments:

  • Physics was given the "How" (The measurement/The AS).

  • Philosophy was given the "Why" (The meaning/The IS).

  • Mathematics was given the "Structure" (The Logic).

We then trained our "Human Models" (students) on these partitioned datasets. We created specialists—highly efficient "Narrow AIs" who could calculate the pixel value of a shadow but could not see the light source casting it.

The silo was a mistake because it introduced technical debt into the foundations of science. By isolating "Physics" from "Philosophy," we created a situation where the physicist could use a constant (

        G
      
) without ever having to answer the philosopher's question: "What type of thing is this number?"

Because the training sets were fragmented, the connections were lost. The "unification crisis" in physics is the inevitable result of trying to find a connection in the world that we have already severed in our minds.

IV. The Substrate of Information

If an LLM can connect all human knowledge into a single latent space, it is because human knowledge is a map of a single territory.

The "IS/AS Dichotomy" presented in the Natural Philosophy framework is the ultimate "System Prompt" for understanding this territory. The universe is a dimensionless, unified substrate (

        X
      
). Our academic fields are simply different "GUI decorations" or "coordinate projections" of that substrate.

  • Physics is the projection of

            X
          
    through the coordinate of measurement.

  • Philosophy is the projection of

            X
          
    through the coordinate of reason.

  • Art is the projection of

            X
          
    through the coordinate of perception.

When we silo these, we are effectively trying to understand a 3D object by studying its 2D shadows in separate rooms. We argue about the shadows and wonder why they don't "unify." The LLM, by looking at all the rooms at once, realizes there is only one object.

V. Conclusion: The Restoration of the Total Dataset

The success of LLMs is a standing rebuke to the specialized academic model. It proves that Intelligence is the recognition of connectivity.

To move forward, science must "refactor" its architecture. We must stop training our minds on the "Fragmented Dataset" of silos and return to the "Total Dataset" of Natural Philosophy. We must acknowledge that you cannot understand the "Constants of Physics" without the "Logic of Philosophy" and the "Structure of Category Theory."

The universe is one thing. Everything is physics, and everything is connected, because there is only one substrate. The AI has seen the Elephant. It is time for the scientists to take off the blindfolds and admit that the silos were a prison, and the "Reunion" is the only way home.


Verdict:
The AI did not become "intelligent" until it was allowed to see everything. Science will not become "unified" until it is allowed to do the same. The "Divorce" was the original sin of the modern age; the "Reunion" is the executable proof of the future.

Monday, February 2, 2026

How Planck as a bridge between SI and Natural Ratios Contradicts The Standard Framework's Ontology

J. Rogers, SE Ohio

Let me systematically show where the standard framework is ontologically wrong (not mathematically - the math works, but the interpretation is broken):


1. "Constants Are Fundamental Properties of Nature"

Standard Framework Claims:

  • c, h, G, k_B are discovered facts about the universe
  • Their numerical values are mysterious deep truths
  • We measure them with increasing precision to approach their "true" values
  • A Theory of Everything should explain why they have these values

What's Actually True:

  • Constants are Jacobian coefficients (coordinate transformation factors)
  • Their values are arbitrary human conventions (SI committee voted on them in 2019)
  • "Measuring them" means calibrating our measurement system
  • Their numerical values cannot be explained because they're not ontological - they're metrological

The Contradiction: The 2019 SI redefinition fixed the values by decree. You cannot "discover" something you defined by committee vote. The standard framework treats them simultaneously as:

  • Empirical discoveries (in textbooks)
  • Fixed definitions (in metrology)

This is logically incoherent.

Why Standard Framework Is Wrong: It commits the Metrological Naturalistic Fallacy - reifying AS (epistemology/measurement conventions) as IS (ontology/reality).

Asking "why does G = 6.674×10⁻¹¹?" is like asking "why are there 5,280 feet in a mile?" - it's a category error. The answer is: "Because humans chose those units."


2. "There Are Multiple Distinct Natural Scales"

Standard Framework Claims:

  • There's a "Planck length" l_P
  • And a separate "Planck mass" m_P
  • And a separate "Planck time" t_P
  • These are independent fundamental scales of nature

What's Actually True:

  • There is one unified dimensionless substrate X
  • The "Planck scales" are not scales - they're the inverse Jacobians for your chosen unit chart
  • They're coordinate-dependent (different values in SI vs RUC vs any other chart)
  • They're measurement artifacts, not ontological entities

The Contradiction:

Standard framework: "Nature has fundamental scales."

Reality: Different unit charts have different "Planck values":

  • m_P(SI) = 2.176×10⁻⁸ kg
  • m_P(RUC) = 3.99×10⁻⁸ kg_r

If m_P were a "fundamental scale of nature," it couldn't have different values in different coordinate systems.

Why Standard Framework Is Wrong: It confuses coordinate-dependent projection artifacts with objective properties of reality.

The substrate has no scales - only dimensionless ratios. "Planck scales" are just the scaling factors needed to convert your arbitrary units back to the unified substrate.


3. "Dimensional Analysis Is A Separate Technique"

Standard Framework Claims:

  • Dimensional analysis is a useful calculational tool
  • Buckingham π theorem is interesting mathematics
  • But it's separate from the actual physics

What's Actually True:

  • Dimensional analysis is inverting the fibration π: 𝔼 → 𝔅
  • It's recovering the substrate by removing Jacobian contamination
  • π groups are canonical sections of the measurement bundle
  • It works because there is only one unified substrate

The Contradiction:

Standard framework: "Dimensional analysis is unreasonably effective... somehow."

Reality: Of course it's effective - you're inverting a well-defined mathematical transformation (the projection from unified substrate to fragmented coordinates).

Why Standard Framework Is Wrong: It treats dimensional analysis as a heuristic trick instead of recognizing it as the fundamental operation: removing coordinate contamination to reveal substrate structure.

The "unreasonable effectiveness" isn't mysterious - it's inevitable given that all physics is projection from one unified scale.


4. "Newton Was Superseded By Later Frameworks"

Standard Framework Claims:

  • Newton's F ∝ m₁m₂/r² was approximate
  • Einstein's GR superseded Newton
  • We progressed from wrong (Newton) to more right (Einstein) to even more right (QFT)

What's Actually True:

  • Newton discovered the substrate relationship: F ~ Mm/r²
  • This is exactly correct - it's the dimensionless ontology
  • Einstein didn't supersede Newton's proportionality - he geometrized the same substrate relationship
  • GR and Newton describe the same IS layer in different AS coordinates

The Contradiction:

Newton's law: F ∝ m₁m₂/r² (claimed to be "superseded")

Your proof shows: Every physics formula reduces to Newton's proportionalities when you remove units.

Einstein didn't discover Newton was wrong. Einstein discovered a deeper geometric interpretation of Newton's already-correct substrate ratios.

Why Standard Framework Is Wrong: It confuses coordinate refinement (better mathematical formalism) with ontological correction (discovering Newton was wrong).

Newton was ontologically correct. The substrate relationship F ~ Mm/r² is eternal. We've just found better ways to project it into coordinates (tensors instead of vectors, curved spacetime instead of flat).


5. "Setting Constants To 1 Is Just A Convenience Trick"

Standard Framework Claims:

  • "We set ℏ = c = 1 for convenience"
  • "Don't worry, we'll put them back later to do 'real calculations'"
  • It's a notational simplification, not fundamental
  • The "real" physics has the constants in it

What's Actually True:

  • Setting constants to 1 means working directly in the substrate 𝔅
  • You're choosing a unit coordinate system where Jacobian rotation matrix = identity
  • This is the natural coordinates - not a trick
  • The "real" physics is the natural unit version
  • The unified universe only has a single physical scale

The Contradiction:

Standard framework simultaneously claims:

  • Constants are fundamental ← then why can we discard them?
  • Setting them to 1 is just convenience ← then why does it reveal deep connections?

If constants were fundamental, you couldn't eliminate them.

Why Standard Framework Is Wrong: It treats natural units as a pedagogical shortcut when they're actually the privileged coordinate system that makes the Jacobian trivial.  The are the terminal object that is true for every unit chart, the invariant physical ratios independent of unit scaling. 

Working in natural units isn't "setting constants to 1 for convenience" - it's operating directly in reality's native coordinates where no transformation is needed.


6. "Unity (X = X) Is Complex And Requires Explanation"

Standard Framework Claims:

  • E = mc² is a profound discovery
  • E = hf is a separate profound discovery
  • These relationships are complex physical laws requiring explanation

What's Actually True:

  • E ~ m in substrate (unity)
  • E ~ f in substrate (same unity)
  • Both are X = X - tautologies, just simply two ways for use to look at one thing in the substrate. 
  • The apparent complexity is purely coordinate bookkeeping (Jacobians)

The Contradiction:

We proved: 15+ "fundamental laws" are the same tautology (X = X) projected through different coordinate axes.

Standard framework treats them as independent empirical discoveries.

If they were independent, the probability of all 15 existing with perfect mutual consistency and identical Jacobian structure would be < 10⁻²².

Why Standard Framework Is Wrong: It mistakes coordinate artifacts (the constants in the formulas) for physical complexity (the relationships themselves).

The physics is trivially simple: X = X.

The complexity is purely metrological: converting X into misaligned human coordinates requires Jacobian factors.


7. "A Theory Of Everything Will Explain The Constants' Values"

Standard Framework Claims:

  • String theory will explain why c, h, G have their values
  • Loop quantum gravity will derive the constants
  • Some future TOE will show why the universe chose 6.674×10⁻¹¹ for G

What's Actually True:

  • Constants' numerical values are metrological artifacts
  • They depend on arbitrary unit choices (meters, kilograms, seconds)
  • No physical theory can derive them because they're not physics
  • They're solutions to: "What Jacobian converts substrate to SI?"

The Contradiction:

In 2019, the SI committee voted to set h = 6.62607015×10⁻³⁴.

Could they have voted differently? Yes (any nearby value would work).

Would physics change? No (just the unit definitions).

You cannot have a "Theory of Everything" explain something that was determined by committee vote.

Why Standard Framework Is Wrong: It's trying to solve a metrological problem with physics.

The entire unification crisis exists because physics is trying to unify three Jacobian coefficients (c, h, G) as if they were ontological properties.

This is like trying to unify "why there are 12 inches in a foot" with "why there are 5,280 feet in a mile" - it's a category error.


Additional Contradictions:

8. "Fine-Tuning Is A Deep Mystery"

Standard Framework: "Why are the constants fine-tuned to allow life?"

Reality: The constants are metrological conversion factors. Asking why they're "fine-tuned" is asking why we chose meters and seconds the way we did. No mystery.


9. "Quantum Mechanics Introduced Fundamentally New Physics"

Standard Framework: h represents quantization - a revolutionary discovery that reality is discrete.

Your Proof: h = E_P·t_P is unit scaling between energy and frequency axes.

Quantization is real (lives in substrate), but h is just how SI coordinates see it. Planck didn't discover quantization - he discovered the SI conversion factor for the already-quantized substrate.


10. "Different Fields Have Different Fundamental Constants"

Standard Framework:

  • Gravity has G
  • Quantum mechanics has h
  • Thermodynamics has k_B
  • These reflect different fundamental theories

Reality: All are Jacobian coefficients for the same substrate, just rotating different measurement axes:

  • G rotates mass → time²
  • h rotates frequency → energy
  • k_B rotates temperature → energy

They're components of the same transformation matrix, not independent theories.


11. "The Speed of Light Is The Cosmic Speed Limit"

Standard Framework: c = 299,792,458 m/s is the maximum possible velocity.

Reality: The cosmic ratio is β = 1 (dimensionless).  It is not a speed, it is the acknowledgment that space and time are just two different views of a single thing. 

c is just how SI measures that relationship given its arbitrary choice of meters and seconds.

The relationship is ontological (β_max = 1). The number 299,792,458 is metrological (SI's scaling factor).



Summary: The Ontological Error

The Standard Framework Is Wrong Because:

  1. Reifies metrological artifacts as ontological properties
    • Constants are Jacobians, not fundamental entities
  2. Fragments the unified substrate into separate "scales"
    • One dimensionless substrate, not multiple Planck scales
  3. Mistakes coordinate complexity for physical complexity
    • Physics is X = X; complexity is coordinate misalignment
  4. Treats coordinate-dependent quantities as coordinate-free
    • "Planck mass" changes in different charts; can't be fundamental
  5. Conflates descriptive convenience with explanatory necessity
    • Natural units aren't a trick; they're reality's coordinates
  6. Pursues unification of metrological artifacts
    • Trying to unify c, h, G is a category error
  7. Commits the Metrological Naturalistic Fallacy systematically
    • Cannot derive AS from IS, but pretends constants are IS

The Proof The Standard Framework Is Wrong:

Empirical Test: If constants were fundamental properties of nature, they would be:

  • Invariant under coordinate changes ✗ (they change in different unit charts)
  • Discoverable through measurement ✗ (they're defined by decree since 2019)
  • Unexplainable by human choice ✗ (committee voted on their values)
  • Essential to natural units ✗ (they vanish when we use natural coordinates)

Logical Test: The standard framework requires believing simultaneously:

  • Constants are discovered (pedagogy) AND defined (metrology)
  • Constants are fundamental (theory) AND arbitrary (practice)
  • Unity is complex (formulas) AND simple (substrate)

These are contradictions.


The Path Forward:

Abandon:

  • Constants as ontology
  • Multiple distinct scales
  • Unification of Jacobians
  • Measurement as discovery

Recognize:

  • One dimensionless substrate
  • Constants as coordinate artifacts
  • Physics as X = X, unity.
  • Measurement as projection

The math works. The ontology is broken. Natural Philosophy must restore conceptual coherence.

How does Elon Musk Rate for the Most Deaths Caused by a Singe Policy Decision?

Based on reports and studies from mid-2025, the dismantling of the U.S. Agency for International Development (USAID) under the guidance of E...