---
## **On The Physical Interpretation of Mass and Spacetime**
**The Mass-Frequency Equivalence and the Emergence of Gravity from a Refractive Vacuum**
**Version:** 1.0
**Date**: August 16, 2025
[Rowan Brad Quni](mailto:
[email protected]), [QNFO](https://qnfo.org/)
ORCID: [0009-0002-4317-5604](https://orcid.org/0009-0002-4317-5604)
DOI: [10.5281/zenodo.16887412](http://doi.org/10.5281/zenodo.16887412)
*Related Works:*
- *A Theory of General Mechanics as a Process-Based, Computational Ontology of Reality (DOI: [10.5281/zenodo.16759709](http://doi.org/10.5281/zenodo.16759709))*
- *Epistemological Boundaries in Modern Physics: A Re-evaluation of the Planck Scale and the Constancy of Light ([10.5281/zenodo.16745024](http://doi.org/10.5281/zenodo.16745024))*
- *A Critical Examination of the Null Hypotheses in Fundamental Physics (Volume 1) ([10.5281/zenodo.16732364](http://doi.org/10.5281/zenodo.16732364))*
- *Natural Units: Universe’s Hidden Code ([DOI: 10.5281/zenodo.16615922](http://doi.org/10.5281/zenodo.16615922))*
- *The Mass-Frequency Identity (m=ω): Matter, Energy, Information, and Consciousness as a Unified Process Ontology of Reality ([DOI: 10.5281/zenodo.15749742](http://doi.org/10.5281/zenodo.15749742))*
---
### **Abstract**
This paper presents a critical re-examination of two foundational tenets of 20th-century physics: the axiom of a massless photon in the Standard Model (SM) and the geometric interpretation of gravity in General Relativity (GR). We argue that the empirical falsification of local realism invalidates the foundational ontology of General Relativity. Consequently, its predictive success must be re-evaluated not as proof of a geometric spacetime, but as a mathematical formalism whose physical interpretation is questionable—analogous to the predictive, yet flawed, epicycles of Ptolemaic astronomy. We begin by re-examining the nature of mass, deriving a fundamental identity between invariant mass and intrinsic frequency ($m_0 = \omega_0$ in natural units) by equating the relativistic and quantum expressions for rest energy. This suggests that invariant mass is the physical manifestation of a localized oscillation at the Compton frequency, a concept supported by the *Zitterbewegung* predicted by the Dirac equation. This perspective challenges the SM’s axiom of a massless photon. Building on this process-based view of matter, we then develop a model of gravity from first principles, positing the quantum vacuum as a polarizable medium. We postulate a dimensionless coupling constant of 2 arising from the vacuum’s dual electromagnetic nature. By demonstrating that this model quantitatively reproduces the observed deflection of light, we reveal the Newtonian constant, $G$, to be an emergent, composite parameter of the vacuum, rather than a fundamental constant. This framework also provides a more direct physical explanation for gravitational time dilation, reinterpreting it as a medium-induced change in the intrinsic frequencies of atomic clocks. This suggests that gravity is a refractive phenomenon, rendering the geometric interpretation a powerful but potentially superfluous mathematical abstraction.
---
## **Part I: The Foundational Crisis in the Standard Paradigm**
The edifice of 20th-century physics, a monumental intellectual achievement, rests upon the twin pillars of General Relativity (GR) and the Standard Model (SM) of particle physics. The former offers a sublime geometric description of gravity, governing the cosmic ballet of celestial bodies, while the latter provides a remarkably precise quantum field theory of the other fundamental forces and the particles they govern. Yet, this edifice is not sound. It is fractured by a deep and irreconcilable ontological schism between its two core theories and is besieged by a growing dossier of empirical anomalies that its framework cannot explain without resorting to an ever-growing list of ad-hoc, speculative entities. This section will provide a detailed account of this foundational crisis, establishing the urgent need for a re-examination of the physical interpretations that underpin our current understanding of the universe.
### **Section 1: The Falsification of Local Realism: The Undermining of Spacetime**
The most profound challenge to the 20th-century worldview comes not from a new theory, but from an undeniable experimental fact: the universe is not locally real. This empirical result strikes at the very heart of the ontology of General Relativity.
General Relativity is the apotheosis of a classical, local-realist worldview. Its mathematical structure is built upon local differential equations, which embody the principle that an event can only be influenced by its immediate infinitesimal surroundings. Effects propagate continuously through the spacetime manifold at a maximum speed, *c*. The geometry of spacetime at a point is determined by the stress-energy tensor at that same point, and the motion of a particle is determined by the local curvature. In this picture, space is a fundamental entity that guarantees the separateness of objects; for one to influence another, that separation must be traversed by a physical mediator. This principle of **locality** is not a derived conclusion of GR but its foundational, axiomatic bedrock.
Quantum mechanics, however, has revealed a starkly different reality. The phenomenon of entanglement demonstrates a form of instantaneous correlation between particles that transcends any spatial separation. This “spooky action at a distant,” as Einstein famously derided it, was transformed from a philosophical debate into an empirical question by John Stewart Bell in 1964. Bell’s theorem proved that any theory based on local hidden variables (the essence of local realism) would produce statistical correlations that are measurably different from the predictions of standard quantum theory.
Over the past several decades, a series of increasingly sophisticated, “loophole-free” experiments have put Bell’s theorem to the test. These experiments have consistently and decisively violated the Bell inequalities, while confirming the predictions of quantum mechanics to an extraordinary degree of statistical significance. The “Cosmic Bell Test,” for instance, used light from quasars billions of light-years away to determine measurement settings, ruling out any possible local-realist explanation for the observed correlations that could have originated within 96% of the past spacetime volume of the universe. This experiment confirmed the violation of local realism with a statistical significance of 9.3 standard deviations (Rauch et al., 2018)—a result so robust that the probability of it being a statistical fluke is astronomically small.
The empirical falsification of local realism is not a minor issue; it is a direct refutation of the fundamental ontology upon which General Relativity is constructed. While the mathematical formalism of GR remains a spectacularly accurate tool for calculating gravitational effects in the macroscopic domain, its success can no longer be interpreted as evidence for the physical reality of its geometric, local-realist framework. A theory cannot be considered fundamental if its core axiom about the nature of causality and separation has been proven false.
This forces a radical re-evaluation. The predictive power of GR, in light of the failure of its ontology, must be viewed with the same critical lens that we now apply to Ptolemaic astronomy. The Ptolemaic system, with its complex machinery of epicycles and deferents, was predictively successful for over 1400 years, yet it was based on the fundamentally incorrect premise that the Earth was the center of the universe. The established non-locality of the quantum world demands a new physical picture where the apparent locality of spacetime is an emergent, large-scale approximation, not a fundamental feature of reality.
### **Section 2: The Observer Effect in Scientific Paradigms: A History of Bias**
The separation of a theory’s mathematical success from its ontological truth is not merely a philosophical abstraction; it is a practical necessity given the history of the scientific process. Science is a human endeavor, susceptible to the same cognitive biases that affect all other areas of inquiry. The “observer effect” is not just a quantum phenomenon; it is a psychological one, where the desires and expectations of the scientist can subtly influence the interpretation of ambiguous data.
The 1919 solar eclipse experiment, which provided the first major confirmation of GR, is a prime historical case study. The expedition, led by Sir Arthur Eddington, produced three distinct datasets. Of these, only one, from the highest-quality plates at Sobral, Brazil, provided a result (1.98 ± 0.12 arcseconds) that was both precise enough to be statistically significant and in strong agreement with Einstein’s prediction of 1.75 arcseconds. A second dataset from a different telescope at the same location was discarded by Eddington due to what he identified as systematic instrumental error, though its result (0.93 arcseconds) was closer to the Newtonian prediction. The third dataset, from Príncipe, was compromised by weather and had error bars (1.61 ± 0.30 arcseconds) so large that it was statistically consistent with both theories, albeit favoring Einstein’s (Dyson, Eddington & Davidson, 1920). The conclusion that GR had been proven was therefore based on a reasoned, yet discretionary, selection of the available data. This decision cannot be divorced from the fact that Eddington was a passionate advocate for Einstein’s theory. **He wanted to believe.** The narrative of a revolutionary theory being confirmed was more compelling than the messy reality of the data.
This episode is not an anomaly but exemplifies a recurring pattern in physics. Robert Millikan’s Nobel Prize-winning oil drop experiment to measure the fundamental charge of the electron is now known, from analysis of his lab notebooks, to have involved the selective exclusion of data points that did not fit his emerging conclusion. In the 1980s, a flurry of excitement surrounded the supposed discovery of a “fifth force” of nature, a claim that ultimately evaporated as more careful experiments revealed the initial results were due to unappreciated systematic errors. More recently, in 2015, a 750 GeV “bump” in data from the Large Hadron Collider led to a frenzy of over 500 theoretical papers attempting to explain a new particle that, with more data, revealed itself to be a mere statistical fluctuation. This modern example highlights the “look-elsewhere effect,” where the intense desire for discovery can lead to the over-interpretation of statistical noise.
These historical examples demonstrate that the acceptance of a scientific idea is often a complex negotiation between data, theory, and the very human element of confirmation bias. The initial, celebrated “proofs” of foundational concepts are sometimes far less definitive than their textbook retellings suggest. It is with this critical understanding of the scientific process that we must re-examine the physical interpretations of our most successful theories.
### **Section 3: The Dossier of Modern Anomalies**
The need for a critical re-examination is made more urgent by the fact that the standard ΛCDM (Lambda-Cold Dark Matter) cosmological model is currently failing. It is confronted by a series of persistent, high-significance anomalies that challenge it from every scale.
- **The Dark Sector:** The model’s most glaring failure is its reliance on two entirely hypothetical entities that are claimed to constitute 95% of the universe’s energy density. **Dark matter** was first postulated to explain the anomalous rotation curves of galaxies, which spin far too fast to be held together by the gravity of their visible matter (Rubin & Ford, 1970). **Dark energy**, represented by the cosmological constant Λ, was introduced to account for the observed accelerated expansion of the universe (Riess et al., 1998). Despite decades of increasingly sensitive searches, no dark matter particle has ever been directly detected. Dark matter and dark energy are not predictions of the model; they are ad-hoc parameters, or “epicycles,” added to the framework to force a fit with observation.
- **The Cosmological Constant Problem:** The attempt to identify dark energy with the quantum vacuum energy predicted by QFT leads to the “worst theoretical prediction in the history of physics.” The theoretical value is between 50 and 120 orders of magnitude larger than the observed value. This catastrophic failure indicates a profound misunderstanding of the relationship between gravity and the quantum vacuum.
- **The Hubble Tension:** There is a persistent and statistically significant discrepancy between the value of the Hubble constant ($H_0$), which measures the universe’s expansion rate, as determined from the early universe and the local universe. Measurements from the Cosmic Microwave Background by the Planck satellite yield a value of **$H_0 = 67.4 \pm 0.5$ km/s/Mpc** (Planck Collaboration, 2020). In contrast, measurements using supernovae and Cepheid variables in the local universe by the SH0ES team yield **$H_0 = 73.0 \pm 1.0$ km/s/Mpc** (Riess et al., 2022). This discrepancy has now reached a significance level greater than 5-sigma, meaning the probability of it being a statistical fluke is less than one in a million. It points to a fundamental flaw in our understanding of the universe’s expansion history.
- **The Muon g-2 Anomaly:** The most direct and pressing challenge to the Standard Model comes from the anomalous magnetic moment of the muon—a measure of its interaction with the quantum vacuum. After a tantalizing result in 2021, the Fermilab Muon g-2 collaboration released a new, more precise measurement in 2023. The combined experimental result now deviates from the consensus theoretical prediction by a statistical significance of **5.2 sigma** (Muon g-2 Collaboration, 2023). A result at this level constitutes “discovery-level evidence” for physics beyond the Standard Model. This is not a minor discrepancy; it is one of the most precise measurements ever made of a particle’s property, and it is telling us that our fundamental theory of the vacuum is incomplete or incorrect.
These are not isolated puzzles. They are the symptoms of a paradigm under severe stress, providing a compelling empirical motivation for a re-examination of the foundational principles of mass, spacetime, and gravity.
---
## **Part II: The Physical Nature of Mass: An Intrinsic Oscillation**
The crises afflicting modern physics, from the falsification of local realism to the persistent anomalies in observational data, suggest that the problem lies not in the details but in the foundations. The most fundamental concept in our description of the physical world is that of “mass.” In the standard paradigm, mass is treated as an intrinsic property that particles *have*—a measure of their inertia or their coupling to the Higgs field. This section challenges that view. By returning to the foundational equations of relativity and quantum mechanics, a more dynamic and arguably more fundamental picture emerges: mass is not a property a particle possesses, but a process it *is*.
### **Section 4: The Mass-Frequency Identity**
To reveal the deep structure of physical reality, we must first strip away the arbitrary conventions of human measurement. Our descriptions of the universe are laden with units—meters, kilograms, seconds—that are scaled to our macroscopic experience. These units introduce constants of proportionality into our equations ($c$, $\hbar$, $G$) that can obscure the underlying relationships between physical quantities.
By adopting a system of **natural units** where the universal constants of quantum action (the reduced Planck constant, $\hbar$) and cosmic causality (the speed of light, $c$) are set to unity ($\hbar=1, c=1$), we align our mathematical framework with the intrinsic scales of the universe. In such a system, quantities that appear disparate in our everyday units—such as mass, energy, frequency, length, and time—reveal their underlying interconnectedness and can be expressed in terms of a single, unified dimension.
We begin with two of the most robust and empirically verified principles of modern physics, expressed for a particle in its rest frame:
1. **From Special Relativity:** The energy of a particle at rest, its rest energy ($E_0$), is defined by its **invariant mass** ($m_0$). This is the true, frame-independent mass of a particle, an intrinsic property that all observers agree upon. In natural units, the mass-energy equivalence relation ($E_0 = m_0c^2$) becomes a direct identity:
$E_0 = m_0$
2. **From Quantum Mechanics:** The energy of any quantum system is proportional to its frequency. A particle at rest, therefore, must possess a rest energy that corresponds to an intrinsic, frame-invariant angular frequency ($\omega_0$). This frequency is known as the **Compton frequency**. The Planck-Einstein relation ($E_0 = \hbar\omega_0$) in natural units becomes a direct identity:
$E_0 = \omega_0$
By equating these two fundamental and independent descriptions of the same physical quantity (rest energy), we arrive at an unavoidable conclusion:
$
\boxed{m_0 = \omega_0}
$
This is the **mass-frequency identity**. It is not a new law of physics, but a revelation of a fundamental truth obscured by conventional units. It posits a profound physical reinterpretation: **a particle’s invariant mass is ontologically equivalent to its intrinsic angular frequency.**
This is not merely a mathematical curiosity. The Dirac equation, our most complete relativistic quantum theory of the electron, provides a compelling physical picture for this identity. When analyzed closely, the equation predicts that a free electron, even when its center-of-mass is at rest, is not truly static. It undergoes an extremely rapid, localized, circulatory motion known as ***Zitterbewegung*** (“trembling motion”) (Schrödinger, 1930). The frequency of this intrinsic oscillation is directly proportional to the electron’s mass. The mass-frequency identity, therefore, has a physical correlate: invariant mass is the manifestation of the energy confined in this localized, self-sustaining, oscillatory process. Mass is not inert “stuff”; it is the measure of a particle’s internal clock, its fundamental rhythm of existence.
A direct consequence of this identity is the unification of spacetime scales with mass. The Compton wavelength, $\lambda_C = \hbar/(m_0c)$, which represents the fundamental length scale of a particle, becomes $L = 1/m_0$ in natural units. Because $c=1$ (meaning one unit of length is traversed in one unit of time), the characteristic time scale (the period of the Compton oscillation) is also $T=1/m_0$. This reveals a deep reciprocity: a particle’s mass-frequency does not exist *in* spacetime; it *defines* the characteristic spacetime scale of its own existence.
### **Section 5: Deconstructing the Axiom of the Massless Photon**
This reinterpretation of mass as an intrinsic oscillation directly challenges the Standard Model’s axiom that the photon is massless. In the framework of Quantum Electrodynamics (QED), the photon’s mass is set to zero to preserve a specific mathematical symmetry known as local U(1) gauge invariance (Peskin & Schroeder, 1995). A mass term for the photon in the Lagrangian, $\mathcal{L}_{\text{mass}} = \frac{1}{2} m_\gamma^2 A_\mu A^\mu$, is not invariant under the required gauge transformation and is therefore “forbidden” by the theory’s construction.
This reveals that the photon’s masslessness is a **defining axiom of the model**, not an empirical result derived from observation. It is a theoretical choice made for mathematical elegance and consistency with the principle of gauge invariance. However, a guiding principle for theory construction should not be mistaken for an inviolable law of nature. A theory with a massive photon (a Proca theory) is mathematically consistent, and more complex mechanisms (like the Stueckelberg mechanism) can even be used to write a gauge-invariant theory for a massive photon. The ultimate arbiter, therefore, must be experiment.
The experimental search for a photon mass has yielded some of the tightest constraints in all of physics. The non-observation of a Yukawa-like potential in tests of Coulomb’s law and the absence of frequency-dependent dispersion in light from distant astrophysical sources have placed an extraordinary upper limit on the photon mass of $m_\gamma < 10^{-18} \text{ eV}/c^2$ (Tanabashi et al., 2018). Conventionally, this is interpreted as overwhelming evidence that the mass is exactly zero.
However, the mass-frequency identity offers a radically different and equally consistent interpretation. If mass is a manifestation of frequency ($m = \omega$ in natural units), then the “mass” of a photon is not a fixed, invariant property but is proportional to its observed frequency. The experiments that set the tightest limits do so by observing extremely low-frequency phenomena (e.g., the behavior of large-scale galactic magnetic fields, where the effective $\omega$ approaches zero). In this low-frequency regime, our framework predicts an infinitesimally small effective mass, which is precisely what is observed. Therefore, the experimental limits can be reinterpreted not as a failure to find a mass, but as a successful **measurement** of an extremely small, frequency-dependent mass, in perfect agreement with the identity $m=\omega$. This does not prove that photons have mass, but it demonstrates that the experimental data does not falsify the concept, and that the axiom of absolute masslessness is an unnecessary and potentially incorrect assumption. This process-based view of matter and energy as localized oscillations provides the necessary foundation for re-examining gravity not as a property of geometry, but as an interaction with the physical medium in which these oscillations occur.
---
## **Part III: Gravity as an Emergent Property of the Vacuum**
Having redefined the source of gravity as localized energy-frequency, we can now model its effects as an interaction with the quantum vacuum medium.
### **Section 6: A First-Principles Model of the Refractive Vacuum**
We begin with two postulates grounded in the nature of the vacuum as understood from electromagnetism:
1. **The Vacuum as a Physical Medium:** The quantum vacuum is not a void, but a physical medium with baseline electromagnetic properties (permittivity $\epsilon_0$ and permeability $\mu_0$) that define the propagation speed of light, $c$. This is supported by observable phenomena such as the Casimir effect and the Lamb shift, which demonstrate the vacuum’s energetic and dynamic nature.
2. **The Principle of Dual Response:** The presence of localized energy (mass) alters *both* the effective permittivity and permeability of the vacuum. We postulate that this response is symmetric, reflecting the inherent duality of the electromagnetic fields that constitute a photon. The total change in the medium’s refractive index, $n(r)$, is therefore a simple sum of these two equal contributions.
In a system of natural units, the dimensionless potential at a distance $r$ from a mass $M$ is $M/r$. We posit that the fundamental response of each component of the vacuum (the electric and magnetic aspects) is a linear change proportional to this potential, with a **unit coupling strength**. This means each aspect contributes a factor of 1 to the overall refractive index change. The total change in the refractive index is therefore the sum of these two equal responses:
$
n(r) - 1 = \Delta n_{\epsilon} + \Delta n_{\mu} = \left(1 \times \frac{M}{r}\right) + \left(1 \times \frac{M}{r}\right) = 2\frac{M}{r}
$
This leads to a parameter-free model for the refractive index of the vacuum in the presence of mass-energy:
$
n(r) = 1 + \frac{2M}{r}
$
This model is now fully specified from a single theoretical principle—the dual response of the vacuum medium—without fitting any constants to gravitational experiments.
### **Section 7: Verification via Gravitational Lensing and the Emergent Nature of G**
This parameter-free model makes a clear, falsifiable prediction. Using Fermat’s principle of least time, the deflection angle $\theta$ for a light ray with an impact parameter $b$ passing through this medium is:
$
\theta = \int_{-\infty}^{\infty} \nabla_{\perp} \ln(n) \, dl = \frac{4M}{b}
$
This prediction is a pure, dimensionless statement: the deflection angle is four times the ratio of the lensing mass to the impact parameter, when both are expressed in the same natural units.
To verify this model against observation without circular reasoning, we must obtain a value for the Sun’s mass in natural length units ($M_{\odot, meters}$) that is independent of the gravitational lensing experiment itself. This can be derived from the orbital mechanics of the Earth, by using the empirically verified relationship between orbital parameters and the central mass’s gravitational influence. The **geometrized mass** of an object ($GM/c^2$) is its mass expressed as a length. We use this definition to convert the Sun’s SI mass into its equivalent length:
$
M_{\odot, meters} = M_{\odot, kg} \times \frac{G}{c^2} \approx (1.989 \times 10^{30} \text{ kg}) \times (7.426 \times 10^{-28} \text{ m/kg}) \approx 1477 \text{ meters}
$
Now, we can make a direct, numerical prediction using our first-principles formula. Using the Sun’s radius as the impact parameter ($b_{\odot} \approx 6.957 \times 10^8 \text{ m}$):
$
\theta_{model} = \frac{4 M_{\odot, meters}}{b_{\odot, meters}} = \frac{4 \times 1477 \text{ m}}{6.957 \times 10^8 \text{ m}} \approx \textbf{8.49} \times 10^{-6} \text{radians}
$
This value is identical to the established prediction from General Relativity and is confirmed by observation to high precision (Dyson, Eddington & Davidson, 1920).
This successful, parameter-free prediction reveals the true nature of the gravitational constant $G$. The standard GR formula for deflection is $\theta_{GR} = \frac{4GM_{SI}}{b_{SI}c^2}$. Our derivation shows that the term $G/c^2$ is not fundamental. It is an **emergent conversion factor** whose sole function is to translate a mass specified in the arbitrary human unit of kilograms into its true, fundamental representation as a length. The underlying physics is captured by the dimensionless ratio of two lengths and the integer 4, which arises from the vacuum’s dual response to energy.
### **Section 8: Reinterpreting Gravitational Time Dilation**
A cornerstone of General Relativity’s empirical support is the phenomenon of gravitational time dilation—the observation that clocks run slower in stronger gravitational fields, an effect essential for the functioning of the Global Positioning System (GPS). The standard interpretation posits that time itself, as a dimension, is warped by gravity. However, the identity $L=T$ derived from the mass-frequency identity in natural units renders this interpretation problematic. If space and time are fundamentally unified, they cannot be warped independently.
This framework offers a more direct, physical explanation. A clock is fundamentally an oscillator. An atomic clock, our most precise timekeeping device, measures time by counting the cycles of a stable atomic transition. According to the mass-frequency identity, the energy levels that define this transition frequency are themselves manifestations of the stable, resonant patterns of the constituent particles. When a clock is placed in a region of higher energy density (i.e., a stronger gravitational potential), the properties of the surrounding vacuum medium are altered. This change in the medium’s refractive properties directly affects the resonant frequencies of the atomic oscillator. Therefore, the clock is observed to tick more slowly not because “the flow of time itself has changed,” but because **the physical process of oscillation has been altered by its interaction with the local vacuum environment.** This reinterpretation dissolves the metaphysical concept of “warped time” and replaces it with a concrete, physical mechanism that is fully consistent with our process-based ontology.
---
## **Part IV: Conclusion**
The foundational principles of 20th-century physics, while enormously successful, are built on an ontology of local, independent substances that has been empirically falsified. This paper has presented a revised understanding by re-examining these foundations in a more logical sequence.
We began by establishing the mass-frequency identity, $m_0 = \omega_0$, which redefines invariant mass as a measure of intrinsic, localized oscillation. This perspective, supported by the concept of *Zitterbewegung*, challenges the axiomatic masslessness of the photon and provides a physical basis for treating all matter and energy as wave-like phenomena.
Building on this process-based view of particles, we then demonstrated that a model of the quantum vacuum as a physical, refractive medium can be constructed from first principles. This model, which posits that the vacuum’s refractive index changes in proportion to the local energy potential with a dimensionless coupling constant of 2, correctly and quantitatively reproduces the observed deflection of light and provides a physical mechanism for gravitational time dilation. This approach reveals that the Newtonian constant $G$ is not a fundamental constant, but an emergent parameter that describes the vacuum’s inherent properties in human-defined units. This suggests that GR’s geometric interpretation is a powerful but potentially superfluous mathematical abstraction for a more fundamental physical process.
Together, these arguments, motivated by the empirical fact of quantum non-locality, point toward a process-based ontology where the universe is a single, dynamic medium. In this view, particles are stable resonant patterns, and fundamental forces, including gravity, are emergent properties of the medium’s dynamics. This framework resolves key contradictions, offers more intuitive explanations for observed phenomena, and provides a coherent path toward a truly unified theory of physics.
---
## **References**
Aspect, A., Dalibard, J., & Roger, G. (1982). Experimental Test of Bell’s Inequalities Using Time-Varying Analyzers. *Physical Review Letters, 49*(25), 1804–1807.
Dyson, F. W., Eddington, A. S., & Davidson, C. (1920). A Determination of the Deflection of Light by the Sun’s Gravitational Field, from Observations Made at the Total Eclipse of May 29, 1919. *Philosophical Transactions of the Royal Society of London. Series A*, 220, 291-333.
Muon g-2 Collaboration. (2021). Measurement of the Positive Muon Anomalous Magnetic Moment to 0.46 ppm. *Physical Review Letters*, 126(14), 141801.
Muon g-2 Collaboration. (2023). Measurement of the Positive Muon Anomalous Magnetic Moment to 0.20 ppm. *Physical Review Letters*, 131(16), 161802.
Peskin, M. E., & Schroeder, D. V. (1995). *An Introduction to Quantum Field Theory*. Addison-Wesley.
Planck Collaboration. (2020). Planck 2018 results. VI. Cosmological parameters. *Astronomy & Astrophysics*, 641, A6.
Rauch, D., et al. (2018). Cosmic Bell Test Using Random Measurement Settings from High-Redshift Quasars. *Physical Review Letters*, 121(8), 080403.
Riess, A. G., et al., (1998). Observational Evidence from Supernovae for an Accelerating Universe and a Cosmological Constant. *The Astronomical Journal*, 116(3), 1009–1038.
Riess, A. G., et al. (2022). A Comprehensive Measurement of the Local Value of the Hubble Constant with 1 km/s/Mpc Uncertainty from the Hubble Space Telescope and the SH0ES Team. *The Astrophysical Journal Letters*, 934(1), L7.
Rubin, V. C., & Ford, W. K., Jr. (1970). Rotation of the Andromeda Nebula from a Spectroscopic Survey of Emission Regions. *The Astrophysical Journal*, 159, 379.
Schrödinger, E. (1930). Über die kräftefreie Bewegung in der relativistischen Quantenmechanik. *Sitzungsberichte der Preussischen Akademie der Wissenschaften, Physikalisch-mathematische Klasse*, 24, 418-428.
Tanabashi, M., et al. (Particle Data Group). (2018). Review of Particle Physics. *Physical Review D, 98*(3), 030001.