## Informational Realism: The Structure and Derivation of Physical Reality
**Author:** Rowan Brad Quni-Gudzinas
**Affiliation:** QNFO
**Contact:**
[email protected]
**ORCID:** 0009-0002-4317-5604
**ISNI:** 0000 0005 2645 6062
**DOI**: 10.5281/zenodo.17171020
**Publication Date:** 2025-09-21
**Version**: 1.0
This manuscript presents a comprehensive theory of Informational Realism, a form of Ontic Structural Realism which posits that information is the fundamental constituent of the universe. The framework is developed from first principles, tracing the philosophical evolution from substance-based metaphysics to structural realism. It is then formalized using axiomatic set theory (ZFC) to define the static universe of all possible informational states and category theory to describe the dynamic processes governing them. From a minimal set of axioms—the conservation of information (unitarity), the constraint on information processing (the Data Processing Inequality), and a formal definition of measurement—the theory derives the core phenomena of quantum mechanics. The measurement problem, quantum entanglement, the uncertainty principle, and wave-particle duality are shown to be necessary informational corollaries, not arbitrary features of a material substrate. The framework further demonstrates the emergence of spacetime from entanglement geometry and dynamics from quantum computational processes. It provides an axiomatic resolution to the black hole information paradox, defines the epistemological limits of scientific inquiry, and is distinguished from other quantum interpretations. The theory culminates in a concrete, falsifiable prediction regarding a quantized action floor for measurement, elevating it from a philosophical framework to a testable scientific program.
Keywords: Informational Realism, Ontic Structural Realism, Quantum Information Theory, Philosophy of Science, Category Theory, Quantum Foundations, Measurement Problem, Black Hole Information Paradox
---
### **1.0 Ontological Foundations of Informational Realism**
A comprehensive theory of physical reality requires a coherent ontological framework. The theory of **Informational Realism** proposes a fundamental shift from traditional substance-based metaphysics, arguing that the universe’s ultimate constituent is not matter or energy, but information itself. This section traces the philosophical evolution that leads to this position, defines its core tenets, and describes its key variants, thereby setting the stage for the theory’s formal mathematical and physical development.
#### **1.1 The Philosophical Evolution from Substance to Structure**
The proposal that reality is informational is the culmination of a long-standing philosophical refinement in response to challenges posed by the history of science. This evolution begins with scientific realism and, under the pressure of historical evidence, transforms into a defensible structuralist position from which Informational Realism emerges.
##### **1.1.1 The Tenets of Scientific Realism**
Traditional **scientific realism** is the stance that our best scientific theories describe a mind-independent world. This position rests on three core commitments: (1) a metaphysical commitment to a mind-independent reality; (2) a semantic commitment that theories are **truth-apt** statements about that reality; and (3) an epistemic commitment that mature, successful theories are **approximately true**. The “no-miracles argument” supports this epistemic claim, suggesting that the profound predictive success of theories would be inexplicable if they were not latching onto the real structure of the world.
##### **1.1.2 The Historical Challenge: The Pessimistic Meta-Induction**
Traditional scientific realism faces a powerful challenge from the historical record of scientific progress. This challenge, the **pessimistic meta-induction**, uses the history of science as inductive evidence against the truth claims of current theories. Its central premise is that the history of science is a graveyard of discarded ontologies (e.g., phlogiston, caloric, the luminiferous aether) from theories once considered empirically successful. The argument concludes by induction that our current theories’ ontologies, such as those of the Standard Model, are likely also false and will eventually be superseded.
##### **1.1.3 The Response: The Formulation of Structural Realism**
In response to the pessimistic meta-induction, a more nuanced position, **Structural Realism**, was developed. Its core insight, articulated by John Worrall (1989), is that while descriptions of the underlying *nature* of things (the ontology) are often discontinuous between successive theories, the *mathematical equations describing the relations* between phenomena are frequently preserved or subsumed. Structural Realism thus shifts the locus of realist commitment from the intrinsic nature of objects to the objective reality of the relational structures described by the mathematical formalism of our theories. This allows for realism about the structure of the world without commitment to a specific, and likely transient, ontology of objects.
##### **1.1.4 The Bifurcation of Structural Realism**
This structuralist turn evolved into two distinct positions. **Epistemic Structural Realism** (ESR) is a cautious variant maintaining that while objects (“relata”) may exist as the nodes in the relational structure, their intrinsic nature is fundamentally unknowable; our knowledge is limited to the structure of relations. **Ontic Structural Realism** (OSR), championed by James Ladyman (1998), is a more radical variant that makes the metaphysical claim that structure is all there is. Relations are ontologically primary, and what we perceive as “objects” are reducible to their place within this structural network.
#### **1.2 The Ontological Primacy of Information**
Building on this philosophical evolution, Informational Realism emerges as a specific, physically grounded form of Ontic Structural Realism. It provides a concrete candidate for the “structure” that OSR posits is fundamental.
This framework identifies the abstract structure with the concrete physical and mathematical concept of “information,” a position articulated by Luciano Floridi (2008). The laws of physics are re-interpreted as the rules governing the processing and transformation of this fundamental informational substrate. This identification profoundly inverts the traditional materialist hierarchy. In the classical view, information is a secondary property of a material substrate (e.g., ink on paper). In the informational view, the informational structure is primary, and what is perceived as a material substrate—such as an elementary particle—is a manifestation or stable pattern of underlying informational processes.
To make this concept rigorous, the theory defines reality’s basic constituents as **informational objects**. The framework of Object-Oriented Programming provides a powerful analogy: an object is defined not by its underlying substance but by its state (data) and the rules governing its interactions with other objects (methods). Applying this to physics, an electron is defined not by some underlying “stuff,” but by its properties (mass, charge, spin) and its lawful interactions. Its identity is entirely relational and dynamic.
#### **1.3 Key Variants of Informational Realism**
Two major variants offer complementary perspectives on this informational reality.
**Informational Structural Realism** (ISR), primarily associated with Luciano Floridi, provides a comprehensive descriptive ontology. Its main function is to map reality’s structural composition, providing a formal language for analyzing any system in terms of its informational objects and their relations. It posits that the ultimate ground of this structure is **dedomena**—mind-independent data existing as fundamental points of difference from which all information, and thus all reality, is constructed.
A second, generative variant is proposed within this manuscript, described as a **Fisher Information Field Theory** (FIFT). This conceptual model moves beyond a descriptive project to offer a generative account of how the informational universe evolves. It postulates dynamic information fields: a **J-field** as a universal reservoir of latent informational potential (pure possibility), an **I-field** as a transformative agent that actualizes this potential into definite structures, and a **K-field** as a stabilizing agent that ensures systemic coherence and persistence over time.
### **2.0 The Formal Mathematical Architecture**
After establishing the philosophical foundations, the theory requires a rigorous mathematical language. This architecture is twofold: a static set-theoretic foundation defines the universe of all possible informational states, and a dynamic categorical framework describes the processes that govern these states.
#### **2.1 The Static Universe of States: A Set-Theoretic Foundation**
To ground the theory in a paradox-free, universally accepted language, the framework adopts **Zermelo-Fraenkel set theory with the Axiom of Choice** (ZFC). ZFC provides the foundational object (the empty set), generative tools for constructing complex sets (pairing, union, power set), rules for defining subsets (specification), and axioms to enforce a well-founded structure that prevents paradoxes (regularity). The axioms of infinity and choice guarantee the existence of infinite sets and the choice functions required for the mathematics of quantum mechanics.
Using ZFC, we define the **Informational Census**, denoted $\Omega$, as the totality of all possible information, coextensive with the von Neumann cumulative hierarchy (the class of all hereditary well-founded sets). Because a “set of all sets” leads to Russell’s Paradox, $\Omega$ is a **proper class**—a collection too large to be a set itself.
Physical reality, however, corresponds to a distinguishable subset of this space. A physical measurement, with its finite resolution, acts as an **equivalence relation** ($\sim$) on $\Omega$ that partitions it into sets of indistinguishable microstates. The **Physical Poll**, denoted $\mathcal{E}$, is the quotient set $\Omega/\sim$, representing the coarse-grained reality accessible to observation. The Axiom of Replacement ensures that $\mathcal{E}$ is a well-defined set.
This set-theoretic coarse-graining has a profound consequence. By definition, the cardinality of the observable space is less than that of the total possibility space: $|\mathcal{E}| < |\Omega|$. The mapping from $\Omega$ to $\mathcal{E}$ is therefore a many-to-one (surjective but not injective) function, which means information is necessarily lost in any observation. This unavoidable loss is the **Information Deficit**.
#### **2.2 The Dynamic Logic of Processes: A Categorical Framework**
To describe dynamics, evolution, and interaction, the framework uses the language of **category theory**. The field of **Categorical Quantum Mechanics** (CQM), pioneered by Abramsky and Coecke (2004), provides a powerful, process-oriented reformulation of quantum theory where physical processes, not static states, are the fundamental primitives. Its central strength is its focus on **compositionality**, providing rigorous rules for combining processes sequentially (one after another) or in parallel (at the same time).
This perspective is formalized in the **Category of Information** (Inf). The objects of this category are sets of information states from the Physical Poll $\mathcal{E}$. The morphisms ($f: A \to B$) are information-processing channels that represent all physical processes, including interactions and time evolution. To capture the full structure of quantum physics, Inf is endowed with the structure of a **dagger symmetric monoidal category**.
- The **monoidal** structure allows for the description of multiple, composite systems via the tensor product ($\otimes$).
- The **symmetric** structure encodes the physical principle that the ordering of parallel systems is irrelevant.
- The **dagger** ($\dagger$) is an operation that maps a process to its adjoint, representing process reversal. It provides a formal link between processes like state preparation and their corresponding measurements.
This categorical framework has an associated graphical language of **string diagrams**, where systems are represented as wires and processes as boxes. This calculus allows for intuitive, topological reasoning about complex quantum interactions, abstracting away the cumbersome algebra of the standard Hilbert space formalism.
### **3.0 The Axiomatic Principles of an Informational Universe**
Building on the mathematical architecture, the framework posits universal principles governing the conservation, processing, and observation of information. These axioms are presented as the fundamental laws of physics.
#### **3.1 Axiom I: The Conservation of Information**
The first axiom states that information in an isolated system is conserved. This is the informational equivalent of unitarity in quantum mechanics. Information content is quantified by **von Neumann entropy**. For a physical state represented by a density matrix $\rho$, the uncertainty is given by:
$
S(\rho) = -\text{Tr}(\rho \log_2 \rho). \quad (3.1)
$
A **pure** state, known with certainty, has zero entropy; a **mixed** state, representing a probabilistic ensemble, has positive entropy. The axiom formally states that for any isolated system undergoing a unitary transformation $U$, the von Neumann entropy of the state is conserved: $S(U\rho U^\dagger) = S(\rho)$. This axiom expresses the principles of **unitarity** and **reversibility** for all fundamental processes.
#### **3.2 Axiom II: The Constraint on Information Processing**
The second axiom elevates the **Data Processing Inequality** (DPI) from information theory to a fundamental physical law governing information flow. The axiom applies to any process that can be modeled as a **Markov chain** ($X \to Y \to Z$), where the future depends only on the present. The DPI states that for any such chain, the **mutual information** between the beginning and the end cannot exceed the mutual information between the beginning and any intermediate stage:
$
I(X;Z) \le I(X;Y). \quad (3.2)
$
The physical meaning of this axiom is profound: any local physical process can only preserve or degrade the information a system contains about its source; it can never create or amplify new information. This single principle of causality is sufficient to derive many core “rules” of quantum mechanics, including the **no-cloning theorem** and the prohibition of superluminal communication via entanglement.
#### **3.3 Axiom III: The Nature of Observation**
The third axiom provides a quantitative, physical definition of measurement. It defines measurement as any physical interaction that reduces an observer’s uncertainty about a system. This uncertainty is termed the **Information Deficit** ($\Delta I$) and is formally quantified by the **conditional entropy**:
$
\Delta I \equiv H(X|Y) = S(XY) - S(Y). \quad (3.3)
$
A **measurement** is any physical process that results in a reduction of this deficit: $H(X'|Y') < H(X|Y)$. This definition reveals a key signature of **quantum entanglement**: the possibility of negative conditional entropy. In quantum systems, $H(X|Y)$ can be negative, implying that the entropy of the whole system is less than the entropy of one of its parts ($S(XY) < S(Y)$). This is mathematically impossible if information is a local property; it signifies that in entangled systems, information is stored non-locally in the correlations *between* the parts.
### **4.0 The Derivation of Quantum Phenomena as Informational Corollaries**
The power of this axiomatic framework lies in its ability to derive the characteristic—and often paradoxical—features of quantum mechanics as necessary consequences of these fundamental laws of information.
#### **4.1 The Measurement Problem and “Wave Function Collapse”**
The measurement problem is resolved by demonstrating that “collapse” is not a fundamental, ad-hoc process distinct from unitary evolution. It is an emergent, irreversible thermodynamic phenomenon that is a necessary consequence of measurement (Axiom III) under the constraint of information conservation (Axiom I).
The proof proceeds by considering an isolated system comprising a quantum object $X$ and an observer/environment $Y$.
1. By Axiom I, the evolution of the total system $XY$ is unitary, and its total entropy $S(XY)$ is conserved.
2. By Axiom III, a measurement is an interaction that reduces the Information Deficit $H(X|Y) = S(XY) - S(Y)$.
3. For $H(X|Y)$ to decrease while $S(XY)$ remains constant, the entropy of the observer/environment, $S(Y)$, must necessarily increase.
4. By Landauer’s principle (1961), an increase in thermodynamic entropy is equivalent to the logical erasure of information. The increase in $S(Y)$ corresponds to the irreversible erasure of the information about the unobserved outcomes of the superposition, which is dissipated as heat into the environment’s degrees of freedom.
This irreversible, information-erasing process, which appears non-unitary when considering only subsystem $X$, is precisely what is termed “wave function collapse.” It is derived here as the thermodynamic consequence of an information-acquiring interaction within a globally information-conserving universe. The **wave function** is thus reinterpreted as an objective field of potential information, and what appears as “**collapse**” is a two-stage process: first, **decoherence** (Zurek, 2003), where system information becomes entangled with the environment, followed by thermodynamic actualization, where one outcome is stabilized as a classical record. The **Quantum Toll Framework**, proposed herein, models this actualization as a thermodynamic phase transition.
#### **4.2 Quantum Entanglement and Non-Locality**
The paradox of entanglement is resolved by rejecting its foundational premise of **separability**. An entangled pair is redefined as a single, **non-local informational object** whose state is holistically defined. The mathematical inseparability of the state vector reflects a real physical inseparability. As established in Axiom III, the signature of this state is negative conditional entropy, which proves that information is stored non-locally.
A local measurement on one part of the system is therefore an internal update of this unified structure, not a superluminal signal. The notion of “spooky action at a distance” is a category error arising from incorrectly applying a two-object model to what is fundamentally a single, non-local object. Axiom II (the DPI) rigorously guarantees that this internal update cannot be used for superluminal communication.
#### **4.3 The Uncertainty Principle**
The **Heisenberg Uncertainty Principle** is reframed as a fundamental epistemological limit on information extraction. In CQM, an **observable** is an information-extraction process, formalized by the algebraic structure of a **commutative dagger Frobenius algebra**. **Complementary observables**, like position and momentum, correspond to algebraically incompatible algebras. The uncertainty principle emerges as a direct theorem from this formalism: the act of measuring one observable (i.e., projecting onto one basis) necessarily randomizes the state with respect to a complementary observable, creating an unavoidable trade-off in the knowledge an observer can acquire.
#### **4.4 Wave-Particle Duality**
The paradox of **wave-particle duality** is resolved through the methodological principle of **Levels of Abstraction** (LoA). The wave and particle aspects are not contradictory intrinsic properties but are different manifestations of an underlying informational object, revealed by the informational context of the experiment. An interference experiment operates at the “**wave LoA**,” probing the object’s delocalized, field-like behavior. A position-detection experiment operates at the “**particle LoA**,” probing a localized interaction. The object manifests different aspects of its nature depending on the questions asked of it.
### **5.0 The Emergence of Physical Reality from the Informational Substrate**
The framework culminates by showing how the macroscopic structures of reality—spacetime and dynamics—are not fundamental but are emergent properties of the underlying informational network.
#### **5.1 The Genesis of Spacetime**
**Spacetime** is not a pre-existing container but an emergent property of the universe’s entanglement patterns. The framework begins with the principle of **background independence**, asserting that all geometry must derive from relational structure. Geometric proximity between two systems is defined as a monotonically decreasing function of their mutual information ($d(A, B) \propto 1/I(A:B)^\alpha$), meaning highly entangled systems are geometrically “close.” This defines a pre-geometric network from which a continuous geometric space (a **manifold**) can be derived as the best-fit approximation. A causal, **light-cone structure** is then imprinted on this manifold using **causal categories**, which restrict parallel composition to space-like separated systems, thus turning the space into a spacetime, with causality governed by Axiom II.
#### **5.2 The Genesis of Dynamics**
The laws of motion (**dynamics**) emerge from the computational evolution of the universe’s informational state. The universe’s evolution is modeled as a gigantic **quantum computation**. The smooth evolution described by the **Schrödinger equation** is seen as an effective, macroscopic approximation of an underlying discrete process, such as a **quantum cellular automaton** (QCA). This algorithmic view is formalized by **Constructor Theory** (Deutsch, 2013), which posits that fundamental laws are timeless statements about which physical transformations are possible versus impossible. Dynamics are a derived consequence of these timeless computational principles.
### **6.0 Synthesis, Critique, and Scientific Status**
This final section applies the framework to the black hole information paradox, assesses its epistemological limits, and identifies its scientific status.
#### **6.1 A Unifying Test Case: The Black Hole Information Paradox**
The **black hole information paradox** arises from the conflict between semi-classical gravity, which suggests information is destroyed during black hole evaporation (Hawking, 1975), and quantum mechanics, which requires information to be conserved (Axiom I). Informational Realism resolves this axiomatically: information cannot be lost. The scientific task shifts from asking *if* information escapes to *how*. The mechanism is that information is scrambled and re-encoded in subtle, non-local correlations within the outgoing Hawking radiation. Recent validation of the **Page curve** (Page, 1993) shows that the entanglement entropy of the radiation follows the precise behavior required for a unitary, information-preserving process. The information is practically inaccessible due to the exponential computational complexity required for its retrieval, but it is not fundamentally lost.
#### **6.2 Epistemological Constraints of the Framework**
The framework’s own axioms place fundamental limits on what can be known. Axiom II (the DPI) proves that any observation is an information-losing process, making the universe’s complete, fine-grained state (the Informational Census) fundamentally inaccessible. The goal of science is therefore not to find the “true” model but the best possible approximation, a task formalized as finding models that minimize the **Kullback-Leibler divergence** from the true data-generating process. This information loss also explains the practical necessity of **regularization** in all non-trivial data analysis.
#### **6.3 Comparative Analysis with Other Quantum Interpretations**
Informational Realism is a thoroughly realist position. Unlike the **Copenhagen interpretation**, it provides a physical explanation for measurement instead of positing an arbitrary quantum-classical cut. Compared to the **Many-Worlds Interpretation** (MWI), it is more ontologically parsimonious, positing one universe of informational potentials rather than an infinite number of branching actual worlds. In direct opposition to **Quantum Bayesianism** (QBism), which treats information as a subjective belief, IR treats information as an objective, mind-independent feature of reality.
#### **6.4 Philosophical Critiques and Falsifiable Predictions**
The framework answers philosophical critiques such as the **Newman Problem** (the charge that structural claims are trivial) by grounding its structures in dynamic, constrained physical processes. It addresses the charge of being untestable metaphysics by positioning itself as a generative theory from which existing physics should be derivable as an effective approximation. Most importantly, the Quantum Toll model of measurement makes a concrete, **falsifiable prediction**: the formation of a stable, classical measurement outcome is a threshold event requiring a minimum, quantized exchange of physical action ($S = n\hbar$). Experiments probing interactions at extremely low action levels could provide a definitive test, elevating the theory from a philosophical framework to a testable scientific program.
---
### **References**
Abramsky, S., & Coecke, B. (2004). A categorical semantics of quantum protocols. In *Proceedings of the 19th Annual IEEE Symposium on Logic in Computer Science* (pp. 415–425). IEEE Computer Society. https://doi.org/10.1109/LICS.2004.1319612
Bekenstein, J. D. (1973). Black holes and entropy. *Physical Review D, 7*(8), 2333–2346. https://doi.org/10.1103/PhysRevD.7.2333
Bell, J. S. (1964). On the Einstein Podolsky Rosen paradox. *Physics Physique Fizika, 1*(3), 195–200. https://doi.org/10.1103/PhysicsPhysiqueFizika.1.195
Cover, T. M., & Thomas, J. A. (2006). *Elements of information theory* (2nd ed.). Wiley-Interscience.
Deutsch, D. (2013). Constructor theory. *Synthese, 190*(18), 4331–4359. https://doi.org/10.1007/s11229-013-0279-z
Einstein, A., Podolsky, B., & Rosen, N. (1935). Can quantum-mechanical description of physical reality be considered complete? *Physical Review, 47*(10), 777–780. https://doi.org/10.1103/PhysRev.47.777
Everett, H. (1957). “Relative State” formulation of quantum mechanics. *Reviews of Modern Physics, 29*(3), 454–462. https://doi.org/10.1103/RevModPhys.29.454
Floridi, L. (2008). A defence of informational structural realism. *Synthese, 161*(2), 219–253. https://doi.org/10.1007/s11229-007-9163-z
Hawking, S. W. (1975). Particle creation by black holes. *Communications in Mathematical Physics, 43*(3), 199–220. https://doi.org/10.1007/BF02345020
Ladyman, J. (1998). What is structural realism? *Studies in History and Philosophy of Science Part A, 29*(3), 409–424. https://doi.org/10.1016/S0039-3681(98)00021-9
Landauer, R. (1961). Irreversibility and heat generation in the computing process. *IBM Journal of Research and Development, 5*(3), 183–191. https://doi.org/10.1147/rd.53.0183
Nielsen, M. A., & Chuang, I. L. (2010). *Quantum computation and quantum information* (10th anniversary ed.). Cambridge University Press.
Page, D. N. (1993). Information in black hole radiation. *Physical Review Letters, 71*(23), 3743–3746. https://doi.org/10.1103/PhysRevLett.71.3743
Schrödinger, E. (1935). Die gegenwärtige Situation in der Quantenmechanik [The present situation in quantum mechanics]. *Naturwissenschaften, 23*(48), 807–812. https://doi.org/10.1007/BF01491891
Wheeler, J. A. (1990). Information, physics, quantum: The search for links. In W. H. Zurek (Ed.), *Complexity, entropy and the physics of information* (pp. 3–28). Addison-Wesley.
Worrall, J. (1989). Structural realism: The best of both worlds? *Dialectica, 43*(1–2), 99–124. https://doi.org/10.1111/j.1746-8361.1989.tb00933.x
Zurek, W. H. (2003). Decoherence, einselection, and the quantum origins of the classical. *Reviews of Modern Physics, 75*(3), 715–775. https://doi.org/10.1103/RevModPhys.75.715