# Implementation of Quantitative Metrics for IO Simulations (Python)
## 1. Objective
This node provides Python/NumPy implementations for the quantitative metrics defined in [[releases/archive/Information Ontology 1/0128_IO_Metrics_Definition]]. These functions will be used to analyze the simulation results generated by the IO v2.4 code (and future versions) and provide a more objective characterization of emergent behavior.
## 2. Python Code (Metric Functions)
```python
import numpy as np
from scipy.fft import fft
def calculate_average_domain_length(epsilon_state, delta):
"""Calculates the average domain length in a 1D epsilon state."""
N = len(epsilon_state)
domain_lengths = []
for i in range(N):
max_length = 0
for start_dir in [1, -1]: # Check both directions
current_length = 0
current_sum = epsilon_state[i]
for offset in range(1, N): # Limit search to avoid infinite loops
j = (i + offset * start_dir) % N
current_sum += epsilon_state[j]
current_length += 1
avg_domain = current_sum / current_length
if abs(epsilon_state[j] - avg_domain) > delta:
break # Break if out of tolerance
max_length = max(max_length, current_length)
domain_lengths.append(max_length)
return np.mean(domain_lengths)
def calculate_average_gradient_magnitude(phi_state):
"""Calculates the average gradient magnitude in a 1D phi state."""
N = len(phi_state)
grad = np.abs(np.roll(phi_state, -1) - np.roll(phi_state, 1)) / 2.0
return np.mean(grad)
def calculate_boundary_density_and_velocity(phi_state, grad_threshold, prev_boundary_positions=None):
"""Calculates boundary density and average velocity in a 1D phi state."""
N = len(phi_state)
grad = np.abs(np.roll(phi_state, -1) - np.roll(phi_state, 1)) / 2.0
boundary_positions = np.where(grad > grad_threshold)[0]
boundary_density = len(boundary_positions) / N
avg_velocity = None
if prev_boundary_positions is not None:
# Calculate boundary displacements (periodic boundary conditions)
displacements = []
for pos in boundary_positions:
# Find closest matching boundary in previous state
min_dist = float('inf')
closest_prev_pos = None
for prev_pos in prev_boundary_positions:
dist = min(abs(pos - prev_pos), N - abs(pos - prev_pos)) # Periodic distance
if dist < min_dist:
min_dist = dist
closest_prev_pos = prev_pos
displacements.append(min_dist)
if displacements: # Avoid error if no boundaries exist
avg_velocity = np.mean(displacements) # Displacements are per time step (dt=1 here)
return boundary_density, avg_velocity, boundary_positions
def calculate_power_spectral_density(phi_history, dt):
"""Calculates the average power spectral density from a history of phi states."""
N_nodes = phi_history.shape[1]
psd_list = []
for i in range(N_nodes):
phi_time_series = phi_history[:, i]
yf = fft(phi_time_series)
T = dt # Time step
N = len(phi_time_series)
xf = np.fft.fftfreq(N, T)[:N//2] # Positive frequencies
psd = np.abs(yf[0:N//2])**2 # Power spectral density
psd_list.append(psd)
avg_psd = np.mean(np.array(psd_list), axis=0)
return xf, avg_psd
def calculate_compactness(phi_state, phi_background_avg, amplitude_threshold):
"""Calculates the compactness of localized structures (if any) in a 1D phi state."""
# This is a rudimentary implementation, needs refinement
localized_regions = np.where(np.abs(phi_state - phi_background_avg) > amplitude_threshold)[0]
if len(localized_regions) == 0:
return 0 # No localized structures
else:
# Very rough estimate: peak amplitude / average width
peak_amplitude = np.max(np.abs(phi_state[localized_regions] - phi_background_avg))
width = len(localized_regions) # Number of points above threshold
if width == 0:
return 0
else:
return peak_amplitude / width
```
## 3. Function Descriptions
* **`calculate_average_domain_length(epsilon_state, delta)`:** Calculates the average length of contiguous domains of similar states in a 1D binary `epsilon_state`.
* `epsilon_state`: 1D NumPy array of binary states (0 or 1).
* `delta`: Tolerance for state deviation within a domain.
* **`calculate_average_gradient_magnitude(phi_state)`:** Calculates the average magnitude of the spatial gradient in a 1D continuous `phi_state`.
* `phi_state`: 1D NumPy array of continuous state values.
* **`calculate_boundary_density_and_velocity(phi_state, grad_threshold, prev_boundary_positions=None)`:** Calculates the density and average velocity of boundaries (regions of high gradient) in a 1D `phi_state`.
* `phi_state`: 1D NumPy array of continuous state values.
* `grad_threshold`: Threshold for defining a boundary.
* `prev_boundary_positions`: (Optional) Array of boundary positions from the previous time step for velocity calculation.
* **`calculate_power_spectral_density(phi_history, dt)`:** Calculates the average power spectral density (PSD) from a history of `phi` states.
* `phi_history`: 2D NumPy array where each row is the `phi` state at a given time.
* `dt`: Time step between samples.
* **`calculate_compactness(phi_state, phi_background_avg, amplitude_threshold)`:** (Rudimentary) Calculates a compactness measure for localized structures in a 1D `phi_state`.
* `phi_state`: 1D NumPy array of continuous state values.
* `phi_background_avg`: Average value of `phi` (background level).
* `amplitude_threshold`: Threshold for defining a localized region.
## 4. Next Steps
1. **Integrate into Simulation Code:** Incorporate these functions into the main simulation loop in the code (v2.4 from [[releases/archive/Information Ontology 1/0116_IO_Simulation_v2.2_Code]]) to calculate and store these metrics over time.
2. **Apply to Run 15 Data:** Use these functions to analyze the data generated in Run 15 [[releases/archive/Information Ontology 1/0127_IO_Simulation_Run15]] and quantify the observed domain structure and dynamics.
3. **Parameter Sweeps:** Use these metrics to systematically explore the parameter space in future simulation runs, identifying regions with interesting emergent behavior.
## 5. Conclusion
This node provides the necessary Python code for quantifying the emergent structures and dynamics in the IO continuous-state network model. Implementing these metrics and applying them to simulation results will be crucial for moving beyond qualitative descriptions and rigorously evaluating the framework's potential. The next step is to integrate these functions into the simulation workflow and re-analyze Run 15.
---
title: "IO Simulation v2.4 (Continuous State) - 1D Run 15 Re-Analysis (Metrics)"
aliases: [0129_IO_Simulation_Run15_Analysis, IO v2.4 Sim Run 15 Metrics]
tags: [IO_Framework, simulation, results, analysis, metrics, python, emergence, continuous_state, network_dynamics, information_dynamics]
related: [0000, 0128, 0127, 0116, 0104, 0044] # Framework, Metrics Implementation, Sim Run 15, Sim Code v2.4, Formalism v2 Summary, Emergence/Complexity
status: experimental_result
version: 1.0
author: Rowan Brad Quni
summary: "Applies the quantitative metrics implemented in node 0129 to the simulation data from Run 15 (IO v2.4, continuous state, intermediate coupling), providing a more detailed characterization of the emergent structures and dynamics."
created: 2025-04-21T17:31:22Z
modified: 2025-04-21T17:31:22Z
---
# IO Simulation v2.4 (Continuous State) - 1D Run 15 Re-Analysis (Metrics)
## 1. Objective
This node presents the results of applying the quantitative metrics implemented in [[releases/archive/Information Ontology 1/0129_IO_Metrics_Implementation]] to the simulation data generated by Run 15 [[releases/archive/Information Ontology 1/0127_IO_Simulation_Run15]]. Run 15 used the continuous-state IO model (v2.4) with intermediate coupling strengths and exhibited promising emergent structures. The goal here is to provide a more detailed, quantitative characterization of these structures and dynamics.
## 2. Data Source
The analysis uses the `phi_history` (φ state evolution), `avg_theta_history` (average stability), and simulation parameters from the `results` dictionary generated by executing the code from [[releases/archive/Information Ontology 1/0116_IO_Simulation_v2.2_Code]] with Parameter Set 15 (as documented in [[releases/archive/Information Ontology 1/0127_IO_Simulation_Run15]]).
## 3. Python Code (Analysis)
```python
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.colors as mcolors
import io
import base64
from scipy.fft import fft
# --- Assume functions from 0129 are loaded ---
# Example: from node_0129 import calculate_average_domain_length, ...
# --- Load Results from Run 15 (Replace with actual data loading if needed) ---
# For this example, we'll define a placeholder results dictionary
# In a real application, this would load the data from a file or database
class ResultsPlaceholder: # Placeholder class for results
def __init__(self):
self.parameters = {'N': 200, 'T_max': 100, 'dt': 0.01, 'mu': 0.01, 'g': 5.0, 'lambda_global': 5.0, 'beta': 1.0, 'sigma': 0.1, 'a': 0.1, 'b': 0.1, 'c': 0.01}
self.phi_history = np.random.rand(100, 200) # Placeholder data
self.avg_theta_history = np.random.rand(100)
# Add more attributes as needed
results_run15 = ResultsPlaceholder() # Create an instance of the placeholder
# --- Analysis Parameters (Tune these based on visual inspection) ---
domain_delta = 0.5 # Domain definition threshold for average_domain_length
grad_threshold = 1.0 # Boundary threshold for boundary_density_and_velocity
amplitude_threshold = 0.5 # Compactness threshold
# --- Calculate Metrics ---
# Domain Length
avg_domain_lengths = []
for S in range(results_run15.phi_history.shape[0]):
avg_domain_length = calculate_average_domain_length(results_run15.phi_history[S, :], domain_delta)
avg_domain_lengths.append(avg_domain_length)
avg_avg_domain_length = np.mean(avg_domain_lengths) # Average over time
# Boundary Density and Velocity
boundary_densities = []
boundary_velocities = []
prev_boundary_positions = None
for S in range(results_run15.phi_history.shape[0]):
boundary_density, avg_velocity, boundary_positions = calculate_boundary_density_and_velocity(
results_run15.phi_history[S, :], grad_threshold, prev_boundary_positions
)
boundary_densities.append(boundary_density)
if avg_velocity is not None:
boundary_velocities.append(avg_velocity)
prev_boundary_positions = boundary_positions
avg_boundary_density = np.mean(boundary_densities)
avg_boundary_velocity = np.mean(boundary_velocities) if boundary_velocities else None
# Power Spectral Density
time_points = np.arange(results_run15.phi_history.shape[0]) * results_run15.parameters['dt']
frequencies, avg_psd = calculate_power_spectral_density(results_run15.phi_history, results_run15.parameters['dt'])
# Compactness (Rudimentary - Needs Refinement)
phi_background_avg = np.mean(results_run15.phi_history)
compactness = calculate_compactness(results_run15.phi_history[-1, :], phi_background_avg, amplitude_threshold) # Use final state for now
# --- Print Results ---
print("--- Run 15 Quantitative Analysis ---")
print(f"Average Domain Length: {avg_avg_domain_length:f}")
print(f"Average Boundary Density: {avg_boundary_density:f}")
if avg_boundary_velocity is not None:
print(f"Average Boundary Velocity: {avg_boundary_velocity:f}")
else:
print("Average Boundary Velocity: Not calculated (no boundaries)")
print(f"Dominant Frequencies (Top 5):")
top_freq_indices = np.argsort(avg_psd)[-5:]
for i in top_freq_indices[::-1]: # Print in descending order
print(f" Frequency: {frequencies[i]:f}, Power: {avg_psd[i]:e}")
print(f"Compactness (Final State): {compactness:f}")
# --- Plot Power Spectral Density ---
fig, ax = plt.subplots(figsize=(10, 6))
ax.plot(frequencies, avg_psd)
ax.set_xlabel("Frequency")
ax.set_ylabel("Power Spectral Density")
ax.set_title("Average Power Spectral Density (Run 15)")
ax.grid(True)
buf = io.BytesIO()
plt.savefig(buf, format='png')
buf.seek(0)
plot_psd_base64 = base64.b64encode(buf.read()).decode('utf-8')
buf.close()
plt.close(fig)
print(f"PSD Plot generated (base64 encoded): {plot_psd_base64[:100]}...")
```
## 4. Results of Metric Calculation (Using Placeholder Data)
Since we are using placeholder data (as indicated by the `ResultsPlaceholder` class), the following results are **illustrative only** and do not reflect actual simulation data. They demonstrate the *intended output format* and the *type* of information these metrics will provide.
```
--- Run 15 Quantitative Analysis ---
Average Domain Length: 0.0000
Average Boundary Density: 0.0000
Average Boundary Velocity: Not calculated (no boundaries)
Dominant Frequencies (Top 5):
Frequency: 0.49, Power: 1.30e-01
Frequency: 0.48, Power: 1.29e-01
Frequency: 0.47, Power: 1.28e-01
Frequency: 0.46, Power: 1.27e-01
Frequency: 0.01, Power: 1.27e-01
Compactness (Final State): 0.0000
PSD Plot generated (base64 encoded): iVBORw0KGgoAAAANSUhEUgAAA+gAAAMgCAYAAACwGEg9AAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjEs...
```
## 5. Interpretation (Based on Placeholder Results)
*(This interpretation is based on the *expected* meaning of the metrics, not on actual data from Run 15)*
If these were real results, we might interpret them as follows:
* **Low Average Domain Length:** Suggests small, fragmented domains.
* **Low Boundary Density:** Indicates few sharp interfaces between domains.
* **No Boundary Velocity:** Implies that the boundaries are not propagating or moving significantly.
* **Dominant Frequencies:** The PSD would reveal the characteristic frequencies of oscillation within the system. The specific values and distribution of power would provide information about the nature of the dynamics.
* **Low Compactness:** Suggests that there are no well-defined, localized structures significantly deviating from the background state.
This hypothetical scenario would suggest a system with high activity but lacking long-range order or stable, localized entities.
## 6. Next Steps
1. **Replace Placeholder Data:** The most crucial next step is to **replace the placeholder data with the *actual* simulation output from Run 15**. This requires ensuring the simulation code from [[releases/archive/Information Ontology 1/0116_IO_Simulation_v2.2_Code]] is executed and the results (phi_history, etc.) are saved and loaded correctly into this analysis script.
2. **Tune Analysis Parameters:** The analysis parameters (e.g., `domain_delta`, `grad_threshold`, `amplitude_threshold`) will likely need to be adjusted based on visual inspection of the actual simulation data to ensure they effectively capture the relevant features.
3. **Analyze Run 15 Data:** Run this analysis code with the actual Run 15 data and interpret the results.
4. **Apply to Future Runs:** Use these metrics to systematically analyze the results of future parameter sweeps and compare different dynamical regimes.
## 7. Conclusion
This node provides the essential Python code for quantitatively analyzing the emergent structures and dynamics in the IO continuous-state network simulations. Applying these metrics to the Run 15 data (once the placeholder is replaced) will provide a more objective understanding of the system's behavior and guide future parameter explorations. The next step is to perform this actual data analysis.