Unveiling Dynamic Holographic Changes with Bayesian Inference

 Holography is a fascinating field, capturing 3D information about objects. But what happens when the holographic data itself is dynamic, changing over time? How can we automatically detect and analyze these changes as they occur? This blog post dives into a Python program that leverages Bayesian inference to continuously monitor a "holographic file" and draw conclusions based on its evolving properties.

The Challenge: Dynamic Data Analysis

Imagine a holographic sensor continuously recording data, or a holographic storage medium undergoing subtle physical changes. Manually inspecting vast amounts of data for shifts in properties like intensity, phase, or diffraction efficiency is impractical. We need an intelligent system that can learn from the data, adapt to new information, and flag significant deviations. This is where Bayesian inference shines.

Find the programs here:

Interintel/Bayesian-Holography: Simple python apps that show Bayesiann inference against a dynamic holographic file

What My Program Does

My Python script acts as a continuous monitor for a file named holo.npy. It performs the following key functions:

  1. File Watching: It periodically checks holo.npy for modifications.

  2. Data Loading & Analysis: If the file changes, it reloads the entire dataset.

  3. Bayesian Inference: It applies sequential Bayesian inference to estimate the underlying mean of the holographic data.

  4. Change Detection: It heuristically identifies significant shifts in this inferred mean, indicating a change in the holographic file's characteristics.

  5. Continuous Plotting: It updates a live plot, showing the raw data, the inferred mean, and its uncertainty, allowing for real-time visualization of the analysis.

Core Concepts Explained

1. Bayesian Inference: Updating Our Beliefs

At its heart, Bayesian inference is about updating our beliefs (probabilities) about a hypothesis as new evidence becomes available. In our case, the "hypothesis" is the true mean value of the holographic data, and the "evidence" is each new observation from the holo.npy file.

  • Prior Distribution: Before seeing any data, we have an initial belief about the mean, represented by a "prior" probability distribution (e.g., a Normal distribution with a prior_mean and prior_variance).

  • Likelihood: Each new observation tells us how likely that observation is, given a particular mean value (this is the "likelihood"). We assume our observations are normally distributed around the true mean with a known variance.

  • Posterior Distribution: Combining the prior belief with the likelihood of the new data gives us a "posterior" distribution. This posterior becomes our new prior for the next observation, allowing for sequential updates.

For simplicity, we use a conjugate prior (Normal prior for a Normal likelihood with known variance). This means the posterior distribution is also Normal, making the mathematical updates straightforward and analytical.

2. Change Point Detection (Heuristic)

While sophisticated Bayesian change point detection algorithms exist, our program employs a simpler heuristic:

  • It continuously tracks the posterior mean of the data.

  • If the current posterior mean deviates significantly (e.g., by more than 1.5 standard deviations) from the mean estimated a few observations ago, it flags a "potential change." This helps identify sustained shifts rather than just noisy fluctuations.

3. File Monitoring and Continuous Plotting

To make the analysis dynamic, the script uses:

  • os.path.getmtime(filename): This function retrieves the last modification time of the holo.npy file. By comparing the current modification time with the last recorded one, the program knows if the file has been updated.

  • time.sleep(5): This pauses the script for 5 seconds between checks, defining our monitoring interval.

  • matplotlib.pyplot.ion(): This command turns on "interactive mode" for Matplotlib, allowing plots to be updated without closing and reopening the window.

  • ax.clear(): Before re-plotting, the axes are cleared to remove old data.

  • fig.canvas.draw_idle() and fig.canvas.flush_events(): These commands tell Matplotlib to redraw the canvas and process any pending GUI events, ensuring the plot updates visually.

  • plt.pause(0.1): A small pause is necessary in interactive mode to allow the GUI event loop to run and the plot to refresh.

Program Flow

Imagine the following steps:


    A[Start Program] --> B{Holo.npy Exists?};
    B -- No --> C[Create Sample holo.npy];
    B -- Yes --> D[Get Initial File Mod Time];
    C --> D;
    D --> E[Enter Monitoring Loop];
    E --> F{Check File Mod Time};
    F -- Unchanged --> G[Wait 5 Seconds];
    F -- Changed or New --> H[Clear Plots];
    H --> I[Load Holo.npy Data];
    I --> J[Initialize Bayesian Updater];
    J --> K[Process Each Observation Sequentially];
    K -- Update Posterior Mean/Std Dev --> L{Detect Change?};
    L -- Yes --> M[Print Change Alert];
    L -- No --> K;
    K --> N[Update Plot with New Data/Inference];
    N --> O[Print Final Conclusions];
    O --> P[Update Last Modified Time];
    P --> G;
    G --> E;
    E -- User Interrupt --> Q[Stop Monitoring];

Simplified Flow:

  1. Setup: Initialize plot, create holo.npy if missing, record its initial modification time. Note that the program simulate_holo_writerv2.py will continuously change the Holographic file holo.npy.

  2. Loop (every 5 seconds):

    • Check holo.npy's modification time.

    • If modified:

      • Clear the existing plot.

      • Reload the entire holo.npy data.

      • Re-initialize the Bayesian model.

      • Process each data point sequentially:

        • Update the posterior mean and standard deviation.

        • Heuristically check for significant shifts in the mean and print alerts.

      • Update the plot with the new observations, inferred mean, and credible intervals.

      • Update the recorded modification time.

    • If unchanged: Print a "file unchanged" message.

  3. Repeat: Continue looping until manually stopped (e.g., by pressing Ctrl+C).

Code Structure Breakdown

Let's look at the main components of the Python script:

BayesianNormalMeanUpdater Class

This class encapsulates the core Bayesian logic.

  • __init__(prior_mean, prior_variance, observation_variance): Sets up our initial beliefs about the mean and the assumed noise in our observations.

  • update(observation): This is the crucial method. For each new observation, it applies the Bayesian update rules to refine our mu_n (posterior mean) and tau_n_sq (posterior variance).

  • get_posterior(): Returns the current best estimate of the mean and its uncertainty.

create_sample_holo_file() Function

This helper function is for demonstration. It generates a holo.npy file with simulated data, including a predefined "change point" where the underlying mean of the data shifts. This allows you to easily test the change detection capabilities of the program.

run_bayesian_hologram_analysis_from_file() Function

This is the main analysis engine.

  • It loads the holo.npy data.

  • It initializes the BayesianNormalMeanUpdater.

  • It iterates through each observation in the loaded data, calling updater.update(obs) to sequentially process the information.

  • It collects the posterior_means and posterior_stds over time.

  • It implements the dynamic change detection logic, comparing the current inferred mean to a recent past mean, relative to the uncertainty.

  • It handles the plotting, clearing previous lines and redrawing with the latest inferred states and detected changes.

Main Execution Block (if __name__ == "__main__":)

This block orchestrates the continuous monitoring:

  • It sets up Matplotlib's interactive mode (plt.ion()) and creates the initial plot figure and axes.

  • It checks for holo.npy and creates a sample if it doesn't exist.

  • It enters an infinite while True loop:

    • Inside the loop, it gets the current_modified_time of the file.

    • If the file has been modified (or is new), it calls run_bayesian_hologram_analysis_from_file() to re-analyze the data and update the plot.

    • fig.canvas.draw_idle() and fig.canvas.flush_events() are critical here for forcing the plot to refresh.

    • It then time.sleep() for the remaining duration of the 5-second interval.

  • A try-except KeyboardInterrupt block allows you to stop the monitoring by pressing Ctrl+C.

How to Use It

  1. Save the Code: Save the provided Python code as a .py file (e.g., hologram_monitor.py).

  2. Run from Terminal: Open a terminal or command prompt, navigate to the directory where you saved the file, and run:

    python hologram_monitor.py
    
  3. Observe:

    • A Matplotlib plot window will appear, showing the initial analysis.

    • The console will print updates about the analysis and file checks.

  4. Simulate Changes: While the program is running, you can manually edit or replace the holo.npy file. For example, you could run the create_sample_holo_file() function again in a separate Python interpreter or script, perhaps with different new_mean values, to simulate new data.

    # In a separate Python session or script:
    import numpy as np
    # Assuming create_sample_holo_file is available or copied here
    def create_sample_holo_file(filename="holo.npy", num_points=200, initial_mean=10.0, change_point=100, new_mean=15.0, observation_std_dev=2.0):
        data = np.zeros(num_points)
        data[:change_point] = np.random.normal(initial_mean, observation_std_dev, change_point)
        data[change_point:] = np.random.normal(new_mean, observation_std_dev, num_points - change_point)
        np.save(filename, data)
        print(f"Updated '{filename}' with new data.")
    
    # Example: Change the mean again after some time
    create_sample_holo_file(initial_mean=15.0, new_mean=20.0, change_point=100)
    

    When you save the new holo.npy, the hologram_monitor.py script will detect the modification, reload the data, and update the plots and console output.

Interpreting the Results

  • Console Output: Look for the "Detected Potential Change" messages. These indicate when the Bayesian model identified a statistically significant shift in the underlying mean of your holographic data. The reported "standard deviations of change" quantify how strong that shift was.

  • Top Plot (Holographic Observations and Inferred Mean):

    • The blue dots are your raw data points.

    • The green line is the inferred posterior mean. Watch how this line adapts to the data. When a change occurs, you'll see it shift from its previous stable value to a new one.

    • The shaded green area represents the 95% credible interval. This shows the uncertainty in our estimate of the mean. It will typically narrow as more data confirms a stable mean, and might briefly widen during a change as the model becomes less certain before converging on the new mean.

  • Bottom Plot (Evolution of Posterior Mean and Uncertainty):

    • This plot provides a clearer view of the posterior mean's evolution.

    • The orange dashed lines mark the points where the program detected a change.

Conclusion

This Python program provides a robust and intuitive way to monitor dynamic holographic files (or any sequential numerical data) for changes using Bayesian inference. By continuously updating our beliefs about the data's underlying properties, we can automatically detect and analyze shifts, providing valuable insights into the behavior of dynamic systems.

While this implementation uses a simple model, it lays the groundwork for more complex analyses, such as inferring multiple parameters, handling unknown noise levels, or employing advanced change point detection algorithms.


Comments

Popular posts from this blog

Use Cases for the emulation of the Drosophila nervous system

Making Dronesophila Part 1

Building a Holographic Digit Classifier with NumPy and MNIST