Posts

Seeing the Brain Predict in Real-Time: An Animated Python Journey

I n our previous post, we explored the core idea of the Free Energy Principle – the brain as a prediction machine striving to minimize the difference between its expectations and sensory reality. We even built a simple Python program to demonstrate this. Now, let's take it a step further and visualize this predictive process as it unfolds. We've enhanced our Python code to create a dynamic animation, giving us a glimpse into how an "internal belief" (our brain's model) continuously adapts to incoming "sensory input." Animating the Prediction: Our Updated Python Code Python import numpy as np import matplotlib.pyplot as plt import matplotlib.animation as animation def generate_world_state ( time_step ): """A simple, changing "world" state.""" return np.sin( 0.1 * time_step) + np.random.normal( 0 , 0.1 ) def generate_sensory_input ( world_state ): """Sensory input with some ...

The Brain as a Prediction Machine: Friston's Free Energy Principle in Practice

  H ave you ever felt a jolt of surprise when something didn't go as expected? According to the fascinating Free Energy Principle , proposed by Karl Friston, this feeling isn't just a quirk of consciousness – it might be a fundamental driving force behind how all self-organizing systems, including our brains, work. At its heart, the Free Energy Principle suggests that our brains are constantly trying to minimize the difference between what they expect and what they actually sense. Think of it like this: your brain builds a model of the world, makes predictions based on that model, and then updates the model when those predictions are wrong. This difference between prediction and reality is, in a simplified sense, related to what Friston calls "free energy." To get a more intuitive grasp of this idea, let's dive into a simple Python program that demonstrates this concept of predictive coding , a key aspect of the Free Energy Principle. Our Simple "World...

Unveiling Dynamic Holographic Changes with Bayesian Inference

  H olography is a fascinating field, capturing 3D information about objects. But what happens when the holographic data itself is dynamic, changing over time? How can we automatically detect and analyze these changes as they occur? This blog post dives into a Python program that leverages Bayesian inference to continuously monitor a "holographic file" and draw conclusions based on its evolving properties. The Challenge: Dynamic Data Analysis Imagine a holographic sensor continuously recording data, or a holographic storage medium undergoing subtle physical changes. Manually inspecting vast amounts of data for shifts in properties like intensity, phase, or diffraction efficiency is impractical. We need an intelligent system that can learn from the data, adapt to new information, and flag significant deviations. This is where Bayesian inference shines. Find the programs here: Interintel/Bayesian-Holography: Simple python apps that show Bayesiann inference against a dynamic hol...

Building a Holographic Digit Classifier with NumPy and MNIST

Image
 In this post, we’ll explore how to build a simple, yet powerful, holographic memory system to classify handwritten digits from the popular MNIST dataset . Using NumPy , we’ll create a system that represents digits as holographic encodings —dense vector representations—and compares new images using cosine similarity . This approach draws inspiration from vector symbolic architectures (VSAs) and holographic reduced representations (HRRs) used in associative memory and cognitive modeling. 📦 Step 1: Organizing the Dataset Assume you have MNIST digit data where each digit ( 0 to 9 ) is stored in its own folder. Each image has been preprocessed and saved as a .npy file (a NumPy array), possibly as a vector encoding or 2D array. Folder structure: python-repl Copy Edit /mnist_vectors/ ├── 0/ │ ├── img_0.npy │ └── ... ├── 1/ │ └── ... ... ├── 9/ 🧪 Step 2: Create the Holographic Memory File We’ll create a single file ( digit_holograms.npz ) that stores a prototype ve...

Mapping the Drosophila Eye

Image
  T o emulate the Drosophila eye sensory system, I had to map what O mmatidia  to the optical sensory neurons. I used the Flywire.ai right eye map from this map:  Retina Grid . Taking each ommatidia, I first had to map the column id associated to the ommatidia. Here is that map by column id: I was able to create the eye map into a grid that has 28 columns and 28 rows, and put this grid into a MS SQL table: C1 C2 C3 C4 C5 C6 C7 C8 C9 C10 C11 C12 C13 C14 C15 C16 C17 C18 C19 C20 C21 C22 C23 C24 C25 C26 C27 C28 0 0 0 0 0 0 0 0 787 0 660 207 494 448 476 380 65 295 352 729 41 784 783 0 0 0 0 0 0 0 0 0 0 0 796 788 402 438 322 767 631 138 282 202 186 237 223 343 222 645 8 782 781 0 0 0 0 0 0 0 779 778 629 453 324 570 353 472 621 284 329 680 618 707 625 693 753 744 692 442 7...