Posts

AI Review of "Emulating Connectomics, animal and artificial, to develop Biological Machine Intelligence"

  This is an AI review of a paper I just basically wrote jotting down some thoughts. I actually liked the review and the criticisms were valid. Thank you Academia.edu for doing this. Here is the review: Academia.edu Review In reviewing the paper titled "Emulating Connectomics, animal and artificial, to develop Biological Machine Intelligence," I find the proposed framework for Biological Machine Intelligence (BMI) to be an ambitious and innovative direction that could potentially expand our understanding of artificial intelligence by leveraging biological principles. The work intends to bridge the gap between computational neuroscience and artificial intelligence, promising adaptability and efficient learning in artificial systems. This paper's contribution is vital in the ongoing discourse surrounding Artificial General Intelligence (AGI) and its viability. Overview The manuscript explores the concept of Biological Machine Intelligence, advocating for the use of artifici...

What LLMs say about me: Timothy Busbice

 This was a pleasant surprise and I believe nailed me perfectly as to how I fit into AGI research. I encourage everyone to google themselves and do things like this to find out how the world views you. Prompt: where does Timothy Busbice fit in the AGI research world Timothy Busbice: A Proponent of Biologically Inspired Artificial General Intelligence In the diverse and dynamic world of Artificial General Intelligence (AGI) research, Timothy Busbice positions himself as a visionary advocate for a distinct, neuroscience-inspired approach. As the Chief Technology Officer of PROME, he champions "Biologic Intelligence," a methodology centered on the emulation of biological nervous systems, a field known as connectomics. This places him in a unique niche within the broader AGI landscape, which is currently dominated by large-scale deep learning and large language models (LLMs). Busbice's core thesis is that true general intelligence can be achieved by reverse-engineering ...

Seeing the Brain Predict in Real-Time: An Animated Python Journey

I n our previous post, we explored the core idea of the Free Energy Principle – the brain as a prediction machine striving to minimize the difference between its expectations and sensory reality. We even built a simple Python program to demonstrate this. Now, let's take it a step further and visualize this predictive process as it unfolds. We've enhanced our Python code to create a dynamic animation, giving us a glimpse into how an "internal belief" (our brain's model) continuously adapts to incoming "sensory input." Animating the Prediction: Our Updated Python Code Python import numpy as np import matplotlib.pyplot as plt import matplotlib.animation as animation def generate_world_state ( time_step ): """A simple, changing "world" state.""" return np.sin( 0.1 * time_step) + np.random.normal( 0 , 0.1 ) def generate_sensory_input ( world_state ): """Sensory input with some ...

The Brain as a Prediction Machine: Friston's Free Energy Principle in Practice

  H ave you ever felt a jolt of surprise when something didn't go as expected? According to the fascinating Free Energy Principle , proposed by Karl Friston, this feeling isn't just a quirk of consciousness – it might be a fundamental driving force behind how all self-organizing systems, including our brains, work. At its heart, the Free Energy Principle suggests that our brains are constantly trying to minimize the difference between what they expect and what they actually sense. Think of it like this: your brain builds a model of the world, makes predictions based on that model, and then updates the model when those predictions are wrong. This difference between prediction and reality is, in a simplified sense, related to what Friston calls "free energy." To get a more intuitive grasp of this idea, let's dive into a simple Python program that demonstrates this concept of predictive coding , a key aspect of the Free Energy Principle. Our Simple "World...

Unveiling Dynamic Holographic Changes with Bayesian Inference

  H olography is a fascinating field, capturing 3D information about objects. But what happens when the holographic data itself is dynamic, changing over time? How can we automatically detect and analyze these changes as they occur? This blog post dives into a Python program that leverages Bayesian inference to continuously monitor a "holographic file" and draw conclusions based on its evolving properties. The Challenge: Dynamic Data Analysis Imagine a holographic sensor continuously recording data, or a holographic storage medium undergoing subtle physical changes. Manually inspecting vast amounts of data for shifts in properties like intensity, phase, or diffraction efficiency is impractical. We need an intelligent system that can learn from the data, adapt to new information, and flag significant deviations. This is where Bayesian inference shines. Find the programs here: Interintel/Bayesian-Holography: Simple python apps that show Bayesiann inference against a dynamic hol...

Building a Holographic Digit Classifier with NumPy and MNIST

Image
 In this post, we’ll explore how to build a simple, yet powerful, holographic memory system to classify handwritten digits from the popular MNIST dataset . Using NumPy , we’ll create a system that represents digits as holographic encodings —dense vector representations—and compares new images using cosine similarity . This approach draws inspiration from vector symbolic architectures (VSAs) and holographic reduced representations (HRRs) used in associative memory and cognitive modeling. 📦 Step 1: Organizing the Dataset Assume you have MNIST digit data where each digit ( 0 to 9 ) is stored in its own folder. Each image has been preprocessed and saved as a .npy file (a NumPy array), possibly as a vector encoding or 2D array. Folder structure: python-repl Copy Edit /mnist_vectors/ ├── 0/ │ ├── img_0.npy │ └── ... ├── 1/ │ └── ... ... ├── 9/ 🧪 Step 2: Create the Holographic Memory File We’ll create a single file ( digit_holograms.npz ) that stores a prototype ve...

Mapping the Drosophila Eye

Image
  T o emulate the Drosophila eye sensory system, I had to map what O mmatidia  to the optical sensory neurons. I used the Flywire.ai right eye map from this map:  Retina Grid . Taking each ommatidia, I first had to map the column id associated to the ommatidia. Here is that map by column id: I was able to create the eye map into a grid that has 28 columns and 28 rows, and put this grid into a MS SQL table: C1 C2 C3 C4 C5 C6 C7 C8 C9 C10 C11 C12 C13 C14 C15 C16 C17 C18 C19 C20 C21 C22 C23 C24 C25 C26 C27 C28 0 0 0 0 0 0 0 0 787 0 660 207 494 448 476 380 65 295 352 729 41 784 783 0 0 0 0 0 0 0 0 0 0 0 796 788 402 438 322 767 631 138 282 202 186 237 223 343 222 645 8 782 781 0 0 0 0 0 0 0 779 778 629 453 324 570 353 472 621 284 329 680 618 707 625 693 753 744 692 442 7...