🌊 Wave Theory of Learning

A Unified Physical Framework for Intelligence

What if neural networks are wave propagation systems?

This research series derives learning algorithms from physical first principles, viewing neural networks as waves propagating through tunable media. The framework unifies Hebbian learning, Oja's rule, and backpropagation—and extends to explain the Fermi Paradox.

Date: March 9, 2026 | Collaboration: Macheng Shen + Claude (Opus 4.6)

→ Read the complete theory

Hebbian Learning, Oja's Rule, and Backpropagation Unified

Preliminary Core Theory Neuroscience

Central thesis: Three fundamental learning principles—Hebbian learning (1949), Oja's rule (1982), and backpropagation (1986)—are not independent mechanisms but unified manifestations of wave interference in tunable media.

Key results:

Implications: Any wave system with tunable coupling automatically supports learning. Provides design principles for neuromorphic computing (optical, acoustic, memristive).

Read Full Paper →

Deriving Backpropagation from Wave Equations & the Fermi Paradox

Preliminary Physics Cosmology

Central thesis: Information transmission obeys physical laws with fundamental cost constraints. The same framework that explains neural network training also explains why we see no signs of extraterrestrial intelligence.

Part I: Neural Networks

Part II: Fermi Paradox

Read Full Paper →

Deriving Backpropagation from Wave Equations (Short Version)

Concise Technical

A focused derivation of backpropagation from wave dynamics, without the Fermi Paradox extension. Best for readers interested in the core mathematical framework.

Read Short Version →

Sleep as Wave Optimization: Why We Wake with Solutions

Preliminary Neuroscience Sleep Learning

Central thesis: Sleep is not passive recovery but active wave impedance optimization. Slow-wave sleep (SWS) minimizes global impedance through synaptic renormalization and memory replay; REM sleep explores novel pathways through random frequency scanning. Morning insights emerge when optimized connections are first "seen" by conscious awareness.

Key mechanisms:

  • SWS (hours 0-3): Synaptic pruning (15-20% reduction), sharp-wave ripples replay at 10-20× speed, slow oscillations enable global optimization
  • REM (hours 4-6): Theta oscillations, frontal lobe suppression, random wave propagation discovers new connections
  • Awakening: Optimized pathways become conscious → "Aha!" moments

Neuroscience evidence (2003-2026):

  • ✅ Sharp-wave ripples predict consolidation (Cell Neuron 2025, PMC 12576410)
  • ✅ Synaptic homeostasis hypothesis confirmed (Tononi & Cirelli 2003-2012)
  • ✅ REM enhances creativity via associative networks (PNAS 2009)
  • ✅ Guided dreaming boosts problem-solving +25% (Northwestern 2026)

Parallels with AI: Human sleep = batch training + exploration; awake = online learning. Same optimization principles!

Practical applications: Load problems before sleep, protect REM (7-8h), capture insights immediately upon waking.

Read Full Theory →
  • Weight transport: Reflected waves carry error automatically—no need to "know" forward weights
  • Phase separation: Continuous wave propagation—forward/backward waves coexist
  • Non-locality: Hebbian plasticity = local wave interference
  • Connections to other theories:

    Testable predictions: bidirectional wave propagation, phase-dependent synaptic plasticity, impedance-based learning difficulty.

    Read Full Response →

    🔬 Testable Predictions

    📬 Feedback & Collaboration

    If you find errors, have suggestions, or want to collaborate on experimental validation, please reach out: macshen93@gmail.com