PGAN is a new neural architecture inspired by the brain's hippocampus. It learns new facts in microseconds — no backpropagation, no GPU, on a $3 chip.
Each component is mathematically proven and biologically grounded. Not a metaphor — the same equations the brain uses.
Dense Associative Memory on the unit circle S1. Stores patterns as explicit phase vectors — addressable, readable, writable.
Dense Associative Memory on the sphere S2. Mathematically equivalent to Transformer attention — but derived from theory.
Injection-locking dynamics validated on hippocampal theta-gamma coupling. Not a metaphor — identical equations.
Every LLM today is frozen after training. GPT, Claude, Gemini — they forget everything when you close the chat. PGAN remembers. One Hebbian update, no GPU needed.
# Day: learn a new fact (microseconds, no GPU)
learner.associate(model, tokenizer,
"Krakow", "Wawel")
# Overlap: 0.17 -> 0.75 (4.4x increase)
# Night: consolidate to permanent weights
consolidator.consolidate(
model, tokenizer, learner)
# Morning: model knows Krakow = Wawel
# No retraining. No GPU. No cloud.
# Privacy: remove S1 = remove all
# personal knowledge (4MB file)
learner.export_public(model, "public.pt")
From 3,422 perplexity to 13.64. Eight versions. $350 total compute cost.
| Capability | Transformer | PGAN |
|---|---|---|
| Formal memory theory | None | alpha* = 1.0 (proven) |
| Post-deployment learning | Impossible | Native (Hebbian S1) |
| Brain validation | None | 192/192 tests (8 patients) |
| Hardware path | GPU only | Analog oscillators |
| Catastrophic forgetting | Yes | No (phase isolation) |
| Edge deployment | Limited | $3 ESP32 with live learning |
Three peer-reviewable papers deposited on Zenodo with DOI
Phase-Gate Computing and Superlinear Capacity in Circular Oscillator Networks
Theory of Directional Associative Memories: Dense Hopfield Networks on the Unit Sphere
A CNOT Phase Gate Equation Unifies Digital Logic and Hippocampal Theta-Gamma Coupling
Interested in PGAN? Looking for collaboration, funding, or want to learn more?