Neural Automata #3: The species of computers
Neural Automata #3: The species of computers
Yash Bonde . 2026-02-15
NotesNeural AutomataNeural Networks

How do you use a computer?

Previously #22 Real Computers, we discussed how we unified theory with real computers.

For most people today a computer means a laptop to surf the internet and work on slack. Mobile phones are not even considered computers. But they are for all software purposes ”same”.

When we studied theoretical computers we only considered the hierarchy, but there are things that predate it.

Now moving away from how the computers of today are, let’s go to how computers could be, after all this was but one path that unfolded. Von Neumann and Ulam proposed an idea of a self replicating computer, a Universal Constructor. In order to build any computer you need a definition of the state and the transition rules, thus Cellular Automata was born. The original proposition was a complicated setup, it was Conway’s Game of Life that popularised 22D grid and neighbour rules, though you can also do 33D. This makes CA a Finite State Automata (FSA). There are cool things like The Wireworld Computer but not a lot of commercial or practical applications, though there are examples like urban design, VLSI cryptography. It can model complex physical flows in parallel but it cannot do “if-then” code. Norman Margolus proposed a machine called CAM-88 and actually built it.

Let’s go back a bit further in time, Church 19411941 proposed a new way to build a universal computer. There are three basics parts, first is a variable xx, a function definition defined by λ\lambda e.g. x+1x + 1 can be defined as λx.x+1\lambda x . x + 1. Then applying the function on variable ss like fxf x. This way one can reuse the functions to build complex operations. Effectively modeling a TM as a series of operations that any human could do with a pen and paper. This is a foundational element of functional programming and λ\lambda becoming a respected cultural symbol, also SICP.

LISP is a raw implementation of Church’s idea, it provides guarantees that any code written, will execute on a TM. It is an interpreted language and thus requires a Virtual Machine (VM) to run, a pioneer. Not just that, a program can change itself during runtime thus making it a universal TM. This was a time when programs were precise instructions on how the machine operated. Which means a program can be tuned for the machine, juicing out sweet ROI. For this LSIPers were trolled even back in 19821982. However there were physical machines called Lisp Machines that were designed to run LISP.

At this point we have defined most of what is useful, arithmetic and build decision flows. Computers have been getting faster exponentially, and that forced a certain standardisation. All languages have a data model even if the cost is hidden from the user. As we started hitting the limit of improvements on transistor size, we pulled another spade which is to use parallel computation instead of sequential computing. The control / master remains with CPUs while these new “GPUs” are tasked with certain things it is good with, like matrix multiplication (matmul) (also read BLAS). On top of which we built neural networks, more on this later.

Computers are not a modern phenomenon, this is actually a long arc. A very long time ago we built analog-mechanical computers like ones discovered at Antikythera, though an argument can be made that something of equal if not higher caliber would come from Asia as well. Using this machine, we can calculate planetary positions 223223 months in advance. The use of this is debated, one of the sides mentions tracking of Panhellenic games.

Neural networks (NN) are a type of machine learning algorithm inspired by the animal brain. It takes an input signal e.g. camera pixels, passes it through a series of pattern filters to find the most important signal in the noise. This signal is then used to make the best prediction. As of this writing in early 20262026, NN is the hype. Used to predict molecule shape, understand earth and kill the internet. We do not understand how they work and that’s half the magic, the other half is how adaptable they are to different tasks. A convolutional network can be used to predict image labels and can also be used to play superhuman Go, by merely changing the learning algorithm.

They are however empirically designed systems, not rules based like software 11.00. Just like the 19401940s the LLMs are being built before theory has matured. AI is getting more powerful / hyped / insane by the day. We will see more computational tasks moving into the Software 22.00 world (NN). What we see is that CPUs are still being used for all the decision making, but its inputs are much more refined by the increased perception of NNs on GPUs. There’s been a lot of work done on trying to see if neural networks can be turned into full end-to-end computers.

Quantum computers, optical computers, there’s so many. Here’s a table going over some computer types:

Computer TypeOne-Liner DescriptionHistory & OriginBest Task (vs. CPU)Worst Task (vs. CPU)Example Physical Implementation
Cellular AutomataA grid of parallel cells evolving based on local neighbor rules.Conceptualized by Ulam and von Neumann (19401940s) for self-replication studies.Fluid Dynamics: Simulating complex physical flows in parallel.Serial Logic: Executing sequential "if-then" code.CAM-88 Machine
Lambda CalculusA mathematical system computing via function application without using "states."Developed by Alonzo Church (19301930s) to explore the foundations of logic.Formal Verification: Mathematically proving code is bug-free.Memory Management: High overhead for basic arithmetic.Lisp Machines
Quantum ComputerUses qubits and superposition to solve probabilistic problems at scale.Proposed by Richard Feynman (19811981) to simulate quantum physics.Prime Factorization: Breaking encryption that takes CPUs eons.Daily Tasks: Simple word processing or web browsing.IBM Quantum System One
Analog ComputerComputes using continuous physical quantities like voltage or rotation.Ancient (Antikythera, 150150 BC) to mid-century electronic differential analyzers.Real-time Control: Instant processing of electrical signals.High Precision: Accuracy is limited by physical tolerances.THAT Analog Computer
Mechanical ComputerUses physical gears and levers to perform discrete logical operations.Babbage's Difference Engine (18201820s) and Pascal's calculators (16401640s).Hostile Environments: Computing in extreme heat or radiation.Speed: Physically limited by the inertia of moving parts.Babbage Difference Engine
NeuromorphicHardware mimicking the brain's "spike" based neural architecture.Coined by Carver Mead (19801980s) to create brain-inspired silicon.Edge AI: Efficient, low-power pattern recognition.Exact Arithmetic: Too "fuzzy" for precise accounting.Intel Loihi 22
Optical ComputerUses pulses of light (photons) instead of electrons for processing.Research surged in the 19801980s at Bell Labs to beat the "heat wall."Matrix Math: Instant multiplication for AI workloads via light.Miniaturization: Hard to shrink to the size of a smartphone.Lightmatter Envoy
Dataflow ComputerInstructions execute only when their data inputs arrive, not by a clock.Developed by Jack Dennis at MIT (19701970s) to bypass the CPU bottleneck.Big Data Pipelines: Maximizing throughput for massive data streams.Dynamic Branching: Handling code with unpredictable "if" statements.SambaNova DataScale
Neural NetworkA system of interconnected nodes that "computes" by weighting signals to find patterns in data.Proposed by McCulloch & Pitts (19431943); first physical model, the Mark I Perceptron, built in 19581958. LLMs are a type of task specific network.Pattern Recognition: Identifying faces or voices in "noisy," unstructured data.Exact Logic: Performing perfect math (like 99,999×3.1499,999 \times 3.14) without error.Apple Neural Engine (ANE)

One more cycle of Exploitation → Exploration is complete. We have built a rough understanding of a lot of ways of computing things, now there are going to be various forms each put together to solve something.

I have covered more in #44 Neural Automata.

The opinions expressed herein are solely those of the author in their individual capacity and do not necessarily reflect the official policy or position of any current or former employer, client, or affiliated organization. Suggest changes.