
How do you use a computer?
Previously # Real Computers, we discussed how we unified theory with real computers.
For most people today a computer means a laptop to surf the internet and work on slack. Mobile phones are not even considered computers. But they are for all software purposes ”same”.
When we studied theoretical computers we only considered the hierarchy, but there are things that predate it.
Now moving away from how the computers of today are, let’s go to how computers could be, after all this was but one path that unfolded. Von Neumann and Ulam proposed an idea of a self replicating computer, a Universal Constructor. In order to build any computer you need a definition of the state and the transition rules, thus Cellular Automata was born. The original proposition was a complicated setup, it was Conway’s Game of Life that popularised D grid and neighbour rules, though you can also do D. This makes CA a Finite State Automata (FSA). There are cool things like The Wireworld Computer but not a lot of commercial or practical applications, though there are examples like urban design, VLSI cryptography. It can model complex physical flows in parallel but it cannot do “if-then” code. Norman Margolus proposed a machine called CAM- and actually built it.
Let’s go back a bit further in time, Church proposed a new way to build a universal computer. There are three basics parts, first is a variable , a function definition defined by e.g. can be defined as . Then applying the function on variable like . This way one can reuse the functions to build complex operations. Effectively modeling a TM as a series of operations that any human could do with a pen and paper. This is a foundational element of functional programming and becoming a respected cultural symbol, also SICP.

LISP is a raw implementation of Church’s idea, it provides guarantees that any code written, will execute on a TM. It is an interpreted language and thus requires a Virtual Machine (VM) to run, a pioneer. Not just that, a program can change itself during runtime thus making it a universal TM. This was a time when programs were precise instructions on how the machine operated. Which means a program can be tuned for the machine, juicing out sweet ROI. For this LSIPers were trolled even back in . However there were physical machines called Lisp Machines that were designed to run LISP.
At this point we have defined most of what is useful, arithmetic and build decision flows. Computers have been getting faster exponentially, and that forced a certain standardisation. All languages have a data model even if the cost is hidden from the user. As we started hitting the limit of improvements on transistor size, we pulled another spade which is to use parallel computation instead of sequential computing. The control / master remains with CPUs while these new “GPUs” are tasked with certain things it is good with, like matrix multiplication (matmul) (also read BLAS). On top of which we built neural networks, more on this later.
Computers are not a modern phenomenon, this is actually a long arc. A very long time ago we built analog-mechanical computers like ones discovered at Antikythera, though an argument can be made that something of equal if not higher caliber would come from Asia as well. Using this machine, we can calculate planetary positions months in advance. The use of this is debated, one of the sides mentions tracking of Panhellenic games.
Neural networks (NN) are a type of machine learning algorithm inspired by the animal brain. It takes an input signal e.g. camera pixels, passes it through a series of pattern filters to find the most important signal in the noise. This signal is then used to make the best prediction. As of this writing in early , NN is the hype. Used to predict molecule shape, understand earth and kill the internet. We do not understand how they work and that’s half the magic, the other half is how adaptable they are to different tasks. A convolutional network can be used to predict image labels and can also be used to play superhuman Go, by merely changing the learning algorithm.
They are however empirically designed systems, not rules based like software .. Just like the s the LLMs are being built before theory has matured. AI is getting more powerful / hyped / insane by the day. We will see more computational tasks moving into the Software . world (NN). What we see is that CPUs are still being used for all the decision making, but its inputs are much more refined by the increased perception of NNs on GPUs. There’s been a lot of work done on trying to see if neural networks can be turned into full end-to-end computers.
Quantum computers, optical computers, there’s so many. Here’s a table going over some computer types:
| Computer Type | One-Liner Description | History & Origin | Best Task (vs. CPU) | Worst Task (vs. CPU) | Example Physical Implementation |
|---|---|---|---|---|---|
| Cellular Automata | A grid of parallel cells evolving based on local neighbor rules. | Conceptualized by Ulam and von Neumann (s) for self-replication studies. | Fluid Dynamics: Simulating complex physical flows in parallel. | Serial Logic: Executing sequential "if-then" code. | CAM- Machine |
| Lambda Calculus | A mathematical system computing via function application without using "states." | Developed by Alonzo Church (s) to explore the foundations of logic. | Formal Verification: Mathematically proving code is bug-free. | Memory Management: High overhead for basic arithmetic. | Lisp Machines |
| Quantum Computer | Uses qubits and superposition to solve probabilistic problems at scale. | Proposed by Richard Feynman () to simulate quantum physics. | Prime Factorization: Breaking encryption that takes CPUs eons. | Daily Tasks: Simple word processing or web browsing. | IBM Quantum System One |
| Analog Computer | Computes using continuous physical quantities like voltage or rotation. | Ancient (Antikythera, BC) to mid-century electronic differential analyzers. | Real-time Control: Instant processing of electrical signals. | High Precision: Accuracy is limited by physical tolerances. | THAT Analog Computer |
| Mechanical Computer | Uses physical gears and levers to perform discrete logical operations. | Babbage's Difference Engine (s) and Pascal's calculators (s). | Hostile Environments: Computing in extreme heat or radiation. | Speed: Physically limited by the inertia of moving parts. | Babbage Difference Engine |
| Neuromorphic | Hardware mimicking the brain's "spike" based neural architecture. | Coined by Carver Mead (s) to create brain-inspired silicon. | Edge AI: Efficient, low-power pattern recognition. | Exact Arithmetic: Too "fuzzy" for precise accounting. | Intel Loihi |
| Optical Computer | Uses pulses of light (photons) instead of electrons for processing. | Research surged in the s at Bell Labs to beat the "heat wall." | Matrix Math: Instant multiplication for AI workloads via light. | Miniaturization: Hard to shrink to the size of a smartphone. | Lightmatter Envoy |
| Dataflow Computer | Instructions execute only when their data inputs arrive, not by a clock. | Developed by Jack Dennis at MIT (s) to bypass the CPU bottleneck. | Big Data Pipelines: Maximizing throughput for massive data streams. | Dynamic Branching: Handling code with unpredictable "if" statements. | SambaNova DataScale |
| Neural Network | A system of interconnected nodes that "computes" by weighting signals to find patterns in data. | Proposed by McCulloch & Pitts (); first physical model, the Mark I Perceptron, built in . LLMs are a type of task specific network. | Pattern Recognition: Identifying faces or voices in "noisy," unstructured data. | Exact Logic: Performing perfect math (like ) without error. | Apple Neural Engine (ANE) |
One more cycle of Exploitation → Exploration is complete. We have built a rough understanding of a lot of ways of computing things, now there are going to be various forms each put together to solve something.
I have covered more in # Neural Automata.
Important Links
- [4]Church 1941
The opinions expressed herein are solely those of the author in their individual capacity and do not necessarily reflect the official policy or position of any current or former employer, client, or affiliated organization. Suggest changes.