The model of a brain as a processor doesn’t stand up to scrutiny. People who use that model often qualify it by saying it isn’t like a computer with a disk and memory, but it is still computational.
Unfortunately, you can’t say the brain is a processor without getting concepts like encoded storage in binary, instructions, programs, data, and many more. If you know what a computation is, how can a brain be a computational machine, but not computational?
It’s a strange twist that those who promote the computational model get the computer analogy in a brain whether wanted or not! Our brains work on analogies.
A Processing Example
One of the scientists I have followed closely is Steven Pinker. He has written excellent descriptions of today’s problems in emulating human language that are still unsolved, outside of Patom theory, my brain theory. In his book The Language Instinct he shows an example with neural gates that compute word forms for critique below:

In the image, you ‘activate’ the ‘be’ dictionary neuron, it sends a signal to the ‘is’ word (the only one shown in this simplified diagram) that in turn can activate the sound /iz/. Because it is a 3rd person, singular, present tense and habitual aspect word, a logic gate that would normally activate the ‘-s’ suffix (as in stop → stops, walk → walks, eat → eats) is inhibited by the output of the word ‘is’.
I call this an example of a processing model — it computes. This model was constrained to make some points, and so it is more specific than a human brain. It’s still a reasonable design with logic gates to demonstrate the challenge in making the conversion of meaning to a word with correct morphology. And there are lots of connections needing to be wired up with this model making it less likely to be from brain evolution as differing languages need different wiring.
A Pattern Matching Example
Instead of the assumption that brains compute, a model using only patterns is quite constrained. Consider the following alternative.
In Patom theory, a brain is made of patterns that are sets, lists or combinations. Any pattern can only be stored once, but can then be associated as many times as needed.

In Fig 2, a pattern-based system is contrasted with the processing model. There is a set of dictionary elements (meaning) that are associated with word forms. Word forms are also associated with a set of features. Selecting 3rd person singular with present tense and habitual aspect pops out ‘is’ when ‘be’ is selected in the dictionary and ‘walks’ when ‘walk’ is selected in the dictionary. The ‘be’ generation (suppletion) supplies a full word, while a different approach (inflection) uses a list of letters to generate the walk + s result.
In practice, once words are learned through one or more experiences, they can be used in new situations as parts of phrases.
Brains recognize sensory inputs. In the case of the generation of words, a person’s brain generates a word by selecting its meaning. That meaning will be accompanied by the rest of the situation, such as who is being talked about (person and singular/plural), when it happened (tense/modality), what aspect of the temporal dimension is being discussed, and so on.
Sentence selection is simply a larger pattern. This model is used in my company’s systems today at Pat Inc in the language generator.
There is a split between meaning and form. Form could be any modality - visual (reading/writing), auditory (speaking, hearing) and even touch with Braille. Similarly form could be from any language, as long as the phrases are constrained properly as well.
Linguistics, such as Role and Reference Grammar (RRG), have detailed elements that help to identify what meanings create which forms.
Discussion
Processors run programs of some nature.
In a brain, the program could be innate or learned. If innate, we are born with a language and knowledge that presumably is handed down by our genetics. And yet we all learn to speak our local language, not some universal one. As DNA capacity would also be reduced if languages were innate, let’s assume languages are learned from words and meaning for the moment!
A better model to processing is pattern matching. There is no program needed beyond very specific patterns - hierarchical bidirectional linked (sets and lists) that only allow a pattern to be stored once. If patterns are found embedded in other ones, decomposition will again create indivisible pattern atoms. In this model, recognized patterns connect to others from experience and therefore by its bidirectional nature enable both input and outputs.
The representation of both models in the diagrams above look like they need labelling to work. In the pattern model, all the parts map to sets or lists. Thinking of just the example today, a word is a sequence of sets (letters have more than one representation). Person is a set of values, depending on the language and singular/plural is a set also. The dictionary meaning maps to a set of word forms, some of which are made of a list of letters.
Conclusion
Today’s newsletter started out with the idea that brain models need brain science, not computer science. Patom theory is theoretical neuroscience based on what needs to happen to produce the amazing human capabilities we see.
The challenge with emulating brains with computer models is the “baggage” that computer scientists expect computers to include — encoded data and computations using hardware like ALUs (arithmetic logic units).
In future newsletters I will look at the bigger picture, and explain the specifics of how the patterns are represented in a brain, but the mere word selection covered today from form to meaning and back illustrates the power of patterns.
Do you want to read more?
If you want to read about the application of brain science to the problems of AI, you can read my latest book, “How to Solve AI with Our Brain: The Final Frontier in Science” to explain the facets of brain science we can apply and why the best analogy today is the brain as a pattern-matcher. The book link is here on Amazon in the US.
In the cover design below, you can see the human brain incorporating its senses, such as the eyes. The brain’s use is being applied to a human-like robot who is being improved with brain science towards full human emulation in looks and capability.