Dogs that know English
Animals are much better than today's AI at key tasks

Those that follow my work may have seen me highlight how animals demonstrate the goals of robotics. They don’t fall over when catching a ball, despite complex visual backgrounds, uneven footing and complex auditory signals. What is holding back today’s expensive and power hungry AI from replicating an animal’s skills?
It’s the use of the computation paradigm that is unlike a brain.
Patom theory was developed to align with the human brain and therefore it addresses the myriad of known problems going back seventy years that inhibit today’s AI.
In today’s picture, these dogs are dying to eat. They had been waiting with a lack of patience for 15 minutes! I was getting whimpering noises and urgent nudging by both. And yet they have been trained to wait until the sound “OK” is heard before eating. So here, sitting with excitement at the prospect of the meal in front of them, they wait.
Here’s the YouTube video if you want to see their obedience at mealtime (click).
“Chicken!” - no movement. “Aardvark!” - nothing. “Oklahoma!” - an initial move, but they stop as the full sound was not recognized. “OK!” and they move to their bowls and begin eating. They are a little tentative to move when I play this game as their brains are checking that they heard correctly compared with just saying “OK” immediately.
Why Important?
Dogs are somewhat close to humans in evolutionary terms as we are both mammals, but they aren’t primates and their brains are relatively small compared with ours.
And yet they are perceiving the distinctions between “Oklahoma” and “OK” as can be seen with their initial motion and then their stopping when the correct sound isn’t matched. So their ability to store the sounds of language and connect it to motion control can be learned by dogs somewhat like the semiotic model (the science of signs once popular in the late 1800s) that contributes to human language. Notice that no machine learning was used to train the dogs, just a daily sit command followed by their eat command (i.e. “OK”). The dogs are quite capable of performing their eating regardless of the positions the bowls are put in, so their execution is quite general.
Their ability to move and this example explains why replicating the dog’s brain would help us to improve robotics. They don’t fall over often when moving, even when jumping on their two back legs. They know the difference between the same words even with variations in pronunciation, and between different words with similar sounds.
While pure research is often expensive and may not lead anywhere, Patom theory is different because not only is its theory better at explaining the observations from cognitive scientists, it has been very successful at emulating human language in conjunction with the linguistics model, Role and Reference Grammar, that represents the biggest breakthrough in language science in the 20th and 21st centuries!
Conclusion
Cognitive sciences need some kind of brain model to explain what’s going on better than today’s model often stated as, “A brain is like a computer,” because a computer doesn’t deal with sensory interactions, motor control or integrations like initiating eating only on command.
Patom theory starts with an explanation of what deficits we see as a result of brain damage. It’s new path forward for brain emulation, just waiting to be exploited.
Now is the time to scale up the applications of brain science to enable better tools to work with us. The last interface we’ll ever need for devices is human language and an improved model over the computer that can emulate biological capabilities is a key part to get started.
Do you want to read more?
If you want to read about the application of brain science to the problems of AI, you can read my latest book, “How to Solve AI with Our Brain: The Final Frontier in Science” to explain the facets of brain science we can apply and why the best analogy today is the brain as a pattern-matcher. The book link is here on Amazon in the US.
In the cover design below, you can see the human brain incorporating its senses, such as the eyes. The brain’s use is being applied to a human-like robot who is being improved with brain science towards full human emulation in looks and capability.



