An interesting article indeed, but by no stretch of the imagination can it be called a “simple read”.
I assume using that word was somewhat tounge-in-cheek, but I often fail to pick up on sarcasm and hence I gave it a quick read. I admit to not reading all of it, I did skim over the quantum physics content and mathematics as I didn‘t need a refresher course.
But firstly, thanks for sharing it, it did cause me to stimulate my mind a little more than normal this morning.
Most importantly, I do hope Brainchip comes up with a method of modelling cortical columns, as that WILL take use cases to the next level and will, as @FactFinder most correctly stated, “blow the semiconductors market socks off”.
My following response is merely an alternate view formed by a casual read of the referenced material. I don‘t pretend to know much about the subject matter nor do I intend to discredit the author or anyone for sharing it. Well maybe the author(s) of the scientific paper, slightly—for the way the information is presented.
And I apologise in advance if I am completely off the mark. As I said—it was not a “simple read” and I only skimmed it.
IMHO, the referenced article is quite a difficult read, not from a content stance , but rather by the way it is presented. I feel it really doesn’t need to delve into quantum physics and electron spin theory nor a lot of the mathematics. The metastable dynamics involved in quantum physics is a completely different concept, and far more complex, I doubt they are at play in brain cells. For one thing, brain cells are far too big for quantum effects to be at play.
Even 28nm electronic circuits may be too large. But getting closer, than a brain cell, to the orders of magnitude required for quantum effects to have an impact.
Using quantum physics to explain the concept is probably using too big a sledge hammer and only alienates many potential readers. But that may be the entire point of the author(s) also!
I saw a lot of the content as the “I’m so smart” padding that normally proliferates scientific papers. I’ve read quite a few in the past. Unfortunately they are often judged as less useful if using an economy of words, or if written so a lay-person can understand them. “Judged by weight of the paper” is a term that is often used.
Academia certainly has a weird way of creating a protectionist bubble that excludes a lot of open discussion. Somewhat like job protection by obfuscation in some ways. Just my opinion!
A simple read would have been: the postulation of Cortical columns (or actually metastable dynamics, as is the true subject of the article) in the notion of preconception and thinking.
It seems they could physically measure brain activity before an event, and it is true that these were fleeting activities that quickly disappeared. Hence calling this metastable dynamics is probably appropriate. My objection is to the postulate that no impetus preceded this activity.
IMHO the author is excluding the wholistic approach of the body and the fact that the human brain does not control ALL aspects of the body. Hormones play a VERY significant role and maybe, in only looking at brain cell stimuli, the study has not appropriately considered hormones as the cause of these seemingly mystical responses in the supposed absence of stimuli.
The human body also can get trained on timing. Like getting hungry at 3 decently standard times of the day. There’s no physiological reason for having 3 meals, yet most people do.
Then there are many other synaesthesias at play, where one sense seems to autonomously elicit a response in another sense. You’ve all heard the saying “We eat with our eyes” for instance.
Now on to thinking. Do we truely understand what thinking is? I won’t go down that rat hole here but needless to say—we certainly don‘t know how the brain achieves the act of thinking. And then there’s thinking about thinking, or meta thinking etc.
Most certainly, every response needs a stimulus. Nothing happens spontaneously in the human brain. Even an unperceived stimulus is still a stimulus. We just have to look harder to notice it.