Hi D&S,I haven't posted in a while as the forum gets rather toxic at times.
These podcasts at CES however do clearly tell me one thing - The team at Brainchip have listened to feedback regarding communication. These have been great little snippets of updates with a few partners. My favourite so far being Onsemi and Inivation.
I sincerely hope Nandan will be bringing Farshad's (Inivation) comments to Sean Hehirs attention ASAP. I emailed Tony to also follow up with Sean to hopefully benefit from valuable critique from Farshad.
"I have been involved with Brainchip for less than a year, or maybe a year so far...I've been interacting with the company in terms of learning their capabilities"
"it is very promising, I also think their are certain areas that you're not advertising as best as you could"
"Kind of underestimating yourselves"
I really hope the dot joiners can calm down a little and not dismiss posters where someone doesn't jump to the premature conclusions that some draw here without fact or announcements. I like this place and hate blatant downramping as much as the next well intentioned shareholder. But we are still VERY early on in partnerships with some companies from the sounds of it. And we clearly aren't advertising our full capabilities well enough just yet (one partners' opinion but they are in a better position to comment on this than any of us). Hopefully this is a very fruitful year for us and it sounds like a little traction may be headed our way.
Good to hear from you.
As usual, that introduction presages a tongue lashing/ear bashing.
This technology is evolving at such a rapid rate that we've met ourselves coming back.
Akida 1 is brilliant groundbreaking technology - digital spiking neural network system on a chip with special sauce (N-of-M coding), using spikes (originally 1-bit) and only activating when an "event" (a change in the input data) occurred.
Previously, the only practicable way of identifying objects in a field of view was with a software program implementing convolutional neural networks (CNNS), either on a CPU (quite slow and power hungry), or GPU (faster but proportionally more power hungry.
Mostly in academic circles, attempts have been made to implement analog (ReRAM/MemRistor) CNN in silicon, but the manufacturing processes and temperature variability have limiter the accuracy of such devices.
PvdM's genius was in realizing that digital NNs could avoid these inherent inaccuracies, and in recognizing the genius of Simon Thorpe's N-of-M coding and, just as importantly, how it could be applied in silicon.
This gave rise to Akida 1, a technology years ahead of the competition. The design is highly flexible, allowing for a few nodes (4*NPEs per node), up to a couple of hundred nodes. Akida 1 was applicable to any sensor (video, audio, taste, smell, vibration ... ) so that, with the appropriate model library, it could classify any input signal. Of course, you don't find model libraries just lying around, but there are open source versions available, and Akida 1 has the ability to learn new classes to add to the model on chip. In addition, BRN has developed its own in-house model library "zoo".
Akida 1 went through a couple of iterations based on customer feedback. Initially it has 1-bit weights and activations - lightning fast and anorexically power sipping.
Customers were prepared to forego portions of these advantages for greater accuracy and somewhat higher power consumption.
So Akida 1 switched to 4-bit weights/activations.
In addition, customers required the ability to use their existing CNN models, so Akida 1 includes CNN2SNN conversion capability.
BRN has been involved with a few leading edge sensor makers for some years - eg, Valeo for lidar, Prophesee for DVS event cameras, both of which are a natural fit for Akida's snn capabilities. But there is an infinite number of applications for which Akida 1 is the best solution, except for Akida 2.
Our switch to IP only made a rod for our own back, excluding all but those who had the odd $50 million laying around to invest in developing new chips. Not that this is an unworkable business model - ARM does nicely out of it, although it would like a larger slice of the pie.
So now we come to the stage where we are prepared to sell devices including Akida 1, not so much as an income generating enterprise as a capability demonstration of Akida 1 ...
... and, to top it off, Akida 2 blows the socks off Akida 1.
How many EAPs are primed to explode? - well. in all the excitement, I've kinda forgotten myself, so, go ahead punk, make my day!
Last edited: