Maybe / maybe not posted prev but I've not seen this one yet.
I have a couple of "huhs
" from it and further context would be great.
Given the ASIC comment makes Megachips sense.
Neuromorphic chip designer BrainChip has unveiled the second generation of its Akida platform. Taking inspiration from the human brain, these chips promise to slash the compute costs burdening enterprises and operators, by moving those workloads into bespoke edgy silicon, and out of expensive...
rethinkresearch.biz
9 March 2023
BrainChip’s second-gen neuromorphic silicon gunning for the big boys
By
Alex Davies
Neuromorphic chip designer BrainChip has unveiled the second generation of its Akida platform. Taking inspiration from the human brain, these chips promise to slash the compute costs burdening enterprises and operators, by moving those workloads into bespoke edgy silicon, and out of expensive centralized cloud environments.
Nandan Nayampally, Chief Marketing Officer at BrainChip and formerly a VP at Arm, pointed to the scale of this pricing problem, in conversation with Faultline this week. “It took $6 million to train one model, with ChatGPT. There is something like $50 billion in productivity losses from unplanned downtime in manufacturing alone, and $1.1 trillion in losses from people missing work due to preventable chronic illness in the US. With 1 TB of data per connected car per day too, there are big opportunities for AI, but big challenges.”
The industry is quite used to huge numbers getting hurled around. For context, the World Bank estimated global GDP to be around $96.5 trillion dollars in 2021, so even a fraction of a percentage increase in productivity could have a profound impact.
With billions more connected devices coming online, all that data has to be harvested and fed into the right processing systems, in order to achieve the $15 trillion global benefit of AI that Nayampally anticipates.
“We’re aiming to reduce the amount of compute done in the cloud, by doing intelligent compute on the device. This saves on cloud training, storage, and compute costs, and allows for real-time responses in the devices. This includes portable devices, and there is also the data security angle to consider too. So, that’s the background,” said Nayampally.
This brings us to the Akida platform; the designs that BrainChip licenses to companies that want to add AI-powered silicon functions to their designs.
BrainChip’s business model is quite similar to Arm, which does not manufacture chips itself. BrainChip provides much of the software and code necessary for developers to put these designs to work.
There are three tiers to the platform. The energy-sipping Akida-E series is the smallest, and intended for use in sensors. The Akida-S is a general-purpose design, intended for use in microcontrollers, while the Akida-P is the maximum performance variant, which is most applicable to the Faultline ecosystem.
“These are effectively ASICs,” said Nayampally, of the chips that include the BrainChip IP cores, “and this is where building ASICs is very cost-effective, as you are replacing a thousand-dollar GPU or card with a sub-$10 part.”
These Akida-P designs are intended for advanced speech recognition, object detection and classification, video object detection and tracking, and vision transformer networks.
The second-generation has added support for the Vision Transformer (ViT) core, which can be combined with the new Temporal Event-based Neural Network (TENN) cores, which claim benefits over RNNs (Recurrent Neural Networks) and CNNs (Convolutional Neural Networks) via their ability to process both spatial and temporal data.
A company called Prophesee has built an object detection system for road vehicles, which claims 30% better precision using the new technology.
More importantly, this uses 50x fewer parameters (inputs) and 30x fewer operations (processing cycles) than the old system. This should mean lower power consumption and faster response times, as well as a cheaper design, as the system requires less on-device memory and storage.
A benchmark test using another video object recognition test resulted in a system that could process a 1382x512p video at 30 fps using less than 75 mW of power, in a 16 nm silicon design. This needed 50x fewer parameters and 5x fewer operations than the Resnet50 reference design.
Notably, the Akida platform claims to improve with time, thanks to the on-device learning capabilities of the silicon.
“This is the most important thing. We don’t train models on the devices. We extract the model’s features, and then port the object classes. You can design for today and then be able to upgrade to more complex models in time,” said Nayampally.
There are
no public second generation customers yet, but Renesas and MegaChips have both licensed the first-generation designs. For video capture, production, and initial distribution, there are plenty of applications for this neuromorphic silicon, which should mature in time.
However, “I think the pay TV operators are at the very tail end of the technical spectrum, and I say that in a nice way,” said Nayampally. “They are a very margin-driven business, so they work on chips they can build with – the lowest common denominators. So, they will require a silicon partner to promote these new capabilities, and we are working on it.”
As BrainChip’s targets include many Internet of Things (IoT) applications, it was only a matter of time until the smart home opportunity came up. Asked whether operators were showing interest in the Matter-spurred second wave of smart home, Nayampally said “we’re not seeing it yet, to be brutally honest.”
“It’s mostly because we’re not in the phase where we are pushing a proven solution. Most want a complete reference design for their stack, and as the IP model, in our first-generation designs, we were essentially proving the silicon. With the second, we can start to build these reference designs, and help them scale it.”
Founded in Australia, in 2004, BrainChip went public in 2015. Its share price has been quite volatile, leaping from $0.41 AUD in October 2021 to $1.76 AUD in January 2022, before declining steadily to reach around $0.56 AUD today. It acquired SpikeNet Technologies in 2016, a French firm specializing in computer vision, for around USD$1.45 million.