If it is still being designed, that leaves scope for inclusion of recently developed IP which is processor agnostic and significantly more efficient than Akida 1.
https://brainchip.com/brainchip-mak...ilable-to-advance-state-of-edge-ai-solutions/
BrainChip Makes Second-Generation Akida Platform Available to Advance State of Edge AI Solutions
Laguna Hills, Calif. – OCTOBER 3, 2023 –
BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY),
the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced the Early Access availability of its second-generation Akida™ IP solution for use in a wide range of applications across the Smart Home, Smart City, Industrial and Automotive markets.
The 2nd generation Akida platform is designed for extremely energy-efficient processing of complex neural network models on Edge devices. The support for 8-bit weights, activations, and long-range skip connections expands the reach of models that are accelerated completely in Akida’s hardware. With the exponential increase in Cloud compute requirements for AI compounded by the growth of Generative AI, the move towards Hybrid AI solutions needs more capable and efficient compute at the Edge.
The introduction of Temporal Event Based Neural Nets (TENNs) revolutionizes the advanced sequential processing for multi-dimensional streaming and time-series data. This can radically reduce model size and improve performance, as well as efficiency, without compromising accuracy, which is an important consideration for Edge devices. Reducing the model size and improving compute density by order of magnitude, allows more capable AI use cases to compute closer to sensor in a more secure fashion.
Combining this with hardware acceleration of Vision Transformers (ViT) models, which boosts vision performance, unlocks the potential to create game-changing Edge devices that can process advanced vision and video applications in milliwatts or audio and other similar applications in microwatts at the sensor.
“Generative AI and LLMs at the Edge are key to intelligent situational awareness in verticals from manufacturing to healthcare to defense,” said Jean-Luc Chatelain, MD of Verax Capital Advisors and former MD and Global CTO at Accenture Applied Intelligence. “Disruptive innovation like BrainChip TENNs support Vision Transformers built on the foundation of neuromorphic principles, can deliver compelling solutions in ultra-low power, small form factor devices at the Edge, without compromising accuracy.”
The second generation MetaTF software enables developers to evaluate the capabilities of Akida, optimize, and customize their designs to get a head start on architecting their System on a Chip (SoC) along with their software solutions. In addition to TensorFlow, MetaTF will support ONNX which allows for greater compatibility across various frameworks including PyTorch.
“Multimodal Edge AI is an irreversible trend, and it is intensifying the demands on intelligent compute required from Edge devices,” said Zach Shelby, CEO Edge Impulse. “We’re excited that the 2nd Generation Akida addresses the critical elements of performance, efficiency, accuracy, and reliability needed to accelerate this transition. Most importantly, BrainChip has been a strategic partner that has collaborated closely with Edge Impulse to make their solutions easy to integrate, develop and deploy to the market.”
Akida processors power the next generation of Edge AI devices that enable growth in intelligence in industrial, home, automotive and other IoT environments. Akida’s fully digital, customizable, event-based neural processing solution is ideal for advanced intelligent sensing, medical monitoring and prediction, high-end video-object detection and more. Along with its extreme efficiency, accuracy and performance, Akida also has a unique ability to securely learn on-device without the need for cloud retraining.
“This is a significant step in BrainChip’s vision to bring unprecedented AI processing power to Edge devices, untethered from the cloud,” said Sean Hehir, CEO, BrainChip. “With Akida’s 2nd generation in advanced engagements with target customers, and MetaTF enabling early evaluation for a broader market, we are excited to accelerate the market towards the promise of Edge AI”.
So MetaTF 2 has been available since before October 2023. In fact it was announced in March 2023 ;
https://brainchip.com/brainchip-introduces-second-generation-akida-platform/
BrainChip Introduces Second-Generation Akida Platform
Introduces Vision Transformers and Spatial-Temporal Convolution for radically fast, hyper-efficient and secure Edge AIoT products, untethered from the cloud
Laguna Hills, Calif. – March 6, 2023 –
BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY),
the world’s first commercial producer of ultra-low power, fully digital, neuromorphic AI IP, today announced the second generation of its Akida™ platform that drives extremely efficient and intelligent edge devices for the Artificial Intelligence of Things (AIoT) solutions and services market that is expected to be $1T+ by 2030. This hyper-efficient yet powerful neural processing system, architected for embedded Edge AI applications, now adds efficient 8-bit processing to go with advanced capabilities such as time domain convolutions and vision transformer acceleration, for an unprecedented level of performance in sub-watt devices, taking them from perception towards cognition.
...
“Our customers wanted us to enable expanded predictive intelligence, target tracking, object detection, scene segmentation, and advanced vision capabilities. This new generation of Akida allows designers and developers to do things that were not possible before in a low-power edge device,” said Sean Hehir, BrainChip CEO. “By inferring and learning from raw sensor data, removing the need for digital signal pre-processing, we take a substantial step toward providing a cloudless Edge AI experience.”
Our customers "requested" the Akida 2 incorporate developments which expand predictive intelligence, target tracking, object detection, scene segmentation, and advanced vision capabilities.
We have a couple of rolled gold automotive customers, Valeo and MB, who could benefit from these capabilities, and MB is gearing up to announce a processor development partner in the near future, so fingers crossed ...
PS: Has anyone mentioned
WO2023250092A1 METHOD AND SYSTEM FOR PROCESSING EVENT-BASED DATA IN EVENT-BASED SPATIOTEMPORAL NEURAL NETWORKS 20220622 and
WO2023250093A1 METHOD AND SYSTEM FOR IMPLEMENTING TEMPORAL CONVOLUTION IN SPATIOTEMPORAL NEURAL NETWORK 20220622 ?
PPS: These patents date from June 2022, so they were in development for at least 6 months before that, meaning that BRN has been working on these developments since at least the start of 2022, which is when the EQXX emerged. So, even while the EQXX was still a chrysalis, BRN was already plotting Akida 2. This ties in with the MB statement when EQXX was disclosed that neuromorphic computing was still in its infancy. MB knew that big things were in the pipeline, even though Akida 1 gave remarkable performance above the available alternatives. They let slip about Akida 1, but were they already aware of Akida 2?