Hey Mate,![]()
BrainChip's Akida Edge Box, Built in Collaboration with VVDN Technologies, Boosts Vision-Based AI Workloads
BrainChip, in collaboration with VVDN Technologies, has previewed its industry-disrupting Akida Edge Box that is based on neuromorphic AI IP technology.www.iotevolutionworld.com
Standby![]()
If you wait just a little longer it may have "Akida Inside". Well one could only hope!Yeah same me, i thing most people here feel the same. Diogenese love your input and high end posts, but sometimes.
I CARNT SPEEKA YOOR LANGUWICH !
Have a great day all, by the way my top up got hit yesterday, i hope my wife doesn’t read TSE cos she wants a new fridge and washing machine, i guess it needs to wait now.wish me luck or a prayer.
![]()
6.20 mins
"Later this year we will decide on a partner to further develop our AI technology". Markus Shafer (CEO Mercedes)
They've already announced NVIDIA, so it can't be them he's referring to. Sooooo....
View attachment 54344
But the marriage hasn't been consummated yet......so not official..........We're already a partner Bravo![]()
6.20 mins
"Later this year we will decide on a partner to further develop our AI technology". Markus Shafer (CEO Mercedes)
They've already announced NVIDIA, so it can't be them he's referring to. Sooooo....
View attachment 54344
Wow! This is a must watch video!
23.10 - Jensen Huang
"We are going to build this fleet like no other fleet has been built.
We're going to put into and adapt into this fleet of Mercedes-Benz the most powerful computer in the world at it's time and it will have plenty of headroom.
It's still being designed.
Hundreds of engineers are currently designing that processor and it will go into a computer that will be outfitted into the entire fleet.
In no time in history has so much computing power been put into a car."
PS: How in the heck are they going to manage all of that massive compute power but still allow the vehicles to maintain high-level efficiency?
View attachment 54345
PPS: I wonder if they'll look towards the solution they discovered with the Vision EXQQ and its incredible efficiency?
View attachment 54346
View attachment 54347
View attachment 54348
If it is still being designed, that leaves scope for inclusion of recently developed IP which is processor agnostic and significantly more efficient than Akida 1.
https://brainchip.com/brainchip-mak...ilable-to-advance-state-of-edge-ai-solutions/
BrainChip Makes Second-Generation Akida Platform Available to Advance State of Edge AI Solutions
Laguna Hills, Calif. – OCTOBER 3, 2023 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced the Early Access availability of its second-generation Akida™ IP solution for use in a wide range of applications across the Smart Home, Smart City, Industrial and Automotive markets.
The 2nd generation Akida platform is designed for extremely energy-efficient processing of complex neural network models on Edge devices. The support for 8-bit weights, activations, and long-range skip connections expands the reach of models that are accelerated completely in Akida’s hardware. With the exponential increase in Cloud compute requirements for AI compounded by the growth of Generative AI, the move towards Hybrid AI solutions needs more capable and efficient compute at the Edge.
The introduction of Temporal Event Based Neural Nets (TENNs) revolutionizes the advanced sequential processing for multi-dimensional streaming and time-series data. This can radically reduce model size and improve performance, as well as efficiency, without compromising accuracy, which is an important consideration for Edge devices. Reducing the model size and improving compute density by order of magnitude, allows more capable AI use cases to compute closer to sensor in a more secure fashion.
Combining this with hardware acceleration of Vision Transformers (ViT) models, which boosts vision performance, unlocks the potential to create game-changing Edge devices that can process advanced vision and video applications in milliwatts or audio and other similar applications in microwatts at the sensor.
“Generative AI and LLMs at the Edge are key to intelligent situational awareness in verticals from manufacturing to healthcare to defense,” said Jean-Luc Chatelain, MD of Verax Capital Advisors and former MD and Global CTO at Accenture Applied Intelligence. “Disruptive innovation like BrainChip TENNs support Vision Transformers built on the foundation of neuromorphic principles, can deliver compelling solutions in ultra-low power, small form factor devices at the Edge, without compromising accuracy.”
The second generation MetaTF software enables developers to evaluate the capabilities of Akida, optimize, and customize their designs to get a head start on architecting their System on a Chip (SoC) along with their software solutions. In addition to TensorFlow, MetaTF will support ONNX which allows for greater compatibility across various frameworks including PyTorch.
“Multimodal Edge AI is an irreversible trend, and it is intensifying the demands on intelligent compute required from Edge devices,” said Zach Shelby, CEO Edge Impulse. “We’re excited that the 2nd Generation Akida addresses the critical elements of performance, efficiency, accuracy, and reliability needed to accelerate this transition. Most importantly, BrainChip has been a strategic partner that has collaborated closely with Edge Impulse to make their solutions easy to integrate, develop and deploy to the market.”
Akida processors power the next generation of Edge AI devices that enable growth in intelligence in industrial, home, automotive and other IoT environments. Akida’s fully digital, customizable, event-based neural processing solution is ideal for advanced intelligent sensing, medical monitoring and prediction, high-end video-object detection and more. Along with its extreme efficiency, accuracy and performance, Akida also has a unique ability to securely learn on-device without the need for cloud retraining.
“This is a significant step in BrainChip’s vision to bring unprecedented AI processing power to Edge devices, untethered from the cloud,” said Sean Hehir, CEO, BrainChip. “With Akida’s 2nd generation in advanced engagements with target customers, and MetaTF enabling early evaluation for a broader market, we are excited to accelerate the market towards the promise of Edge AI”.
So MetaTF 2 has been available since before October 2023. In fact it was announced in March 2023 ;
https://brainchip.com/brainchip-introduces-second-generation-akida-platform/
BrainChip Introduces Second-Generation Akida Platform
Introduces Vision Transformers and Spatial-Temporal Convolution for radically fast, hyper-efficient and secure Edge AIoT products, untethered from the cloud
Laguna Hills, Calif. – March 6, 2023 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, neuromorphic AI IP, today announced the second generation of its Akida™ platform that drives extremely efficient and intelligent edge devices for the Artificial Intelligence of Things (AIoT) solutions and services market that is expected to be $1T+ by 2030. This hyper-efficient yet powerful neural processing system, architected for embedded Edge AI applications, now adds efficient 8-bit processing to go with advanced capabilities such as time domain convolutions and vision transformer acceleration, for an unprecedented level of performance in sub-watt devices, taking them from perception towards cognition.
...
“Our customers wanted us to enable expanded predictive intelligence, target tracking, object detection, scene segmentation, and advanced vision capabilities. This new generation of Akida allows designers and developers to do things that were not possible before in a low-power edge device,” said Sean Hehir, BrainChip CEO. “By inferring and learning from raw sensor data, removing the need for digital signal pre-processing, we take a substantial step toward providing a cloudless Edge AI experience.”
Our customers "requested" the Akida 2 incorporate developments which expand predictive intelligence, target tracking, object detection, scene segmentation, and advanced vision capabilities.
We have a couple of rolled gold automotive customers, Valeo and MB, who could benefit from these capabilities, and MB is gearing up to announce a processor development partner in the near future, so fingers crossed ...
PS: Has anyone mentioned
WO2023250092A1 METHOD AND SYSTEM FOR PROCESSING EVENT-BASED DATA IN EVENT-BASED SPATIOTEMPORAL NEURAL NETWORKS 20220622 and
WO2023250093A1 METHOD AND SYSTEM FOR IMPLEMENTING TEMPORAL CONVOLUTION IN SPATIOTEMPORAL NEURAL NETWORK 20220622 ?
If it is still being designed, that leaves scope for inclusion of recently developed IP which is processor agnostic and significantly more efficient than Akida 1.
https://brainchip.com/brainchip-mak...ilable-to-advance-state-of-edge-ai-solutions/
BrainChip Makes Second-Generation Akida Platform Available to Advance State of Edge AI Solutions
Laguna Hills, Calif. – OCTOBER 3, 2023 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI IP, today announced the Early Access availability of its second-generation Akida™ IP solution for use in a wide range of applications across the Smart Home, Smart City, Industrial and Automotive markets.
The 2nd generation Akida platform is designed for extremely energy-efficient processing of complex neural network models on Edge devices. The support for 8-bit weights, activations, and long-range skip connections expands the reach of models that are accelerated completely in Akida’s hardware. With the exponential increase in Cloud compute requirements for AI compounded by the growth of Generative AI, the move towards Hybrid AI solutions needs more capable and efficient compute at the Edge.
The introduction of Temporal Event Based Neural Nets (TENNs) revolutionizes the advanced sequential processing for multi-dimensional streaming and time-series data. This can radically reduce model size and improve performance, as well as efficiency, without compromising accuracy, which is an important consideration for Edge devices. Reducing the model size and improving compute density by order of magnitude, allows more capable AI use cases to compute closer to sensor in a more secure fashion.
Combining this with hardware acceleration of Vision Transformers (ViT) models, which boosts vision performance, unlocks the potential to create game-changing Edge devices that can process advanced vision and video applications in milliwatts or audio and other similar applications in microwatts at the sensor.
“Generative AI and LLMs at the Edge are key to intelligent situational awareness in verticals from manufacturing to healthcare to defense,” said Jean-Luc Chatelain, MD of Verax Capital Advisors and former MD and Global CTO at Accenture Applied Intelligence. “Disruptive innovation like BrainChip TENNs support Vision Transformers built on the foundation of neuromorphic principles, can deliver compelling solutions in ultra-low power, small form factor devices at the Edge, without compromising accuracy.”
The second generation MetaTF software enables developers to evaluate the capabilities of Akida, optimize, and customize their designs to get a head start on architecting their System on a Chip (SoC) along with their software solutions. In addition to TensorFlow, MetaTF will support ONNX which allows for greater compatibility across various frameworks including PyTorch.
“Multimodal Edge AI is an irreversible trend, and it is intensifying the demands on intelligent compute required from Edge devices,” said Zach Shelby, CEO Edge Impulse. “We’re excited that the 2nd Generation Akida addresses the critical elements of performance, efficiency, accuracy, and reliability needed to accelerate this transition. Most importantly, BrainChip has been a strategic partner that has collaborated closely with Edge Impulse to make their solutions easy to integrate, develop and deploy to the market.”
Akida processors power the next generation of Edge AI devices that enable growth in intelligence in industrial, home, automotive and other IoT environments. Akida’s fully digital, customizable, event-based neural processing solution is ideal for advanced intelligent sensing, medical monitoring and prediction, high-end video-object detection and more. Along with its extreme efficiency, accuracy and performance, Akida also has a unique ability to securely learn on-device without the need for cloud retraining.
“This is a significant step in BrainChip’s vision to bring unprecedented AI processing power to Edge devices, untethered from the cloud,” said Sean Hehir, CEO, BrainChip. “With Akida’s 2nd generation in advanced engagements with target customers, and MetaTF enabling early evaluation for a broader market, we are excited to accelerate the market towards the promise of Edge AI”.
So MetaTF 2 has been available since before October 2023. In fact it was announced in March 2023 ;
https://brainchip.com/brainchip-introduces-second-generation-akida-platform/
BrainChip Introduces Second-Generation Akida Platform
Introduces Vision Transformers and Spatial-Temporal Convolution for radically fast, hyper-efficient and secure Edge AIoT products, untethered from the cloud
Laguna Hills, Calif. – March 6, 2023 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, neuromorphic AI IP, today announced the second generation of its Akida™ platform that drives extremely efficient and intelligent edge devices for the Artificial Intelligence of Things (AIoT) solutions and services market that is expected to be $1T+ by 2030. This hyper-efficient yet powerful neural processing system, architected for embedded Edge AI applications, now adds efficient 8-bit processing to go with advanced capabilities such as time domain convolutions and vision transformer acceleration, for an unprecedented level of performance in sub-watt devices, taking them from perception towards cognition.
...
“Our customers wanted us to enable expanded predictive intelligence, target tracking, object detection, scene segmentation, and advanced vision capabilities. This new generation of Akida allows designers and developers to do things that were not possible before in a low-power edge device,” said Sean Hehir, BrainChip CEO. “By inferring and learning from raw sensor data, removing the need for digital signal pre-processing, we take a substantial step toward providing a cloudless Edge AI experience.”
Our customers "requested" the Akida 2 incorporate developments which expand predictive intelligence, target tracking, object detection, scene segmentation, and advanced vision capabilities.
We have a couple of rolled gold automotive customers, Valeo and MB, who could benefit from these capabilities, and MB is gearing up to announce a processor development partner in the near future, so fingers crossed ...
PS: Has anyone mentioned
WO2023250092A1 METHOD AND SYSTEM FOR PROCESSING EVENT-BASED DATA IN EVENT-BASED SPATIOTEMPORAL NEURAL NETWORKS 20220622 and
WO2023250093A1 METHOD AND SYSTEM FOR IMPLEMENTING TEMPORAL CONVOLUTION IN SPATIOTEMPORAL NEURAL NETWORK 20220622 ?
PPS: These patents date from June 2022, so they were in development for at least 6 months before that, meaning that BRN has been working on these developments since at least the start of 2022, which is when the EQXX emerged. So, even while the EQXX was still a chrysalis, BRN was already plotting Akida 2. This ties in with the MB statement when EQXX was disclosed that neuromorphic computing was still in its infancy. MB knew that big things were in the pipeline, even though Akida 1 gave remarkable performance above the available alternatives. They let slip about Akida 1, but were they already aware of Akida 2?
"Forget the buyers... I can't believe there are people willing to sell their shares... especially at these prices. I wonder who could possibly be willing to sell?"Just imagine if all of the buys were real, we would be back in the 20's if they just bit the bullet and changed their orders to at market. Oh well, time to stop dreaming and get back to work.
View attachment 54339
"Aaaahhhh... makes sense"Painchip is the real name of the company. SP reflecting the companys performance which is basicly trash
"Forget the buyers... I can't believe there are people willing to sell their shares... especially at these prices. I wonder who could possibly be willing to sell?"My thoughts scrolling on TSE today looking for some insights/news about Brainchip
"Forget the buyers... I can't believe there are people willing to sell their shares... especially at these prices. I wonder who could possibly be willing to sell?"
"Aaaahhhh... makes sense"
OK, so I said " I'm VERY disappointed that the share price has gone nowhere after the C.E.S., to which Wiltzy said "Where was it supposed to go? LOL. Perhaps I am the naive one, and you will enlighten me as to where it should have gone and why."I think what we need to remember, and is how I look at it anyway, is that whilst it is a tradeshow, CES is the "Consumer Electronics Show".
This says to me that most of the actual "exhibitors" are showing off their end user products, things consumers can buy now or near term etc.
The suites appear to be the real tradeshow where OEMs, tech companies, suppliers, component manufacturers etc have the nuts and bolts displays and demos to attract new clients, partners etc rather than investors.
Obviously, investment may pop up as a side discussion however I would think that would be the domain of Broker investor meetings or direct one on ones between interested companies.
Individual investors would be the ones milling around the exhibition wide eyed at new tech products and not really exposed to the suites anyway.
So for me, I wasn't expecting any SP change much unless an actual Ann came out but I do expect that these suite meetings have encouraged other companies to explore Akida and engage with us over and above the few reveals we did also get.
And, that those reveals are closer to firm commitments than not.