You are completely wrong imo, BrainChips patents are everything, so important to our future worth and not only from the point of defending but as an attraction in takeover. It seems to me that you don’t really get how unique BrainChips technology is and how important it could likely become. Let’s for a moment think of the time when inevitably the limits and sustainability of current AI solutions is exhausted. It will happened, it can’t go on the present path that Nvidia are squeezing out atm. The time will come when BRNs tech is suddenly the tech of choice (they are setting themselves up for it) and then we will get very big very quickly or, someone will want control of our patents so badly they will be willing to pay an incredible premium for them. AIMO.Patents are worthless if they're defending a concept that's not commercially viable. If BRN doesn't start making serious revenue soon then it won't have the resources to defend patent infringements anyway. This is simply a red herring designed to distract would-be investors from what is actually going on.
'The cornerstone of BRN value' what a bloody joke. The actual cornerstone of BRN value is IP licences, royalties and big tier 1 companies shouting from the rooftops about how amazing akida is. None of which appear to be happening.
Maximum
i have the current version. It requires internet connectivity, power inefficient, doesn’t have specific facial recognition & requires fairly regular recharging, 1-2 x monthly.This brand does door bells too, and very interesting is the 1 TOPS NPU.
My spidey sense is tingling.
Bring it on shorters!
I have no doubt in my mind this has Brainchip Akida S.
I’d like someone to prove me wrong.
Not financial advice.
Hadn't personally seen this project over at Edge Impulse before.
Google search link said late May 23 but who knows.
Have taken all the code sections etc out but full read at the link. Pretty cool running with FOMO.
" A new 16-core Neural Engine is capable of nearly 17 trillion operations per second, enabling even faster machine learning computations for features like Live Voicemail transcriptions in iOS 17 and third-party app experiences — all while protecting critical privacy and security features using the Secure Enclave."iPhone 15 USES akida and so does the new Apple Watch, and AirPods Pro.
Apple revenue is what Sean has been referring to cryptically.
Mark my words.
Will go out on a limb to say iPhone 16 will have akida 2.0 P in it.
Monday 18/09/23 is the date we’ve been waiting for!!! I feel it.
Time to shift gears and SEND ITTTTTT !!!
Go Brainchip!!!!
Not financial advice.
![]()
Apple debuts iPhone 15 and iPhone 15 Plus
Apple today announced iPhone 15 and iPhone 15 Plus, featuring an industry-first colour-infused back glass with a stunning, textured matte finish.www.apple.com
We have demonstrated that we can create a ‘smart biosensor’ that could learn to detect a disease, such as cystic fibrosis, without using a computer or software.
The real novelty is that the chips can learn and adapt to their application and environment.
17 trillion operations per second.....Pfffft" A new 16-core Neural Engine is capable of nearly 17 trillion operations per second, enabling even faster machine learning computations for features like Live Voicemail transcriptions in iOS 17 and third-party app experiences — all while protecting critical privacy and security features using the Secure Enclave."
Wouldn't it be nice......![]()
Here is another article about Naveen Kumar’s traffic monitoring project, published on Wevolver today - they really seem to love Akida!
![]()
Real-Time Traffic Monitoring with Neuromorphic Computing
Article #5 of Spotlight on Innovations in Edge Computing and Machine Learning: A computer vision project that monitors vehicle traffic in real-time using video inferencing performed on the Brainchip Akida Development Kit.www.wevolver.comReal-Time Traffic Monitoring with Neuromorphic Computing
![]()
David Tischler
14 Sep, 2023
FOLLOW
![]()
Article #5 of Spotlight on Innovations in Edge Computing and Machine Learning: A computer vision project that monitors vehicle traffic in real-time using video inferencing performed on the Brainchip Akida Development Kit.
Artificial Intelligence
- Edge Processors
- Embedded Machine Learning
- Neural Network
- Transportation
This article is part of Spotlight on Innovations in Edge Computing and Machine Learning. The series features some unique projects from around the world that leverage edge computing and machine learning, showcasing the ways these technological advancements are driving growth, efficiency, and innovation across various domains.
This series is made possible through the sponsorship of Edge Impulse, a leader in providing the platform for building smarter, connected solutions with edge computing and machine learning.
In the ever-evolving landscape of urban planning and development, the significance of efficient real-time traffic monitoring cannot be overstated. Traditional systems, while functional, often fall short when high-performance data processing is required in a low-power budget. Enter neuromorphic computing—a technology inspired by the neural structure of the brain, aiming to combine efficiency with computational power. This article delves into an interesting computer vision project that monitors vehicle traffic using this paradigm.
Utilizing aerial camera feeds, the project can detect moving vehicles with exceptional precision, making it a game-changer for city planners and governments aiming to optimize urban mobility. The key lies in the advanced neuromorphic processor that serves as the project's backbone. This processor is not just about low power consumption—it also boasts high-speed inference capabilities, making it ideal for real-time video inferencing tasks.
But the journey doesn't end at hardware selection. This article covers the full spectrum of the project, from setting up the optimal development environment and data collection methods to model training and deployment strategies. It offers a deep dive into how neuromorphic computing can be applied in real-world scenarios, shedding light on the processes of data acquisition, labeling, model training, and final deployment. As we navigate through the complexities of urban challenges, such insights pave the way for smarter, more efficient solutions in traffic monitoring and beyond.
Traffic Monitoring using the Brainchip Akida Neuromorphic Processor
Created By: Naveen Kumar
Public Project Link: https://studio.edgeimpulse.com/public/222419/latest
Overview
A highly efficient computer-vision system that can detect moving vehicles with great accuracy and relative motion, all while consuming minimal power.
![]()
By capturing moving vehicle images, aerial cameras can provide information about traffic conditions, which is beneficial for governments and planners to manage traffic and enhance urban mobility. Detecting moving vehicles with low-powered devices is still a challenging task. We are going to tackle this problem using a Brainchip Akida neural network accelerator.
Hardware Selection
In this project, we'll utilize BrainChip’s Akida Development Kit. BrainChip's neuromorphic processor IP uses event-based technology for increased energy efficiency. It allows incremental learning and high-speed inference for various applications, including convolutional neural networks, with exceptional performance and low power consumption.
![]()
The kit consists of an Akida PCie board, a Raspberry Pi Compute Module 4 with Wi-Fi and 8 GB RAM, and a Raspberry Pi Compute Module 4 I/O Board. The disassembled kit is shown below.
Hardware UnassembledThe Akida PCIe board can be connected to the Raspberry Pi Compute Module 4 IO Board through the PCIe Gen 2 x1 socket available onboard.![]()
Hardware Closeup![]()
Setting up the Development Environment
(…)
Conclusion
This project highlights the impressive abilities of the Akida PCIe board. Boasting low power consumption, it could be used as a highly effective device for real-time object detection in various industries for numerous use cases.
This article is based on: Traffic Monitoring using the Brainchip Akida Neuromorphic Processor - Expert Projects, a blog by Edge Impulse. It has been edited by the Wevolver team and Electrical Engineer Ravi Y Rao. It's the third article from the Spotlight on Innovations in Edge Computing and Machine Learning Series.
The first article introduced the series and explored the implementation of Predictive Maintenance system using a Nordic Thingy:91.
The second article described the implementation of voice control for making appliances smarter using a Nordic Thingy:53.
The third article dives deep into the application of EdgeAI for surface crack detection, showcasing its transformative role in modern industrial predictive maintenance systems.
The fourth article explains the integration of neuromorphic computing for real-time traffic monitoring, offering a technical blueprint for revolutionizing urban management.
About the sponsor: Edge Impulse
Edge Impulse is the leading development platform for embedded machine learning, used by over 1,000 enterprises across 200,000 ML projects worldwide. We are on a mission to enable the ultimate development experience for machine learning on embedded devices for sensors, audio, and computer vision, at scale.
From getting started in under five minutes to MLOps in production, we enable highly optimized ML deployable to a wide range of hardware from MCUs to CPUs, to custom AI accelerators. With Edge Impulse, developers, engineers, and domain experts solve real problems using machine learning in embedded solutions, speeding up development time from years to weeks. We specialize in industrial and professional applications including predictive maintenance, anomaly detection, human health, wearables, and more.
![]()
More by David Tischler
FOLLOW
David is a Senior Developer Program Manager helping to take care of the Edge Impulse community of developers, and is a fan of computing on small, low power devices. He's also an extreme recycler, so use caution if trying to throw away recyclable objects if he's around.
" A new 16-core Neural Engine is capable of nearly 17 trillion operations per second, enabling even faster machine learning computations for features like Live Voicemail transcriptions in iOS 17 and third-party app experiences — all while protecting critical privacy and security features using the Secure Enclave."
Wouldn't it be nice......![]()
Plus the quad ex witching. Its going to be a wild auction close today.