Dozzaman1977
Regular
Is that a new whitepaper released "Learning how to learn neuromorphic AI at the edge" on BRN website or did i miss its release in the past.
YessssView attachment 15381
Is that a new whitepaper released "Learning how to learn neuromorphic AI at the edge" on BRN website or did i miss its release in the past.
Sorry did I forget to mention someone?**Insert dad joke**
Geez....her name might be hard to forget.
Good morning all,
inspired by a teardown article yesterday, i had a closer look into the specs of the GoPro 10 camera.
![]()
GoPro HERO 10 Teardown
A teardown of the GoPro HERO10 codenamed Kong including what changed in hardware from the HERO9. The ARMV8 64bit GP2 Processor also known as a Socionext M20V is the only major change. The whole codebase was upgraded to a 64bit codebase so be weary of buying this right out of the gates.gethypoxic.com
And one interesting thing for me is, that they removed the "DSP Group DBMD4" from the GoPro 9 camera.
The DBMD4 is an "Ultra Low Power Always-On Voice Activation for Any Device" Audio DSP pre-processor.
A costumer commented on the GoPro forums, that this was due the fact, that the camera often accidently powering on, low user interest and the chip shortage.
GoPro Support
community.gopro.com
Which leads me to the DSP Group Youtube channel and I would like to hear from someone with more expertise about the benchmarks between AKIDA and DSP Group on keyword spotting:
By the way, DSP Group have been acquired by Synaptics, which have been discussed here as a competitor earlier this year.
Which will really be in 2023In my opinion brainchip AKIDA IP isn't in this version but will be in the new MB OS system in 2024.
Though we know the ASX is not very proactive and they are barely covering off on their basic duties as it is. A lot of companies are getting away with a lot of different things unfortunately.I am not sure if this was for me or Cardpro but either way Trade Secret or not it makes clear that the ASX can decide that it should be disclosed.
When a company makes an announcement it takes the risk that the ASX will intervene and require the disclosure of information that it wants, needs or has agreed to keep confidential. See 3.1A.2 of Guidance Note 8
Of course in the absence of an announcement if the ASX becomes aware from any source that the company may be sitting on material information it can intervene and force disclosure via the ASX.
My opinion only DYOR
FF
AKIDA BALLISTA
This is a great read. Thanks for posting @Dozzaman1977View attachment 15381
Is that a new whitepaper released "Learning how to learn neuromorphic AI at the edge" on BRN website or did i miss its release in the past.
In my opinion brainchip AKIDA IP isn't in this version but will be in the new MB OS system in 2024.
And it stinks. I have seen this type of enforcement across so many areas. When enforcement organisations are failing they try to disguise their failure by selecting high visibility targets that get them publicity.Though we know the ASX is not very proactive and they are barely covering off on their basic duties as it is. A lot of companies are getting away with a lot of different things unfortunately.
View attachment 15381
Is that a new whitepaper released "Learning how to learn neuromorphic AI at the edge" on BRN website or did i miss its release in the past.
Hi Dozz,
Yes it's new, great find, I checked out the document properties to find the creation date & it was created on the 24th Aug 2022
See below
View attachment 15388
Tell me this isn't Valeo's next generation LiDAR: Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edgeThis is a great read. Thanks for posting @Dozzaman1977
Q: BrainChip customers are already deploying smarter endpoints with independent learning and inference capabilities, faster response times, and a lower power budget. Can you give us some real-world examples of how AKIDA is revolutionizing AI inference at the edge?
Peter: Automotive is a good place to start. Using AKIDA-powered sensors and accelerators, automotive companies can now design lighter, faster, and more energy efficient in-cabin systems that enable advanced driver verification and customization, sophisticated voice control technology, and next-level gaze estimation and emotion classification capabilities.
Revolutionizing AI inference at the edge In addition to redefining the automotive in-cabin experience, AKIDA is helping enable new computer vision and LiDAR systems to detect vehicles, pedestrians, bicyclists, street signs, and objects with incredibly high levels of precision. We’re looking forward to seeing how these fast and energy efficient ADAS systems help automotive companies accelerate the rollout of increasingly advanced assisted driving capabilities. In the future, we’d also like to power self-driving cars and trucks. But we don’t want to program these vehicles. To achieve true autonomy, cars and trucks must independently learn how to drive in different environmental and topographical conditions such as icy mountain roads with limited visibility, crowded streets, and fast-moving highways.
Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge With AKIDA, these driving skills can be easily copied and adaptable by millions of self-driving vehicles. This is a particularly important point. AKIDA driving updates will be based on real-world knowledge and skills other cars have learned on the road. We want to avoid boilerplate firmware updates pushed out by engineering teams sitting in cubes. A programmed car lacking advanced learning and inferential capabilities can’t anticipate, understand, or react to new driving scenarios. That’s why BrainChip’s AKIDA focuses on efficiently learning, inferring, and adapting new skills.
Q: Aside from automotive, what are some other multimodal edge use cases AKIDA enables?
Peter: Smart homes, automated factories and warehouses, vibration monitoring and analysis of industrial equipment, as well as advanced speech and facial recognition applications. AKIDA is also accelerating the design of robots using sophisticated sensors to see, hear, smell, touch, and even taste.
Yes brand new and I have reported a fault with the security access via my iPhone which prevents me from advancing to the document. The thing I find funny is that Ken the Robot with AKIDA can pass all these select a boat or bus or plane tests they throw up to eliminate robots from accessing. I have also mentioned the humour involved here to the company as well. LOLHi Dozz,
Yes it's new, great find, I checked out the document properties to find the creation date & it was created on the 24th Aug 2022
See below screenshot.
View attachment 15388
Your 100% on the moneyTell me this isn't Valeo's next generation LiDAR: Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge
Hi @SladeTell me this isn't Valeo's next generation LiDAR: Enabling advanced LiDAR with AKIDA Neuromorphic AI inference at the edge
Exactly. Nintendo has only the Switch console in program by now. An update is coming 2023. The technical dates of it been leaked earlier this year, no Akida inside. According to the usually well informed games scene a new console will not appear earlier than 2026. When there comes a new console with new in game functionalities, it first needs new games which can handle the new features. Games have a very long development cycle, there is no way to speed this up.
So anything about controller updates is much closer to reality than speculations on a brandnew console coming soon containing Akida IP.
I don't think anyone could say 100% that there won't be Akida inside in the update in 2023. I understand what you're saying about the fact that new games may have to be developed to utilize Akida to the fullest capacity and the development time-frame creating new games is particularly long.
But there's no way of knowing that Akida won't be in the upgrade to assist with other things like AI, self-learnig features, noise reduction, etc, whilst new games are being developed. Just a thought?
Cheers TechGirl
Chapter 3 is exciting, its all been stated before but hopefully the world is moving in the Akida direction!!!!
Q: With AKIDA’s neuromorphic architecture, BrainChip is enabling the
semiconductor industry to untether edge AI from cloud data centers. This is quite
timely, because conventional AI silicon and cloud-centric inference models aren’t
performing efficiently at the edge, even as the number of edge-enabled IoT
devices are expected to hit 7.8 billion by 2030. Can you elaborate on the notion
of untethering?
Peter: Increasing internet congestion is increasing latency as more edge devices
upload their data. The power consumption and heat production of massive
parallel Von Neumann type processors is also increasing linearly with the
computing power required by AI applications. That’s why untethering edge AI from
the cloud with AKIDA is a critical step to designing faster and more environmentally
sustainable endpoints.
Chapter 3:
Untethering edge AI from cloud data centers
Differentiating intelligent endpoint requirements
Data Center
Server Edge Endpoint
Power intensive
High latency
Huge memory requirement
Big data inference
High bandwidth
Privacy concerns
Power efficient
Ultra low latency
Small memory requirement
Small data inference, one-shot learning
Low bandwidth
On-chip, in-device
Privacy enabling
The Edge
Data centers hosting cloud-based workloads emitted an estimated 600
megatons of greenhouse gases in 2020 alone, more than the consumption of the
entire United Kingdom (GB). Unless something radically changes, data centers
will consume over 20% of the world’s energy by 2050! With its on-chip learning
and low power, high throughput inference capabilities, we believe AKIDA can help
reduce data center carbon emissions by 98% by decentralizing AI processing.
Intelligently analyzing data on-chip will help put an end to the yottabytes of raw,
unprocessed, and mostly irrelevant data sent to cloud data centers by millions of
endpoints, solving the impeding internet congestion problem.
Using image recognition as an example, we can quantify the power savings
enabled by AKIDA’s on-chip capabilities compared to a GPU in today’s data
center. Specifically, AKIDA can efficiently analyze and categorize the 1.2 million
images of the ImageNet dataset with a minimal power budget of 300 milliwatts.
A GPU performing this task consumes up to 300 watts! This huge difference
illustrates why simply scaling down conventional AI hardware to meet the unique
requirements of edge endpoints is insufficient.