BRN Discussion Ongoing

AARONASX

Holding onto what I've got
A lot of dot joining has been shared by wonderful people here.

There is too many to share and go over (otherwise I would be here all night) but anyone wants light reading.
Go to: https://ppubs.uspto.gov/pubwebapp/
Search: "Brainchip"
NOTE: While I understand not all would lead to Brainchip /Akida or anything directly to do, however "brainchip" gets a mention, therefor IMO the author(s) are aware of Brainchip.

-8 of them in the last year alone.
-7ish in the last and most recent 4C time frame.
-3 (Blumind Inc, Giant AI Inc & Digimarc Corporation)
- many many more.
-10 specific by Brainchip Inc


And I don't think this site lists all of them globally, extrapolate that information and we have a lot more out there aware.

1671691236784.png
 

Attachments

  • 1671690494433.png
    1671690494433.png
    152.2 KB · Views: 65
  • Like
  • Love
  • Fire
Reactions: 24 users

Pappagallo

Regular
This is a positive view of where Brainchip is going in automotive. Mercedes Benz sells around an average of 3 million passenger vehicles and Jerome Nadel places 70 AKIDA chips pre processing sensor inputs before passing on as meta data. If Blind Freddie's mental arithmetic is correct that is 270 million AKIDA smart sensors.


"BrainChip Akida


Mercedes-Benz's EQXX concept car, which debuted at CES earlier this year, uses BrainChip's Akida neuromorphic processor for in-vehicle keyword recognition. Billed as "the most efficient car Mercedes has ever made," the car utilizes neuromorphic technology that consumes less power than a deep learning-based keyword spotting system. That's crucial for a car with a range of 620 miles, or 167 miles more than Mercedes' flagship electric car, the EQS.

Mercedes said at the time that BrainChip's solution was five to 10 times more efficient than traditional voice controls at recognizing the wake word "Hey Mercedes."

Application of SNN in Vehicle Field


Mercedes said, “Although neuromorphic computing is still in its infancy, such systems will soon be on the market within a few years. When applied at scale throughout vehicles, they have the potential to radically reduce the amount of effort required to run the latest AI technologies. power consumption."


BrainChip's CMO Jerome Nadel said: "Mercedes is focused on big issues like battery management and transmission, but every milliwatt counts, and when you think about energy efficiency, even the most basic reasoning, like finding keywords, matters. important."

A typical car could have as many as 70 different sensors by 2022, Nadel said. For cockpit applications, these sensors can enable face detection, gaze assessment, emotion classification, and more.

He said: “From a system architecture perspective, we can do a 1:1 approach where there is a sensor that will do some preprocessing and then the data will be forwarded. The AI will do inference near the sensor...it will Instead of the full array of data from sensors, the inference metadata is passed forward.”

The idea is to minimize the size and complexity of packets sent to AI accelerators, while reducing latency and minimizing power consumption. Each vehicle will likely have 70 Akida chips or sensors with Akida technology, each of which will be "low-cost parts that won't notice them at all," Nadel said. He noted that attention needs to be paid to the BOM of all these sensors.


Application of SNN in Vehicle Field


BrainChip expects to have its neuromorphic processor next to every sensor on the vehicle

Going forward, Nadel said, neuromorphic processing will also be used in ADAS and autonomous driving systems. This has the potential to reduce the need for other types of power-hungry AI accelerators.

"If every sensor could have Akida configured on one or two nodes, it would do adequate inference, and the data passed would be an order of magnitude less, because that would be inference metadata...that would affect the servers you need," he said. power."


BrainChip's Akida chip accelerates SNNs (spike neural networks) and CNNs (by converting to SNNs). It's not tailored for any specific use case or sensor, so it can be paired with visual sensing for face recognition or people detection, or other audio applications like speaker ID. BrainChip also demonstrated Akida's smell and taste sensors, although it's hard to imagine how these could be used in cars (perhaps to detect air pollution or fuel quality through smell and taste).

Akida is set up to handle SNNs or deep learning CNNs that have been transformed into SNNs. Unlike the native spike network, the transformed CNN preserves some spike-level information, so it may require 2 or 4 bits of computation. However, this approach allows exploiting the properties of CNNs, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP. In the case of Mercedes-Benz, this might mean retraining the network after deployment to discover more or different keywords.

Application of SNN in Vehicle Field


According to Autocar, Mercedes-Benz confirmed that "many innovations" from the EQXX concept car, including "specific components and technologies," will be used in the production model. There's no word yet on whether new Mercedes-Benz models will feature artificial brains."

I do hope you read the whole post and not just the orange text😂🤣😂🤣 - (🐫x1000)

My opinion only DYOR
FF

AKIDA BALLISTA

Blind Freddie is incorrect but 210 million is still pretty huge so the point stands.
 
  • Haha
  • Like
Reactions: 11 users
This is a positive view of where Brainchip is going in automotive. Mercedes Benz sells around an average of 3 million passenger vehicles and Jerome Nadel places 70 AKIDA chips pre processing sensor inputs before passing on as meta data. If Blind Freddie's mental arithmetic is correct that is 270 million AKIDA smart sensors.


"BrainChip Akida


Mercedes-Benz's EQXX concept car, which debuted at CES earlier this year, uses BrainChip's Akida neuromorphic processor for in-vehicle keyword recognition. Billed as "the most efficient car Mercedes has ever made," the car utilizes neuromorphic technology that consumes less power than a deep learning-based keyword spotting system. That's crucial for a car with a range of 620 miles, or 167 miles more than Mercedes' flagship electric car, the EQS.

Mercedes said at the time that BrainChip's solution was five to 10 times more efficient than traditional voice controls at recognizing the wake word "Hey Mercedes."

Application of SNN in Vehicle Field


Mercedes said, “Although neuromorphic computing is still in its infancy, such systems will soon be on the market within a few years. When applied at scale throughout vehicles, they have the potential to radically reduce the amount of effort required to run the latest AI technologies. power consumption."


BrainChip's CMO Jerome Nadel said: "Mercedes is focused on big issues like battery management and transmission, but every milliwatt counts, and when you think about energy efficiency, even the most basic reasoning, like finding keywords, matters. important."

A typical car could have as many as 70 different sensors by 2022, Nadel said. For cockpit applications, these sensors can enable face detection, gaze assessment, emotion classification, and more.

He said: “From a system architecture perspective, we can do a 1:1 approach where there is a sensor that will do some preprocessing and then the data will be forwarded. The AI will do inference near the sensor...it will Instead of the full array of data from sensors, the inference metadata is passed forward.”

The idea is to minimize the size and complexity of packets sent to AI accelerators, while reducing latency and minimizing power consumption. Each vehicle will likely have 70 Akida chips or sensors with Akida technology, each of which will be "low-cost parts that won't notice them at all," Nadel said. He noted that attention needs to be paid to the BOM of all these sensors.


Application of SNN in Vehicle Field


BrainChip expects to have its neuromorphic processor next to every sensor on the vehicle

Going forward, Nadel said, neuromorphic processing will also be used in ADAS and autonomous driving systems. This has the potential to reduce the need for other types of power-hungry AI accelerators.

"If every sensor could have Akida configured on one or two nodes, it would do adequate inference, and the data passed would be an order of magnitude less, because that would be inference metadata...that would affect the servers you need," he said. power."


BrainChip's Akida chip accelerates SNNs (spike neural networks) and CNNs (by converting to SNNs). It's not tailored for any specific use case or sensor, so it can be paired with visual sensing for face recognition or people detection, or other audio applications like speaker ID. BrainChip also demonstrated Akida's smell and taste sensors, although it's hard to imagine how these could be used in cars (perhaps to detect air pollution or fuel quality through smell and taste).

Akida is set up to handle SNNs or deep learning CNNs that have been transformed into SNNs. Unlike the native spike network, the transformed CNN preserves some spike-level information, so it may require 2 or 4 bits of computation. However, this approach allows exploiting the properties of CNNs, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP. In the case of Mercedes-Benz, this might mean retraining the network after deployment to discover more or different keywords.

Application of SNN in Vehicle Field


According to Autocar, Mercedes-Benz confirmed that "many innovations" from the EQXX concept car, including "specific components and technologies," will be used in the production model. There's no word yet on whether new Mercedes-Benz models will feature artificial brains."

I do hope you read the whole post and not just the orange text😂🤣😂🤣 - (🐫x1000)

My opinion only DYOR
FF

AKIDA BALLISTA
What is interesting about this claim of 70 AKIDA made smart sensors in my opinion is the two facts that have recently emerged:

1. RENESAS is taping out its AKIDA MCU for automotive applications - ideally these budget versions of AKIDA will meet the BOM (Bill of Material) cost constraints to which Jerome Nadel refers in the article.

AND

2. SOCIONEXT is producing AKIDA for ADAS at 5 & 7 nm which will ideally replace the need for power hungry accelerators again as referenced by Jerome Nadel.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 60 users
Blind Freddie is incorrect but 210 million is still pretty huge so the point stands.
Very brave to say Blind Freddie is wrong about anything it was the dumb typist. 😂🤣😂🤡😂🤣
Hopefully my admission will satisfy his thirst for revenge.
Post has been corrected.🤓
 
  • Haha
  • Like
  • Love
Reactions: 19 users

miaeffect

Oat latte lover
Blind Freddie is incorrect but 210 million is still pretty huge so the point stands.
Screenshot_20221222-181743_Calculator.jpg

You are wrong. It says 270
 
  • Haha
  • Like
Reactions: 9 users
Hi Quatrojos,

Unfortunately this is not BrainChip. It is a software system developed by the authors for redistributing internet loads to avoid bottlenecks.

Page 10:
"Our study proposes AKIDA, a new architecture that strategically harvests the untapped compute capacity of the SmartNICs to offload transient workload spikes, thereby reducing the SLA violations. Usage of this untapped compute capacity is more favorable than adding and deploying additional servers, as SmartNICs are economically and operationally more desirable. AKIDA is a low-cost and scalable platform that orchestrates seamless offloading of serverless workloads to the SmartNICs at the network edge, eliminating the need for pre-allocating expensive compute power and over-utilization of host servers."

SmartNICs = Smart Network Interface Cards
I notice there is a patent application noted early on when an intern at HPE, I am guessing. 17/222160

SC
 
  • Like
Reactions: 5 users

cassip

Regular
Merry Christmas to all🎄
Regards from Germany

The most prominent wish, today once more, related to Christmas is peace, going along with reason and appreciation in everday interaction. Let us take care that Akida is always used for savety and comfort of humankind.

Thank you for this communicative place. Support and success have the same first letter (and "second..ed"). Go, BRN-Team!🧡

For the ones who like contemplative parts in this time of the year a poem of Rainer Maria Rilke:

Es gibt so wunderweiße Nächte There are those nights of white and wonder

Es gibt so wunderweiße Nächte, There are those nights of white and wonder
drin alle Dinge Silber sind. wherin all has a silver shine,
Da schimmert mancher Stern so lind, with many a gleaming star - a sign
als ob er fromme Hirten brächte as though to guide the shepherds yonder
zu einem neuen Jesuskind. towards a new infant divine.

Weit wie mit dichtem Demantstaube Widespread, as under diamond layers,
bestreut, erscheinen Flur und Flut, appear both meadows and the sea,
und in die Herzen, traumgemut, and in the hearts in dreamlike glee
steigt ein kapellenloser Glaube, a faith ascends, that needs no prayers,
der leise seine Wunder tut. performing wonders silently.

 
  • Like
  • Love
  • Thinking
Reactions: 31 users

Diogenese

Top 20
I notice there is a patent application noted early on when an intern at HPE, I am guessing. 17/222160

SC
Yes. That is US11436054B1 Directing queries to nodes of a cluster of a container orchestration platform distributed across a host system and a hardware accelerator of the host system
 
  • Love
  • Like
  • Fire
Reactions: 5 users

Tothemoon24

Top 20
Yes. That is US11436054B1 Directing queries to nodes of a cluster of a container orchestration platform distributed across a host system and a hardware accelerator of the host system
Dio
Any chance I could lease 5% of your brain ?
It would make me 1000% smarter
 
  • Haha
  • Like
  • Fire
Reactions: 39 users
Anyone looked into a possible BrainChip link to Sony Depthsensing Solutions (SDS)?

I’ve stated previously that I have no idea about the technology but it seems like BrainChip and SDS are a perfect fit. Has the stuff we did/are doing with Magik Eye Inc helped us showcase Akida in Japan and got us a foot in the door with Sony?

“This relationship opens a new and exciting gateway for BrainChip in Japan.”

Happy to be shot down but always willing to risk that possibility in case it leads somewhere


1671694794877.png

1671694751050.png

1671694859240.png

1671694897270.png

1671694958640.png

1671694983879.png

1671695017061.png
1671695050442.png
Current Sony Depthsensing Solutions engineering vacancies here

One for you @Fact Finder 👇
1671692767001.png

BrainChip Inc and Magik Eye Inc. Partner to Combine Best of AI with 3D Sensing for Total 3D Vision Solution​

Companies to jointly pursue market opportunities using BrainChip AI processor and MagikEye 3D image sensor technology

San Francisco, August 17, 2020 – BrainChip Holdings Ltd (ASX: BRN), a leading provider of ultra-low power high performance AI technology, today announced that it has partnered with Magik Eye Inc., developers of revolutionary 3D sensors that change how machines see the world, to market a breakthrough solution for object detection, object classification and gesture recognition based on MagikEye’s Invertible Light™ 3D depth sensing technology and the AkidaTMneuromorphic processor. This relationship opens a new and exciting gateway for BrainChip in Japan. MagikEye’s Invertible Light provides the smallest, fastest and most power-efficient 3D depth sensing. This is done using a standard CMOS image sensor and a regular dot projector along with a proprietary and patented technique to produce 3D point cloud data. Coupled with the Akida neuromorphic processor, the companies intend to jointly provide a total 3D vision solution to customers for fast 3D object detection and recognition in applications, including robotics, automotive and emerging consumer products, such as AR/VR and others. The MagikEye technology addresses the need for devices to see clearly and understand the surrounding environment, which is critical for new classes of 3D vision applications. The BrainChip Akida neuromorphic processor efficiently utilizes AI to gather new insights from the 3D data.

The combination of advanced neuromorphic processing with a low power 3D sensor is the perfect solution for many products in end-point devices,” said Richard Wawrzyniak, Principal Analyst for ASIC & SoC at Semico Research Corp.” 3D imaging is attracting great interest in the market today and the BrainChip architecture, which delivers a power-efficient, scalable solution that enables increased functionality with minimal impact on system cost and the power budget, is the right fit for this class of applications. It is not surprising their solution would be paired with the MagikEye’s Invertible Light Technology for real-time object detection in all types of applications, where low power and high throughput are valued elements for success. Semico believes this technology partnership is a winning combination for the market,” said Wawrzyniak.

BrainChip’s groundbreaking Akida neuromorphic processor is uniquely suited to provide the analytics necessary for manufacturers to implement a complete 3D vision system. With ultra-low power and the ability to directly process the 3D image generated by the MagikEye sensor, the companies can jointly address gesture recognition in Smart Home applications, such as gaming and other consumer products. Smart Transportation and Smart City applications are additional primary markets for collaboration. This includes Advanced Driver Assistance Systems (ADAS) and Autonomous Vehicles (AV).

By combining the strengths of BrainChip’s Neural Network capabilities with MagikEye’s Invertible Light, we are excited about the game-changing benefits that customers will experience, in terms of a total 3D vision solution for robotics, machine vision and many other new applications,” said Takeo Miyazawa, MagikEye, founder and CEO.

“Our relationship with MagikEye is exciting,” said Louis DiNardo, BrainChip CEO. “The innovation brought to the market by their proprietary Invertible Light technology is impressive and this collaboration provides both companies an opportunity to address large and growing markets with outstanding technology to solve difficult real-world challenges.”
 
  • Like
  • Fire
  • Love
Reactions: 62 users

Diogenese

Top 20
Anyone looked into a possible BrainChip link to Sony Depthsensing Solutions (SDS)?

I’ve stated previously that I have no idea about the technology but it seems like BrainChip and SDS are a perfect fit. Has the stuff we did/are doing with Magik Eye Inc helped us showcase Akida in Japan and got us a foot in the door with Sony?

“This relationship opens a new and exciting gateway for BrainChip in Japan.”

Happy to be shot down but always willing to risk that possibility in case it leads somewhere


View attachment 25130

View attachment 25129

View attachment 25131

View attachment 25132

View attachment 25133
View attachment 25134
View attachment 25135
View attachment 25136
Current Sony Depthsensing Solutions engineering vacancies here

One for you @Fact Finder 👇
View attachment 25126

BrainChip Inc and Magik Eye Inc. Partner to Combine Best of AI with 3D Sensing for Total 3D Vision Solution​

Companies to jointly pursue market opportunities using BrainChip AI processor and MagikEye 3D image sensor technology

San Francisco, August 17, 2020 – BrainChip Holdings Ltd (ASX: BRN), a leading provider of ultra-low power high performance AI technology, today announced that it has partnered with Magik Eye Inc., developers of revolutionary 3D sensors that change how machines see the world, to market a breakthrough solution for object detection, object classification and gesture recognition based on MagikEye’s Invertible Light™ 3D depth sensing technology and the AkidaTMneuromorphic processor. This relationship opens a new and exciting gateway for BrainChip in Japan. MagikEye’s Invertible Light provides the smallest, fastest and most power-efficient 3D depth sensing. This is done using a standard CMOS image sensor and a regular dot projector along with a proprietary and patented technique to produce 3D point cloud data. Coupled with the Akida neuromorphic processor, the companies intend to jointly provide a total 3D vision solution to customers for fast 3D object detection and recognition in applications, including robotics, automotive and emerging consumer products, such as AR/VR and others. The MagikEye technology addresses the need for devices to see clearly and understand the surrounding environment, which is critical for new classes of 3D vision applications. The BrainChip Akida neuromorphic processor efficiently utilizes AI to gather new insights from the 3D data.

The combination of advanced neuromorphic processing with a low power 3D sensor is the perfect solution for many products in end-point devices,” said Richard Wawrzyniak, Principal Analyst for ASIC & SoC at Semico Research Corp.” 3D imaging is attracting great interest in the market today and the BrainChip architecture, which delivers a power-efficient, scalable solution that enables increased functionality with minimal impact on system cost and the power budget, is the right fit for this class of applications. It is not surprising their solution would be paired with the MagikEye’s Invertible Light Technology for real-time object detection in all types of applications, where low power and high throughput are valued elements for success. Semico believes this technology partnership is a winning combination for the market,” said Wawrzyniak.

BrainChip’s groundbreaking Akida neuromorphic processor is uniquely suited to provide the analytics necessary for manufacturers to implement a complete 3D vision system. With ultra-low power and the ability to directly process the 3D image generated by the MagikEye sensor, the companies can jointly address gesture recognition in Smart Home applications, such as gaming and other consumer products. Smart Transportation and Smart City applications are additional primary markets for collaboration. This includes Advanced Driver Assistance Systems (ADAS) and Autonomous Vehicles (AV).

By combining the strengths of BrainChip’s Neural Network capabilities with MagikEye’s Invertible Light, we are excited about the game-changing benefits that customers will experience, in terms of a total 3D vision solution for robotics, machine vision and many other new applications,” said Takeo Miyazawa, MagikEye, founder and CEO.

“Our relationship with MagikEye is exciting,” said Louis DiNardo, BrainChip CEO. “The innovation brought to the market by their proprietary Invertible Light technology is impressive and this collaboration provides both companies an opportunity to address large and growing markets with outstanding technology to solve difficult real-world challenges.”
Not Magik Eye:

1671698930309.png



https://finance.yahoo.com/news/magikeye-present-disruptive-3d-sensing-140000385.html
STAMFORD, Conn., December 20, 2022--(BUSINESS WIRE)--Magik Eye Inc. (www.magik-eye.com), an innovative 3D sensing company will be holding demonstrations for its latest Invertible Light™ Technology (ILT) at the 2023 Consumer Electronics Show in Las Vegas Nevada. ILT is a patented alternative to older Time of Flight and Structured Light solutions, enabling the smallest, fastest and most power-efficient 3D sensing method.
 
  • Like
  • Fire
  • Sad
Reactions: 16 users
  • Like
  • Fire
Reactions: 16 users
  • Haha
  • Like
Reactions: 18 users
  • Haha
  • Like
Reactions: 6 users

Diogenese

Top 20
I’m not suggesting Magik Eye are involved


Has the stuff we did/are doing with Magik Eye Inc helped us showcase Akida in Japan and got us a foot in the door with Sony?

“This relationship opens a new and exciting gateway for BrainChip in Japan.”
We already have a potential link to Sony through Prophesee.

https://www.prophesee.ai/2021/09/09/sony-event-based-vision-sensors-prophesee-co-development/

Atsugi, Japan — Sony Semiconductor Solutions Corporation (“Sony”) today announced the upcoming release of two types of stacked event-based vision sensors. These sensors designed for industrial equipment are capable of detecting only subject changes, and achieve the industry’s smallest*1 pixel size of 4.86μm.

These two sensors were made possible through a collaboration between Sony and Prophesee, by combining Sony’s CMOS image sensor technology with Prophesee’s unique event-based vision sensing technology.
This enables high-speed, high-precision data acquisition and contributes to improve the productivity of the industrial equipment.

... but the more the merrier.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 35 users
I have no idea about Magic-Eye giving Brainchip exposure to Sony but the following definitely suggests Socionext has that capacity.

When you open the following link click on Partners:


My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 30 users

jtardif999

Regular
So if it's in a car ... ?
First principles, in our understanding of how our own brains process vision maybe? If we are moving in a car then our brains understand what is expected to be moving and is then aware of what is not moving or moving in a different way or direction to the car. Now to understand that in terms of how a vision sensor would be able to disseminate what is expected and what is not expected, that is a question. Also how much work is done by the sensor to do that and how much is the processor learning to make sense of the input. As has been said before a DVS camera only registers pixels that are changing within a scene as events but when everything is in motion then the events of interest will be the stationary objects or those that are moving in a different way to the car the camera is situated on, so does both the camera and the processor contribute to this understanding of what is actually a change event? What happens when the car goes over a pot hole? I guess that would be all part of the learning process and probably why multi-modal input is so important - the bump would be felt by the car and so processes the input from the DVS accordingly. I think that’s also why most cars will include LiDAR, radar and camera vision DVS or otherwise in the autonomous makeup. AIMO.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 16 users

Fair enough - I was hoping it was a different link to Sony though as Prophesee and Sony Depthsensing Solutions operate in different fields: event-based vision vs mapping and tracking in 3D?

Like I said I’d prefer to be shot down rather than not have a crack
 
  • Like
  • Fire
Reactions: 25 users
Top Bottom