BRN Discussion Ongoing

Diogenese

Top 20
Very interesting, the EV - Event Based Camera is powered in part by Prophesee Metavision. " The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection."
See US media release dated 19/6/22. We integrated with Prophesee Metasvision.
The 2 year timeline is just about there. Either AKIDA 1000 or 1500.
Now Sean at the AGM made it quite clear that deals were closing in on decision time.
Very interesting.
What I'd like to know is how they compensate for camera movement in autonomous driving.

The point of a DVS is that it detects changes in pixel illumination above a threshold level of change. With a static DVS, it will thus detect any movement, but when the camera is moving, all the pixels experience changing illumination (light exposure).

So do they take local averages of change in different regions (a bit like CNN), and use that as the threshold, or do they have some more sophistocted akgorithm?
 
  • Like
  • Fire
  • Thinking
Reactions: 15 users

FiveBucks

Regular
Our share price....

 
  • Haha
  • Like
Reactions: 9 users

manny100

Regular
What I'd like to know is how they compensate for camera movement in autonomous driving.

The point of a DVS is that it detects changes in pixel illumination above a threshold level of change. With a static DVS, it will thus detect any movement, but when the camera is moving, all the pixels experience changing illumination (light exposure).

So do they take local averages of change in different regions (a bit like CNN), and use that as the threshold, or do they have some more sophistocted akgorithm?
So its not AKIDA??
 
Todd likes it
 

Attachments

  • 271002E5-9AF6-4C5F-8BF6-21026B846890.jpeg
    271002E5-9AF6-4C5F-8BF6-21026B846890.jpeg
    257.8 KB · Views: 176
  • Like
  • Fire
  • Thinking
Reactions: 12 users
  • Haha
  • Like
  • Fire
Reactions: 7 users
It was actually a great finish for a friday!:)
It’s going to make Monday a very interesting start to the week moving into October
I was very surprised that we were flat
More interested people looking to buy
Hope the Squeeze is coming
 
  • Like
  • Fire
Reactions: 11 users

CHIPS

Regular
  • Haha
  • Like
Reactions: 7 users

CHIPS

Regular
It was actually a great finish for a friday!:)

It was still Thursday then in Germany :ROFLMAO:. Today, Friday, the SP is going down. :cry:
 
  • Like
  • Thinking
Reactions: 2 users

Diogenese

Top 20
So its not AKIDA??
I really don't know.

We do know that Sony and Prophesee went with Synsense for the lo-fi version (380*380 pixels) - the apochryphal low hanging fruit.

From your June 2022 link:
“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”

By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings,” said Luca Verre, CEO and co-founder of Prophesee."

Akida IP includes more than the tape-out specs. It would also include, eg, the copyright-protected software.

To the uninitiated, Anil's comments imply that the Prophesee data was applied to Akida 1 SoC, whereas Luca's coments can be interpreted as encompassing incorporating Akida software, eg, TeNNs simulation software into the Prophesee Metavision software.

The patent application for teNNs was filed 3 days after the article was published, so clearly BRN had been testing it beforehand as software. I think it is probable that TeNNs was used in tests on the Prophesee data, and that combining the Akida-based software IP with Metavision would have been the only available means of testing the Prophesee data against Akida 2, as we still have not seen the SoC. So it is not outside the bounds of possibility that TeNNs/Akida 2 software has been combined with Metavision.
 
  • Like
  • Fire
  • Thinking
Reactions: 44 users

7für7

Top 20
Don't worry, today is the day! It is down 3.8 % already, more is to come for sure.

Emma Stone Laughing GIF
-10% in germanistan… what’s up there?
 
  • Thinking
  • Like
Reactions: 2 users

Guzzi62

Regular
-10% in germanistan… what’s up there?
A company listed on several exchanges have to have the same value on all of them.

If not, you could in theory buy them on the cheap one and sell them on the expensive one.
 
  • Like
Reactions: 1 users

Tothemoon24

Top 20

“This is the future “​

ITL ventures into neuromorphic computing​

By Megan Saxton
U.S. ARMY ENGINEER RESEARCH AND DEVELOPMENT CENTER
Published Sept. 23, 2024
Updated: Sept. 23, 2024
FacebookXEmailShare


Neuromorphic Computing

PHOTO DETAILS / DOWNLOAD HI-RES 1 of 1
The U.S. Army Engineer Research and Development Center (ERDC) Information Technology Laboratory (ITL) Edge Computing Lab has long been on the cutting-edge of this field and is now exploring something new: neuromorphic computing.

PRINT | E-MAIL
In recent years, edge computing has revolutionized the technology landscape for users situated in remote areas or away from primary devices. By bringing computation and data storage closer to the location where it is needed, response times, reliability and performance are greatly improved, latency and bandwidth costs are reduced and privacy and security are enhanced. The U.S. Army Engineer Research and Development Center (ERDC) Information Technology Laboratory (ITL) Edge Computing Lab has long been on the cutting-edge of this field and is now exploring something new: neuromorphic computing.
“Neuromorphic computing is a process in which computers are designed and engineered to mirror the structure and function of the human brain,” said Dr. Raju Namburu, ITL chief technology officer and a senior scientific technical manager. “Using artificial neurons and synapses, neuromorphic computers simulate the way our brains process information, allowing them to solve problems, recognize patterns and make decisions more quickly and efficiently than the traditional high-performance computing systems we use today.”
The driving force behind ITL’s research into this emerging technology is the U.S. military’s need to know more, sooner, to allow rapid, decisive action on the multi-domain battlefield. The battlespace has become characterized by highly distributed processing, heterogeneous and mobile assets with limited battery life, communications- dominated but restricted network capacity and operating with time-critical needs in a rapidly changing hostile environment. Distributed and low power edge processing is one of the essential technologies for maintaining overmatch in various emerging operational and contested environments, as is the need to take advantage of machine learning (ML) and generative artificial intelligence (AI).
“Overall, neuromorphic chips offer the DoD community a number of potential benefits including improved performance, resilience, cost-efficiency, security, privacy, power-efficiency, signal processing, ML capabilities and more,” said Dr. Ruth Cheng, a computer scientist in ITL’s Supercomputing Research Center. “By keeping an eye on developments in this technology, the DoD community can ensure it remains at the forefront of military and defense innovation.”
“Computations performed at the molecular, atomic, and neuro scales mimicking the human brain are showing tremendous viability,” added Namburu. “We just started this work on next generation advanced computing, which is significantly different from traditional computing systems historically used at ERDC. Neuromorphic computing represents a paradigm shift in computing, promising significant advancements in ML, generative AI, scientific applications and sensor processing compared to traditional computing. Moreover, neuromorphic chips emulate the brain's plasticity, enabling learning and adaptation over time, unlike traditional systems.”
Ongoing efforts edge computing efforts include agnostic graphics processing unit (GPU) ray tracing development, benchmarking deep neural networks, sensor-data management, ML for underwater invasive plants, railcar inspection, photogrammetry, reservoir frameworks, decentralized edge computing, bi-directional digital twins and algorithms for anomaly detection. ITL is also exploring emerging AI chips for edge computing including novel algorithms and sustainable software.
“Overall, edge computing is helping to enable new use cases and provide better experiences to the users by making applications faster, more reliable and more secure,” said Cheng. “Neuromorphic chips are well-suited for edge computing, which is becoming increasingly important in military and defense applications, and ITL is already aiding in this process that will touch everything from lowering the cost of deployments by eliminating the need for expensive, high-powered servers and data centers to support of mobile and autonomous systems. This is the future.”
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 27 users

Esq.111

Fascinatingly Intuitive.

ITL ventures into neuromorphic computing​

By Megan Saxton
U.S. ARMY ENGINEER RESEARCH AND DEVELOPMENT CENTER
Published Sept. 23, 2024
Updated: Sept. 23, 2024
FacebookXEmailShare


Neuromorphic Computing

PHOTO DETAILS / DOWNLOAD HI-RES 1 of 1
The U.S. Army Engineer Research and Development Center (ERDC) Information Technology Laboratory (ITL) Edge Computing Lab has long been on the cutting-edge of this field and is now exploring something new: neuromorphic computing.

PRINT | E-MAIL
In recent years, edge computing has revolutionized the technology landscape for users situated in remote areas or away from primary devices. By bringing computation and data storage closer to the location where it is needed, response times, reliability and performance are greatly improved, latency and bandwidth costs are reduced and privacy and security are enhanced. The U.S. Army Engineer Research and Development Center (ERDC) Information Technology Laboratory (ITL) Edge Computing Lab has long been on the cutting-edge of this field and is now exploring something new: neuromorphic computing.
“Neuromorphic computing is a process in which computers are designed and engineered to mirror the structure and function of the human brain,” said Dr. Raju Namburu, ITL chief technology officer and a senior scientific technical manager. “Using artificial neurons and synapses, neuromorphic computers simulate the way our brains process information, allowing them to solve problems, recognize patterns and make decisions more quickly and efficiently than the traditional high-performance computing systems we use today.”
The driving force behind ITL’s research into this emerging technology is the U.S. military’s need to know more, sooner, to allow rapid, decisive action on the multi-domain battlefield. The battlespace has become characterized by highly distributed processing, heterogeneous and mobile assets with limited battery life, communications- dominated but restricted network capacity and operating with time-critical needs in a rapidly changing hostile environment. Distributed and low power edge processing is one of the essential technologies for maintaining overmatch in various emerging operational and contested environments, as is the need to take advantage of machine learning (ML) and generative artificial intelligence (AI).
“Overall, neuromorphic chips offer the DoD community a number of potential benefits including improved performance, resilience, cost-efficiency, security, privacy, power-efficiency, signal processing, ML capabilities and more,” said Dr. Ruth Cheng, a computer scientist in ITL’s Supercomputing Research Center. “By keeping an eye on developments in this technology, the DoD community can ensure it remains at the forefront of military and defense innovation.”
“Computations performed at the molecular, atomic, and neuro scales mimicking the human brain are showing tremendous viability,” added Namburu. “We just started this work on next generation advanced computing, which is significantly different from traditional computing systems historically used at ERDC. Neuromorphic computing represents a paradigm shift in computing, promising significant advancements in ML, generative AI, scientific applications and sensor processing compared to traditional computing. Moreover, neuromorphic chips emulate the brain's plasticity, enabling learning and adaptation over time, unlike traditional systems.”
Ongoing efforts edge computing efforts include agnostic graphics processing unit (GPU) ray tracing development, benchmarking deep neural networks, sensor-data management, ML for underwater invasive plants, railcar inspection, photogrammetry, reservoir frameworks, decentralized edge computing, bi-directional digital twins and algorithms for anomaly detection. ITL is also exploring emerging AI chips for edge computing including novel algorithms and sustainable software.
“Overall, edge computing is helping to enable new use cases and provide better experiences to the users by making applications faster, more reliable and more secure,” said Cheng. “Neuromorphic chips are well-suited for edge computing, which is becoming increasingly important in military and defense applications, and ITL is already aiding in this process that will touch everything from lowering the cost of deployments by eliminating the need for expensive, high-powered servers and data centers to support of mobile and autonomous systems. This is the future.”
Evening Tothemoon24 ,

Good article.

Christ the yanks are slow to get their shite together , now thay are talking about bi directional digital twins.... kinky buggers .


Regards,
Esq.
 
  • Haha
  • Like
Reactions: 9 users
Evening Tothemoon24 ,

Good article.

Christ the yanks are slow to get their shite together , now thay are talking about bi directional digital twins.... kinky buggers .


Regards,
Esq.
I met Bi-twins once. Only problem for me was they were a pigeon pair. Bolted real quick 😆 🤣 😂

SC
 
  • Haha
Reactions: 7 users

manny100

Regular
I really don't know.

We do know that Sony and Prophesee went with Synsense for the lo-fi version (380*380 pixels) - the apochryphal low hanging fruit.

From your June 2022 link:
“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”

By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings,” said Luca Verre, CEO and co-founder of Prophesee."

Akida IP includes more than the tape-out specs. It would also include, eg, the copyright-protected software.

To the uninitiated, Anil's comments imply that the Prophesee data was applied to Akida 1 SoC, whereas Luca's coments can be interpreted as encompassing incorporating Akida software, eg, TeNNs simulation software into the Prophesee Metavision software.

The patent application for teNNs was filed 3 days after the article was published, so clearly BRN had been testing it beforehand as software. I think it is probable that TeNNs was used in tests on the Prophesee data, and that combining the Akida-based software IP with Metavision would have been the only available means of testing the Prophesee data against Akida 2, as we still have not seen the SoC. So it is not outside the bounds of possibility that TeNNs/Akida 2 software has been combined with Metavision.
Sean in a interview/ presentation around a year ago said that if you buy a camera powered by Prophesee you want to know its got AKIDA in it.
At the time I thought it was a hint. BRN would have been working together for sure.
 
  • Like
  • Thinking
  • Wow
Reactions: 12 users

Tothemoon24

Top 20
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 16 users

Rskiff

Regular
Great September newsletter just received. Plenty going on.
 
  • Like
  • Fire
  • Love
Reactions: 24 users
Top Bottom