BRN Discussion Ongoing

According to Ask Ai www.iAsk.ai:


Meta’s New AR Glasses and Neuromorphic Computing

The new augmented reality (AR) glasses developed by Meta, codenamed Orion, do not specifically utilize neuromorphic computing. Instead, they incorporate a combination of advanced technologies including eye tracking, hand tracking, and a neural interface through a wrist-worn device known as the EMG (electromyography) wristband. This wristband interprets electrical signals from the user’s muscles to facilitate gesture control without requiring direct line-of-sight to the glasses’ sensors.

Understanding Neuromorphic Computing

Neuromorphic computing refers to a type of computation that mimics the neural structure and functioning of the human brain. It typically involves specialized hardware designed to process information in ways similar to biological systems, which can lead to more efficient processing for certain types of tasks, particularly those involving sensory data and learning.

Orion’s Technology Overview

  1. Eye Tracking: The Orion glasses are equipped with full eye-tracking cameras that allow for precise interaction based on where the user is looking.
  2. Hand Tracking: The glasses also feature multiple cameras that enable hand tracking, allowing users to interact with digital content using natural gestures.
  3. EMG Wristband: The EMG wristband translates muscle signals into commands for controlling the AR interface. This technology allows for complex gestures and haptic feedback without needing to keep hands in view of the glasses’ cameras.
  4. Wireless Compute Puck: Orion requires an external processor puck that wirelessly feeds applications and graphics to the glasses. This setup is necessary because the glasses themselves do not have sufficient processing power on their own.
While these technologies represent significant advancements in AR interfaces and user interaction, they do not fall under the category of neuromorphic computing. Instead, they rely on traditional computational methods enhanced by specific sensor technologies designed for augmented reality experiences.

In summary, Meta’s new AR glasses do not use neuromorphic compute; they utilize advanced sensor technologies including eye tracking, hand tracking, and an EMG wristband for user interaction.
 
  • Like
  • Sad
  • Love
Reactions: 10 users

TheDrooben

Pretty Pretty Pretty Pretty Good
This is the type of SP movement I was hoping for a couple of weeks ago. Burn baby burn
 
  • Like
  • Love
  • Fire
Reactions: 11 users

Calsco

Regular
There is a double crown in the charts also… fingers crossed 🤞
 
  • Like
  • Fire
Reactions: 4 users

Boab

I wish I could paint like Vincent
  • Sad
  • Like
Reactions: 4 users

Xray1

Regular
Yes.

Both have disclosed that they are using software to process sensor signals. I'm certain both have had access to Akida 2/TeNNs simulation software since the patent application was filed over 2 years ago.

Understandably, they would be reluctant to commit to silicon while the tech is so new and in a state of ongoing development. Software provides a way for new developments to be added without needing a soldering iron. My hope is that, longer term when the development plateaus, we may see Akida SoC IP integrated into the Nvidia drive processor silicon, and, if that door opens ...

On top of that, Sean recently disclosed that we had a new product line, algorithms, in addition to IP.

On the lighter side, some months ago there was MB's Magnus Ostberg's "Stay tuned" Linkedin response to my question about whether they were using NN software on the water-cooled processor of the CLA concept vehicle, ... and now Valeo are advising everyone to "Stay tuned" in relation to the SDV.

Of course, as the more sober commentators here will tell you, I could be wrong ...
Never say you " Could be wrong " just say " I could be mistaken " :)
 
  • Like
Reactions: 1 users

TheDrooben

Pretty Pretty Pretty Pretty Good
They've just rebuilt the wall😩😩
Yep more games. Maybe a pump and dump to offload some off the 19.3c SPP for a small profit.
One day it will happen......
200w (9).gif


Happy as Larry
 
  • Like
  • Sad
Reactions: 4 users

7für7

Top 20
Yes I was fooled too… 🫤 whatever
 
  • Love
Reactions: 1 users

manny100

Regular
Looks to be some very interesting camera 📸 technology coming up at Vision 2024


LUCID to Unveil Latest GigE Vision Cameras and Advanced Sensing Technologies at VISION 2024​

Camera-spread-VISION-2024-1800x580-1.jpg

Richmond, BC, Canada – August 22, 2024 – LUCID Vision Labs, Inc., a leading designer and manufacturer of industrial cameras, will showcase a range of new GigE Vision cameras and advanced sensing technologies at VISION 2024, which takes place from October 8–10, 2024, in Stuttgart, Germany.
Triton2-SMART-300x300-1.jpg
LUCID is set to introduce the first member of its intelligent vision camera family, the Triton® Smart camera featuring Sony’s IMX501 intelligent vision sensor with AI processing. The Triton Smart is an easy-to-use, cost-effective intelligent vision camera capable of outputting inference results alongside regular 12.3 MP images for every frame. Its on-sensor AI processing reduces data bandwidth, alleviates processing load on the host PC, and minimizes latency.
Triton2-EVS-VISION-2024.jpg
Expanding the Triton2 – 2.5GigE camera family, LUCID will showcase two new models for advanced sensing applications. The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection.
Triton2-4K-line-scan-500x500-1.jpg
Additionally, the new Triton2 4K line scan camera, equipped with Gpixel’s GL3504 image sensor, will also be unveiled. Featuring 4096 (H) x 2 (V) at 3.5 μm pixels, this camera is ideal for high-speed, high-resolution imaging.
Atlas10 10GigE On-Semi 45mp
The Atlas10 camera – 10GigE camera family is welcoming a new high-resolution model featuring the 45-megapixel (8192 x 5460) onsemi XGS45000 CMOS global shutter image sensor, capable of running at 16 fps. This RDMA-enabled camera offers a unique combination of high resolution, high frame rate, and superior image quality, making it well-suited for applications such as flat panel inspection, aerial surveillance, mapping, and electronics inspection.
Helios2 3D ToF Camera with Narrow FoV
LUCID is also expanding its Helios®2 3D Time-of-Flight camera family with the introduction of the Helios2 Narrow Field-of-View (FoV)variant. This model integrates Sony’s DepthSense™ IMX556PLR back-illuminated ToF image sensor. It produces a tighter point cloud, and the narrower illumination area reduces the likelihood of multipath error, making it ideal for applications requiring precise 3D depth measurement in confined spaces.
industrial-VISION-days.jpg
As part of Industrial VISION Days 2024, organized by VDMA Machine Vision, LUCID’s Director of Product Management, Alexis Teissie, will present “The Benefits of RDMA for 10GigE Cameras and Beyond” on Wednesday, October 9th at 2:40 pm.
Stay tuned for additional product highlights to be unveiled on the show floor. Join LUCID at VISION 2024 from October 8–10, 2024, in Stuttgart, Germany at Booth 10E40.
Very interesting, the EV - Event Based Camera is powered in part by Prophesee Metavision. " The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection."
See US media release dated 19/6/22. We integrated with Prophesee Metasvision.
The 2 year timeline is just about there. Either AKIDA 1000 or 1500.
Now Sean at the AGM made it quite clear that deals were closing in on decision time.
Very interesting.
 
  • Like
  • Thinking
  • Love
Reactions: 20 users

manny100

Regular
Further to my previous post check out the Prophesse website for event based products. They deal with Sony.

 
  • Like
  • Fire
  • Love
Reactions: 7 users

Diogenese

Top 20
Very interesting, the EV - Event Based Camera is powered in part by Prophesee Metavision. " The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection."
See US media release dated 19/6/22. We integrated with Prophesee Metasvision.
The 2 year timeline is just about there. Either AKIDA 1000 or 1500.
Now Sean at the AGM made it quite clear that deals were closing in on decision time.
Very interesting.
What I'd like to know is how they compensate for camera movement in autonomous driving.

The point of a DVS is that it detects changes in pixel illumination above a threshold level of change. With a static DVS, it will thus detect any movement, but when the camera is moving, all the pixels experience changing illumination (light exposure).

So do they take local averages of change in different regions (a bit like CNN), and use that as the threshold, or do they have some more sophistocted akgorithm?
 
  • Like
  • Fire
  • Thinking
Reactions: 15 users

FiveBucks

Regular
Our share price....

 
  • Haha
  • Like
Reactions: 9 users

manny100

Regular
What I'd like to know is how they compensate for camera movement in autonomous driving.

The point of a DVS is that it detects changes in pixel illumination above a threshold level of change. With a static DVS, it will thus detect any movement, but when the camera is moving, all the pixels experience changing illumination (light exposure).

So do they take local averages of change in different regions (a bit like CNN), and use that as the threshold, or do they have some more sophistocted akgorithm?
So its not AKIDA??
 
Todd likes it
 

Attachments

  • 271002E5-9AF6-4C5F-8BF6-21026B846890.jpeg
    271002E5-9AF6-4C5F-8BF6-21026B846890.jpeg
    257.8 KB · Views: 200
  • Like
  • Fire
  • Thinking
Reactions: 12 users
Yes, I am afraid so. The day ended with +11%.
Fuck me dead we never ended up red.

1727420895514.gif
 
  • Haha
  • Like
  • Fire
Reactions: 7 users
It was actually a great finish for a friday!:)
It’s going to make Monday a very interesting start to the week moving into October
I was very surprised that we were flat
More interested people looking to buy
Hope the Squeeze is coming
 
  • Like
  • Fire
Reactions: 11 users

CHIPS

Regular
  • Haha
  • Like
Reactions: 7 users

CHIPS

Regular
It was actually a great finish for a friday!:)

It was still Thursday then in Germany :ROFLMAO:. Today, Friday, the SP is going down. :cry:
 
  • Like
  • Thinking
Reactions: 2 users

Diogenese

Top 20
So its not AKIDA??
I really don't know.

We do know that Sony and Prophesee went with Synsense for the lo-fi version (380*380 pixels) - the apochryphal low hanging fruit.

From your June 2022 link:
“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”

By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings,” said Luca Verre, CEO and co-founder of Prophesee."

Akida IP includes more than the tape-out specs. It would also include, eg, the copyright-protected software.

To the uninitiated, Anil's comments imply that the Prophesee data was applied to Akida 1 SoC, whereas Luca's coments can be interpreted as encompassing incorporating Akida software, eg, TeNNs simulation software into the Prophesee Metavision software.

The patent application for teNNs was filed 3 days after the article was published, so clearly BRN had been testing it beforehand as software. I think it is probable that TeNNs was used in tests on the Prophesee data, and that combining the Akida-based software IP with Metavision would have been the only available means of testing the Prophesee data against Akida 2, as we still have not seen the SoC. So it is not outside the bounds of possibility that TeNNs/Akida 2 software has been combined with Metavision.
 
  • Like
  • Fire
  • Thinking
Reactions: 45 users
Top Bottom