BRN Discussion Ongoing

Not sure if already posted/discussed but an interesting article here about the U.S. Army Engineer Research and Development Center's very keen interest in exploring the wonderful world of neuromorphic computing:


"The driving force behind ITL’s research into this emerging technology is the U.S. military’s need to know more, sooner, to allow rapid, decisive action on the multi-domain battlefield. The battlespace has become characterized by highly distributed processing, heterogeneous and mobile assets with limited battery life, communications- dominated but restricted network capacity and operating with time-critical needs in a rapidly changing hostile environment. Distributed and low power edge processing is one of the essential technologies for maintaining overmatch in various emerging operational and contested environments, as is the need to take advantage of machine learning (ML) and generative artificial intelligence (AI)."

"Overall, edge computing is helping to enable new use cases and provide better experiences to the users by making applications faster, more reliable and more secure,” said Cheng. “Neuromorphic chips are well-suited for edge computing, which is becoming increasingly important in military and defense applications, and ITL is already aiding in this process that will touch everything from lowering the cost of deployments by eliminating the need for expensive, high-powered servers and data centers to support of mobile and autonomous systems. This is the future."
 
  • Like
  • Fire
  • Love
Reactions: 30 users

Tothemoon24

Top 20
IMG_9632.jpeg
IMG_9631.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 42 users
  • Like
  • Thinking
  • Haha
Reactions: 5 users

cosors

👀
I wonder why they do this ban

I am guessing so the USA don’t look stupid
Asia is way more advanced than the USA
From a European perspective, there's nothing to be surprised about. So that they don't press the button and control what happens there. We Europeans know their blackmail methods, especially those of us in the graphite group. Downstream to protect the market, I think.
In Germany €205B of economic damage by cyber attacks is caused every year. China and Russia are responsible for 89% of this, almost equally. It's about control, strategically, economically and about expanding power/might (?). Or the EU ban of Huawei tec.)
They have chosen this path of the old blocs.
...there is a scene with EVs in the movie "Leave the World Behind"
I wish that Brainchip will act carefully with China. They won't pay anyway, but simply copy Akida as they copied nearly everything and later build by their own.
So I think that this approach by the USA could be positive for Brainchip. Just my gut feeling. Too high politics for me, a small citizen.
 
Last edited:
  • Like
  • Fire
Reactions: 17 users

Tothemoon24

Top 20
Looks to be some very interesting camera 📸 technology coming up at Vision 2024


LUCID to Unveil Latest GigE Vision Cameras and Advanced Sensing Technologies at VISION 2024​

Camera-spread-VISION-2024-1800x580-1.jpg

Richmond, BC, Canada – August 22, 2024 – LUCID Vision Labs, Inc., a leading designer and manufacturer of industrial cameras, will showcase a range of new GigE Vision cameras and advanced sensing technologies at VISION 2024, which takes place from October 8–10, 2024, in Stuttgart, Germany.
Triton2-SMART-300x300-1.jpg
LUCID is set to introduce the first member of its intelligent vision camera family, the Triton® Smart camera featuring Sony’s IMX501 intelligent vision sensor with AI processing. The Triton Smart is an easy-to-use, cost-effective intelligent vision camera capable of outputting inference results alongside regular 12.3 MP images for every frame. Its on-sensor AI processing reduces data bandwidth, alleviates processing load on the host PC, and minimizes latency.
Triton2-EVS-VISION-2024.jpg
Expanding the Triton2 – 2.5GigE camera family, LUCID will showcase two new models for advanced sensing applications. The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection.
Triton2-4K-line-scan-500x500-1.jpg
Additionally, the new Triton2 4K line scan camera, equipped with Gpixel’s GL3504 image sensor, will also be unveiled. Featuring 4096 (H) x 2 (V) at 3.5 μm pixels, this camera is ideal for high-speed, high-resolution imaging.
Atlas10 10GigE On-Semi 45mp
The Atlas10 camera – 10GigE camera family is welcoming a new high-resolution model featuring the 45-megapixel (8192 x 5460) onsemi XGS45000 CMOS global shutter image sensor, capable of running at 16 fps. This RDMA-enabled camera offers a unique combination of high resolution, high frame rate, and superior image quality, making it well-suited for applications such as flat panel inspection, aerial surveillance, mapping, and electronics inspection.
Helios2 3D ToF Camera with Narrow FoV
LUCID is also expanding its Helios®2 3D Time-of-Flight camera family with the introduction of the Helios2 Narrow Field-of-View (FoV)variant. This model integrates Sony’s DepthSense™ IMX556PLR back-illuminated ToF image sensor. It produces a tighter point cloud, and the narrower illumination area reduces the likelihood of multipath error, making it ideal for applications requiring precise 3D depth measurement in confined spaces.
industrial-VISION-days.jpg
As part of Industrial VISION Days 2024, organized by VDMA Machine Vision, LUCID’s Director of Product Management, Alexis Teissie, will present “The Benefits of RDMA for 10GigE Cameras and Beyond” on Wednesday, October 9th at 2:40 pm.
Stay tuned for additional product highlights to be unveiled on the show floor. Join LUCID at VISION 2024 from October 8–10, 2024, in Stuttgart, Germany at Booth 10E40.
 
  • Like
  • Fire
  • Love
Reactions: 21 users

Boab

I wish I could paint like Vincent
Overnight in the US. Not that it means much as they seem to follow our market with these itty bitty stocks??
1727389805526.png
 
  • Like
  • Fire
  • Love
Reactions: 26 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Love
  • Wow
Reactions: 17 users

miaeffect

Oat latte lover
  • Haha
  • Like
  • Thinking
Reactions: 6 users

7für7

Top 20
Overnight in the US. Not that it means much as they seem to follow our market with these itty bitty stocks??
View attachment 69925
If you ask me, we can happily stay isolated from the U.S. market for quite a while… they’re wired differently up there and change their minds as often as others change their underwear, if you get what I mean.

For example:

When things go up: “Yessss, pump it up… we’re all going to get rich…”

Five hours later, when it drops a little: “This stock is garbage… it’s nothing but a cheap fast-food joint… been in the red for years like catchup.”

The next day, when it goes up again: “I knew it! We’re in a bull market, let’s goooo!!!”

The day after, when it drops again: “I sold that junk… it’s going nowhere… everyone, short it.”

A few hours later, when it rises again: “Yessss, we’re back!!! Just secured 5000000000000 shares… it’s time! We’re going to the moon, baby!”
 
  • Like
Reactions: 3 users

7für7

Top 20
Today would be a perfect day for a price sensitive announcement I would say… today in the morning or later after market! 🥹 I am right?
 
  • Like
  • Thinking
Reactions: 3 users
According to Ask Ai www.iAsk.ai:


Meta’s New AR Glasses and Neuromorphic Computing

The new augmented reality (AR) glasses developed by Meta, codenamed Orion, do not specifically utilize neuromorphic computing. Instead, they incorporate a combination of advanced technologies including eye tracking, hand tracking, and a neural interface through a wrist-worn device known as the EMG (electromyography) wristband. This wristband interprets electrical signals from the user’s muscles to facilitate gesture control without requiring direct line-of-sight to the glasses’ sensors.

Understanding Neuromorphic Computing

Neuromorphic computing refers to a type of computation that mimics the neural structure and functioning of the human brain. It typically involves specialized hardware designed to process information in ways similar to biological systems, which can lead to more efficient processing for certain types of tasks, particularly those involving sensory data and learning.

Orion’s Technology Overview

  1. Eye Tracking: The Orion glasses are equipped with full eye-tracking cameras that allow for precise interaction based on where the user is looking.
  2. Hand Tracking: The glasses also feature multiple cameras that enable hand tracking, allowing users to interact with digital content using natural gestures.
  3. EMG Wristband: The EMG wristband translates muscle signals into commands for controlling the AR interface. This technology allows for complex gestures and haptic feedback without needing to keep hands in view of the glasses’ cameras.
  4. Wireless Compute Puck: Orion requires an external processor puck that wirelessly feeds applications and graphics to the glasses. This setup is necessary because the glasses themselves do not have sufficient processing power on their own.
While these technologies represent significant advancements in AR interfaces and user interaction, they do not fall under the category of neuromorphic computing. Instead, they rely on traditional computational methods enhanced by specific sensor technologies designed for augmented reality experiences.

In summary, Meta’s new AR glasses do not use neuromorphic compute; they utilize advanced sensor technologies including eye tracking, hand tracking, and an EMG wristband for user interaction.
 
  • Like
  • Sad
  • Love
Reactions: 10 users

TheDrooben

Pretty Pretty Pretty Pretty Good
This is the type of SP movement I was hoping for a couple of weeks ago. Burn baby burn
 
  • Like
  • Love
  • Fire
Reactions: 11 users

Calsco

Regular
There is a double crown in the charts also… fingers crossed 🤞
 
  • Like
  • Fire
Reactions: 4 users

Boab

I wish I could paint like Vincent
This is the type of SP movement I was hoping for a couple of weeks ago. Burn baby burn
They've just rebuilt the wall😩😩
 
  • Sad
  • Like
Reactions: 4 users

Xray1

Regular
Yes.

Both have disclosed that they are using software to process sensor signals. I'm certain both have had access to Akida 2/TeNNs simulation software since the patent application was filed over 2 years ago.

Understandably, they would be reluctant to commit to silicon while the tech is so new and in a state of ongoing development. Software provides a way for new developments to be added without needing a soldering iron. My hope is that, longer term when the development plateaus, we may see Akida SoC IP integrated into the Nvidia drive processor silicon, and, if that door opens ...

On top of that, Sean recently disclosed that we had a new product line, algorithms, in addition to IP.

On the lighter side, some months ago there was MB's Magnus Ostberg's "Stay tuned" Linkedin response to my question about whether they were using NN software on the water-cooled processor of the CLA concept vehicle, ... and now Valeo are advising everyone to "Stay tuned" in relation to the SDV.

Of course, as the more sober commentators here will tell you, I could be wrong ...
Never say you " Could be wrong " just say " I could be mistaken " :)
 
  • Like
Reactions: 1 users

TheDrooben

Pretty Pretty Pretty Pretty Good
They've just rebuilt the wall😩😩
Yep more games. Maybe a pump and dump to offload some off the 19.3c SPP for a small profit.
One day it will happen......
200w (9).gif


Happy as Larry
 
  • Like
  • Sad
Reactions: 4 users

7für7

Top 20
Yes I was fooled too… 🫤 whatever
 
  • Love
Reactions: 1 users

manny100

Regular
Looks to be some very interesting camera 📸 technology coming up at Vision 2024


LUCID to Unveil Latest GigE Vision Cameras and Advanced Sensing Technologies at VISION 2024​

Camera-spread-VISION-2024-1800x580-1.jpg

Richmond, BC, Canada – August 22, 2024 – LUCID Vision Labs, Inc., a leading designer and manufacturer of industrial cameras, will showcase a range of new GigE Vision cameras and advanced sensing technologies at VISION 2024, which takes place from October 8–10, 2024, in Stuttgart, Germany.
Triton2-SMART-300x300-1.jpg
LUCID is set to introduce the first member of its intelligent vision camera family, the Triton® Smart camera featuring Sony’s IMX501 intelligent vision sensor with AI processing. The Triton Smart is an easy-to-use, cost-effective intelligent vision camera capable of outputting inference results alongside regular 12.3 MP images for every frame. Its on-sensor AI processing reduces data bandwidth, alleviates processing load on the host PC, and minimizes latency.
Triton2-EVS-VISION-2024.jpg
Expanding the Triton2 – 2.5GigE camera family, LUCID will showcase two new models for advanced sensing applications. The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection.
Triton2-4K-line-scan-500x500-1.jpg
Additionally, the new Triton2 4K line scan camera, equipped with Gpixel’s GL3504 image sensor, will also be unveiled. Featuring 4096 (H) x 2 (V) at 3.5 μm pixels, this camera is ideal for high-speed, high-resolution imaging.
Atlas10 10GigE On-Semi 45mp
The Atlas10 camera – 10GigE camera family is welcoming a new high-resolution model featuring the 45-megapixel (8192 x 5460) onsemi XGS45000 CMOS global shutter image sensor, capable of running at 16 fps. This RDMA-enabled camera offers a unique combination of high resolution, high frame rate, and superior image quality, making it well-suited for applications such as flat panel inspection, aerial surveillance, mapping, and electronics inspection.
Helios2 3D ToF Camera with Narrow FoV
LUCID is also expanding its Helios®2 3D Time-of-Flight camera family with the introduction of the Helios2 Narrow Field-of-View (FoV)variant. This model integrates Sony’s DepthSense™ IMX556PLR back-illuminated ToF image sensor. It produces a tighter point cloud, and the narrower illumination area reduces the likelihood of multipath error, making it ideal for applications requiring precise 3D depth measurement in confined spaces.
industrial-VISION-days.jpg
As part of Industrial VISION Days 2024, organized by VDMA Machine Vision, LUCID’s Director of Product Management, Alexis Teissie, will present “The Benefits of RDMA for 10GigE Cameras and Beyond” on Wednesday, October 9th at 2:40 pm.
Stay tuned for additional product highlights to be unveiled on the show floor. Join LUCID at VISION 2024 from October 8–10, 2024, in Stuttgart, Germany at Booth 10E40.
Very interesting, the EV - Event Based Camera is powered in part by Prophesee Metavision. " The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection."
See US media release dated 19/6/22. We integrated with Prophesee Metasvision.
The 2 year timeline is just about there. Either AKIDA 1000 or 1500.
Now Sean at the AGM made it quite clear that deals were closing in on decision time.
Very interesting.
 
  • Like
  • Thinking
  • Love
Reactions: 20 users

manny100

Regular
Further to my previous post check out the Prophesse website for event based products. They deal with Sony.

 
  • Like
  • Fire
  • Love
Reactions: 7 users
Top Bottom