I wonder why they do this ban"US proposes ban on smart cars with Chinese and Russian tech"
https://edition.cnn.com/2024/09/23/tech/us-car-software-ban-china-russia/index.html
It was explained on the radio where I first heard from ot that it would possibly not stop there, but that the EU, for example, would habitually follow their suit.
From a European perspective, there's nothing to be surprised about. So that they don't press the button and control what happens there. We Europeans know their blackmail methods, especially those of us in the graphite group. Downstream to protect the market, I think.I wonder why they do this ban
I am guessing so the USA don’t look stupid
Asia is way more advanced than the USA
View attachment 69928
"Users will be able to interact with the glasses by hand-tracking, voice and wrist based NEURAL INTERFACE".
I set the video link to play at the above point.
If you ask me, we can happily stay isolated from the U.S. market for quite a while… they’re wired differently up there and change their minds as often as others change their underwear, if you get what I mean.Overnight in the US. Not that it means much as they seem to follow our market with these itty bitty stocks??
View attachment 69925
They've just rebuilt the wallThis is the type of SP movement I was hoping for a couple of weeks ago. Burn baby burn
Never say you " Could be wrong " just say " I could be mistaken "Yes.
Both have disclosed that they are using software to process sensor signals. I'm certain both have had access to Akida 2/TeNNs simulation software since the patent application was filed over 2 years ago.
Understandably, they would be reluctant to commit to silicon while the tech is so new and in a state of ongoing development. Software provides a way for new developments to be added without needing a soldering iron. My hope is that, longer term when the development plateaus, we may see Akida SoC IP integrated into the Nvidia drive processor silicon, and, if that door opens ...
On top of that, Sean recently disclosed that we had a new product line, algorithms, in addition to IP.
On the lighter side, some months ago there was MB's Magnus Ostberg's "Stay tuned" Linkedin response to my question about whether they were using NN software on the water-cooled processor of the CLA concept vehicle, ... and now Valeo are advising everyone to "Stay tuned" in relation to the SDV.
Of course, as the more sober commentators here will tell you, I could be wrong ...
Yep more games. Maybe a pump and dump to offload some off the 19.3c SPP for a small profit.They've just rebuilt the wall
Very interesting, the EV - Event Based Camera is powered in part by Prophesee Metavision. " The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection."Looks to be some very interesting camera technology coming up at Vision 2024
LUCID to Unveil Latest GigE Vision Cameras and Advanced Sensing Technologies at VISION 2024
Richmond, BC, Canada – August 22, 2024 – LUCID Vision Labs, Inc., a leading designer and manufacturer of industrial cameras, will showcase a range of new GigE Vision cameras and advanced sensing technologies at VISION 2024, which takes place from October 8–10, 2024, in Stuttgart, Germany.
LUCID is set to introduce the first member of its intelligent vision camera family, the Triton® Smart camera featuring Sony’s IMX501 intelligent vision sensor with AI processing. The Triton Smart is an easy-to-use, cost-effective intelligent vision camera capable of outputting inference results alongside regular 12.3 MP images for every frame. Its on-sensor AI processing reduces data bandwidth, alleviates processing load on the host PC, and minimizes latency.
Expanding the Triton2 – 2.5GigE camera family, LUCID will showcase two new models for advanced sensing applications. The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection.Additionally, the new Triton2 4K line scan camera, equipped with Gpixel’s GL3504 image sensor, will also be unveiled. Featuring 4096 (H) x 2 (V) at 3.5 μm pixels, this camera is ideal for high-speed, high-resolution imaging.
The Atlas10 camera – 10GigE camera family is welcoming a new high-resolution model featuring the 45-megapixel (8192 x 5460) onsemi XGS45000 CMOS global shutter image sensor, capable of running at 16 fps. This RDMA-enabled camera offers a unique combination of high resolution, high frame rate, and superior image quality, making it well-suited for applications such as flat panel inspection, aerial surveillance, mapping, and electronics inspection.
LUCID is also expanding its Helios®2 3D Time-of-Flight camera family with the introduction of the Helios2 Narrow Field-of-View (FoV)variant. This model integrates Sony’s DepthSense™ IMX556PLR back-illuminated ToF image sensor. It produces a tighter point cloud, and the narrower illumination area reduces the likelihood of multipath error, making it ideal for applications requiring precise 3D depth measurement in confined spaces.
As part of Industrial VISION Days 2024, organized by VDMA Machine Vision, LUCID’s Director of Product Management, Alexis Teissie, will present “The Benefits of RDMA for 10GigE Cameras and Beyond” on Wednesday, October 9th at 2:40 pm.
Stay tuned for additional product highlights to be unveiled on the show floor. Join LUCID at VISION 2024 from October 8–10, 2024, in Stuttgart, Germany at Booth 10E40.
What I'd like to know is how they compensate for camera movement in autonomous driving.Very interesting, the EV - Event Based Camera is powered in part by Prophesee Metavision. " The Triton2 EVS event-based 2.5GigE camera, powered by Sony’s IMX636 or IMX637 vision sensor and our Arena® SDK as well as PROPHESEE’s Metavision® SDK, is designed to deliver enhanced performance, lower power consumption, and greater flexibility for machine vision applications. It is ideal for motion analysis, vibration monitoring, object tracking, optical flow, autonomous driving, and high-speed detection."
See US media release dated 19/6/22. We integrated with Prophesee Metasvision.
The 2 year timeline is just about there. Either AKIDA 1000 or 1500.
Now Sean at the AGM made it quite clear that deals were closing in on decision time.
Very interesting.
BrainChip Partners with Prophesee Optimizing Computer Vision AI Performance and Efficiency
BrainChip and Prophesee partner to optimize computer vision AI, delivering enhanced performance and efficiency.brainchip.com