Markus Schäfer on LinkedIn: #mercedesbenz #ces #leadintechnology | 13 comments
Just two more days to go until CES 2024 in Las Vegas! I’m really looking forward to presenting the next major step towards our vision of the… | 13 comments on LinkedIn
A question if you have looked into @Diogenesehttps://www.valeo.com/wp-content/uploads/2023/12/press-release_valeo-at-ces2024.pdf
Valeo Unveils Groundbreaking Innovations at CES 2024, Paving the Way for Greener, Safer Mobility for All, Everywhere
CES2024 Innovation Award Honoree Valeo SCALA™3 LiDAR:
For the first time, we will give visitors the opportunity to experience and learn more about our AI-based perception software and how it helps classify objects identified by the LiDAR in its point cloud.
I suppose they could use software for an ICE, but surely there must be a better way for EVs?
Valeo will display a number of black technologies at CES 2024Here is a link I found regarding Valeo at CES 2024.
It has more info but also quite a lot of ads.
https://inf.news/en/auto/82114ffb5c9c664534fc32a89d6a178b.html
View attachment 53618
Valeo SCALA3 lidar: improving the reliability of autonomous driving
Lidar is an important part of the car's autonomous driving system, allowing the vehicle to detect and respond to obstacles in the surrounding environment, which is difficult to achieve with other sensing systems. For example, lidar sensors produce more accurate and comprehensive images compared to millimeter-wave radar; they have a wider detection range than ultrasonic sensors; and they can render images in adverse weather and lighting conditions compared to cameras. Better results. In addition, lidar systems can better generate real-time visualization information for autonomous vehicles. Valeo SCALA3 is Valeo's third-generation lidar (laser detection and ranging system). Innovations in hardware and sensing software functions make its performance leading in the industry. Valeo SCALA3 delivers powerful sensing capabilities in any condition and meets the highest standards of quality and safety in the automotive industry.
Well the paper's from December 2022, and they appear to be trying to reinvent the wheel, but BrainChip does CNN2SNN in software while the authors propose as a next step to a hardware implementation.A question if you have looked into @Diogenese
Valeo were running a project / program Spikili with Tempo and this is a conclusion to one paper I was reading.
Thoughts on whether we would fit within the CNN2SNN development and / or the hardware implementation as part of our joint Dev agreement?
Was in 4 bits as well.
![]()
SpikiLi: A Spiking Simulation of LiDAR based Real-time Object Detection for Autonomous Driving
Spiking Neural Networks are a recent and new neural network design approach that promises tremendous improvements in power efficiency, computation efficiency, and processing latency. They do so by using asynchronous sp…ar5iv.labs.arxiv.org
SpikiLi: A Spiking Simulation of LiDAR based Real-time Object Detection for Autonomous Driving
Sambit Mohapatra*1, Thomas Mesquida*2, Mona Hodaei1, Senthil Yogamani3, Heinrich Gotzig1, Patrick Mäder4
*Equal contribution 1Valeo Germany 2CEA-List, France 3Valeo Ireland 4TU Ilmenau, Germany
VConclusion
We have presented CNN to SNN conversion and simulation that adapts existing CNN building blocks to simulate spiking behavior for a complex real world application. We have shown that SNNs can be applied to complex tasks such as object detection and achieve comparable performance while achieving much better energy efficiency when implemented in hardware. This is a nascent research area, and we aim to progress from simulations to hardware implementation to realize the full potential of SNNs using other vital techniques such as event-based processing, differential signal processing, power-saving using sparsity, and low activation of neurons.
View attachment 53617
Just like the EQXX, Akida is being used in this vehicle also. Still only a concept car, but still good news for BRN.View attachment 53616![]()
Markus Schäfer on LinkedIn: #mercedesbenz #ces #leadintechnology | 13 comments
Just two more days to go until CES 2024 in Las Vegas! I’m really looking forward to presenting the next major step towards our vision of the… | 13 comments on LinkedInwww.linkedin.com
Hi All
As the cost of developing an academic theory or a product has become a topic and being a technophobe I thought I would investigate this question.
It turns out if you use the Edge Impulse platform it is likely to be free for most academic and individual users.
Who would have thought so struggling academics and developers no longer have to put up their hard earned to run an idea through Brainchip AKIDA or any other companies supported hardware.
I guess that’s why Brainchip dropped off including support with their Boards something which they promoted strongly.
Rather a clever move this Edge Impulse partnership. Who would have thought completely free access to experiment with Brainchip AKIDA technology.
“Edge Impulse
Edge Impulse is the leading development platform for machine learning on edge devices, free for developers and trusted by enterprises. Founded in 2019 by Zach Shelby and Jan Jongboom, we are on a mission to enable developers to create the next generation of intelligent devices. We believe that machine learning can enable positive change in society, and we are dedicated to support applications for good.”
![]()
We're Updating Our Free Community Plan — Here Are the Details
On January 10, 2024 we updated the Edge Impulse Community Plan. Here is everything you need to know.www.edgeimpulse.com
My opinion only DYOR
Fact Finder
Absolutely correct these Brainchip products were low volume partly assembled and packaged by staff at Brainchip in limited numbers (literally a few hundred) primarily as demonstrators for new and existing customers. There were three. The $499 Raspberry Pi, then the two larger board packages at $4,999 and $9,999.
Thanks D.Well the paper's from December 2022, and they appear to be trying to reinvent the wheel, but BrainChip does CNN2SNN in software while the authors propose as a next step to a hardware implementation.
Bit of a blurb on MagikEyes' CES attendance. @MDhere mentioned they were resurfacing not long ago.
Haven't heard much from them or BRN so wonder
MagikEye's Pico Image Sensor: Pioneering the Eyes of AI for the Robotics Age at CES
From Businesswire.
December 20, 2023 09:00 AM Eastern Standard Time
STAMFORD, Conn.--(BUSINESS WIRE)--Magik Eye Inc. (www.magik-eye.com), a trailblazer in 3D sensing technology, is set to showcase its groundbreaking Pico Depth Sensor at the 2024 Consumer Electronics Show (CES) in Las Vegas, Nevada. Embarking on a mission to "Provide the Eyes of AI for the Robotics Age," the Pico Depth Sensor represents a key milestone in MagikEye’s journey towards AI and robotics excellence.
The heart of the Pico Depth Sensor’s innovation lies in its use of MagikEye’s proprietary Invertible Light™ Technology (ILT), which operates efficiently on a “bare-metal” ARM M0 processor within the Raspberry Pi RP2040. This noteworthy feature underscores the sensor's ability to deliver high-quality 3D sensing without the need for specialized silicon. Moreover, while the Pico Sensor showcases its capabilities using the RP2040, the underlying technology is designed with adaptability in mind, allowing seamless operation on a variety of microcontroller cores, including those based on the popular RISC-V architecture. This flexibility signifies a major leap forward in making advanced 3D sensing accessible and adaptable across different platforms.
Takeo Miyazawa, Founder & CEO of MagikEye, emphasizes the sensor's transformative potential: “Just as personal computers democratized access to technology and spurred a revolution in productivity, the Pico Depth Sensor is set to ignite a similar transformation in the realms of AI and robotics. It is not just an innovative product; it’s a gateway to new possibilities in fields like autonomous vehicles, smart home systems, and beyond, where AI and depth sensing converge to create smarter, more intuitive solutions.”
Attendees at CES 2024 are cordially invited to visit MagikEye's booth for an exclusive first-hand experience of the Pico Sensor. Live demonstrations of MagikEye’s latest ILT solutions for next-gen 3D sensing solutions will be held from January 9-11 at the Embassy Suites by Hilton Convention Center Las Vegas. Demonstration times are limited and private reservations will be accommodated by contacting ces2024@magik-eye.com.
View attachment 53616![]()
Markus Schäfer on LinkedIn: #mercedesbenz #ces #leadintechnology | 13 comments
Just two more days to go until CES 2024 in Las Vegas! I’m really looking forward to presenting the next major step towards our vision of the… | 13 comments on LinkedInwww.linkedin.com
AKIDA development kits, aren't specifically intended for backyard hobbyists and are not priced as such.
The Quantum Ventura AKIDA USB stick at ~$US50.00 (should that product eventuate) is and although he can't possibly do a price comparison on a product that doesn't exist yet, his "product" at ~$US100 doesn't exist yet either?..
I still can't see how you can agree on his argument of price point..
But I guess we'll just have to agree to disagree, again![]()
Posting on behalf of @Patient
TATA demoing with BRN at CES along with NVISO and VVDN
View attachment 53634![]()
CES 2024
Check back frequently for updates, as we’ll be adding new information throughout CES 2024 and beyond.brainchip.com
this NVISO logo is from NVISO.eu, not NVISO.ai (the one we already know)…so it maybe we have another European partner in the cybersecurity space.Posting on behalf of @Patient
TATA demoing with BRN at CES along with NVISO and VVDN
View attachment 53634![]()
CES 2024
Check back frequently for updates, as we’ll be adding new information throughout CES 2024 and beyond.brainchip.com
Nice pick up B.
this NVISO logo is from NVISO.eu, not NVISO.ai (the one we already know)…so it maybe we have another European partner in the cybersecurity space.