BRN Discussion Ongoing

IloveLamp

Top 20


Screenshot_20230304_083957_LinkedIn.jpg
 
  • Like
  • Fire
Reactions: 9 users
Interesting link via Edge Impulse to AI wearables and Rett Syndrome. Not sure if there are any BRN holders here who also hold NEU


 
  • Like
Reactions: 7 users

jtardif999

Regular
Great find TTM,

Drive Pilot is "current".

Valeo SCALA 3 is in Drive Pilot!?

Luminar LiDaR uses "foveated" LidaR which focuses on point(s) of interest (denser laser point distribution) and relegates less interesting data to "peripheral vision".

Will Mercedes mandate the use of Akida with Luminar?
I think from recollection SCALA2 is limited to speeds of 60km/hour; SCALA3 is supposed to double that limit due to being able to see further ahead. Level 3 certification is currently limited to 60km/hour speeds.
 
  • Like
  • Sad
Reactions: 6 users
EE Times- Posted an hour ago,


Experts Weigh Impact of Prophesee-Qualcomm Deal​

By Sunny Bains 03.03.2023 0
Share Post
Share on Facebook
Share on Twitter


Prophesee this week announced a collaboration to “allow native compatibility” between its Metavision neuromorphic event-based cameras and Qualcomm’s Snapdragon mobile platforms in a multi-year deal to co-develop their tech.
Prophesee’s brain-inspired sensorsinherently compress data by only detecting pixels that change their brightness level at any given time. This means the sensors can work at very high, effective frame rates but, for most tasks, at very low power and bandwidth. The technology is already routinely used in automation and inspection applications to measure vibrations as well as count and track objects.
ADVERTISING

The data provided by the sensors can be used to enhance images from a conventional frame-based camera: removing blur where the light is low or the subjects are moving quickly. This is particularly important for small cameras that have less light-gathering power.
ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING
Frédéric Guichard, CEO and CTO of DXOMARK, a French company that specializes in testing cameras and other consumer electronics, and that is unconnected with Paris-based Prophesee, told EE Times that the ability to deblur in these circumstances could provide definite advantages.
“Reducing motion blur [without increasing noise] would be equivalent to virtually increasing camera sensitivity,” Guichard said, noting two potential benefits: “For the same sensitivity [you could] reduce the sensor size and therefore camera thickness,” or you could maintain the sensor size and use longer exposures without motion blur.

Rendition of Prophesee and Qualcomm technology collab. By combining a conventional frame camera with data from an event-based imager, motion blur can be eliminated. See this video. (Source: Prophesee)
Judd Heape, VP for product management of camera, computer vision and video at Qualcomm Technologies, told EE Times that they can get this image enhancement with probably a 20-30% increase in power consumption to run the extra image sensor and execute the processing.
“The processing can be done slowly and offline because you don’t really care about how long it takes to complete,” Heape added.

A richer feature set

Artist rendition of the dual event/frame sensor camera to be produced by Prophesee and Qualcomm. Concept of the dual event/frame sensor camera. (Source: Prophesee)
Event sensors, however, should make other functionalities possible, too.
Tobi Delbruck, a professor at the Institute of Neuroinformatics in Zurich, Switzerland, and founder of Prophesee competitor IniVation, told EE Times that a big group at Samsung was looking at “trying to integrate something like a DVS [event-based camera] into smartphones, and they successfully demonstrated a whole bunch of cool [features] like gesture recognition.”
At the time, Delbruck explained, it wasn’t technically feasible to execute the signal processing required to make an event-based camera work on a phone, but now, with the neural accelerators that have become increasingly powerful and efficient in mobile platforms (as on Qualcomm’s Snapdragon), this is no longer a barrier.
Qualcomm’s Heape said he is also aware of, and interested in, these other possibilities.
“We have many, many low-power use cases,” he said. Lifting a phone to your ear to wake it up is one example. Gesture-recognition to control the car when you’re driving is another.
“These event-based sensors are much more efficient for that because they can be programmed to easily detect motion at very low power,” he said. “So, when the sensor is not operating, when there’s no movement or no changes in the scene, the sensor basically consumes almost no power. So that’s really interesting to us.”
Eye-tracking could also be very useful, Heape added, because Qualcomm builds devices for augmented and virtual reality. “Eye-tracking, motion-tracking of your arms, hands, legs… are very efficient with image sensors,” he said. “In those cases, it is about power, but it’s also about frame rate. We need to track the eyes at like 900 frames per second. It’s harder to do that with a standard image sensor.”

Toward mass production

Heape explained how the collaboration will work: Qualcomm’s OEMs, such as Oppo, Vivo, Xiaomi, OnePlus Honor, and Samsung, can “purchase a chipset and the software from [Qualcomm] and then, from Prophesee, they would also purchase the image sensor and the software… but they would have both been pre-tested by us.”
Product lines, however, are not being combined. “We’re working together to pre-integrate them before they get incorporated into the final product,” he said.
This highlights another advantage of the collaboration with Qualcomm, one that Delbruck points out: It gives Prophesee access to integrate with Mobile Industry Processing Interface (MIPI), making it possible for the company to move into these mobile applications. Licensing this technology is expensive, so this would otherwise be a barrier to entering the mobile market.
Prophesee CEO Luca Verre told EE Times the company is close to launching its first mobile product with one OEM. “The target is to enter into mass production next year,” he said.
However, Delbruck cautioned that an intellectual property battle could get in the way—because there has long been contention about whether the Prophesee camera is too similar to earlier designs, particularly those invented at INI Zurich.
“It’s not an issue at all right now because nothing is in mass production,” he said, “But it could become an issue for them later, as happened with Caltech and the basic APS [Active Pixel Sensor] patent.”
“Samsung, can “purchase a chipset and the software from [Qualcomm] and then, from Prophesee, they would also purchase the image sensor and the software… but they would have both been pre-tested by us.”

If we accept the above as a correct reporting of how this engagement will work then there is no reason why Prophesee cannot complete its vision by adding the AKIDA brain to the vision sensor and process the vision and send metadata to Snapdragon.

In doing so it would have the advantage of reduced latency and reduced power that would otherwise be taken up by Snapdragon converting the unprocessed spikes from Prophesee’s sensor then processing that data.

It just makes sense and given that Prophesee is after multiple markets completing their sensor by adding the missing brain discussed in the podcast with Rob Telson just makes logical sense.

Additionally if there is some issue as this article suggest with the earlier designs by third parties adding the brain may actually work to Prophesee’s advantage.

Remember the Mercedes Benz words that it is incredible what a little bit of human like intelligence can achieve.

By the way as it now seems Prophesee will be mass producing its own sensor to supply to Qualcomm the need for $50 million becomes a little clearer.

My opinion only so DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 47 users

Taproot

Regular
 
  • Like
  • Love
  • Fire
Reactions: 19 users

MDhere

Top 20
Well its confirmed ive been aoproved my time off to attend the Agm once again 😀
 
  • Like
  • Fire
  • Haha
Reactions: 15 users

Love the wording.

Anil’s profile photo

Anil Mankar reposted this


BrainChip
BrainChip 8,600 followers
17h •
17 hours ago



BrainChip teams for AIoT - Teksun in India has joined the Essential AI Ecosystem around BrainChips Akida low power event-driven neural networking chip to develop a wide range of AI-featured devices in the Internet of Things (AIoT). eeNews Europe via Nick Flaherty
https://lnkd.in/e-TDVuKP
Another independent opinion of AKIDA technology being publicly stated by a significant industry player.

Does this opinion have greater weight than that expressed by anonymous posters and WANCAs you be the judge:

“Teksun is committed to innovation and dedicated to providing our customers with the most advanced IoT solutions,” said Brijesh Kamani, founder & CEO of Teksun.

“With BrainChip’s Akida processor, we will be able to deliver next-generation AIoT devices that are faster, more efficient, and more intelligent than ever before and not just meet, but exceed the expectations of our customers.”

Does this statement not make all the negative commentators no more than white noise to be tuned out.

I certainly believe so.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 85 users

BaconLover

Founding Member

The recent price drop (especially since the LDA ann) may not have played a major factor on this rebalance.

Unless we see some actual material sales, next re-balance would be worth noting.
 
  • Like
  • Fire
  • Love
Reactions: 7 users

IloveLamp

Top 20
  • Like
  • Love
  • Wow
Reactions: 11 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Talking about the new Mercedes operating system MBOS. "The entire operating system would be a game changer, Iyer stated."


Screen Shot 2023-03-04 at 11.50.561png.png


Screen Shot 2023-03-04 at 11.51.12 2png..png



Screen Shot 2023-03-04 at 11.51.23png..png


 
  • Like
  • Love
Reactions: 15 users
Had my 1st chance to take a look into Taksun this morning and only one word springs to mind


Now try and find the time, to look into the company we actually partnered with 🙄..

Hopefully you're not underwhelmed..

jim-carrey-stupid-stupid.gif
 
  • Like
  • Haha
Reactions: 11 users

Gazzafish

Regular
BMW I7. Autonomous driving and tracks eye movement… interesting

 
  • Like
  • Fire
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Another article this morning espousing the new partnership between Google and Mercedes. I think at this stage even Blind Freddie would have to agree there's no way that we won't be working with Goggle at some stage given "the Googlefied architecture" will employ the latest version Hyperscreen, a feature of the UI/UX which incoproates AKIDA technology in order increase efficiency and to help the car and driver work together as one.


Google.png


Screen Shot 2023-03-04 at 12.06.0.png




 
  • Like
  • Love
  • Fire
Reactions: 18 users
Talking about the new Mercedes operating system MBOS. "The entire operating system would be a game changer, Iyer stated."


View attachment 31132

View attachment 31133


View attachment 31134

Reading this I was reminded of a survey done a few years ago by a car magazine of new car buyers and a significant percentage which I cannot now remember said they would not buy their particular vehicle again because of issues related to cup holders.😂🤣😂
 
  • Haha
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This is from an old post on Magnus Ostberg's LinkedIn (Chief Software Officer - Mercedes).

This tells us that Continental, BMW, Volswagen and Mercedes all use the same tool chain. What the significance of this is, I know not, but Dr Bonin does admit the Hyperscreen is the most stunning in the industry.


Screen Shot 2023-03-04 at 12.03.3.png




Screen Shot 2023-03-04 at 12.02.59.png




 
  • Like
  • Haha
Reactions: 9 users

Sirod69

bavarian girl ;-)
This is from an old post on Magnus Ostberg's LinkedIn (Chief Software Officer - Mercedes).

This tells us that Continental, BMW, Volswagen and Mercedes all use the same tool chain. What the significance of this is, I know not, but Dr Bonin does admit the Hyperscreen is the most stunning in the industry.


View attachment 31143



View attachment 31142



really interesting @Bravo, this all are german firms! 🥰
 
  • Like
  • Fire
  • Love
Reactions: 11 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Reading this I was reminded of a survey done a few years ago by a car magazine of new car buyers and a significant percentage which I cannot now remember said they would not buy their particular vehicle again because of issues related to cup holders.😂🤣😂

They mustn't have bought the magical spill-proof cup holder then. Now THAT was a game changer!



vhdanxtsmll2sbbflext.gif
 
  • Haha
  • Like
  • Love
Reactions: 18 users

Tothemoon24

Top 20

Seeing is Collecting​

Audio-video giant Sony is working to take vision systems to the next level. Similar to Nikon’s transformation, Sony aims to carve out its spot in the event-based vision system industry by creating sensors that act like retinas in the human eye.

The tiny sensors are becoming ever smaller, which allows more of them to be fitted on a device to boost data collection volumes. The use of these sensors goes far beyond the manufacturing floor. As the technology improves, Sony sees deployment within collision avoidance systems, drones, and event-based 3D cameras.

Sony recently introduced what it touts as the world’s first intelligent vision sensors equipped with AI processing functionality. One highlight: The new chip will be to identify people and objects.

This would allow cameras with the chip to identify stock levels on a store shelf or use heat maps to track and analyze customer behavior. It could even count and forecast the number of customers in a given location, providing valuable data to calculate when foot traffic is highest.

Where the technology stands to shine the most in manufacturing is around data management. Advanced sensors can identify objects and send a description of what they see without having to include an accompanying image that takes up space in the database. This could reduce storage requirements by up to 10,000 times, leaving companies with more space to gather critical data that they previously haven’t been able to access, while giving AI a looser leash to capture relevant information.

Working Together​

As technology evolves, partnerships between companies in the event-based vision system space and those that want to deploy across other industries will become commonplace.

Datalogic is joining forces with Paris-based Prophesee, a company that invented advanced neuromorphic vision systems and is working to build the next generation of industrial products.

“We are conducting a very fruitful partnership with Prophesee,” said Michele Benedetti, chief technology officer at Datalogic. “Neuromorphic vision is a fascinating technology inspired by the behavior of the human biological system, exactly like neural networks. We believe that the combination of these technologies will provide innovative solutions to our customers.”

As investment in this space increases, the market is expected to drive growth in other industries at an exponential rate through at least 2030, according to a Grand View Research report on the U.S. machine vision market. The increasing demand for quality inspection, as well as the need for vision-guided robotic systems, is expected to fuel that growth.

While long-term forecasts for emerging technologies are far from an exact science, the future for event-based vision systems looks promising—giving manufacturers cause to be fitted with a pair of 20/20 rose-colored specs.
 
  • Like
  • Fire
Reactions: 19 users

Slade

Top 20
“Samsung, can “purchase a chipset and the software from [Qualcomm] and then, from Prophesee, they would also purchase the image sensor and the software… but they would have both been pre-tested by us.”

If we accept the above as a correct reporting of how this engagement will work then there is no reason why Prophesee cannot complete it vision by adding the AKIDA brain to the vision sensor and process the vision and send metadata to Snapdragon.

In doing so it would have the advantage of reduced latency and reduced power that would otherwise be taken up by Snapdragon converting the unprocessed spikes from Prophesee’s sensor then processing that data.

It just makes sense and given that Prophesee is after multiple markets completing their sensor by adding the missing brain discussed in the podcast with Rob Telson just makes logical sense.

Additionally if there is some issue as this article suggest with the earlier designs by third parties adding the brain may actually work to Prophesee’s advantage.

Remember the Mercedes Benz words that it is incredible what a little bit of human like intelligence can achieve.

By the way as it now seems Prophesee will be mass producing its own sensor to supply to Qualcomm the need for $50 million becomes a little clearer.

My opinion only so DYOR
FF

AKIDA BALLISTA
Well One thing is for certain, Brainchip’s astute management team must have had a very good reason for drawing down on the $50 million. With everything that’s going on it is hard not to be extremely confident that Brainchip is going to fly this year.
 
  • Like
  • Love
  • Fire
Reactions: 34 users
Top Bottom