BRN Discussion Ongoing

Dhm

Regular
The full report which costs US$4,750 mentions us in tons of places, but most probably stuff we already know.
27FCC927-0F4E-4205-A64B-09A5A8B89C54.png

7F375A5E-797F-4FF5-B19B-43BCED0433E6.png

B1F9957F-C98D-40F8-8801-81973CC16941.png
 
  • Like
  • Fire
  • Wow
Reactions: 38 users

chapman89

Founding Member
EE Times- Posted an hour ago,


Experts Weigh Impact of Prophesee-Qualcomm Deal​

By Sunny Bains 03.03.2023 0
Share Post
Share on Facebook
Share on Twitter


Prophesee this week announced a collaboration to “allow native compatibility” between its Metavision neuromorphic event-based cameras and Qualcomm’s Snapdragon mobile platforms in a multi-year deal to co-develop their tech.
Prophesee’s brain-inspired sensorsinherently compress data by only detecting pixels that change their brightness level at any given time. This means the sensors can work at very high, effective frame rates but, for most tasks, at very low power and bandwidth. The technology is already routinely used in automation and inspection applications to measure vibrations as well as count and track objects.
ADVERTISING

The data provided by the sensors can be used to enhance images from a conventional frame-based camera: removing blur where the light is low or the subjects are moving quickly. This is particularly important for small cameras that have less light-gathering power.
ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING
Frédéric Guichard, CEO and CTO of DXOMARK, a French company that specializes in testing cameras and other consumer electronics, and that is unconnected with Paris-based Prophesee, told EE Times that the ability to deblur in these circumstances could provide definite advantages.
“Reducing motion blur [without increasing noise] would be equivalent to virtually increasing camera sensitivity,” Guichard said, noting two potential benefits: “For the same sensitivity [you could] reduce the sensor size and therefore camera thickness,” or you could maintain the sensor size and use longer exposures without motion blur.

Rendition of Prophesee and Qualcomm technology collab. By combining a conventional frame camera with data from an event-based imager, motion blur can be eliminated. See this video. (Source: Prophesee)
Judd Heape, VP for product management of camera, computer vision and video at Qualcomm Technologies, told EE Times that they can get this image enhancement with probably a 20-30% increase in power consumption to run the extra image sensor and execute the processing.
“The processing can be done slowly and offline because you don’t really care about how long it takes to complete,” Heape added.

A richer feature set

Artist rendition of the dual event/frame sensor camera to be produced by Prophesee and Qualcomm. Concept of the dual event/frame sensor camera. (Source: Prophesee)
Event sensors, however, should make other functionalities possible, too.
Tobi Delbruck, a professor at the Institute of Neuroinformatics in Zurich, Switzerland, and founder of Prophesee competitor IniVation, told EE Times that a big group at Samsung was looking at “trying to integrate something like a DVS [event-based camera] into smartphones, and they successfully demonstrated a whole bunch of cool [features] like gesture recognition.”
At the time, Delbruck explained, it wasn’t technically feasible to execute the signal processing required to make an event-based camera work on a phone, but now, with the neural accelerators that have become increasingly powerful and efficient in mobile platforms (as on Qualcomm’s Snapdragon), this is no longer a barrier.
Qualcomm’s Heape said he is also aware of, and interested in, these other possibilities.
“We have many, many low-power use cases,” he said. Lifting a phone to your ear to wake it up is one example. Gesture-recognition to control the car when you’re driving is another.
“These event-based sensors are much more efficient for that because they can be programmed to easily detect motion at very low power,” he said. “So, when the sensor is not operating, when there’s no movement or no changes in the scene, the sensor basically consumes almost no power. So that’s really interesting to us.”
Eye-tracking could also be very useful, Heape added, because Qualcomm builds devices for augmented and virtual reality. “Eye-tracking, motion-tracking of your arms, hands, legs… are very efficient with image sensors,” he said. “In those cases, it is about power, but it’s also about frame rate. We need to track the eyes at like 900 frames per second. It’s harder to do that with a standard image sensor.”

Toward mass production

Heape explained how the collaboration will work: Qualcomm’s OEMs, such as Oppo, Vivo, Xiaomi, OnePlus Honor, and Samsung, can “purchase a chipset and the software from [Qualcomm] and then, from Prophesee, they would also purchase the image sensor and the software… but they would have both been pre-tested by us.”
Product lines, however, are not being combined. “We’re working together to pre-integrate them before they get incorporated into the final product,” he said.
This highlights another advantage of the collaboration with Qualcomm, one that Delbruck points out: It gives Prophesee access to integrate with Mobile Industry Processing Interface (MIPI), making it possible for the company to move into these mobile applications. Licensing this technology is expensive, so this would otherwise be a barrier to entering the mobile market.
Prophesee CEO Luca Verre told EE Times the company is close to launching its first mobile product with one OEM. “The target is to enter into mass production next year,” he said.
However, Delbruck cautioned that an intellectual property battle could get in the way—because there has long been contention about whether the Prophesee camera is too similar to earlier designs, particularly those invented at INI Zurich.
“It’s not an issue at all right now because nothing is in mass production,” he said, “But it could become an issue for them later, as happened with Caltech and the basic APS [Active Pixel Sensor] patent.”
 
  • Like
  • Fire
  • Love
Reactions: 41 users

Steve10

Regular
EE Times- Posted an hour ago,


Experts Weigh Impact of Prophesee-Qualcomm Deal​

By Sunny Bains 03.03.2023 0
Share Post
Share on Facebook
Share on Twitter


Prophesee this week announced a collaboration to “allow native compatibility” between its Metavision neuromorphic event-based cameras and Qualcomm’s Snapdragon mobile platforms in a multi-year deal to co-develop their tech.
Prophesee’s brain-inspired sensorsinherently compress data by only detecting pixels that change their brightness level at any given time. This means the sensors can work at very high, effective frame rates but, for most tasks, at very low power and bandwidth. The technology is already routinely used in automation and inspection applications to measure vibrations as well as count and track objects.
ADVERTISING

The data provided by the sensors can be used to enhance images from a conventional frame-based camera: removing blur where the light is low or the subjects are moving quickly. This is particularly important for small cameras that have less light-gathering power.
ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING
Frédéric Guichard, CEO and CTO of DXOMARK, a French company that specializes in testing cameras and other consumer electronics, and that is unconnected with Paris-based Prophesee, told EE Times that the ability to deblur in these circumstances could provide definite advantages.
“Reducing motion blur [without increasing noise] would be equivalent to virtually increasing camera sensitivity,” Guichard said, noting two potential benefits: “For the same sensitivity [you could] reduce the sensor size and therefore camera thickness,” or you could maintain the sensor size and use longer exposures without motion blur.

Rendition of Prophesee and Qualcomm technology collab. By combining a conventional frame camera with data from an event-based imager, motion blur can be eliminated. See this video. (Source: Prophesee)
Judd Heape, VP for product management of camera, computer vision and video at Qualcomm Technologies, told EE Times that they can get this image enhancement with probably a 20-30% increase in power consumption to run the extra image sensor and execute the processing.
“The processing can be done slowly and offline because you don’t really care about how long it takes to complete,” Heape added.

A richer feature set

Artist rendition of the dual event/frame sensor camera to be produced by Prophesee and Qualcomm. Concept of the dual event/frame sensor camera. (Source: Prophesee)
Event sensors, however, should make other functionalities possible, too.
Tobi Delbruck, a professor at the Institute of Neuroinformatics in Zurich, Switzerland, and founder of Prophesee competitor IniVation, told EE Times that a big group at Samsung was looking at “trying to integrate something like a DVS [event-based camera] into smartphones, and they successfully demonstrated a whole bunch of cool [features] like gesture recognition.”
At the time, Delbruck explained, it wasn’t technically feasible to execute the signal processing required to make an event-based camera work on a phone, but now, with the neural accelerators that have become increasingly powerful and efficient in mobile platforms (as on Qualcomm’s Snapdragon), this is no longer a barrier.
Qualcomm’s Heape said he is also aware of, and interested in, these other possibilities.
“We have many, many low-power use cases,” he said. Lifting a phone to your ear to wake it up is one example. Gesture-recognition to control the car when you’re driving is another.
“These event-based sensors are much more efficient for that because they can be programmed to easily detect motion at very low power,” he said. “So, when the sensor is not operating, when there’s no movement or no changes in the scene, the sensor basically consumes almost no power. So that’s really interesting to us.”
Eye-tracking could also be very useful, Heape added, because Qualcomm builds devices for augmented and virtual reality. “Eye-tracking, motion-tracking of your arms, hands, legs… are very efficient with image sensors,” he said. “In those cases, it is about power, but it’s also about frame rate. We need to track the eyes at like 900 frames per second. It’s harder to do that with a standard image sensor.”

Toward mass production

Heape explained how the collaboration will work: Qualcomm’s OEMs, such as Oppo, Vivo, Xiaomi, OnePlus Honor, and Samsung, can “purchase a chipset and the software from [Qualcomm] and then, from Prophesee, they would also purchase the image sensor and the software… but they would have both been pre-tested by us.”
Product lines, however, are not being combined. “We’re working together to pre-integrate them before they get incorporated into the final product,” he said.
This highlights another advantage of the collaboration with Qualcomm, one that Delbruck points out: It gives Prophesee access to integrate with Mobile Industry Processing Interface (MIPI), making it possible for the company to move into these mobile applications. Licensing this technology is expensive, so this would otherwise be a barrier to entering the mobile market.
Prophesee CEO Luca Verre told EE Times the company is close to launching its first mobile product with one OEM. “The target is to enter into mass production next year,” he said.
However, Delbruck cautioned that an intellectual property battle could get in the way—because there has long been contention about whether the Prophesee camera is too similar to earlier designs, particularly those invented at INI Zurich.
“It’s not an issue at all right now because nothing is in mass production,” he said, “But it could become an issue for them later, as happened with Caltech and the basic APS [Active Pixel Sensor] patent.”

So Prophesee supplying sensor & software compatible with Qualcomm SoC for camera manufacturers to buy. Looks like Qualcomm's AI tech will do the processing.
 
  • Like
  • Sad
Reactions: 9 users

Baisyet

Regular
EE Times- Posted an hour ago,


Experts Weigh Impact of Prophesee-Qualcomm Deal​

By Sunny Bains 03.03.2023 0
Share Post
Share on Facebook
Share on Twitter


Prophesee this week announced a collaboration to “allow native compatibility” between its Metavision neuromorphic event-based cameras and Qualcomm’s Snapdragon mobile platforms in a multi-year deal to co-develop their tech.
Prophesee’s brain-inspired sensorsinherently compress data by only detecting pixels that change their brightness level at any given time. This means the sensors can work at very high, effective frame rates but, for most tasks, at very low power and bandwidth. The technology is already routinely used in automation and inspection applications to measure vibrations as well as count and track objects.
ADVERTISING

The data provided by the sensors can be used to enhance images from a conventional frame-based camera: removing blur where the light is low or the subjects are moving quickly. This is particularly important for small cameras that have less light-gathering power.
ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING
Frédéric Guichard, CEO and CTO of DXOMARK, a French company that specializes in testing cameras and other consumer electronics, and that is unconnected with Paris-based Prophesee, told EE Times that the ability to deblur in these circumstances could provide definite advantages.
“Reducing motion blur [without increasing noise] would be equivalent to virtually increasing camera sensitivity,” Guichard said, noting two potential benefits: “For the same sensitivity [you could] reduce the sensor size and therefore camera thickness,” or you could maintain the sensor size and use longer exposures without motion blur.

Rendition of Prophesee and Qualcomm technology collab. By combining a conventional frame camera with data from an event-based imager, motion blur can be eliminated. See this video. (Source: Prophesee)
Judd Heape, VP for product management of camera, computer vision and video at Qualcomm Technologies, told EE Times that they can get this image enhancement with probably a 20-30% increase in power consumption to run the extra image sensor and execute the processing.
“The processing can be done slowly and offline because you don’t really care about how long it takes to complete,” Heape added.

A richer feature set

Artist rendition of the dual event/frame sensor camera to be produced by Prophesee and Qualcomm. Concept of the dual event/frame sensor camera. (Source: Prophesee)
Event sensors, however, should make other functionalities possible, too.
Tobi Delbruck, a professor at the Institute of Neuroinformatics in Zurich, Switzerland, and founder of Prophesee competitor IniVation, told EE Times that a big group at Samsung was looking at “trying to integrate something like a DVS [event-based camera] into smartphones, and they successfully demonstrated a whole bunch of cool [features] like gesture recognition.”
At the time, Delbruck explained, it wasn’t technically feasible to execute the signal processing required to make an event-based camera work on a phone, but now, with the neural accelerators that have become increasingly powerful and efficient in mobile platforms (as on Qualcomm’s Snapdragon), this is no longer a barrier.
Qualcomm’s Heape said he is also aware of, and interested in, these other possibilities.
“We have many, many low-power use cases,” he said. Lifting a phone to your ear to wake it up is one example. Gesture-recognition to control the car when you’re driving is another.
“These event-based sensors are much more efficient for that because they can be programmed to easily detect motion at very low power,” he said. “So, when the sensor is not operating, when there’s no movement or no changes in the scene, the sensor basically consumes almost no power. So that’s really interesting to us.”
Eye-tracking could also be very useful, Heape added, because Qualcomm builds devices for augmented and virtual reality. “Eye-tracking, motion-tracking of your arms, hands, legs… are very efficient with image sensors,” he said. “In those cases, it is about power, but it’s also about frame rate. We need to track the eyes at like 900 frames per second. It’s harder to do that with a standard image sensor.”

Toward mass production

Heape explained how the collaboration will work: Qualcomm’s OEMs, such as Oppo, Vivo, Xiaomi, OnePlus Honor, and Samsung, can “purchase a chipset and the software from [Qualcomm] and then, from Prophesee, they would also purchase the image sensor and the software… but they would have both been pre-tested by us.”
Product lines, however, are not being combined. “We’re working together to pre-integrate them before they get incorporated into the final product,” he said.
This highlights another advantage of the collaboration with Qualcomm, one that Delbruck points out: It gives Prophesee access to integrate with Mobile Industry Processing Interface (MIPI), making it possible for the company to move into these mobile applications. Licensing this technology is expensive, so this would otherwise be a barrier to entering the mobile market.
Prophesee CEO Luca Verre told EE Times the company is close to launching its first mobile product with one OEM. “The target is to enter into mass production next year,” he said.
However, Delbruck cautioned that an intellectual property battle could get in the way—because there has long been contention about whether the Prophesee camera is too similar to earlier designs, particularly those invented at INI Zurich.
“It’s not an issue at all right now because nothing is in mass production,” he said, “But it could become an issue for them later, as happened with Caltech and the basic APS [Active Pixel Sensor] patent.”
Are we in in @chapman89
 

chapman89

Founding Member
No, @Diogenese was right that Qualcomm can do the processing themselves.

My money is on consumer applications, we will find out more in the months ahead 😁
 
  • Like
Reactions: 15 users

Baisyet

Regular
No, @Diogenese was right that Qualcomm can do the processing themselves.

My money is on consumer applications, we will find out more in the months ahead 😁
Yes Hoping so been a long wait :p
 
  • Like
Reactions: 6 users

TheFunkMachine

seeds have the potential to become trees.
I encourage everyone to re-listen to the Prophesee podcast with Luca and Rob.



In light of this recent news with Prophesee and Qualcomm it is very interesting listening to this conversation. In the beginning of the podcast Luca said that brainchip is helping unlocked the neuromorphic paradime fully and Rob said the purpose of our collaboration is to create advanced vision solutions based upon Prophesees event based vision system and Brainchips Akida’s ultra low power etc. creating best in class vison solutions.

They are talking about it being a perfect fit and that they complement each other very well!

There is so many nuggets in this podcast and it’s especially interesting in light of the Qualcomm deal.

There is no question about the excitement around the partnership of Brainchip and Prophesee, and there even is a “you complete me” moment in there where Luca said that their technology was really only half the story and now trough Akida they have a complete system they can confidently offer their prospective costumers!

And now the Qualcomm deal surfaced. It is evident that they have been talking with Qualcomm during this podcast as they discussed mobile integration for motion blur. But it seemed like a very recent discussion at the time of the podcast. They have moved very quickly in my understanding to get the deal with Qualcomm. Question is does Akida have a part in this? Does anyone have a better understanding in how Akida can help the Prophesee solution with motion blur?
 
  • Like
  • Fire
  • Love
Reactions: 18 users
Interesting link via Edge Impulse to AI wearables and Rett Syndrome. Not sure if there are any BRN holders here who also hold NEU


 
  • Like
Reactions: 7 users

jtardif999

Regular
Great find TTM,

Drive Pilot is "current".

Valeo SCALA 3 is in Drive Pilot!?

Luminar LiDaR uses "foveated" LidaR which focuses on point(s) of interest (denser laser point distribution) and relegates less interesting data to "peripheral vision".

Will Mercedes mandate the use of Akida with Luminar?
I think from recollection SCALA2 is limited to speeds of 60km/hour; SCALA3 is supposed to double that limit due to being able to see further ahead. Level 3 certification is currently limited to 60km/hour speeds.
 
  • Like
  • Sad
Reactions: 6 users
EE Times- Posted an hour ago,


Experts Weigh Impact of Prophesee-Qualcomm Deal​

By Sunny Bains 03.03.2023 0
Share Post
Share on Facebook
Share on Twitter


Prophesee this week announced a collaboration to “allow native compatibility” between its Metavision neuromorphic event-based cameras and Qualcomm’s Snapdragon mobile platforms in a multi-year deal to co-develop their tech.
Prophesee’s brain-inspired sensorsinherently compress data by only detecting pixels that change their brightness level at any given time. This means the sensors can work at very high, effective frame rates but, for most tasks, at very low power and bandwidth. The technology is already routinely used in automation and inspection applications to measure vibrations as well as count and track objects.
ADVERTISING

The data provided by the sensors can be used to enhance images from a conventional frame-based camera: removing blur where the light is low or the subjects are moving quickly. This is particularly important for small cameras that have less light-gathering power.
ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING
Frédéric Guichard, CEO and CTO of DXOMARK, a French company that specializes in testing cameras and other consumer electronics, and that is unconnected with Paris-based Prophesee, told EE Times that the ability to deblur in these circumstances could provide definite advantages.
“Reducing motion blur [without increasing noise] would be equivalent to virtually increasing camera sensitivity,” Guichard said, noting two potential benefits: “For the same sensitivity [you could] reduce the sensor size and therefore camera thickness,” or you could maintain the sensor size and use longer exposures without motion blur.

Rendition of Prophesee and Qualcomm technology collab. By combining a conventional frame camera with data from an event-based imager, motion blur can be eliminated. See this video. (Source: Prophesee)
Judd Heape, VP for product management of camera, computer vision and video at Qualcomm Technologies, told EE Times that they can get this image enhancement with probably a 20-30% increase in power consumption to run the extra image sensor and execute the processing.
“The processing can be done slowly and offline because you don’t really care about how long it takes to complete,” Heape added.

A richer feature set

Artist rendition of the dual event/frame sensor camera to be produced by Prophesee and Qualcomm. Concept of the dual event/frame sensor camera. (Source: Prophesee)
Event sensors, however, should make other functionalities possible, too.
Tobi Delbruck, a professor at the Institute of Neuroinformatics in Zurich, Switzerland, and founder of Prophesee competitor IniVation, told EE Times that a big group at Samsung was looking at “trying to integrate something like a DVS [event-based camera] into smartphones, and they successfully demonstrated a whole bunch of cool [features] like gesture recognition.”
At the time, Delbruck explained, it wasn’t technically feasible to execute the signal processing required to make an event-based camera work on a phone, but now, with the neural accelerators that have become increasingly powerful and efficient in mobile platforms (as on Qualcomm’s Snapdragon), this is no longer a barrier.
Qualcomm’s Heape said he is also aware of, and interested in, these other possibilities.
“We have many, many low-power use cases,” he said. Lifting a phone to your ear to wake it up is one example. Gesture-recognition to control the car when you’re driving is another.
“These event-based sensors are much more efficient for that because they can be programmed to easily detect motion at very low power,” he said. “So, when the sensor is not operating, when there’s no movement or no changes in the scene, the sensor basically consumes almost no power. So that’s really interesting to us.”
Eye-tracking could also be very useful, Heape added, because Qualcomm builds devices for augmented and virtual reality. “Eye-tracking, motion-tracking of your arms, hands, legs… are very efficient with image sensors,” he said. “In those cases, it is about power, but it’s also about frame rate. We need to track the eyes at like 900 frames per second. It’s harder to do that with a standard image sensor.”

Toward mass production

Heape explained how the collaboration will work: Qualcomm’s OEMs, such as Oppo, Vivo, Xiaomi, OnePlus Honor, and Samsung, can “purchase a chipset and the software from [Qualcomm] and then, from Prophesee, they would also purchase the image sensor and the software… but they would have both been pre-tested by us.”
Product lines, however, are not being combined. “We’re working together to pre-integrate them before they get incorporated into the final product,” he said.
This highlights another advantage of the collaboration with Qualcomm, one that Delbruck points out: It gives Prophesee access to integrate with Mobile Industry Processing Interface (MIPI), making it possible for the company to move into these mobile applications. Licensing this technology is expensive, so this would otherwise be a barrier to entering the mobile market.
Prophesee CEO Luca Verre told EE Times the company is close to launching its first mobile product with one OEM. “The target is to enter into mass production next year,” he said.
However, Delbruck cautioned that an intellectual property battle could get in the way—because there has long been contention about whether the Prophesee camera is too similar to earlier designs, particularly those invented at INI Zurich.
“It’s not an issue at all right now because nothing is in mass production,” he said, “But it could become an issue for them later, as happened with Caltech and the basic APS [Active Pixel Sensor] patent.”
“Samsung, can “purchase a chipset and the software from [Qualcomm] and then, from Prophesee, they would also purchase the image sensor and the software… but they would have both been pre-tested by us.”

If we accept the above as a correct reporting of how this engagement will work then there is no reason why Prophesee cannot complete its vision by adding the AKIDA brain to the vision sensor and process the vision and send metadata to Snapdragon.

In doing so it would have the advantage of reduced latency and reduced power that would otherwise be taken up by Snapdragon converting the unprocessed spikes from Prophesee’s sensor then processing that data.

It just makes sense and given that Prophesee is after multiple markets completing their sensor by adding the missing brain discussed in the podcast with Rob Telson just makes logical sense.

Additionally if there is some issue as this article suggest with the earlier designs by third parties adding the brain may actually work to Prophesee’s advantage.

Remember the Mercedes Benz words that it is incredible what a little bit of human like intelligence can achieve.

By the way as it now seems Prophesee will be mass producing its own sensor to supply to Qualcomm the need for $50 million becomes a little clearer.

My opinion only so DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 47 users

Taproot

Regular
 
  • Like
  • Love
  • Fire
Reactions: 19 users

MDhere

Regular
Well its confirmed ive been aoproved my time off to attend the Agm once again 😀
 
  • Like
  • Fire
  • Haha
Reactions: 15 users

Love the wording.

Anil’s profile photo

Anil Mankar reposted this


BrainChip
BrainChip 8,600 followers
17h •
17 hours ago



BrainChip teams for AIoT - Teksun in India has joined the Essential AI Ecosystem around BrainChips Akida low power event-driven neural networking chip to develop a wide range of AI-featured devices in the Internet of Things (AIoT). eeNews Europe via Nick Flaherty
https://lnkd.in/e-TDVuKP
Another independent opinion of AKIDA technology being publicly stated by a significant industry player.

Does this opinion have greater weight than that expressed by anonymous posters and WANCAs you be the judge:

“Teksun is committed to innovation and dedicated to providing our customers with the most advanced IoT solutions,” said Brijesh Kamani, founder & CEO of Teksun.

“With BrainChip’s Akida processor, we will be able to deliver next-generation AIoT devices that are faster, more efficient, and more intelligent than ever before and not just meet, but exceed the expectations of our customers.”

Does this statement not make all the negative commentators no more than white noise to be tuned out.

I certainly believe so.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 85 users

BaconLover

Founding Member

The recent price drop (especially since the LDA ann) may not have played a major factor on this rebalance.

Unless we see some actual material sales, next re-balance would be worth noting.
 
  • Like
  • Fire
  • Love
Reactions: 7 users

IloveLamp

Top 20
  • Like
  • Love
  • Wow
Reactions: 11 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Talking about the new Mercedes operating system MBOS. "The entire operating system would be a game changer, Iyer stated."


Screen Shot 2023-03-04 at 11.50.561png.png


Screen Shot 2023-03-04 at 11.51.12 2png..png



Screen Shot 2023-03-04 at 11.51.23png..png


 
  • Like
  • Love
Reactions: 15 users
Had my 1st chance to take a look into Taksun this morning and only one word springs to mind


Now try and find the time, to look into the company we actually partnered with 🙄..

Hopefully you're not underwhelmed..

jim-carrey-stupid-stupid.gif
 
  • Like
  • Haha
Reactions: 11 users

Gazzafish

Regular
BMW I7. Autonomous driving and tracks eye movement… interesting

 
  • Like
  • Fire
Reactions: 13 users
Top Bottom