BRN Discussion Ongoing

Damo4

Regular
Since Management began giving ASX the silent treatment on news feeds about 13mths ago, daily volume has dropped from 200m peak to around 5m these days.
IMO the past 13mths trading has been dominated by selling pressure.
The broader market does not have a BRN news feed but may screen ASX for Company announcements.
By managements ASX silent treatment , we have been missing out on new interest from the broader market that may have provided some balance against the selling pressure which could have reduced our share price depreciation over that period. IMO
We were rich last year and looking forward to multiples more but have rapidly suffered a 75% fall from those highs in the silent period.
Is it any wonder some astute investors have concerns? Especially while targets continue being missed.
If Sean gets a $450k incentive bonus as share at todays price he will receive 300% more share than he would have at last years highs.

200m shares trading hands at over $2 is not organic, it was hype.
Pumping results in dumping unless something backs it up - eg revenue.
Releasing loose announcements designed to prop up the SP, will just delay the inevitable.
Plus most of you seem to think the reported revenue isn't good enough, so what would have happened on the day the 4C's and 4E's are released whilst we are propped up at $2?
Finally forcing management to spend time worrying about the SP is time they could be spending attracting clients or closing deals.
I doubt a sudden $5B MC means the likes of Qualcomm decide to finally do business with Brainchip. It's the product/service that matters and I'm glad that's their focus.
It means with a little patience we will all be handsomely rewarded. Despite the bickering here, we all have a collective goal to build wealth and a stable, no fluff company that delivers on promises will see this in the end.

Perhaps I see investing differently but I don't hold management liable for the SP.
Results are important and the Mercedes Press release artificially lifted the SP to the all time high.
That never would have happened if we let the reports speak for themselves.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 18 users

BaconLover

Founding Member
I think "the secret message group" isn't a secret message group as such, but the "Conversations" tab at top banner to the right of users name. Anyone can open up a conversation with someone else if the other is willing to reply. Or am I wrong and there is a secret group plotting Brainchips explosive growth with timelines, forecasts, NDA's, EAP, oh wait that maybe Brainchip itself. IMO

No idea Rskiff, I have had private messages, but I am not part of a group, so don't know what's shared there, what's not.
I was only replying to what Damo said in a message earlier ''the useful stuff is shared via private message rather than on these discussion threads.''
Whether someone is a part of it or not is not important, but the fact that such a statement was made is interesting.
 
  • Like
Reactions: 6 users

Lex555

Regular
Ann announcement could only be made if there is a binding contract or license signed. $$ or numbers could not be mentioned in an ann unless they can be substantiated to the ASX satisfaction. They will not allow another 'Get Swift' to get through.

I tend to agree. If you don't bring the 'bacon' home you don't get to eat. Our stock is volatile and the SP flies on a 'sniff' of revenue and falls like a stone when it does not materialise. That is why the ASX is hard on ANN's by BRN. They do not want another 'Get Swift' issue.
Having said that BRN with AKIDA in my opinion is/was simply ahead of its time. That is why NASA is on board. They want cutting edge stuff.
Great invention and we are just waiting for the rest of the World to catch up.
This ChatGP 'thingo' has wakened the World to AI and we are one up being at the Edge.
I can only speak for myself but for the first time i can actually sense we are getting closer to revenue agreements.
With World awakening there will definitely some who want to commercialise quickly for a 1st in which will lead to some revenue.
I think hindsight will say we were smart to hire Sean Hehir who quickly transformed Marketing and Sales.
From the podcast they hinted consumables and industrial would be short term wins.
“This ChatGP 'thingo' has wakened the World to AI and we are one up being at the Edge.”

Great point, of all the headwinds in 2023 the revelation that has been ChatGPT and other large language models LLM is putting AI front and centre of every new start-up around the world. As a well-known VC commentator recently noted, if a software company doesn’t include AI in their pitch deck or at least discussed leveraging off of it they don’t want to know about it.

I believe this will cause companies to deep dive into the AI sector to differentiate themselves from other LLMs users, and what better way than a bit of science fiction mixed in.

This is a paradigm shift, turbo charging the tech industry
 
  • Like
  • Fire
Reactions: 10 users

Damo4

Regular
I think "the secret message group" isn't a secret message group as such, but the "Conversations" tab at top banner to the right of users name. Anyone can open up a conversation with someone else if the other is willing to reply. Or am I wrong and there is a secret group plotting Brainchips explosive growth with timelines, forecasts, NDA's, EAP, oh wait that maybe Brainchip itself. IMO
"edit" , we are the secret group here at TSex so am thankful for the opportunity that zeebot provided us. The vast majority of shareholders probably don't know this even exits!!
No idea Rskiff, I have had private messages, but I am not part of a group, so don't know what's shared there, what's not.
I was only replying to what Damo said in a message earlier ''the useful stuff is shared via private message rather than on these discussion threads.''
Whether someone is a part of it or not is not important, but the fact that such a statement was made is interesting.

There you go BL, perhaps I wasn't very clear, but Rskiff gets it.
"shared via private message" means just that.
You used the words "group"
 

BaconLover

Founding Member
There you go BL, perhaps I wasn't very clear, but Rskiff gets it.
"shared via private message" means just that.
You used the words "group"
You're also able to add as many people as you want to this ''private message''.
If you are not in it, you would not know it is a group or not either, would you?

Again, no worries, it is not unconstitutional, it's a free country/forum.
 
  • Like
Reactions: 7 users

Damo4

Regular
You're also able to add as many people as you want to this ''private message''.
If you are not in it, you would not know it is a group or not either, would you?

Again, no worries, it is not unconstitutional, it's a free country/forum.

.

Edit: I also need to stop pursuing this useless conversation.
BRN is doing exactly what I need it to do.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 13 users
Far out

So much hate here

Get over yourselfs

Go find something useful to do

This use to be happy place when I had a spare 10 minutes

Not anymore
 
  • Like
  • Fire
  • Love
Reactions: 21 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Howdy All,

This is showing as having been published 3 hours ago. I don't think it's been posted, but if so, I'll delete.

💋


Neuromorphic vision sensors are coming to smartphones​

By Dan O'SheaFeb 28, 2023 03:45pm
Prophesee SAQualcomm Snapdragonneuromorphicimage sensors
neuromorphic image sensors


A new partnership teaming Prophesee and Qualcomm will optimize event-based neuromorphic vision sensors for use in mobile devices, resulting in better images from device cameras. (Prophesee).

Prophesee, a provider of event-based neuromorphic vision sensor technology, announced a partnership with Qualcomm Technologies at the massive Mobile World Congress event in Barcelona, Spain, this week, a collaboration that will see Prophesee’s Metavision sensors optimized for use with Qualcomm’s Snapdragon platform to bring neuromorphic-enabled image capture capabilities to mobile devices.

Event-based neuromorphic vision sensor technology could be a game changer for camera performance, with neuromorphic capabilities processing movement and moments closer to the way the human brain processes them–with less blurring and more clarity than a frame-based camera can manage. Specifically, the technology allows cameras to perform better while capturing fast movements and scenes in low lighting in their photos and videos. For the most part, the consumer marketplace is still waiting to get their hands on devices with these capabilities, but the new partnership means that wait is growing shorter.

Later this year, the Prophesee plans to release a development kit to support the integration of the Metavision sensor technology for use with devices that contain next-generation Snapdragon platforms. After that, it will not be too much longer before consumers can experience the benefits of event-based neuromorphic vision sensor technology themselves.

Luca Verre, co-founder and CEO of Prophesee, told Fierce Electronics, “We expect phones with this feature/capability to be in the market by 2024. It will likely appear in ‘flagship’ models first.”

When the technology arrives, it is not expected to replace traditional frame-based sensors, but instead work in tandem with them, with much of the camera performance improvement coming through Prophesee’s event-based continuous and asynchronous pixel sensing approach which will help in the “de-blurring of images” and the highlighting of focal points that otherwise might become lost where low lighting intrudes on a captured moment.


That could make consumers much happier about the quality of the pictures they take on their mobile devices, although there is a good chance they may not even know they will have neuromorphic sensors to thank for the improvement, as they probably will not have to think about switching into a different photo capture mode to take advantage of neuromorphic sensing.

“It’s unlikely that smartphones would have a ‘neuromorphic mode,’ but instead would work seamlessly with the existing image capture capabilities - but, in theory, that could be something the OEM could consider,” Verre said. “Note that using an event based camera actually reduces the amount of data processed, capturing only things in a scene that move, which are often ‘invisible’ to traditional cameras, so it is largely an augmentation of traditional frame based cameras (and other sensors, such as lidar in a car), not a replacement, especially in consumer applications.”

These sensors already are used in other kinds of applications, including business and industrial use cases such as security cameras, surveillance, preventative maintenance, vibration monitoring, high speed counting, and others where event cameras can work “as a standalone machine vision option,” Verre said, adding, “There is vast potential in the idea of sensor fusion, combining event-based sensors with other types of sensors, like frame-based cameras.”

The Qualcomm partnership comes almost a year after Prophesee announced a partnership with Sony that revolved around enabling improved integration of event-based sensing technology into devices, and Verre said the migration of the technology to mobile phones likely will be smoother as a result of the earlier partnership. Working with Sony, a leading CMOS sensor provider for the mobile industry, helped make the sensors “more applicable for mobile (smaller size, lower power, etc.) with 3D stacking manufacturing techniques,” he said.

Prophesee also sees the technology as having potential in other mobile and wearable device applications, such as augmented reality headsets. Prophesee already is talking to OEMs about moving in that direction, and Verre said the company believes that “by enabling a paradigm shift in sensing modalities with this approach, there are countless applications that can benefit.”

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 90 users
D

Deleted member 2799

Guest
I simply cannot believe that in today's age, there are still people who invest in stocks and base their actions on what anonymous individuals write in a forum. Not only because it is always emphasized that one should do their own research, but also because it is frightening to think that one is entrusting their hard-earned money to strangers without knowing them. A forum serves to exchange information in order to get a rough idea of the company and to catch up on anything missed. But it is by no means a platform for investment advice! So don't blame those who post positive or negative comments here, but blame yourselves because you allow yourselves to be influenced. Seek out an investment advisor and attend seminars if you don't know how the stock market works! Just my opinion... no trading recommendations.
 
  • Like
  • Fire
  • Love
Reactions: 22 users
D

Deleted member 118

Guest
@DingoBorat up to 10 more bloody days. Hopefully we will be at 38c lol
03E877E5-471E-485E-BFAD-729D0A8C9527.png
 
  • Haha
  • Like
Reactions: 6 users

BaconLover

Founding Member
  • Haha
  • Like
Reactions: 17 users
@DingoBorat up to 10 more bloody days. Hopefully we will be at 38c lol View attachment 30881
Yeah I got that one too..

"There is nothing additional you need to do"

Until they send you an email saying something like..

"There's a problem with completing your application blah blah go into your nearest branch, sit on a chair for half an hour and then verify/change a couple of things, or you could just send an email, which we will reply to within 5 business days and then ask you to come into a branch"..

Check your emails regularily.

I'd call every couple of days and check everything is on track 👍
 
  • Like
  • Haha
Reactions: 5 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Wonder if they used Prophesee's camera?

Extract
In a new study, researchers at the Indian Institute of Science (IISc) show how a brain-inspired image sensor can go beyond the diffraction limit of light to detect miniscule objects such as cellular components or nanoparticles invisible to current microscopes.

 
  • Like
Reactions: 8 users

Dozzaman1977

Regular
Howdy All,

This is showing as having been published 3 hours ago. I don't think it's been posted, but if so, I'll delete.

💋


Neuromorphic vision sensors are coming to smartphones​

By Dan O'SheaFeb 28, 2023 03:45pm
Prophesee SAQualcomm Snapdragonneuromorphicimage sensors
neuromorphic image sensors


A new partnership teaming Prophesee and Qualcomm will optimize event-based neuromorphic vision sensors for use in mobile devices, resulting in better images from device cameras. (Prophesee).

Prophesee, a provider of event-based neuromorphic vision sensor technology, announced a partnership with Qualcomm Technologies at the massive Mobile World Congress event in Barcelona, Spain, this week, a collaboration that will see Prophesee’s Metavision sensors optimized for use with Qualcomm’s Snapdragon platform to bring neuromorphic-enabled image capture capabilities to mobile devices.

Event-based neuromorphic vision sensor technology could be a game changer for camera performance, with neuromorphic capabilities processing movement and moments closer to the way the human brain processes them–with less blurring and more clarity than a frame-based camera can manage. Specifically, the technology allows cameras to perform better while capturing fast movements and scenes in low lighting in their photos and videos. For the most part, the consumer marketplace is still waiting to get their hands on devices with these capabilities, but the new partnership means that wait is growing shorter.

Later this year, the Prophesee plans to release a development kit to support the integration of the Metavision sensor technology for use with devices that contain next-generation Snapdragon platforms. After that, it will not be too much longer before consumers can experience the benefits of event-based neuromorphic vision sensor technology themselves.

Luca Verre, co-founder and CEO of Prophesee, told Fierce Electronics, “We expect phones with this feature/capability to be in the market by 2024. It will likely appear in ‘flagship’ models first.”

When the technology arrives, it is not expected to replace traditional frame-based sensors, but instead work in tandem with them, with much of the camera performance improvement coming through Prophesee’s event-based continuous and asynchronous pixel sensing approach which will help in the “de-blurring of images” and the highlighting of focal points that otherwise might become lost where low lighting intrudes on a captured moment.


That could make consumers much happier about the quality of the pictures they take on their mobile devices, although there is a good chance they may not even know they will have neuromorphic sensors to thank for the improvement, as they probably will not have to think about switching into a different photo capture mode to take advantage of neuromorphic sensing.

“It’s unlikely that smartphones would have a ‘neuromorphic mode,’ but instead would work seamlessly with the existing image capture capabilities - but, in theory, that could be something the OEM could consider,” Verre said. “Note that using an event based camera actually reduces the amount of data processed, capturing only things in a scene that move, which are often ‘invisible’ to traditional cameras, so it is largely an augmentation of traditional frame based cameras (and other sensors, such as lidar in a car), not a replacement, especially in consumer applications.”

These sensors already are used in other kinds of applications, including business and industrial use cases such as security cameras, surveillance, preventative maintenance, vibration monitoring, high speed counting, and others where event cameras can work “as a standalone machine vision option,” Verre said, adding, “There is vast potential in the idea of sensor fusion, combining event-based sensors with other types of sensors, like frame-based cameras.”

The Qualcomm partnership comes almost a year after Prophesee announced a partnership with Sony that revolved around enabling improved integration of event-based sensing technology into devices, and Verre said the migration of the technology to mobile phones likely will be smoother as a result of the earlier partnership. Working with Sony, a leading CMOS sensor provider for the mobile industry, helped make the sensors “more applicable for mobile (smaller size, lower power, etc.) with 3D stacking manufacturing techniques,” he said.

Prophesee also sees the technology as having potential in other mobile and wearable device applications, such as augmented reality headsets. Prophesee already is talking to OEMs about moving in that direction, and Verre said the company believes that “by enabling a paradigm shift in sensing modalities with this approach, there are countless applications that can benefit.”

FINGERS CROSSED 🤞
It would be nice to see a partnership with a fortune 500 company or household name drop at some stage (via social media of course!!!! )
 
  • Like
  • Haha
  • Love
Reactions: 17 users

Learning

Learning to the Top 🕵‍♂️
  • Like
  • Love
  • Fire
Reactions: 17 users

equanimous

Norse clairvoyant shapeshifter goddess
Howdy All,

This is showing as having been published 3 hours ago. I don't think it's been posted, but if so, I'll delete.

💋


Neuromorphic vision sensors are coming to smartphones​

By Dan O'SheaFeb 28, 2023 03:45pm
Prophesee SAQualcomm Snapdragonneuromorphicimage sensors
neuromorphic image sensors


A new partnership teaming Prophesee and Qualcomm will optimize event-based neuromorphic vision sensors for use in mobile devices, resulting in better images from device cameras. (Prophesee).

Prophesee, a provider of event-based neuromorphic vision sensor technology, announced a partnership with Qualcomm Technologies at the massive Mobile World Congress event in Barcelona, Spain, this week, a collaboration that will see Prophesee’s Metavision sensors optimized for use with Qualcomm’s Snapdragon platform to bring neuromorphic-enabled image capture capabilities to mobile devices.

Event-based neuromorphic vision sensor technology could be a game changer for camera performance, with neuromorphic capabilities processing movement and moments closer to the way the human brain processes them–with less blurring and more clarity than a frame-based camera can manage. Specifically, the technology allows cameras to perform better while capturing fast movements and scenes in low lighting in their photos and videos. For the most part, the consumer marketplace is still waiting to get their hands on devices with these capabilities, but the new partnership means that wait is growing shorter.

Later this year, the Prophesee plans to release a development kit to support the integration of the Metavision sensor technology for use with devices that contain next-generation Snapdragon platforms. After that, it will not be too much longer before consumers can experience the benefits of event-based neuromorphic vision sensor technology themselves.

Luca Verre, co-founder and CEO of Prophesee, told Fierce Electronics, “We expect phones with this feature/capability to be in the market by 2024. It will likely appear in ‘flagship’ models first.”

When the technology arrives, it is not expected to replace traditional frame-based sensors, but instead work in tandem with them, with much of the camera performance improvement coming through Prophesee’s event-based continuous and asynchronous pixel sensing approach which will help in the “de-blurring of images” and the highlighting of focal points that otherwise might become lost where low lighting intrudes on a captured moment.


That could make consumers much happier about the quality of the pictures they take on their mobile devices, although there is a good chance they may not even know they will have neuromorphic sensors to thank for the improvement, as they probably will not have to think about switching into a different photo capture mode to take advantage of neuromorphic sensing.

“It’s unlikely that smartphones would have a ‘neuromorphic mode,’ but instead would work seamlessly with the existing image capture capabilities - but, in theory, that could be something the OEM could consider,” Verre said. “Note that using an event based camera actually reduces the amount of data processed, capturing only things in a scene that move, which are often ‘invisible’ to traditional cameras, so it is largely an augmentation of traditional frame based cameras (and other sensors, such as lidar in a car), not a replacement, especially in consumer applications.”

These sensors already are used in other kinds of applications, including business and industrial use cases such as security cameras, surveillance, preventative maintenance, vibration monitoring, high speed counting, and others where event cameras can work “as a standalone machine vision option,” Verre said, adding, “There is vast potential in the idea of sensor fusion, combining event-based sensors with other types of sensors, like frame-based cameras.”

The Qualcomm partnership comes almost a year after Prophesee announced a partnership with Sony that revolved around enabling improved integration of event-based sensing technology into devices, and Verre said the migration of the technology to mobile phones likely will be smoother as a result of the earlier partnership. Working with Sony, a leading CMOS sensor provider for the mobile industry, helped make the sensors “more applicable for mobile (smaller size, lower power, etc.) with 3D stacking manufacturing techniques,” he said.

Prophesee also sees the technology as having potential in other mobile and wearable device applications, such as augmented reality headsets. Prophesee already is talking to OEMs about moving in that direction, and Verre said the company believes that “by enabling a paradigm shift in sensing modalities with this approach, there are countless applications that can benefit.”

Nice find
 
  • Like
Reactions: 5 users

Steve10

Regular
Howdy All,

This is showing as having been published 3 hours ago. I don't think it's been posted, but if so, I'll delete.

💋


Neuromorphic vision sensors are coming to smartphones​

By Dan O'SheaFeb 28, 2023 03:45pm
Prophesee SAQualcomm Snapdragonneuromorphicimage sensors
neuromorphic image sensors


A new partnership teaming Prophesee and Qualcomm will optimize event-based neuromorphic vision sensors for use in mobile devices, resulting in better images from device cameras. (Prophesee).

Prophesee, a provider of event-based neuromorphic vision sensor technology, announced a partnership with Qualcomm Technologies at the massive Mobile World Congress event in Barcelona, Spain, this week, a collaboration that will see Prophesee’s Metavision sensors optimized for use with Qualcomm’s Snapdragon platform to bring neuromorphic-enabled image capture capabilities to mobile devices.

Event-based neuromorphic vision sensor technology could be a game changer for camera performance, with neuromorphic capabilities processing movement and moments closer to the way the human brain processes them–with less blurring and more clarity than a frame-based camera can manage. Specifically, the technology allows cameras to perform better while capturing fast movements and scenes in low lighting in their photos and videos. For the most part, the consumer marketplace is still waiting to get their hands on devices with these capabilities, but the new partnership means that wait is growing shorter.

Later this year, the Prophesee plans to release a development kit to support the integration of the Metavision sensor technology for use with devices that contain next-generation Snapdragon platforms. After that, it will not be too much longer before consumers can experience the benefits of event-based neuromorphic vision sensor technology themselves.

Luca Verre, co-founder and CEO of Prophesee, told Fierce Electronics, “We expect phones with this feature/capability to be in the market by 2024. It will likely appear in ‘flagship’ models first.”

When the technology arrives, it is not expected to replace traditional frame-based sensors, but instead work in tandem with them, with much of the camera performance improvement coming through Prophesee’s event-based continuous and asynchronous pixel sensing approach which will help in the “de-blurring of images” and the highlighting of focal points that otherwise might become lost where low lighting intrudes on a captured moment.


That could make consumers much happier about the quality of the pictures they take on their mobile devices, although there is a good chance they may not even know they will have neuromorphic sensors to thank for the improvement, as they probably will not have to think about switching into a different photo capture mode to take advantage of neuromorphic sensing.

“It’s unlikely that smartphones would have a ‘neuromorphic mode,’ but instead would work seamlessly with the existing image capture capabilities - but, in theory, that could be something the OEM could consider,” Verre said. “Note that using an event based camera actually reduces the amount of data processed, capturing only things in a scene that move, which are often ‘invisible’ to traditional cameras, so it is largely an augmentation of traditional frame based cameras (and other sensors, such as lidar in a car), not a replacement, especially in consumer applications.”

These sensors already are used in other kinds of applications, including business and industrial use cases such as security cameras, surveillance, preventative maintenance, vibration monitoring, high speed counting, and others where event cameras can work “as a standalone machine vision option,” Verre said, adding, “There is vast potential in the idea of sensor fusion, combining event-based sensors with other types of sensors, like frame-based cameras.”

The Qualcomm partnership comes almost a year after Prophesee announced a partnership with Sony that revolved around enabling improved integration of event-based sensing technology into devices, and Verre said the migration of the technology to mobile phones likely will be smoother as a result of the earlier partnership. Working with Sony, a leading CMOS sensor provider for the mobile industry, helped make the sensors “more applicable for mobile (smaller size, lower power, etc.) with 3D stacking manufacturing techniques,” he said.

Prophesee also sees the technology as having potential in other mobile and wearable device applications, such as augmented reality headsets. Prophesee already is talking to OEMs about moving in that direction, and Verre said the company believes that “by enabling a paradigm shift in sensing modalities with this approach, there are countless applications that can benefit.”


Qualcomm mentioned 2023 launch but Prophesee has mentioned 2024.

Based on all the new phones models being released in February/March most likely Akida/Prophesee will be in top end models of Samsung, Oppo, Vivo etc next year.

So if the new models will be hitting stores in February/March 2024 they must go into production a few months earlier towards the end of CY2023.

In order to use Akida IP for production there will have to be a licence agreement signed between now & say October/November this year. All depends on production commencement date.
 
  • Like
  • Fire
  • Thinking
Reactions: 23 users

Susie

Emerged
I don’t normally post. But I’m like research almost daily while waiting Qualcomm deal come fruition one day.

I know it’s painful watching sp going trending down.

I’d suggest anyone who have a long term goal not to focus on the daily share price waving but searching any business progression on the way to your goal.
 

wilzy123

Founding Member
Howdy All,

This is showing as having been published 3 hours ago. I don't think it's been posted, but if so, I'll delete.

💋


Neuromorphic vision sensors are coming to smartphones​

By Dan O'SheaFeb 28, 2023 03:45pm
Prophesee SAQualcomm Snapdragonneuromorphicimage sensors
neuromorphic image sensors


A new partnership teaming Prophesee and Qualcomm will optimize event-based neuromorphic vision sensors for use in mobile devices, resulting in better images from device cameras. (Prophesee).

Prophesee, a provider of event-based neuromorphic vision sensor technology, announced a partnership with Qualcomm Technologies at the massive Mobile World Congress event in Barcelona, Spain, this week, a collaboration that will see Prophesee’s Metavision sensors optimized for use with Qualcomm’s Snapdragon platform to bring neuromorphic-enabled image capture capabilities to mobile devices.

Event-based neuromorphic vision sensor technology could be a game changer for camera performance, with neuromorphic capabilities processing movement and moments closer to the way the human brain processes them–with less blurring and more clarity than a frame-based camera can manage. Specifically, the technology allows cameras to perform better while capturing fast movements and scenes in low lighting in their photos and videos. For the most part, the consumer marketplace is still waiting to get their hands on devices with these capabilities, but the new partnership means that wait is growing shorter.

Later this year, the Prophesee plans to release a development kit to support the integration of the Metavision sensor technology for use with devices that contain next-generation Snapdragon platforms. After that, it will not be too much longer before consumers can experience the benefits of event-based neuromorphic vision sensor technology themselves.

Luca Verre, co-founder and CEO of Prophesee, told Fierce Electronics, “We expect phones with this feature/capability to be in the market by 2024. It will likely appear in ‘flagship’ models first.”

When the technology arrives, it is not expected to replace traditional frame-based sensors, but instead work in tandem with them, with much of the camera performance improvement coming through Prophesee’s event-based continuous and asynchronous pixel sensing approach which will help in the “de-blurring of images” and the highlighting of focal points that otherwise might become lost where low lighting intrudes on a captured moment.


That could make consumers much happier about the quality of the pictures they take on their mobile devices, although there is a good chance they may not even know they will have neuromorphic sensors to thank for the improvement, as they probably will not have to think about switching into a different photo capture mode to take advantage of neuromorphic sensing.

“It’s unlikely that smartphones would have a ‘neuromorphic mode,’ but instead would work seamlessly with the existing image capture capabilities - but, in theory, that could be something the OEM could consider,” Verre said. “Note that using an event based camera actually reduces the amount of data processed, capturing only things in a scene that move, which are often ‘invisible’ to traditional cameras, so it is largely an augmentation of traditional frame based cameras (and other sensors, such as lidar in a car), not a replacement, especially in consumer applications.”

These sensors already are used in other kinds of applications, including business and industrial use cases such as security cameras, surveillance, preventative maintenance, vibration monitoring, high speed counting, and others where event cameras can work “as a standalone machine vision option,” Verre said, adding, “There is vast potential in the idea of sensor fusion, combining event-based sensors with other types of sensors, like frame-based cameras.”

The Qualcomm partnership comes almost a year after Prophesee announced a partnership with Sony that revolved around enabling improved integration of event-based sensing technology into devices, and Verre said the migration of the technology to mobile phones likely will be smoother as a result of the earlier partnership. Working with Sony, a leading CMOS sensor provider for the mobile industry, helped make the sensors “more applicable for mobile (smaller size, lower power, etc.) with 3D stacking manufacturing techniques,” he said.

Prophesee also sees the technology as having potential in other mobile and wearable device applications, such as augmented reality headsets. Prophesee already is talking to OEMs about moving in that direction, and Verre said the company believes that “by enabling a paradigm shift in sensing modalities with this approach, there are countless applications that can benefit.”

Love it! Thanks @Bravo
 
  • Like
  • Love
Reactions: 11 users
Top Bottom