BRN Discussion Ongoing

mrgds

Regular
LETS GET THIS PARTY STARTED ..........................

BOOOOOOOOOOM ...................... .44c


AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 23 users

charles2

Regular
LETS GET THIS PARTY STARTED ..........................

BOOOOOOOOOOM ...................... .44c


AKIDA BALLISTA
My take.........

THE PARTY HAS ALREADY STARTED
 
Last edited:
  • Like
  • Haha
  • Fire
Reactions: 15 users

Tothemoon24

Top 20

Prophesee Announces Collaboration with Qualcomm to Optimize Neuromorphic Vision Technologies For the Next Generation of Smartphones, Unlocking a New Image Quality Paradigm for Photography and Video​




Companies agree on a multi-year collaboration to enable native compatibility between Prophesee’s Event-Based Metavision® Sensors & Software and premium Snapdragon® mobile platforms.


Psee-Qualcomm.jpg

Highlights



  • The world is neither raster-based nor frame-based. Inspired by the human eye, Prophesee Event-Based sensors repair motion blur and other image quality artefacts caused by conventional sensors, especially in high dynamic scenes and low lightconditions bringing Photography and Video closer to our true experiences.
  • Collaborating with Qualcomm Technologies, Inc., a leading provider of premium mobile technologies, to help accelerate mobile industry adoption of Prophesee’s solutions.
  • Companies join forces to optimize Prophesee’s neuromorphic Event-Based Metavision Sensors and software for use with the premium Snapdragon mobile platforms. Development kits expected to be available from Prophesee this year.
Prophesee-Image-Deblurring-Mobile-Event-Based-Metavision-HD.jpg


PARIS, February 27, 2023 – Prophesee today announced a collaboration with Qualcomm Technologies, Inc. that will optimize Prophesee’s Event-based Metavision sensors for use with premium Snapdragon® mobile platforms to bring the speed, efficiency, and quality of neuromorphic-enabled vision to mobile devices.
The technical and business collaboration will provide mobile device developers a fast and efficient way to leverage the Prophesee sensor’s ability to dramatically improve camera performance, particularly in fast-moving dynamic scenes (e.g. sport scenes) and in low light, through its breakthrough event-based continuous and asynchronous pixel sensing approach. Prophesee is working on a development kit to support the integration of the Metavision sensor technology for use with devices that contain next generation Snapdragon platforms.



Prophesee is a clear leader in applying neuromorphic techniques to address limitations of traditional cameras and improve the overall user experience. We believe this is game-changing technology for taking mobile photography to the next leveland our collaboration on both the technical and business levels will help drive adoption by leading OEMs,” said Judd Heape, VP, Product Management at Qualcomm Technologies, Inc. “Their pioneering achievements with event cameras’ shutter-free capability offer a significant enhancement to the quality of photography available in the next generation of mobile devices powered by Snapdragon, even in the most demanding environments, unlocking a range of new possibilities for Snapdragon customers.”

Video 2 – Judd Heape, VP, Product Management at Qualcomm Technologies, introducing Prophesee Metavision technologies during Snapdragon Summit.
“We are excited to be working with the provider of one of the world’s most popular mobile platforms to incorporate event-based vision into the Snapdragon ecosystem. Through this collaboration, product developers will be able to dramatically enhance the user experience with camerasthat deliver image quality and operational excellence not available using just traditional frame-based methods,” said Luca Verre, CEO and co-founder of Prophesee.
How it works
Prophesee’s breakthrough sensors add a new sensing dimension to mobile photography. They change the paradigm in traditional image capture by focusing only on changes in a scene, pixel by pixel, continuously, at extreme speeds.

Each pixel in the Metavision sensor embeds a logic core, enabling it to act as a neuron.
They each activate themselves intelligently and asynchronouslydepending on the amount of photons they sense. A pixel activating itself is called an event. In essence, events are driven by the scene’s dynamics, not an arbitrary clock anymore, so the acquisition speed always matches the actual scene dynamics.
High-performance event-based deblurring is achieved by synchronizing a frame-based and Prophesee’s event-based sensor. The system then fills the gaps between and inside the frames with microsecond events to algorithmically extract pure motion information and repair motion blur.

Availability


A development kit featuring compatibility with Prophesee sensor technologies is expected to be available this year.
 
  • Like
  • Fire
  • Love
Reactions: 46 users
Welcome to the ASX300 Dronesheild..
Edit - All Ords, not ASX300

20240304_145810.jpg


20240304_145837.jpg


And they have strong revenue growth, have just become strongly profitable and have a very healthy balance sheet.

20240304_150156.jpg


I don't have shares in them, but have kept an eye on them, since being mentioned in a BrainChip post, a few years ago..

I remember commenting, that the technological advantage of their products, was at risk from more intelligent drones.
So they would have to be across what our technology can do for drones and possibly themselves.

The shorters salivate, whenever a new company enters one of those indices.
 
Last edited:
  • Like
  • Sad
  • Love
Reactions: 15 users

Teach22

Regular
Welcome to the ASX300 Dronesheild..

View attachment 58400

View attachment 58401

And they have strong revenue growth, have just become strongly profitable and have a very healthy balance sheet.

View attachment 58402

I don't have shares in them, but have kept an eye on them, since being mentioned in a BrainChip post, a few years ago..

I remember commenting, that the technological advantage of their products, was at risk from more intelligent drones.
So they would have to be across what our technology can do for drones and possibly themselves.

The shorters salivate, whenever a new company enters one of those indices.
Since when was/is DRO admitted to the ASX300?
 

Diogenese

Top 20

Prophesee Announces Collaboration with Qualcomm to Optimize Neuromorphic Vision Technologies For the Next Generation of Smartphones, Unlocking a New Image Quality Paradigm for Photography and Video​




Companies agree on a multi-year collaboration to enable native compatibility between Prophesee’s Event-Based Metavision® Sensors & Software and premium Snapdragon® mobile platforms.


Psee-Qualcomm.jpg

Highlights



  • The world is neither raster-based nor frame-based. Inspired by the human eye, Prophesee Event-Based sensors repair motion blur and other image quality artefacts caused by conventional sensors, especially in high dynamic scenes and low lightconditions bringing Photography and Video closer to our true experiences.
  • Collaborating with Qualcomm Technologies, Inc., a leading provider of premium mobile technologies, to help accelerate mobile industry adoption of Prophesee’s solutions.
  • Companies join forces to optimize Prophesee’s neuromorphic Event-Based Metavision Sensors and software for use with the premium Snapdragon mobile platforms. Development kits expected to be available from Prophesee this year.
Prophesee-Image-Deblurring-Mobile-Event-Based-Metavision-HD.jpg


PARIS, February 27, 2023 – Prophesee today announced a collaboration with Qualcomm Technologies, Inc. that will optimize Prophesee’s Event-based Metavision sensors for use with premium Snapdragon® mobile platforms to bring the speed, efficiency, and quality of neuromorphic-enabled vision to mobile devices.
The technical and business collaboration will provide mobile device developers a fast and efficient way to leverage the Prophesee sensor’s ability to dramatically improve camera performance, particularly in fast-moving dynamic scenes (e.g. sport scenes) and in low light, through its breakthrough event-based continuous and asynchronous pixel sensing approach. Prophesee is working on a development kit to support the integration of the Metavision sensor technology for use with devices that contain next generation Snapdragon platforms.





Video 2 – Judd Heape, VP, Product Management at Qualcomm Technologies, introducing Prophesee Metavision technologies during Snapdragon Summit.

How it works
Prophesee’s breakthrough sensors add a new sensing dimension to mobile photography. They change the paradigm in traditional image capture by focusing only on changes in a scene, pixel by pixel, continuously, at extreme speeds.

Each pixel in the Metavision sensor embeds a logic core, enabling it to act as a neuron.
They each activate themselves intelligently and asynchronouslydepending on the amount of photons they sense. A pixel activating itself is called an event. In essence, events are driven by the scene’s dynamics, not an arbitrary clock anymore, so the acquisition speed always matches the actual scene dynamics.
High-performance event-based deblurring is achieved by synchronizing a frame-based and Prophesee’s event-based sensor. The system then fills the gaps between and inside the frames with microsecond events to algorithmically extract pure motion information and repair motion blur.

Availability


A development kit featuring compatibility with Prophesee sensor technologies is expected to be available this year.

This is the Prophesee patent for combining the DVS with frame camera.

US2022329771A1 METHOD OF PIXEL-BY-PIXEL REGISTRATION OF AN EVENT CAMERA TO A FRAME CAMERA 20210402

1709526743592.png


A method for registering pixels provided in a pixel event stream comprising:
acquiring image frames from a frame-based camera, each image frame being generated using an exposure period;
generating a first point matrix from one or more of the image frames, the first point matrix being associated with an acquisition period of the image frames;
acquiring a pixel event stream generated during the acquisition period;
generating a second point matrix from pixel events of the pixel event stream, occurring during the acquisition period of the first point matrix;
computing a correlation scoring function applied to at least a part of the points of the first and second point matrices, and
estimating respective positions of points of the second point matrix in the first point matrix, due to depths of the points of the first point matrix related to the second point matrix, by maximizing the correlation scoring function
.


FIG. 4 depicts the exposure timings of a vertical rolling shutter sensor. Each row R 0 , R 1 , . . . , Rn of pixels of the rolling shutter sensor is exposed during an exposure period of the same duration Tr, shown in FIG. 4 by a rectangle RE. The start of exposure period RE of each row R 0 -Rn is offset by the rolling shutter skew Rs compared to the previous row, In contrast, the exposure periods of the pixel rows in a global shutter sensor are all identical, as shown in FIG. 4 by a single central rectangle GE. The duration T of the exposure period of the global shutter sensor may be the same as or different from the duration Tr of the row exposure periods.

If the sensor has a vertical rolling shutter starting from top to bottom, and the y coordinate varies from 0 (top of the sensor) to height-1 (bottom or last row of the sensor), function Es(x,y) can be computed as Es(x,y)=ts+y·Rs. Function Es(x,y) has a same value for all pixels lying on the same row.

Function Ee(x,y) can be computed as Ee(x,y)=Es(x,y)+Tr, where Tr is the duration of the exposure period of one row of image I. Alternatively, function Ee can also be computed as Ee(x,y)=te+y·Rs, to being the time of the end of the exposure period at the first row of the frame I.

It can be observed that the point matrix J is defined only for points corresponding to pixel events occurring during the exposure period T. The other points are undefined and may be set to any arbitrary value, for example 0 or the value of the nearest defined point
.
 
  • Like
  • Fire
  • Thinking
Reactions: 22 users

Shadow59

Regular
  • Like
Reactions: 1 users

Teach22

Regular
  • Like
Reactions: 3 users
Since when was/is DRO admitted to the ASX300?
March rebalance.
Edit - this is for DRO's inclusion into the All Ords, not ASX300.

20240304_152958.jpg


It's supposed to be effective from the 18th of March, but some getting in early..

Since they are in such a strong position, these shorts are probably more likely to be pushing the price down, for when they have to balance their funds and the price may rise on actual inclusion.
 
Last edited:
  • Like
  • Fire
Reactions: 6 users

Teach22

Regular
Just to clarify.
There is a difference between ASX200, ASX300 and All Ords.
WBT will be removed from the ASX200 but is still in the ASX300.
DRO will be added to the All Ords, not the ASX300
 
  • Like
Reactions: 7 users
Just to clarify.
There is a difference between ASX200, ASX300 and All Ords.
WBT will be removed from the ASX200 but is still in the ASX300.
DRO will be added to the All Ords, not the ASX300
"DRO will be added to the All Ords, not the ASX300"
Edit - I read the rebalance incorrectly.

Man you're smarter than that..
I'm pretty sure the All Ords, represents pretty much everything?

Edit..
What is the difference between the S&P/ASX 200 and the All Ordinaries? The S&P/ASX 200 index is rebalanced every quarter and has a set minimum market capitalisation and liquidity requirement. The All Ordinaries index is rebalanced annually and consists of the 500 largest ASX listed stocks by market capitalisation.

Read the March quarterly rebalance announcement.
Edit - He read it better than me.
 
Last edited:
  • Like
Reactions: 1 users

Teach22

Regular
"DRO will be added to the All Ords, not the ASX300"

Man you're smarter than that..
I'm pretty sure the All Ords, represents pretty much everything?

Read the March quarterly rebalance announcement.

All Ords is, I believe, the top 500 companies.
 
Last edited:
  • Like
  • Love
Reactions: 5 users
All Ords is, I believe, the top 500 companies.
Yes you are correct, it was their entry into the All Ords, not ASX300.

I will edit my previous posts 👍

I apologise to you 👍
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 11 users

7für7

Top 20
I’m so glad this forum is all about BRN 🙋🏻‍♂️
 
  • Haha
  • Like
  • Love
Reactions: 17 users
  • Haha
  • Like
Reactions: 25 users

wilzy123

Founding Member
Maybe tell your friend to follow their own advice. Frankly, if they can't be more generous and less of a narcissist, if they can't be slightly distracted by the improving SP, then move on. Why do they want to target individual contributors anyway?

At least you're consistent

72884c7f98149bd422e488510277f2b0b9-20-dumpster-fire.rsquare.w700.gif
 
  • Haha
  • Like
  • Fire
Reactions: 11 users

Boab

I wish I could paint like Vincent
In an article in the Weekend Australian it was reported that a group of Australian investors from James Packers CPH to Macquarrie had amassed a combined $US10Bn ($15.4Bn AUD) holding in Nvidia.
Using this forums 1% rule 15,400,000,000 div by 100 = 154,000,000 @40c/share is 385,000,000 shares.
It's easy to see how these institutional buyers can amass huge amounts of shares for what appears to be a tiny part of their portfolio.
On top of all the news that we know about it will only take one decent ASX announcement to put a rocket under this share price and all the faithful retail holders will be handsomely rewarded.
GLTA
Cheers
 
  • Like
  • Love
  • Fire
Reactions: 44 users

mrgds

Regular
THANKS @Bravo ,,,,,,,,,,,,,,,,,,,,,,,,,,,,,Job well done today !!!!!

You deserve this well earned cat-nap ..........................................(y)

Screenshot (81).png



Just make sure your up and at em again tomorrow !



BTW ............ any notice the XT after the auction of nearly 1.8million @ .435 ? :eek:;)

AKIDA BALLISTA
 
  • Like
  • Haha
  • Wow
Reactions: 32 users

Sirod69

bavarian girl ;-)

Easing automotive software migration: From discrete ECUs to Zonal Controllers in emerging EE architectures​


....

One way to achieve FFI is by sandboxing each software component into virtual machines isolated by a separation kernel. Armv8-R supports this feature by means of real-time virtualization. Through using a hypervisor, or simpler separation kernel, on Armv8-R based processors, like Cortex-R52 and Cortex-R52+, it is possible to achieve FFI among multiple software workloads.

Therefore, Cortex-R52 and Cortex-R52+ processors offer an ideal platform for building zonal controllers that can be useful for deploying multiple software workloads, which are currently running on discrete ECUs, many of which are based on Arm Cortex-M processors. For more information on virtualization supported by Armv8-R, please refer to the Best Practices for Armv8-R Cortex-R52+ Software Consolidation.
1709537042231.png

 
  • Like
  • Thinking
  • Fire
Reactions: 10 users
Top Bottom