BRN Discussion Ongoing

stan9614

Regular
"...The second-generation Akida processor IP is available now from BrainChip for inclusion in any SoC and comes complete with a software stack tuned for this unique architecture. We encourage companies to investigate this technology, especially those implementing time series or sequential data applications."

....nice wording....!
I don't understand how Akida 2.0 IP can be available, while the company is yet to announcement general availability of Akida 2.0 by end of September?
 
  • Like
  • Fire
  • Thinking
Reactions: 5 users

Diogenese

Top 20
Prophesee has just released
an industrial event-based vision sensor

Skip to main content
Home

Social media links IMVE​

Piano.io account links​





Prophesee releases industrial-grade neuromorphic sensor​

Prophesee%20Metavision%20sensor%20packaged1.jpg

Prophesee has released what it says is the first industrial event-based vision sensor in a commercially viable, industry standard package.
Event-based vision technology takes a different imaging approach to traditional frame-based sensors. Each pixel in Prophesee’s Metavision sensor only activates if it detects a change in the scene – an event – which means low power, latency and data processing requirements compared to traditional frame-based systems.
The third-generation VGA sensor is aimed at developers of cameras for industrial automation and IoT systems such as robots, inspection equipment, monitoring, and surveillance devices.
Prophesee has been working with camera maker Imago Technologies, which has integrated Prophesee’s sensor inside its VisionCam smart camera. This Metavision sensor release opens up event-based imaging to many more camera makers and the wider industrial imaging market.
‘A lot has been said about the performance [of event-based sensors]; a lot of claims have been made, but most of this comes from academia,’ Luca Verre, co-founder and CEO of Prophesee, told Imaging and Machine Vision Europe. ‘When we started the company we worked to prove the technology, and it took effort to industrialise it. Now it’s ready to use.’
Prophesee was formed five years ago, originally under the name of Chronocam. It has raised $68m in investment to date, and now has more than 100 employees, most of them based in Paris, France, but with offices in Grenoble, Silicon Valley, Shanghai and Tokyo. Its neuromorphic vision technology – it mimics the function of biological sight – can run at an equivalent of 10,000 images per second, with high dynamic range, and excellent power efficiency.
Each pixel is independent and asynchronous, adjusting at the pixel level according to the dynamic and the light in the scene. If part of the scene is static – like the floor in an industrial setting – no information will be recorded. Slow changes in the scene will be sampled slowly; if something fast happens the pixel will react quickly.
Because the sensor is only recording dynamic events, the data volume is relatively low. ‘The amount of data we produce is orders of magnitude lower than what you would get with a frame camera,’ Verre said.
He said that the efficiency of the sensor means high-speed counting, operating at thousands of frames per second, can be run on a mobile system-on-chip, such as a Snapdragon 845. Imago’s VisionCam product carries out image processing internally on a computer board.
Prophesee is working on a project counting droplets being deposited very quickly on a target object. ‘We can do this at high precision and with low compute,’ Verre said.
Monitoring vibration of machines for predictive maintenance is another application Prophesee is involved in. The sensor can measure vibration frequency and amplitude in real time to give an indication of when a machine might be about to fail.
A more traditional approach would be to use an accelerometer, but an image sensor is non-intrusive and can monitor inaccessible parts of the machine or areas that get very hot.
Prophesee has also deployed its sensor with a large machine tools manufacturer to monitor laser welding. The exposure of each pixel in the Metavision sensor can be adjusted independently, giving a wide dynamic range, making it useful for imaging a bright weld spot.
The customer is looking to track the laser spot as it moves along the seam, as well as monitoring the spatter or debris produced by the weld – the size of debris particles can give an indication of the quality of the weld. The aim is for the system to provide a closed feedback loop, so that the laser parameters can be controlled in real time for a higher quality weld.
Prophesee is also exploring ways of inspecting mobile phone screens for surface damage, by vibrating the phone and measuring how light is scattered from the surface – light will reflect differently depending on whether the phone is scratched or completely flat and undamaged.
It has also shown that the sensor can be used for volume measurements by combining it with structured light.
The sensor is supported by a software development kit (SDK), a full set of drivers, the Prophesee Player tool for recording sequences and visualising data, and an online knowledge centre containing useful resources for developers.
The chip is available in a 13 x 15mm mini PBGA package. The sensor has a 640 x 480-pixel resolution with 15μm pixels in a 3/4-inch optical format. It is manufactured in a 0.18μm specialised CIS process.
Verre said that the company’s technology follows the same rules as conventional CMOS sensors in terms of geometry and optics. The difference comes at the level of processing, he added. The firm’s SDK includes turnkey solutions for counting, tracking, optical flow, 3D measurements, and vibration measurements, but it is also working with and training distributors and system integrators.
Engineering samples of the fourth generation of the sensor are already available.

Topics​

Read more about:​

Image sensor, Event-based vision, High-speed imaging, Business, Technology

Magazines​

Registration​

©2023 Europa Science Ltd


Some of their upcoming developments look interesting:


1. Prophesee is working on a project counting droplets being deposited very quickly on a target object. ‘We can do this at high precision and with low compute,’ Verre said.
2. Monitoring vibration of machines for predictive maintenance is another application Prophesee is involved in. The sensor can measure vibration frequency and amplitude in real time to give an indication of when a machine might be about to fail.
A more traditional approach would be to use an accelerometer, but an image sensor is non-intrusive and can monitor inaccessible parts of the machine or areas that get very hot.
 
  • Like
  • Love
  • Fire
Reactions: 21 users

Iseki

Regular
And the half yearly revenue from contracts with customers is...
And the testing of the taped out Akida1500 will be released on...
24 hours to fret
 
  • Like
  • Fire
Reactions: 3 users
Good post. 🙌 now how do I get a mate to stop with the silent treatment towards me. Obviously purchased higher and also refuse my offer of topping him up at these prices no strings attached and no need to pay me back.?
Send Wilzy to him/her.. 😝
 
  • Haha
Reactions: 3 users
Morning Chippers,

Screen shot from the other channel.......

Still nothing registering at my end ?????

POSSIBLE INCOMING HALF YEARLY

Regards,
Esq,
View attachment 42684
Hi Esq, I believe it may be Aug 30 report..
That’s on Trading View and also reported on OTC in the US..
 
  • Like
  • Fire
Reactions: 3 users

Esq.111

Fascinatingly Intuitive.
Hi Esq, I believe it may be Aug 30 report..
That’s on Trading View and also reported on OTC in the US..
Morning Schnitzel Lover,

That is what I thought ..... until the screen shot provided .... intimated otherwise.

Regards,
Esq.
 
  • Like
Reactions: 2 users

Xray1

Regular
Morning Schnitzel Lover,

That is what I thought ..... until the screen shot provided .... intimated otherwise.

Regards,
Esq.
Last year's half yearly financial report was released on Wednesday the 24th August 2022 ...... so maybe sometime today might be a highly likely situation.
 
  • Like
Reactions: 2 users
Last year's half yearly financial report was released on Wednesday the 24th August 2022 ...... so maybe sometime today might be a highly likely situation.
I've emailed TD. See if he can shed any light
 
  • Fire
  • Like
Reactions: 5 users
  • Like
  • Love
  • Fire
Reactions: 19 users

chapman89

Founding Member
Valeo announces 2 new major contracts for its third-generation LiDAR
Valeo, the global leader in ADAS, is proud to announce that its SCALA 3 LiDAR has been chosen by a leading Asian manufacturer and a leading American robotaxi company. Valeo has now registered orders worth more than 1 billion euros for SCALA 3.

So thanks to @chapman89 , we all now know how likely it is that Akida is inside the new Scalar 3.
Based on this article that I'm sure was posted here before, as of March there was over $1B worth of orders on their books.

Now there's 2 ways we can be included; Through Renesas/Megachips or as a stand alone License.
Given the lack of IP announcement I wouldn't be surprised if it was Renesas, but I'm hoping it's neither as that would mean a minimum of 3 products.

Either way, does anyone have any idea what the rough value of a Lidar system is? I'd like to estimate the qty of orders on their system.
Alternatively, does anyone have any links to the orders they have as far as qty?
I know @Diogenese is probably exhausted talking about Valeo but maybe he can explain again why he believes it is us?!

My reasons is, we have a joint development agreement with them asx announced back in June 2020, there is NOBODY right now or especially back then that has commercially available neuromorphic IP, can somebody allude me to another company that has commercial neuromorphic IP that was available back in 2020??

Nanden has recently said that they (Brainchip) have even worked on a prototype with Valeo, now what other prototype could it be?
It’s been 3 years since that announcement with Brainchip and Valeo and by 2024 rolls around it makes sense that it’s almost 4 years time to market, which is standard for the automotive industry.

We’ve got the confirmation from a 27 year veteran currently working at Valeo that they achieved Scala 3 through neuromorphic and spiking neural networks, they appear on our presentations as early adopters, it is us I bet my life on it (my opinion ASIC 😆)

There may be a standalone announcement but as @Diogenese has said, we already have an asx announcement already about the joint development and the terms of it, as Brainchip will be paid when undisclosed milestones are met.

So when Sean Hehir says financials, I believe this will be one of them in the not too distant future, I’d say by March next year we would have had a stand-alone announcement about revenue owing to Brainchip as it will be substantial, if there’s let’s say $2 billion Aus, can Brainchip demand a 5% royalty? Well of course they can, but let’s just go with 2%, do the maths guys, and that will be reoccurring for many years and growing every year as more automotive companies adopt Scala3.

Also by years end I am expecting a launch customer and another 1-2 customers to sign, maybe I am expecting too much but that’s what I am thinking.

There is also companies as you all would know that just because there isn’t an asx announcement right now, does not mean that companies arent working on designing and implementing akida 1 or 2 as we speak.
It is very simple, you only have to read the announcements on all our partnerships in the last year and a half or so, they don’t say we will evaluate Brainchip, it says that Brainchip is basically the key differentiator and revolutionary and will start designing right away.

So, in my opinion, many companies as we speak are working on getting their prototypes working and getting ready to get to market for themselves and their customers.

Of course my opinion but this makes sense based on the research I have done!!
 
  • Like
  • Love
  • Fire
Reactions: 119 users

Xray1

Regular
I've emailed TD. See if he can shed any light
IMO ..... I think TD will most likely be unable to provide you that information because it will be designated "Price Sensitive" announcement.
 
  • Like
  • Fire
Reactions: 3 users

Damo4

Regular
I know @Diogenese is probably exhausted talking about Valeo but maybe he can explain again why he believes it is us?!

My reasons is, we have a joint development agreement with them asx announced back in June 2020, there is NOBODY right now or especially back then that has commercially available neuromorphic IP, can somebody allude me to another company that has commercial neuromorphic IP that was available back in 2020??

Nanden has recently said that they (Brainchip) have even worked on a prototype with Valeo, now what other prototype could it be?
It’s been 3 years since that announcement with Brainchip and Valeo and by 2024 rolls around it makes sense that it’s almost 4 years time to market, which is standard for the automotive industry.

We’ve got the confirmation from a 27 year veteran currently working at Valeo that they achieved Scala 3 through neuromorphic and spiking neural networks, they appear on our presentations as early adopters, it is us I bet my life on it (my opinion ASIC 😆)

There may be a standalone announcement but as @Diogenese has said, we already have an asx announcement already about the joint development and the terms of it, as Brainchip will be paid when undisclosed milestones are met.

So when Sean Hehir says financials, I believe this will be one of them in the not too distant future, I’d say by March next year we would have had a stand-alone announcement about revenue owing to Brainchip as it will be substantial, if there’s let’s say $2 billion Aus, can Brainchip demand a 5% royalty? Well of course they can, but let’s just go with 2%, do the maths guys, and that will be reoccurring for many years and growing every year as more automotive companies adopt Scala3.

Also by years end I am expecting a launch customer and another 1-2 customers to sign, maybe I am expecting too much but that’s what I am thinking.

There is also companies as you all would know that just because there isn’t an asx announcement right now, does not mean that companies arent working on designing and implementing akida 1 or 2 as we speak.
It is very simple, you only have to read the announcements on all our partnerships in the last year and a half or so, they don’t say we will evaluate Brainchip, it says that Brainchip is basically the key differentiator and revolutionary and will start designing right away.

So, in my opinion, many companies as we speak are working on getting their prototypes working and getting ready to get to market for themselves and their customers.

Of course my opinion but this makes sense based on the research I have done!!

Another fantastic post by you.

I agree RE. Neuromorphic availability, a quick google search has yet to show me any other partnerships or EAP agreements that would suggest they are using someone else.

Also the below spells out why we may have seen some revenue (in the hundreds of thousand) in our reports, despite no new Licences.

1692754553419.png
 
  • Like
  • Love
  • Fire
Reactions: 38 users

Diogenese

Top 20
I know @Diogenese is probably exhausted talking about Valeo but maybe he can explain again why he believes it is us?!

My reasons is, we have a joint development agreement with them asx announced back in June 2020, there is NOBODY right now or especially back then that has commercially available neuromorphic IP, can somebody allude me to another company that has commercial neuromorphic IP that was available back in 2020??

Nanden has recently said that they (Brainchip) have even worked on a prototype with Valeo, now what other prototype could it be?
It’s been 3 years since that announcement with Brainchip and Valeo and by 2024 rolls around it makes sense that it’s almost 4 years time to market, which is standard for the automotive industry.

We’ve got the confirmation from a 27 year veteran currently working at Valeo that they achieved Scala 3 through neuromorphic and spiking neural networks, they appear on our presentations as early adopters, it is us I bet my life on it (my opinion ASIC 😆)

There may be a standalone announcement but as @Diogenese has said, we already have an asx announcement already about the joint development and the terms of it, as Brainchip will be paid when undisclosed milestones are met.

So when Sean Hehir says financials, I believe this will be one of them in the not too distant future, I’d say by March next year we would have had a stand-alone announcement about revenue owing to Brainchip as it will be substantial, if there’s let’s say $2 billion Aus, can Brainchip demand a 5% royalty? Well of course they can, but let’s just go with 2%, do the maths guys, and that will be reoccurring for many years and growing every year as more automotive companies adopt Scala3.

Also by years end I am expecting a launch customer and another 1-2 customers to sign, maybe I am expecting too much but that’s what I am thinking.

There is also companies as you all would know that just because there isn’t an asx announcement right now, does not mean that companies arent working on designing and implementing akida 1 or 2 as we speak.
It is very simple, you only have to read the announcements on all our partnerships in the last year and a half or so, they don’t say we will evaluate Brainchip, it says that Brainchip is basically the key differentiator and revolutionary and will start designing right away.

So, in my opinion, many companies as we speak are working on getting their prototypes working and getting ready to get to market for themselves and their customers.

Of course my opinion but this makes sense based on the research I have done!!
Hi Jesse,

This Valeo patent application was published in May 2023.

US2023146935A1 CONTENT CAPTURE OF AN ENVIRONMENT OF A VEHICLE USING A PRIORI CONFIDENCE LEVELS - 20211109

A method for the content capture of an environment of a vehicle is disclosed. The method uses an artificial intelligence neural network, on the basis of a point cloud generated by an environment sensor of the vehicle. The method involves performing reference measurements by the environment sensor to capture reference objects depending on positions in the environment in relation to the environment sensor, generating confidence values depending on positions in the environment in relation to the environment sensor on the basis of the reference measurements by the environment sensor to capture the reference objects, training the artificial intelligence for the content capture of the environment on the basis of training point clouds for the environment sensor, capturing the environment by the environment sensor to generate the point cloud, and processing the point cloud generated by the environment sensor using the trained artificial intelligence for the content capture of the environment.

[0010] The basic concept of the present invention is thus to capture properties of the at least one environment sensor in relation to positions in the environment of the vehicle via the reference measurements and to generate therefrom a priori confidence values on the basis of which the contents of the environment can be reliably captured. As a result, a statement about the reliability of the detection of the object is immediately made possible depending on the respective position of a captured object. In practice, the confidence level for different positions decreases here the further away an object is from the vehicle. Even when objects are the same distance away, differences in the confidence values, for example due to different angular positions in relation to the vehicle, can also result, depending on sensor properties of the at least one environment sensor.

[0011] Calibration of the confidence level—the problem of predicting probability estimates which are representative of the actual correctness probability—is important in many applications, for example for models of deep neural networks. Modern, very deep neural networks are often poorly calibrated and may be reliant on the underlying training data. Factors such as depth, resolution, weight decay and batch normalization (even for the training processes) are important factors which influence the calibration. These problems can be reduced by using the a priori determined confidence values.

[0012] Calibrated confidence level estimates are important for the interpretability of an artificial intelligence here. As opposed to artificial intelligence, human beings have a natural cognitive intuition for probabilities. Good confidence level estimates constitute valuable additional information in order to establish the trustworthiness of the artificial intelligence with respect to a human user and thus to increase the acceptance thereof—in particular for an artificial intelligence, the classification decisions of which can be difficult to interpret. Good probability estimates can furthermore be used in order, for example, to integrate neural networks into other probabilistic models
.

[0013] The steps of performing reference measurements, of generating confidence values and of training the artificial intelligence are typically performed once for the method. These steps can furthermore be performed once centrally for the purpose of providing a plurality of driving assistance systems such that the confidence values and the trained artificial intelligence can be equally used for the plurality of driving assistance systems. As will be apparent from the following statements, the training of the artificial intelligence can take place in part independently of the confidence values and these steps can thus take place in a different order.

[0014] The steps of capturing the environment of the vehicle by the at least one environment sensor and of processing the point cloud using the trained artificial intelligence for the content capture of the environment relate to the operation of the vehicle. These steps are therefore each performed individually in each driving assistance system. These steps are furthermore performed repeatedly in the driving assistance system in order to perform continuous content capture of the environment.

These paragraphs explain that (pre-Akida) training NNs was a central one-off process, but this invention utilizes (on-chip) individual training, which, as we all know, was only available from one chip back in 2021.

Also keep in mind that there could be another 18 months worth of patents in the confidentiality pipeline.

Note the patent is jointly owned with Sensoren Gmbh, an ultrasonic manufacturer,
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 68 users

Damo4

Regular
I love this forum

Brain Dead GIF - Brain Dead Ned - Discover & Share GIFs
 
  • Haha
  • Like
  • Love
Reactions: 14 users

robsmark

Regular
Yeah, me too.

This Is Fine GIF
 
  • Haha
Reactions: 16 users

Xray1

Regular
Well, like we have been told by the Co many times .... "Watch the Financials" ....... so let's hope that we see some significant improvement in this upcoming Half Yearly Report due out at any moment.
 
  • Like
  • Sad
  • Haha
Reactions: 12 users

HopalongPetrovski

I'm Spartacus!
He'll still hold, he sees the great potential we have here and I've bludgeoned him with page upon page upon page/ reams of info related to brn. Since he really only looks at the ASX for news one knows they have released two fifths of F-alll through that channel conveniently since after he made first purchase 🤯
And still good mates no matter what
Lolz I'm staying at his place for about a month soon just to get out of the city and enjoy some picturesque views.
In the meantime tell your Mate to have a look at the 4DS and WBT chart's over the past few years to get an idea of what it's been like for them, and imagine just how they are feeling today.
This is the nature of the small cap tech sector in which we have all placed our bets.
It can be volatile. Which is precisely why we invest here. 🤣

Yes, we research it to the best of our abilities and then invest to the extent that our personal risk/reward computation allows, but ultimately we are hoping to get in before it is common sense and risk free. 🤣
When it goes up as we expect, we are self proclaimed geniuses, but when it goes down, we are fools, or cast about seeking other's to blame.
It's a grand game, isn't it? 🤣

There are many seeking to tip your hand and influence us one way or 'tother, either in hope of directly or indirectly benefiting, or in the paid service of those who run the game.
Filter all you read with a grain of salt, take your own wise counsel, plan for contingencies both good and bad and never let money come before friendships.
That is the path to the darkside, and whilst they may have the sexy gear and clothing, ultimately it is just sad. 🤣

Happy for the much deserved run 4DS holders are embarking upon with their just announced success and the stellar rise WBT has enjoyed over the last 2-3 years.
I still believe BrainChip's ripening success will have their runs pale into insignificance by comparison. 🤣
Our share price has been pushed this far and yet our management and Company still stands strong and defiant in the knowledge of what is to come.
They will either succeed or fail in their endeavour, but I intend to ride this rocket all the way.
Either to Glory, or to Smithereens. 🤣
Our day is coming.
Bring It, BrainChip!
GLTAH
 
  • Like
  • Love
  • Fire
Reactions: 46 users

AusEire

Founding Member. It's ok to say No to Dot Joining
Good post. 🙌 now how do I get a mate to stop with the silent treatment towards me. Obviously purchased higher and also refuse my offer of topping him up at these prices no strings attached and no need to pay me back.?
You can top up my holdings if you want? 😂
 
  • Haha
  • Like
Reactions: 10 users

Damo4

Regular
Deleted or moderated @robsmark ? Brutal if modded, the reporting is as frustrating as ever.
Was quicker than me finding the gif

1692758847524.png
 
  • Like
  • Thinking
Reactions: 2 users
Top Bottom