BRN Discussion Ongoing

D

Deleted member 118

Guest
A mention for Akida and i don’t think I’ve seen an article reference work from Brainchip either.


AFCA30D5-E222-4E1D-B320-2C668AD9AA5C.png


And not let’s forget @Fact Finder

 
  • Like
  • Haha
Reactions: 8 users

Quatrojos

Regular
  • Like
  • Fire
Reactions: 52 users
D

Deleted member 118

Guest
  • Haha
  • Like
  • Sad
Reactions: 11 users
Thanks for the correction J

Have edited post.
Hi FMF
I think you also need to give weight to the fact that in presentations Valeo has consented to having its Logo displayed (as we know about ten others at least all have refused to do so) alongside Mercedes, NASA and Vorago as Early Adopters of Brainchip’s AKIDA technology.

In the semiconductor space the term ‘Early Adopter’ has significance and is well beyond being just an EAP.

So the only question outstanding with Valeo is in my opinion what product is Valeo using AKIDA technology in and for all of the reasons you have referred to in my opinion if it walks like a duck and quacks like a duck it has to be LiDAR at least.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 29 users

Baisyet

Regular

Just a refresher for all of us :)

AI player BrainChip on a roll; signs two contracts within a month​

via KalkineMedia

Artificial Intelligence is expected to have a firm grip on the market in the upcoming future. A recent report published in May 2020 by the Australian Government highlighted that there is a variety of high-profile demonstrations of Artificial Intelligence and significant progress has been made in fields of self-driving cars, gameplaying machines and virtual assistant. Further, AI has had a considerable role to play in managing the current COVID-19 crisis.

During the last decade, there have been five vital areas where significant growth has been witnessed. These include:

  • Image understanding
  • Intelligent decision making
  • Artificial creativity
  • Natural Language Processing
  • physical automation
The scope of AI is not exhaustive. However, the above five regions have shown a significant change in the past ten years.

ASX-listed BrainChip Holdings Ltd (ASX:BRN) is one such technology company that is engaged in developing innovative neuromorphic processor that brings AI to the edge in a manner that is beyond the abilities of other neural network devices. The solution is high-speed, small, low power. It allows a broad range of edge abilities comprising continual learning, local training, and interpretation.

BRN, during April 2020, introduced its AKD1000 to spectators at the Processor Virtual Conference by the Linley Group. The AKD1000’s neural processor is capable of running a normal Convolutional neural network by transforming it into event-based, letting it to execute incremental learning and transfer it on a chip.

CNN or Convolutional neural network is a type of deep neural network used for analysing images. These have specially designed architecture that makes them comparatively easy to train, even for relatively deep networks.

After the introduction of AKD1000, BrainChip has recently signed two agreements, post which the Company noted a significant improvement in its share price in the past couple of weeks. BRN shares, which settled at A$0.058 on 22 May 2020 reached A$0.120 on 9 June 2020, representing a growth of ~106.9%.

On 9 June 2020, the share price skyrocketed after the release of the Company’s announcement related to its Joint agreement with Valeo Corporation. The stock settled at A$0.110 on 10 June 2020, down 8.333%.

Let us look at the two recent deals signed by the Company that led to the stock rally.

Joint Agreement with Tier-1 Automotive Supplier

On 8 June 2020, Brainchip Holdings Ltd entered a joint development pact using BRN’s Akida neuromorphic SoC with Valeo Corporation, a Tier-1 European automotive supplier of sensors as well as systems for Advanced Driver Assistance Systems (ADAS) and Autonomous Vehicles (AV).

Under the agreement, certain performance milestones & payments that are anticipated to include the Company’s expenditures. The term of the deal is specified by the accomplishment of performance goals & the accessibility of the Akida devices. Each party has the option to end the agreement for convenience with specific notification.

The confirmation of the Company’s Akida device by a Tier-1 supplier of sensors & systems to the automotive industry is believed to be significant progress.

In ADAS & AV applications, the real-time processing of data is vital for security as well as dependability of the autonomous systems. From the automotive industry, the suppliers, as well as the manufacturers, have acknowledged that the advanced & highly efficient neuromorphic nature of the Akida SoC makes it ideally fit to process data at the “Edge” for their advanced system solutions.

With the integration of the Akida neural network processor with sensors, the subsequent system can attain ultra-low power, min. latency, max. reliability & incremental learning.

The Akida neural processor’s game-changing high performance & ultra-low power utilisation allows smart sensor combination by resolving power and footprint difficulties for a range of sensor technologies. Further, it consumes less power than alternative AI solutions and simultaneously maintaining the necessary performance as well as accuracy in a fraction of the physical space.

Agreement with Ford Motor Company for the Evaluation of AkidaTM Neural Processor

On 24 May 2020, the Company signed a joint agreement with Detroit-based Company for evaluation of the Akida neural network System-on-Chip for Advanced Driver Assistance Systems & Autonomous Vehicle applications.

The evaluation agreement was binding on execution which was signed into with Ford Motor Company and is not the subject of a fixed term. The deal is based upon a partnership to assess Akida as it relates to the automotive industry & payments under the agreement proposed to cover related expenses and received periodically during the evaluation process.

Akida NSoC has an advanced and highly efficient neuromorphic nature, and the partners in the collaboration have also realised that these features provide a broad range of potential solutions to complex problems such as driver behaviour assessments and real-time object detection.

The Akida NSoC exemplifies ground-breaking Neural Processing computing tools for Edge AI systems and gadgets. Each Akida NSoC has 10 billion synapses and 1.2 million neurons, demonstrating orders of magnitude improved effectiveness than other neural processing devices available.

The unique combination of meagre power, high performance and on-chip learning enables for real-time processing at the sensor along with continuous learning. The objective is to facilitate personalisation of every driver’s understanding in real time & constant updates to the system with the change in the environmental condition
 
  • Like
  • Fire
  • Love
Reactions: 38 users
I have a question about Lidar or radar for AVs.

It's great if only 1 AV is using it, but what about when there are two or more cars in the 200 m range, whether going in the same direction or going the other way?

How does each AV know which reflections are from its Lidar, let alone the directly impinging beams from oncoming traffic?

The receiver is going to need to distinguish one set of reflections from maybe 50 sets of reflections and direct beams.
Hi @Diogenese

I read this somewhere and it was that an autonomous vehicle to be given life will need more than one source of sensory input and if one of the inputs is in conflict majority will rule.

This necessity for multiple sources necessitates ultra low latency processing hence why AKIDA technology is essential.

The second thing I would say is I remember from high school science something about angles of incidence equaling angles of refraction. So assuming my Lidar sends out one pulse of light which collides with an object I can expect that pulse to come back at a known angle and in a time frame which will tell me by reference to these two things, time and angle, the distance and the location of the object.

As I am very clever if the angle does not match the time then I will know that the pulse of light hitting my sensor is not the pulse of light I sent out and therefore must be @Diogenese fooling around with the laser pointer he got for Christmas.

Now in there somewhere which is well above my pay grade is the Doppler Effect but I think I will leave that to someone who knows what they are talking about.

Suffice to say I think random pulses of light must always be in play even if @Diogenese is the only one who could not sleep and has gone for a drive in the early hours to watch the transit of Venus or something.

Having more than one sensor and majority rules will deal with this issue of random inputs.

My opinion only made up completely out of my own head with nothing but high school science DYOR
FF


AKIDA BALLISTA
 
  • Haha
  • Like
  • Fire
Reactions: 12 users

MrNick

Regular
Are we watching further roll-out amongst lesser-known marques?

Screen Shot 2022-05-02 at 9.17.54 am.png
 
  • Like
  • Thinking
  • Fire
Reactions: 19 users

Evermont

Stealth Mode
Here is a name I don't believe we have seen in association yet.

Early days for Fisker, production of the Ocean One is scheduled for November.

1651454121466.png
 
  • Like
  • Fire
  • Thinking
Reactions: 20 users
The reality is that Brainchip is the real deal.

The technology is proven.

The technology lead is obvious and patent protected.

The management, sales and marketing team is in place and rock solid from top to bottom.

The customers are materialising and so far have C.V’s that confirm the description given too them by Mr. Dinardo in 2020 of being ‘household named’ and ‘Fortune 500’ companies. Eleven or so more ‘household named’ and/or ‘Fortune 500’ companies to be revealed.

Those who have stood in the street hurling insults have only one final rock they can throw at rusted on retail investors ‘where’s the income’.

Well given Brainchip has proven its integrity and has made clear and unqualified statements that income will commence to ramp up second half 2022 I for one am unconcerned about this final insult.
Just now someone gave me a like for a post that was written by me quite a while ago.

Recently I had a too and fro with Realinfo and someone who shall not be named and who no longer exists and who @Diogenese privately pointed out used identical phrases to another earlier poster who also departed this cyber space.

What I said in this too and fro was that my post the subject of their commentary was being taken completely out of context and should be read with my earlier statements regarding income.

I have extracted the above from the earlier post which I could not be bothered searching out but has been dropped in my lap just now so to speak.

As can be seen I had absolutely no expectations around income in the last quarter or the current quarter and third quarter is not a certainty given the language used by Brainchip.

Why am I raising this now well because I have a confession. In the post that Realinfo attempted to pull me up on was a deliberate error.

I said if the $2 million from MegaChips was evenly distributed it would mean $1 million income.

It is an advocates technique to make an obvious error knowing it will be corrected by the judge to reinforce the point being made.

This was a quarterly so evenly distributed would be half a million not the million I stated.

No one pulled me up or corrected me such was the irrational exuberance being expressed by others there was no room for clear headed thinking or rational debate and thought.

So the point I am making early is one which another poster made income ramping up second half 2022 will potentially not appear until the 4th quarter and the 4C reflecting this will not be available until end of January, 2023.

So please take this onboard and temper your expectations now and do not allow manipulation to create an environment where you are emotionally vulnerable when false expectations are not achieved.

Learn from what took place and understand the strength of Brainchip shares in the market now and how despite there best sophisticated efforts the false expectations created by manipulators did not collapse the price.

My opinion only DYOR - and yes this sounds a bit pompous but I am a retired lawyer so cut me some slack.
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 34 users
  • Like
  • Haha
Reactions: 11 users
Hi FMF
I think you also need to give weight to the fact that in presentations Valeo has consented to having its Logo displayed (as we know about ten others at least all have refused to do so) alongside Mercedes, NASA and Vorago as Early Adopters of Brainchip’s AKIDA technology.

In the semiconductor space the term ‘Early Adopter’ has significance and is well beyond being just an EAP.

So the only question outstanding with Valeo is in my opinion what product is Valeo using AKIDA technology in and for all of the reasons you have referred to in my opinion if it walks like a duck and quacks like a duck it has to be LiDAR at least.

My opinion only DYOR
FF

AKIDA BALLISTA
Agree FF that all indications are that Akida being used and given, as you point out, that Valeo are not hiding behind a NDA it would be beneficial for confirmation of same imo.

That was more the point in my post that there doesn't appear to be any reason not to release that information if and when Valeo are integrating into products.
 
  • Like
  • Fire
  • Love
Reactions: 10 users
Agree FF that all indications are that Akida being used and given, as you point out, that Valeo are not hiding behind a NDA it would be beneficial for confirmation of same imo.

That was more the point in my post that there doesn't appear to be any reason not to release that information if and when Valeo are integrating into products.
Sorry we then are on the same page. I do think because Brainchip will play with anyone that Valeo will leave any AKIDA reveal to the very last minute to keep competitors guessing.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
Reactions: 10 users

Diogenese

Top 20
Hi @Diogenese

I read this somewhere and it was that an autonomous vehicle to be given life will need more than one source of sensory input and if one of the inputs is in conflict majority will rule.

This necessity for multiple sources necessitates ultra low latency processing hence why AKIDA technology is essential.

The second thing I would say is I remember from high school science something about angles of incidence equaling angles of refraction. So assuming my Lidar sends out one pulse of light which collides with an object I can expect that pulse to come back at a known angle and in a time frame which will tell me by reference to these two things, time and angle, the distance and the location of the object.

As I am very clever if the angle does not match the time then I will know that the pulse of light hitting my sensor is not the pulse of light I sent out and therefore must be @Diogenese fooling around with the laser pointer he got for Christmas.

Now in there somewhere which is well above my pay grade is the Doppler Effect but I think I will leave that to someone who knows what they are talking about.

Suffice to say I think random pulses of light must always be in play even if @Diogenese is the only one who could not sleep and has gone for a drive in the early hours to watch the transit of Venus or something.

Having more than one sensor and majority rules will deal with this issue of random inputs.

My opinion only made up completely out of my own head with nothing but high school science DYOR
FF


AKIDA BALLISTA
Well, it's true that the position of the receiving pixel defines the angle of incidence of the incoming beam (the central ray of the incoming beam passes straight through the lens), but a direct beam from an oncoming vehicle will be orders of magnitude more powerful than a scattered return beam, and may at least temporarily "blind" the receiving pixels as they reach "saturation". I suppose that this has already been addressed in relation to night-vision goggles.

Also, the beam from the oncoming vehicle is not synchronized with the receiving vehicle's outgoing bursts, so it cannot be used directly to determine the distance.

It would, of course, be possible to track the angular movement of the oncoming vehicle from the moving location of the "blinded" pixels.

The point density will decrease with distance (unless the laser beams have a beam spread angle proportional to the distance), so the probability of a direct hit on the receiving sensor will increase as the vehicles get closer.

Oncoming vehicles can produce both direct laser impingement and scattered light impingement. Following vehicles only produce scattered light impingement on a forward facing LiDaR.

One solution may be that, for pulsed LiDaR (send a laser pulse and wait for reflection before sending another pulse) the LiDaR receiver pixels are only queried for a short period determined by the number (N) of laser pulses per frame (1/25 of a second). So if we use N = 4000*, the individual pixels are scanned every 0.04 sec for a period of 0.000005sec (assuming 50% duty cycle). So I guess what we are looking at is the probability of an incoming direct or scattered beam arriving in the time window a pixel is being queried by the Akida SNN.

As you point out, there will be only one pulse from the oncoming vehicle which could possibly impinge on the receiving pixel being examined, and this will also be 0.5/4000 of a second every 1/25 th of a second. So the probability of a direct hit while a pixel is being examined is quite low.

PS: I wonder if there is a correlation between the number of pulses per frame and the number of pixels.

*Valeo uses 25 frames per second, but the 4000 pulses per frame and 50% duty cycle are my guesstimates by way of example only. 50% duty cycle means the pulse duration and the waiting period for reflection are equal.
 
  • Like
  • Fire
  • Thinking
Reactions: 16 users

Diogenese

Top 20
Well, it's true that the position of the receiving pixel defines the angle of incidence of the incoming beam (the central ray of the incoming beam passes straight through the lens), but a direct beam from an oncoming vehicle will be orders of magnitude more powerful than a scattered return beam, and may at least temporarily "blind" the receiving pixels as they reach "saturation". I suppose that this has already been addressed in relation to night-vision goggles.

Also, the beam from the oncoming vehicle is not synchronized with the receiving vehicle's outgoing bursts, so it cannot be used directly to determine the distance.

It would, of course, be possible to track the angular movement of the oncoming vehicle from the moving location of the "blinded" pixels.

The point density will decrease with distance (unless the laser beams have a beam spread angle proportional to the distance), so the probability of a direct hit on the receiving sensor will increase as the vehicles get closer.

Oncoming vehicles can produce both direct laser impingement and scattered light impingement. Following vehicles only produce scattered light impingement on a forward facing LiDaR.

One solution may be that, for pulsed LiDaR (send a laser pulse and wait for reflection before sending another pulse) the LiDaR receiver pixels are only queried for a short period determined by the number (N) of laser pulses per frame (1/25 of a second). So if we use N = 4000*, the individual pixels are scanned every 0.04 sec for a period of 0.000005sec (assuming 50% duty cycle). So I guess what we are looking at is the probability of an incoming direct or scattered beam arriving in the time window a pixel is being queried by the Akida SNN.

As you point out, there will be only one pulse from the oncoming vehicle which could possibly impinge on the receiving pixel being examined, and this will also be 0.5/4000 of a second every 1/25 th of a second. So the probability of a direct hit while a pixel is being examined is quite low.

PS: I wonder if there is a correlation between the number of pulses per frame and the number of pixels.

*Valeo uses 25 frames per second, but the 4000 pulses per frame and 50% duty cycle are my guesstimates by way of example only. 50% duty cycle means the pulse duration and the waiting period for reflection are equal.

Here is one of the random doodles I made showing the relation of incoming beams impinging on the pixels of a light sensor. If you use, say, the central pixel (blue) as the reference, the angles of other beams can be determined precisely by the pixels each beam strikes.


1651458775105.png


(This is from one of the experiments I conducted with my Christmas laser pointer while I was endeavoring to watch the transit of Venus)
(I find sometimes the trees obscure one's view of the forest)
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Here is a name I don't believe we have seen in association yet.

Early days for Fisker, production of the Ocean One is scheduled for November.

View attachment 5371


Hi Evermont,

Fisker worked with a startup in Texas called 'Uhnder', who innovated the ICON digital radar in partnership with Magna. The system uses advanced military radar technology to enable precise image detection at more than 1,000 feet (about 300 m), continuously scanning to determine distance, height, depth and speed.

I wonder if BrainChip are involved in someway? I guess it wouldn't be too surprising considering BrainChip's links with DARPA and more recently with Information Systems Laboratories assistance in the development of an artificial intelligence technology meant to support the Air Force Research Laboratory’s radar projects.

Pure conjecture...🧦

Nonetheless, looks pretty impressive!



This article states:

This proactive safety system of automated electronic sensors (radar, ultrasonic sensors, and cameras) and processing software continuously senses inputs, adds intelligence, and then engages when necessary to anticipate and prevent accidents.

 
  • Like
  • Thinking
  • Fire
Reactions: 15 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hi Evermont,

Fisker worked with a startup in Texas called 'Uhnder', who innovated the ICON digital radar in partnership with Magna. The system uses advanced military radar technology to enable precise image detection at more than 1,000 feet (about 300 m), continuously scanning to determine distance, height, depth and speed.

I wonder if BrainChip are involved in someway? I guess it wouldn't be too surprising considering BrainChip's links with DARPA and more recently with Information Systems Laboratories assistance in the development of an artificial intelligence technology meant to support the Air Force Research Laboratory’s radar projects.

Pure conjecture...🧦

Nonetheless, looks pretty impressive!



This article states:

This proactive safety system of automated electronic sensors (radar, ultrasonic sensors, and cameras) and processing software continuously senses inputs, adds intelligence, and then engages when necessary to anticipate and prevent accidents.




Oh, and rather curiously, edge computing is discussed in Fiskers 2020 Annual Report.


Screen Shot 2022-05-02 at 1.08.34 pm.png

 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 12 users
Here is one of the random doodles I made showing the relation of incoming beams impinging on the pixels of a light sensor. If you use, say, the central pixel (blue) as the reference, the angles of other beams can be determined precisely by the pixels each beam strikes.


View attachment 5379

(This is from one of the experiments I conducted with my Christmas laser pointer while I was endeavoring to watch the transit of Venus)
(I find sometimes the trees obscure one's view of the forest)
The probability of say three LiDAR sensors each receiving direct hits at exactly the same time where you have a majority decision making process also would need to be factored in to the associated risk of such an occurrence not that I could do the maths but as you add each sensor your initial odds of a direct hit are becoming factors less.

Something which I remembered from my dark and distant past when I had to learn all about the police radar in the event that I had to prosecute a defended case was that the reason the police radar operator was instructed to obtain a clear solid signal on the radar instrument for at least three seconds before a prosecution for a given exceed speed could be prosecuted was to overcome any possibility of interference of the type you are referring too.

I could try to explain it in greater detail but it is such a long time ago we are talking the 1970's when I did the training course it would get a bit messy but I know what I mean and trust you @Diogenese will immediately understand.

In the real world based upon this limited knowledge the likelihood of interference at greater distances causing a problem will be less than at shorter distances as many more pulses will be engaged at distance as the objects are becoming closer.

The amazing Brembo braking break through whereby stopping distances are dramatically reduced would be of assistance as it would allow more time on every occasion that there is a need to brake for the Lidar to send and receive many more pulses of light.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 10 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Some interesting people in the Fisker team.





Screen Shot 2022-05-02 at 1.19.02 pm.png
 
  • Like
  • Fire
  • Thinking
Reactions: 14 users
Top Bottom