D
Again yr on fire Rocket and i especially also love the last page of this article -
I do try and balance my contributionsAgain yr on fire Rocket and i especially also love the last page of this article -
View attachment 5357
Hi FMFThanks for the correction J
Have edited post.
Hi @DiogeneseI have a question about Lidar or radar for AVs.
It's great if only 1 AV is using it, but what about when there are two or more cars in the 200 m range, whether going in the same direction or going the other way?
How does each AV know which reflections are from its Lidar, let alone the directly impinging beams from oncoming traffic?
The receiver is going to need to distinguish one set of reflections from maybe 50 sets of reflections and direct beams.
Just now someone gave me a like for a post that was written by me quite a while ago.The reality is that Brainchip is the real deal.
The technology is proven.
The technology lead is obvious and patent protected.
The management, sales and marketing team is in place and rock solid from top to bottom.
The customers are materialising and so far have C.V’s that confirm the description given too them by Mr. Dinardo in 2020 of being ‘household named’ and ‘Fortune 500’ companies. Eleven or so more ‘household named’ and/or ‘Fortune 500’ companies to be revealed.
Those who have stood in the street hurling insults have only one final rock they can throw at rusted on retail investors ‘where’s the income’.
Well given Brainchip has proven its integrity and has made clear and unqualified statements that income will commence to ramp up second half 2022 I for one am unconcerned about this final insult.
Hope it comes in other colours. FFHere is a name I don't believe we have seen in association yet.
Early days for Fisker, production of the Ocean One is scheduled for November.
View attachment 5371
Agree FF that all indications are that Akida being used and given, as you point out, that Valeo are not hiding behind a NDA it would be beneficial for confirmation of same imo.Hi FMF
I think you also need to give weight to the fact that in presentations Valeo has consented to having its Logo displayed (as we know about ten others at least all have refused to do so) alongside Mercedes, NASA and Vorago as Early Adopters of Brainchip’s AKIDA technology.
In the semiconductor space the term ‘Early Adopter’ has significance and is well beyond being just an EAP.
So the only question outstanding with Valeo is in my opinion what product is Valeo using AKIDA technology in and for all of the reasons you have referred to in my opinion if it walks like a duck and quacks like a duck it has to be LiDAR at least.
My opinion only DYOR
FF
AKIDA BALLISTA
Sorry we then are on the same page. I do think because Brainchip will play with anyone that Valeo will leave any AKIDA reveal to the very last minute to keep competitors guessing.Agree FF that all indications are that Akida being used and given, as you point out, that Valeo are not hiding behind a NDA it would be beneficial for confirmation of same imo.
That was more the point in my post that there doesn't appear to be any reason not to release that information if and when Valeo are integrating into products.
Well, it's true that the position of the receiving pixel defines the angle of incidence of the incoming beam (the central ray of the incoming beam passes straight through the lens), but a direct beam from an oncoming vehicle will be orders of magnitude more powerful than a scattered return beam, and may at least temporarily "blind" the receiving pixels as they reach "saturation". I suppose that this has already been addressed in relation to night-vision goggles.Hi @Diogenese
I read this somewhere and it was that an autonomous vehicle to be given life will need more than one source of sensory input and if one of the inputs is in conflict majority will rule.
This necessity for multiple sources necessitates ultra low latency processing hence why AKIDA technology is essential.
The second thing I would say is I remember from high school science something about angles of incidence equaling angles of refraction. So assuming my Lidar sends out one pulse of light which collides with an object I can expect that pulse to come back at a known angle and in a time frame which will tell me by reference to these two things, time and angle, the distance and the location of the object.
As I am very clever if the angle does not match the time then I will know that the pulse of light hitting my sensor is not the pulse of light I sent out and therefore must be @Diogenese fooling around with the laser pointer he got for Christmas.
Now in there somewhere which is well above my pay grade is the Doppler Effect but I think I will leave that to someone who knows what they are talking about.
Suffice to say I think random pulses of light must always be in play even if @Diogenese is the only one who could not sleep and has gone for a drive in the early hours to watch the transit of Venus or something.
Having more than one sensor and majority rules will deal with this issue of random inputs.
My opinion only made up completely out of my own head with nothing but high school science DYOR
FF
AKIDA BALLISTA
Well, it's true that the position of the receiving pixel defines the angle of incidence of the incoming beam (the central ray of the incoming beam passes straight through the lens), but a direct beam from an oncoming vehicle will be orders of magnitude more powerful than a scattered return beam, and may at least temporarily "blind" the receiving pixels as they reach "saturation". I suppose that this has already been addressed in relation to night-vision goggles.
Also, the beam from the oncoming vehicle is not synchronized with the receiving vehicle's outgoing bursts, so it cannot be used directly to determine the distance.
It would, of course, be possible to track the angular movement of the oncoming vehicle from the moving location of the "blinded" pixels.
The point density will decrease with distance (unless the laser beams have a beam spread angle proportional to the distance), so the probability of a direct hit on the receiving sensor will increase as the vehicles get closer.
Oncoming vehicles can produce both direct laser impingement and scattered light impingement. Following vehicles only produce scattered light impingement on a forward facing LiDaR.
One solution may be that, for pulsed LiDaR (send a laser pulse and wait for reflection before sending another pulse) the LiDaR receiver pixels are only queried for a short period determined by the number (N) of laser pulses per frame (1/25 of a second). So if we use N = 4000*, the individual pixels are scanned every 0.04 sec for a period of 0.000005sec (assuming 50% duty cycle). So I guess what we are looking at is the probability of an incoming direct or scattered beam arriving in the time window a pixel is being queried by the Akida SNN.
As you point out, there will be only one pulse from the oncoming vehicle which could possibly impinge on the receiving pixel being examined, and this will also be 0.5/4000 of a second every 1/25 th of a second. So the probability of a direct hit while a pixel is being examined is quite low.
PS: I wonder if there is a correlation between the number of pulses per frame and the number of pixels.
*Valeo uses 25 frames per second, but the 4000 pulses per frame and 50% duty cycle are my guesstimates by way of example only. 50% duty cycle means the pulse duration and the waiting period for reflection are equal.
Here is a name I don't believe we have seen in association yet.
Early days for Fisker, production of the Ocean One is scheduled for November.
View attachment 5371
Hi Evermont,
Fisker worked with a startup in Texas called 'Uhnder', who innovated the ICON digital radar in partnership with Magna. The system uses advanced military radar technology to enable precise image detection at more than 1,000 feet (about 300 m), continuously scanning to determine distance, height, depth and speed.
I wonder if BrainChip are involved in someway? I guess it wouldn't be too surprising considering BrainChip's links with DARPA and more recently with Information Systems Laboratories assistance in the development of an artificial intelligence technology meant to support the Air Force Research Laboratory’s radar projects.
Pure conjecture...
Nonetheless, looks pretty impressive!
This article states:
This proactive safety system of automated electronic sensors (radar, ultrasonic sensors, and cameras) and processing software continuously senses inputs, adds intelligence, and then engages when necessary to anticipate and prevent accidents.
The Fisker Ocean All-Electric SUV – LA Reveal For 17.1-Inch Rotating Revolve1 Screen
Fisker Inc. has officially revealed its Fisker Ocean SUV on the first media day of the Los Angeles Auto Show. One of the new features revealed is a 17.1-inch central high-resolution Revolve screen...www.autofutures.tv
The probability of say three LiDAR sensors each receiving direct hits at exactly the same time where you have a majority decision making process also would need to be factored in to the associated risk of such an occurrence not that I could do the maths but as you add each sensor your initial odds of a direct hit are becoming factors less.Here is one of the random doodles I made showing the relation of incoming beams impinging on the pixels of a light sensor. If you use, say, the central pixel (blue) as the reference, the angles of other beams can be determined precisely by the pixels each beam strikes.
View attachment 5379
(This is from one of the experiments I conducted with my Christmas laser pointer while I was endeavoring to watch the transit of Venus)
(I find sometimes the trees obscure one's view of the forest)