Dhm
Regular
I’d vote for FF all day long
I'm meeting your Blackadder and raising you a python ptang ptang ole biscuit barrel election result. Not sure if FF is involved, but probably is.
I’d vote for FF all day long
I love the analogy, I love you thinkingMorning Diogenese,
Interesting thought.
If most, if not all future cars , drones etc have this new fangled technology how dose one distinguish their own output / input results from another machine / machines.
That's a hell of alot of lazer / lidar / radar / sonar signals being fired around willy nilly.
Not being at all clued up in this area l would think each autonomous machine must have a unique signature somehow incorperated into its onboard navigation system.
On the same thought train , nature seems to of figured it out....
Dolphins & Whales, Bat's ( not shaw about Jellyfish) all use Echolocation to navigate / communicate.
Imagine several thousand bats flying in a tight formation, all sending out little echo's which allows them to navigate blind ( not intoxicated), at night, without accident.
Amazing.
I'm certain if nature has figured out how to overcome this problem then the boffins at Mercedes Benz would have a fair idea, nevermind those folks at NASA & DARPA.
Regards,
Esq.
Your starting to sound like Diana Fisher from the inventors that aired on the ABC back in the day she loved pink lolHope it comes in other colours.FF
Well Ouster have heard of neural networks, and there is plenty of scope for them to incorporate Akida:"Digital lidar is built on the idea that if you can consolidate all of the important functionality of a lidar sensor into semiconductors fabricated in a standard CMOS process, you can put your core technology on a radically different price/performance improvement curve than is possible with other analog, MEMS, or silicon photonics-based approaches.
At this point you may be asking, if using SPADs and VCSELs has all of these performance and cost advantages, why doesn’t everyone use them? The short answer is that it’s really hard to make them work. Back in 2015 when we first began down the path of digital lidar, the current state of the art detectors and lasers would have produced a sensor with a range of only a few meters. Today, our OS2 has a range of over 200 meters and will continue to improve significantly over time. Crucially, its cost – and the cost of all our sensors – will come down at the same time.
At Ouster, we envision a future where lidar-powered solutions are ubiquitous, with high-performing and affordable 3D perception capabilities built for every industry. We are convinced CMOS digital lidar technology is what will get us there."
![]()
Why Digital Lidar is the Future | Ouster
Lidar sensors for high-resolution, long range use in autonomous vehicles, robotics, mapping. Low-cost & reliable for any use case. Shipping today.ouster.com
![]()
Why Apple chose digital lidar | Ouster
Lidar sensors for high-resolution, long range use in autonomous vehicles, robotics, mapping. Low-cost & reliable for any use case. Shipping today.ouster.com
As a side note to this radar / lidar discussion,
Some will recognise Ouster
Manny + Ouster
Manny + Brainchip
for those that prefer to watch rather than read.
From 50 seconds.
Ouster + Sense Photonics = Ouster Automotive
The best example I can give is reflectors. Reflector tape only shines back at the light source.Hi @Diogenese
I read this somewhere and it was that an autonomous vehicle to be given life will need more than one source of sensory input and if one of the inputs is in conflict majority will rule.
This necessity for multiple sources necessitates ultra low latency processing hence why AKIDA technology is essential.
The second thing I would say is I remember from high school science something about angles of incidence equaling angles of refraction. So assuming my Lidar sends out one pulse of light which collides with an object I can expect that pulse to come back at a known angle and in a time frame which will tell me by reference to these two things, time and angle, the distance and the location of the object.
As I am very clever if the angle does not match the time then I will know that the pulse of light hitting my sensor is not the pulse of light I sent out and therefore must be @Diogenese fooling around with the laser pointer he got for Christmas.
Now in there somewhere which is well above my pay grade is the Doppler Effect but I think I will leave that to someone who knows what they are talking about.
Suffice to say I think random pulses of light must always be in play even if @Diogenese is the only one who could not sleep and has gone for a drive in the early hours to watch the transit of Venus or something.
Having more than one sensor and majority rules will deal with this issue of random inputs.
My opinion only made up completely out of my own head with nothing but high school science DYOR
FF
AKIDA BALLISTA
In a nutshell, it's like trying to retrieve a squash shot from the back corner with your $200 cane racquet which you destroy against the wall in the attempt, only different (sob, sob, ...)First of all I am a Luddite, with some science knowledge. Is that the equivalent of Accidental Radio Jamming? or interference?
Diogenese Pty Ltd (incorporated in Seychelles) apologizes for the previous post, but pleads provocation because someone posted Monty Python above.In a nutshell, it's like trying to retrieve a squash shot from the back corner with your $200 cane racquet which you destroy against the wall in the attempt, only different (sob, sob, ...)
... and if you're old enuf to remember cane squash racquets, you'll know I don't harbour a grudge ... for very long.
Now what brought that up?
Oh yes, "jamming" - yep - that's it in a nutshell - nutshells are quite small so you have to jam it in.
I doubt it.Australia needs a person like you in Politics, I have no idea of your leaning but I would love to see a brain like yours supporting the Left, you would be an asset and the country would benefit for your contribution. You sound like a nice bloke. Just imagine FF for Attorney General ,,, whhooo whoooo !
A bit late, but:Yes. Why? How else am I supposed to keep count of that many big numbers?![]()
Thanks for the refresher!Just a refresher for all of us
AI player BrainChip on a roll; signs two contracts within a month
via KalkineMedia
Artificial Intelligence is expected to have a firm grip on the market in the upcoming future. A recent report published in May 2020 by the Australian Government highlighted that there is a variety of high-profile demonstrations of Artificial Intelligence and significant progress has been made in fields of self-driving cars, gameplaying machines and virtual assistant. Further, AI has had a considerable role to play in managing the current COVID-19 crisis.
During the last decade, there have been five vital areas where significant growth has been witnessed. These include:
The scope of AI is not exhaustive. However, the above five regions have shown a significant change in the past ten years.
- Image understanding
- Intelligent decision making
- Artificial creativity
- Natural Language Processing
- physical automation
ASX-listed BrainChip Holdings Ltd (ASX:BRN) is one such technology company that is engaged in developing innovative neuromorphic processor that brings AI to the edge in a manner that is beyond the abilities of other neural network devices. The solution is high-speed, small, low power. It allows a broad range of edge abilities comprising continual learning, local training, and interpretation.
BRN, during April 2020, introduced its AKD1000 to spectators at the Processor Virtual Conference by the Linley Group. The AKD1000’s neural processor is capable of running a normal Convolutional neural network by transforming it into event-based, letting it to execute incremental learning and transfer it on a chip.
CNN or Convolutional neural network is a type of deep neural network used for analysing images. These have specially designed architecture that makes them comparatively easy to train, even for relatively deep networks.
After the introduction of AKD1000, BrainChip has recently signed two agreements, post which the Company noted a significant improvement in its share price in the past couple of weeks. BRN shares, which settled at A$0.058 on 22 May 2020 reached A$0.120 on 9 June 2020, representing a growth of ~106.9%.
On 9 June 2020, the share price skyrocketed after the release of the Company’s announcement related to its Joint agreement with Valeo Corporation. The stock settled at A$0.110 on 10 June 2020, down 8.333%.
Let us look at the two recent deals signed by the Company that led to the stock rally.
Joint Agreement with Tier-1 Automotive Supplier
On 8 June 2020, Brainchip Holdings Ltd entered a joint development pact using BRN’s Akida neuromorphic SoC with Valeo Corporation, a Tier-1 European automotive supplier of sensors as well as systems for Advanced Driver Assistance Systems (ADAS) and Autonomous Vehicles (AV).
Under the agreement, certain performance milestones & payments that are anticipated to include the Company’s expenditures. The term of the deal is specified by the accomplishment of performance goals & the accessibility of the Akida devices. Each party has the option to end the agreement for convenience with specific notification.
The confirmation of the Company’s Akida device by a Tier-1 supplier of sensors & systems to the automotive industry is believed to be significant progress.
In ADAS & AV applications, the real-time processing of data is vital for security as well as dependability of the autonomous systems. From the automotive industry, the suppliers, as well as the manufacturers, have acknowledged that the advanced & highly efficient neuromorphic nature of the Akida SoC makes it ideally fit to process data at the “Edge” for their advanced system solutions.
With the integration of the Akida neural network processor with sensors, the subsequent system can attain ultra-low power, min. latency, max. reliability & incremental learning.
The Akida neural processor’s game-changing high performance & ultra-low power utilisation allows smart sensor combination by resolving power and footprint difficulties for a range of sensor technologies. Further, it consumes less power than alternative AI solutions and simultaneously maintaining the necessary performance as well as accuracy in a fraction of the physical space.
Agreement with Ford Motor Company for the Evaluation of AkidaTM Neural Processor
On 24 May 2020, the Company signed a joint agreement with Detroit-based Company for evaluation of the Akida neural network System-on-Chip for Advanced Driver Assistance Systems & Autonomous Vehicle applications.
The evaluation agreement was binding on execution which was signed into with Ford Motor Company and is not the subject of a fixed term. The deal is based upon a partnership to assess Akida as it relates to the automotive industry & payments under the agreement proposed to cover related expenses and received periodically during the evaluation process.
Akida NSoC has an advanced and highly efficient neuromorphic nature, and the partners in the collaboration have also realised that these features provide a broad range of potential solutions to complex problems such as driver behaviour assessments and real-time object detection.
The Akida NSoC exemplifies ground-breaking Neural Processing computing tools for Edge AI systems and gadgets. Each Akida NSoC has 10 billion synapses and 1.2 million neurons, demonstrating orders of magnitude improved effectiveness than other neural processing devices available.
The unique combination of meagre power, high performance and on-chip learning enables for real-time processing at the sensor along with continuous learning. The objective is to facilitate personalisation of every driver’s understanding in real time & constant updates to the system with the change in the environmental condition
I received another email this morning after I asked where he can see Brainchip being involved. His reply-
‘Nothing will come out of this neuromorphic silicon neurons approaches. Brainchip is behind Intel and they can’t make it!’
He also stated in a previous email-
‘I believe there is no future of neural board replicating neurons, the use is very limited, HOTS as an example could work on a spiking neural network but this is not silicon friendly. The idea is to adapt what you want to compute to the available substrate.’
![]()
Thanks Taproot,
This may help convince @Fact Finder that I wasn't blathering on just to air my tonsils.
And, of course, Akida is perfect for sieving out the specific digitally coded radar signals from each individual vehicle.
Just like Warnie's flipper/doozera, you'd be surprised.
See this from the venerable Daily Mail:
https://www.dailymail.co.uk/science...w-theory-completely-change-view-universe.html
Was Einstein WRONG about the speed of light? New theory could completely change our view of the universe
- Albert Einstein believed the speed of light was constant in a vacuum
- This constant has formed the basis of many theories in modern physics
- But the model of inflation leaves a conundrum called the Horizon Problem
- Physicists believe that light may have travelled faster in the early universe, before slowing to present levels, which could be tested with observations
A bit late, but: