Dougie54
Regular
Coupala daaaaysDon’t know if you remember Con but if he was here he would say:
Beautifuuuul.
FF
AKIDA BALLISTA
Coupala daaaaysDon’t know if you remember Con but if he was here he would say:
Beautifuuuul.
FF
AKIDA BALLISTA
I'll see your: "process inference on Akida", and raise:I said it was speculation so you can put your ogre back under the bridge.
The words used are to “process inference on Akida” which is not how it is normally put by Anil Mankar or others from Brainchip.
I think that with LSTM lurking in the background your ogre has jumped too early.
My opinion only DYOR
FF
AKIDA BALLISTA
Well they do not appear on their list of partners. If Brainchip fills the gap on the list over the next week or so then I suspect they have moved on.Can we assume from the Prophesee / Brainchip partnership that Prophesee are no longer using SynSense for their neuromorphic chips? I can’t see how there would be a need for Prophesee to use Akida as well as any of the SynSense hardware or software?!
Partnering up with Prophesee is great….but edging out their existing neuromorphic partner is even better….and a real validation of our technology versus a direct and now proven inferior competitor.
Coupala daaaays
Don’t know if you remember Con but if he was here he would say:
Beautifuuuul.
FF
AKIDA BALLISTA
Therein lies the weakness of your argument ‘as far as we know’.I'll see your: "process inference on Akida", and raise:
"We've successfully ported the data from Prophesee's neuromorphic-based camera sensor to process inference on Akida with impressive performance," said Anil Mankar, Co-Founder and CDO of BrainChip. "This combination of intelligent vision sensors with Akida's ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution."
I think the first "process" could be replaced by "perform" or "carry out" without altering the meaning.
Also "to process inference on Akida with impressive performance" implies that the inferencing was carried out on an Akida SoC rather than in software simulation, and, as far as we know, there is only Akida 1000 SoC.
I rest my briefs.
PS: As to "speculation" consider it a pre-emptive ogre.
More importantly this partnership once again imo validate's our AI / IP technology with a well known and well connected Co.
Yeah…interesting that SynSense aren’t listed as a partner on the About tab here…https://www.prophesee.ai/about-prophesee/Well they do not appear on their list of partners. If Brainchip fills the gap on the list over the next week or so then I suspect they have moved on.
Perhaps SynSense web site will tell the story.
My opinion only DYOR
FF
AKIDA BALLISTA
imagine renault using akida, phew
I think our Co name would look a lot better if it was placed at the "top" of the partners lists.... lolThis looks better
View attachment 9679
DYOR, I've saved the hard work for Prophesee, by showing how nice their page would look like with our company logo.
Could we be in this?…possibly the driver for announcing the partnership if they’re starting to make some sales?…
If you take the following part of Prophesee’s statement above:"We've successfully ported the data from Prophesee's neuromorphic-based camera sensor to process inference on Akida with impressive performance," said Anil Mankar, Co-Founder and CDO of BrainChip. "This combination of intelligent vision sensors with Akida's ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution."
"By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings, said Luca Verre, CEO and co-founder of Prophesee."
Its great to be a shareholder
WASCA - What's a SynSense Chip AnywayIf you take the following part of Prophesee’s statement above:
“"By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs”
with what they said when they partnered with SynSense in October, 2021 it is clear that SynSense was not all they thought it was going to be:
SynSense and Prophesee develop one-chip event-based smart sensing solution
Partnership leverages respective leadership in neuromorphic computing and sensing to realize on-sensor processing integration leading to small form-factor, low-power and cost-effective IoT solutionswww.prophesee.ai
If you were SynSense you would cringe at the statement that Brainchip is ‘better able’ to do what they were supposed to in the above announcement.
Ubiquitous. DEFAULT STANDARD.
My opinion only DYOR
FF
AKIDA BALLISTA
Hi FF,Therein lies the weakness of your argument ‘as far as we know’.
Given we all know we do not know everything that is taking place behind the Brainchip veil your certainty is certainly misplaced.
If you read what Anil Mankar said at 2021 Ai Field Day he left wide open the opportunity that LSTM could be added to AKD1000 neural fabric by software without changes to the fabric as it already uses ‘itegrate and fire neuron’.
So put your briefs back on for the moment my speculation is still taking up the surface of the table.
My opinion only DYOR
FF
AKIDA BALLISTA
I am still speculating. The logic is that the next generation with LSTM has been available to early access customers and robotics needs LSTM and Prophesee have jumped ship mid voyage.Hi FF,
Personally I disagree with all the comments said here about this. The word inference has different meanings depending on which industry you are referring to. In the context of machine learning, inference refers to the data being fed into the machine learning model or algorithm (see link below).
So in the context here, processing inference simply refers to processing sensor data. Nothing further to read here, definitely no reference to LSTM's.
On top of this, since English isn't Anil's primary language, I think some leniency may be required, so there's not much point reading too deeply into how his sentences are structured.
Pure speculation, DYOR
Model inference overview | BigQuery | Google Cloud
cloud.google.com
Machine learning inference is the process of running data points into a machine learning model to calculate an output such as a single numerical score. This process is also referred to as “operationalizing a machine learning model” or “putting a machine learning model into production.”
"offers manufacturers a ready-to-implement solution" is interesting. We might see revenue brought forwardGood morning
BrainChip Partners with Prophesee Optimizing Computer Vision AI Performance and Efficiency
LAGUNA HILLS, CA / ACCESSWIRE / June 19, 2022 / BrainChip Holdings Ltd (ASX:BRN)(OTCQX:BRCHF)(ADR:BCHPY), the world's first commercial producer of neuromorphic AI IP, and Prophesee, the inventor of the world's most advanced neuromorpcrweworld.com
Hi IDD,Hi FF,
Personally I disagree with all the comments said here about this. The word inference has different meanings depending on which industry you are referring to. In the context of machine learning, inference refers to the data being fed into the machine learning model or algorithm (see link below).
So in the context here, processing inference simply refers to processing sensor data. Nothing further to read here, definitely no reference to LSTM's.
On top of this, since English isn't Anil's primary language, I think some leniency may be required, so there's not much point reading too deeply into how his sentences are structured.
Pure speculation, DYOR
Model inference overview | BigQuery | Google Cloud
cloud.google.com
Machine learning inference is the process of running data points into a machine learning model to calculate an output such as a single numerical score. This process is also referred to as “operationalizing a machine learning model” or “putting a machine learning model into production.”